Things change in different ways.
A couple of weeks ago, I took a deep dive into Glenda Eoyang’s Human Systems Dynamics, learning about her theory of complexity and getting my hands on the tools and methods that HSD uses to work in complex adaptive systems. (The tools are very good by the way, and highly recommended as ways to both get a good introductory grasp on complex problems, and work within those contexts to make decisions and lead).
One of the useful ways of looking at things concerns the kinds of change that happen, and if you’ve been reading my blog lately, you’ll know that accurately describing your theory of change is a key discipline for me.
In HSD we talk about three kinds of change: static, dynamic and dynamical. I’m not 100% sold on the terminology, but I invite you to think of these are ways of describing the start and end points of an intervention.
Static change begins and ends with a fairly stable system. An example is nailing drywall to a frame. You start with a frame, a sheet of drywall and some nails. The act of change is a predictable and controllable action that fastens the drywall to the framing and creates a wall. The system is stable to begin with and stable after the intervention.
Dynamic change is change that is full of motion and movement but that motion follow a predictable trajectory and also begins with a fairly stable beginning and end point. To extend our metaphor, this is about building a house, or using a crane to raise and lower materials on the building site. There are dynamics at play but the beginning is knowable and the end state is predictable. The interventions are dynamic, requiring little adjustments as you go, applied with expertise. Hire a crane operator if you want to avoid accidents.
Dynamical change comes from the world of physics, where small perturbations in a system result in massive changes and emergent outcomes. The beginning state is in motion and has a history that matters. The end state is also in motion and has a trajectory that matters. The intervention will alter the the future state in unpredictable ways. This is what happens in most complex systems. Small changes make big and unpredictable differences. Extending our house building metaphor even further, this is what happens when you build a variety of structures in a neighbourhood and fill them with people. The neighbourhood changes, sometimes for the better, sometimes for the worse.
We can try to reduce the amount of unpredictability in our work but there are limits to that. Externalizing the results of our decisions is not without peril, and in fact I would say that there is a moral imperative to taking responsibility for the kinds of interventions that we make in a system. While we can’t know everything that is going to happen, we need to bear some responsibility for our actions. In highly ordered systems where causality is attributable, we can do this with solid accountability mechanisms. In highly unordered, complex and emergent systems, we can’t attribute causality and accountability, but we can take care to use the right tools and views. This sometimes paralyzes people into not acting – the well known “analysis paralysis” situation. Sometimes not acting, or simply ignoring consequences, comes with some moral peril. The problem is that, despite the nature of the problem, we still need to act.
I find in general that it helps to know that complexity is fundamentally unknowable in its totality. in this kind of system, no amount of data and research will give us definitive answers before making decisions about what to do. This is why adaptive action is so important. It shortens the feedback loop between planning, acting and evaluating so that you can start small and being to watch for the effects of your decisions right away. Of course with large scale system work, the process of understanding the system is important, but it’s a never-ending process. One studies it but one shouldn’t treat a large complex system as if it is always subject to static change: moving between one state and another. We need to learn to see that and operate within a dynamic and changing environment, finding “just enough” information to initiate changes and then watching for what happens, adjusting as we go.