
Things change in different ways.
A couple of weeks ago, I took a deep dive into Glenda Eoyang’s Human Systems Dynamics, learning about her theory of complexity and getting my hands on the tools and methods that HSD uses to work in complex adaptive systems. (The tools are very good by the way, and highly recommended as ways to both get a good introductory grasp on complex problems, and work within those contexts to make decisions and lead).
One of the useful ways of looking at things concerns the kinds of change that happen, and if you’ve been reading my blog lately, you’ll know that accurately describing your theory of change is a key discipline for me.
In HSD we talk about three kinds of change: static, dynamic and dynamical. I’m not 100% sold on the terminology, but I invite you to think of these are ways of describing the start and end points of an intervention.
Static change begins and ends with a fairly stable system. An example is nailing drywall to a frame. You start with a frame, a sheet of drywall and some nails. The act of change is a predictable and controllable action that fastens the drywall to the framing and creates a wall. The system is stable to begin with and stable after the intervention.
Dynamic change is change that is full of motion and movement but that motion follow a predictable trajectory and also begins with a fairly stable beginning and end point. To extend our metaphor, this is about building a house, or using a crane to raise and lower materials on the building site. There are dynamics at play but the beginning is knowable and the end state is predictable. The interventions are dynamic, requiring little adjustments as you go, applied with expertise. Hire a crane operator if you want to avoid accidents.
Dynamical change comes from the world of physics, where small perturbations in a system result in massive changes and emergent outcomes. The beginning state is in motion and has a history that matters. The end state is also in motion and has a trajectory that matters. The intervention will alter the the future state in unpredictable ways. This is what happens in most complex systems. Small changes make big and unpredictable differences. Extending our house building metaphor even further, this is what happens when you build a variety of structures in a neighbourhood and fill them with people. The neighbourhood changes, sometimes for the better, sometimes for the worse.
We can try to reduce the amount of unpredictability in our work but there are limits to that. Externalizing the results of our decisions is not without peril, and in fact I would say that there is a moral imperative to taking responsibility for the kinds of interventions that we make in a system. While we can’t know everything that is going to happen, we need to bear some responsibility for our actions. In highly ordered systems where causality is attributable, we can do this with solid accountability mechanisms. In highly unordered, complex and emergent systems, we can’t attribute causality and accountability, but we can take care to use the right tools and views. This sometimes paralyzes people into not acting – the well known “analysis paralysis” situation. Sometimes not acting, or simply ignoring consequences, comes with some moral peril. The problem is that, despite the nature of the problem, we still need to act.
I find in general that it helps to know that complexity is fundamentally unknowable in its totality. in this kind of system, no amount of data and research will give us definitive answers before making decisions about what to do. This is why adaptive action is so important. It shortens the feedback loop between planning, acting and evaluating so that you can start small and being to watch for the effects of your decisions right away. Of course with large scale system work, the process of understanding the system is important, but it’s a never-ending process. One studies it but one shouldn’t treat a large complex system as if it is always subject to static change: moving between one state and another. We need to learn to see that and operate within a dynamic and changing environment, finding “just enough” information to initiate changes and then watching for what happens, adjusting as we go.
Share:

In the world of non-profits, social change, and philanthropy it seems essential that change agents provide funders with a theory of change. This is nominally a way for funders to see how an organization intends to make change in their work. Often on application forms, funders provide guidance, asking that a grantee provide an articulation of their theory of change and a logic model to show how, step by step, their program will help transform something, address an issue or solve a problem.
In my experience, most of the time “theory of change” is really just another word for “strategic plan” in which an end point is specified, and steps are articulated backwards from that end point, with outcomes identified along the way. Here’s an example. While that is helpful for situations in which you have a high degree of control and influence, and in which the nature of the problem is well ordered and predictable, these are not useful with complex emergent problems. Most importantly they are not theories of change, but descriptors of activities.
For me a theory of change is critical. Looking at the problem you are facing, ask yourself how do these kinds of problems change? If, for example, we are trying to work on a specific change to an education policy, the theory of change needs to be based on the reality of how policy change actually happens. For example, to change policy you need to be influential enough with the government in power to be able to design and enact your desired changes with politicians and policy makers. How does policy change? Through lobbying, a groundswell of support, pressure during elections, participation in consultation processes and so on. From there you can design a campaign – a strategic plan – to see if you can get the policy changed.
Complex problems are a different beast altogether. They are non-linear, unpredictable and emergent. Traffic safety is an example. A theory of change for these kind of problems looks much more like the dynamics of flocking behaviour. The problem changes through many many small interactions and butterfly effects. A road safety program might work for a while until new factors come into play, such as distractions or raised speed limits, or increased use of particular sections of road. Suddenly the problem changes in a complex and adaptive way. It is not logical or rational and one certainly can’t predict the outcome of actions.
In my perfect world I wish it would be perfectly acceptable for grantees to say that “Our theory of change is complexity.” Complexity, to quote Michael Quinn Patton, IS a theory of change. Understanding that reality has radical implications for doing change work. This is why I am so passionate about teaching complexity to organizations and especially to funders. If funders believe that all problems can be solved with predictive planning and a logic model adhered to with accountability structures, then they will constrain grantees in ways that prevent grantees from actually addressing the nature of complex phenomena. Working with foundations to change their grant forms is hugely rewarding, but it needs to be supported with change theory literacy at the more powerful levels of the organization and with those who are making granting decisions.
So what does it look like?
I’m trying these days to be very practical in describing how to address complex problems in the world of social change. For me it comes down to these basic activities:
Describe the current state of the system. This is a process of describing what is happening. It can be through a combination of looking at data, conducting narrative research and indeed, sitting in groups full of diversity and different lived experience and talking about what’s going on. If we are looking at road safety we could say “there are 70 accidents here this year” or “I don’t feel safe crossing the road at this intersection.” Collecting data about the current state of things is essential, because no change initiative starts from scratch.
Ask what patterns are occurring the system. Gathering scads of data will reveal patterns that are repeating and reoccurring in the system, Being able to name these patterns is essential. It often looks as simple as “hey, do you notice that there are way more accidents at night concentrated on this stretch of road?” Pattern logic, a process used in the Human Systems Dynamics community, is one way that we make sense of what is happening. It is an essential step because in complexity we cannot simply solve problems but instead we seek to shift patterns.
Ask yourself what might be holding these patterns in place. Recently I have been doing this by asking groups to look at the patterns they have identified and answer this question. “If this pattern was the result of set of principles and advice that we have been following, what would those principles be?” This helps you to see the structures that keep problems in place, and that is an essential intelligence for strategic change work. This is one adaptation of part of the process called TRIZ which seeks to uncover principles and patterns. So in our road safety example we might say, “make sure you drive too fast in the evening on this stretch of road” is a principle that, if followed, would increase danger at this intersection. Ask what principles would give you the behaviours that you are seeing? You are trying to find principles that are hypotheses, things you can test and learn more about. Those principles are what you are aiming to change, to therefore shift behaviour. A key piece of complexity as a theory of change is that constraints influence behaviour. These are sometimes called “simple rules” but I’m going to refer to them as principles, because it will later dovetail better with a particular evaluation method.
Determine a direction of travel towards “better.” As opposed to starting with an end point in sight, in complexity you get to determine which direction you want to head towards, and you get to do it with others. “Better” is a set of choices you get to make, and they can be socially constructed and socially contested. “Better” is not inevitable and it cannot be predictive but choosing an indicator like “fewer accidents everywhere and a feeling of safety amongst pedestrians” will help guide your decisions. In a road safety initiative this will direct you towards a monitoring strategy and towards context specific actions for certain places that are more unsafe than others. Note that “eliminating accidents” isn’t possible, because the work you are trying to do is dynamic and adaptive, and changes over time. The only way to eliminate accidents is to ban cars. That may be one strategy, and in certain places that might be how you do it. It will of course generate other problems, and you have to be aware and monitor for those as well. In this work we are looking for what is called an “adjacent possible” state for the system. What can we possibly change to take us towards a better state? What is the system inclined to do? Banning cars might not be that adjacent possible.
Choose principles that will help guide you away from the current state towards “better.” It’s a key piece of complexity as a theory of change that constraints in a system cause emergent actions. One of my favourite writers on constraints is Mark O Sullivan, a soccer coach with AIK in Sweden. He pioneers and research constraint based learning for children at the AIK academy. Rather than teach children strategy, he creates the conditions so that they can discover it for themselves. He gives children simple rules to follow in constrained game simulated situations and lets them explore and experiment with solutions to problems in a dynamic context. In this presentation he shows a video of kids practicing simple rules like “move away from the ball” and “pass” and watches as they discover ways to create and use space, which is an essential tactical skill for players, but which cannot be taught abstractly and which must be learned in application. Principles aimed at changing the constraints will help design interventions to shift patterns.
Design actions aimed at shifting constraints and monitor them closely. Using these simple rules (principles) and a direction of travel, you can begin to design and try actions that give you a sense of what works and what doesn’t. These are called safe to fail probes. In the road safety example, probes might include placing temporary speed bumps on the road, installing reflective tape or silhouettes on posts at pedestrian crossings, placing a large object on the road to constrain the driving lanes and cause drivers to slow down. All of these probes will give you information about how to shift the patterns in the system, and some might produce results that will inspire you to make them more permanent. But in addition to monitoring for success, you have to also monitor for emergent side effects. Slowing traffic down might increase delays for drivers, meaning that they drive with more frustration, meaning more fender benders elsewhere in the system. Complex adaptive systems produce emergent outcomes. You have to watch for them.
Evaluate the effectiveness of your principles in changing the constraints in the system. Evaluation in complex systems is about monitoring and watching what develops as you work. It is not about measuring the results of your work, doing a gap analysis and making recommendations. There are many, many approaches to evaluation, and you have to be smart in using the methods that work for the nature of the problem you are facing. In my opinion we all need become much more literate in evaluation theory, because done poorly, evaluation can have the effect of constraining change work into a few easily observed outcomes. One form of evaluation that is getting my attention is principles-based evaluation, which helps you to look at the effectiveness of the principles you are using to guide action. This is why using principles as a framework helps to plan, act and evaluate.
Monitor and repeat. Working on complex problems has no end. A traffic safety initiative will change over time due to factors well outside the control of an organization to respond to it. And so there never can be an end point to the work. Strategies will have an effect and then you need to look at the current state again and repeat the process. Embedding this cycle in daily practice is actually good capacity building and teams and organizations that can do this become more responsive and strategic over time.
Complexity IS indeed a theory of change. I feel like I’m on a mission to help organizations, social change workers and funders get a sense of how and why adopting to that reality is beneficial all round.
How are you working with complexity as a theory of change?
Share:

When working in complexity, and when trying to create new approaches to things, it’s important to pay attention to ideas that lie outside of the known ways of doing things. These are sometimes called “weak signals” and by their very nature they are hard to hear and see.
At the Participatory Narrative Inquiry Institute, they have been thinking about this stuff. On May 31, Cynthia Kurtz posted a useful blog post on how we choose what to pay attention to:
If you think of all the famous detectives you know of, fictional or real, they are always distinguished by their ability to hone in on signals — that is, to choose signals to pay attention to — based on their deep understanding of what they are listening for and why. That’s also why we use the symbol of a magnifying glass for a detective: it draws our gaze to some things by excluding other things. Knowing where to point the glass, and where not to point it, is the mark of a good detective.
In other words, a signal does not arise out of noise because it is louder than the noise. A signal arises out of noise because it matters. And we can only decide what matters if we understand our purpose.
That is helpful. In complexity, purpose and a sense of direction helps us to choose courses of action from making sense of the data we are seeing to acting on it.
By necessity that creates a narrowing of focus and so paying attention to how weak signals work is alos important. Yesterday the PNI Institute discussed this on a call which resulted in a nice set of observations about the people seeking weka signals an dthe nature of the signals themselves:
We thought of five ways that have to do with the observer of the signal:
- Ignorance – We don’t know what to look for. (Example: the detective knows more about wear patterns on boots than anyone else.)
- Blindness [sic]- We don’t look past what we assume to be true. (No example needed!)
- Disinterest – We don’t care enough about what we’re seeing to look further. (Example: parents understand their toddlers, nobody else does.)
- Habituation – We stopped looking a long time ago because nothing ever seems to change. (Example: A sign changes on a road, nobody notices it for weeks.)
- Unwillingness – It’s too much effort to look, so we don’t. (Example: The “looking for your keys under the street light” story is one of these.)
And we listed five ways a signal can be weak that have to do with the system in which the observer is embedded:
- Rare – It just doesn’t happen often.
- Novel – It’s so new that nobody has noticed it yet.
- Overshadowed – It does happen, but something else happens so much more that we notice that instead.
- Taboo – Nobody talks about it.
- Powerless – Sometimes a signal is literally weak, as in, those who are trying to transmit it have no power.
You can see that this has important implications for building in equity and diversity into sense-making processes. People with different lived experiences, ways of knowing and ways of seeing will pay attention to signals differently. If you are trying to build a group with the increased capacity to scan and make sense of a complex problem, having cognitive and experiential diversity will help you to find many new ideas that re useful in addressing complex problems. Furthermore, you need to pay attention to people whose voices are traditionally quieted in a group so as to amplify their perspectives on powerless signals.
Share:

Sonja Blignault has been blogging some terrific stuff on Paul Cilliers’ work on complexity. Specifically she has been riffing on Cilliers’ seven characteristics of complex systems and the implications of complexity for organizations.
Yesterday I was teaching an Art of Hosting here in Calgary, where we were looking at Cynefin and then followed with a discussion about how the nature of complex systems compels us to make important design choices when we are facilitating participatory processes to do work in organizations.
This is a cursory list, but I thought it would be helpful to share here. Cilliers’ text is bold.
Complex systems consist of a large number of elements that in themselves can be simple.
If you are planning participatory processes, don’t focus on working on the simple problems that are the elements in complexity. Instead, you need to gather information about those many simple elements and use facilitation methods to look for patterns together. We talk about describing the system before interpreting it. Getting a sense of the bits and pieces ensures that you don’t begin strategic process work with high level aspirations.
The elements interact dynamically by exchanging energy or information. These interactions are rich. Even if specific elements only interact with a few others, the effects of these interactions are propagated throughout the system. The interactions are nonlinear.
Non-lienarity is truly one of those things that traditional planning processes fail to understand. We want to always be heading towards a goal, despite the fact that in complex systems such controlled progress is impossible. What we need to be doing is choosing a direction to move in and make decisions and choices that are coherent with that direction, all the while keeping a careful watch on what is happening and what effect our decisions have. Participatory processes help us to make sense of what we are seeing, and convening regular meetings of people to look through data and seen what is happening is essential, especially if we are making decisions on innovative approaches. Avoid creating processes that assume casualty going forward; don’t make plans that are based on linear chains of events that take us from A to B. Traditional vision, mission goals and objectives planning has little usefulness in a complex system. Instead, focus on the direction you want to move in and a set of principles or values that help you make decisions in that direction.
There are many direct and indirect feedback loops.
The interactions between the parts of a systems happen in a myriad of ways. To keep your strategy adapting, you need to build in feedback loops that work at a variety of time scales. Daily journalling, weekly sense making and project cycle reporting can all be useful. Set up simple and easily observable monitoring criteria that help you to watch what you are doing and decide how to adjust when that criteria are triggered. Build in individual and collective ways to harvest and make sense of what you are seeing.
Complex systems are open systems—they exchange energy or infor- mation with their environment—and operate at conditions far from equilibrium.
You need to understand that there are factors outside your control that are affecting the success or failure of your strategy. Your and your people are constantly interacting with the outside world. Understand these patterns as they can often be more important than your strategy. In participatory process and strategy building I love it when we bring in naive experts to contribute ideas from outside our usual thinking. In natural systems, evolution and change is powered by what happens at the edges ad boundaries, where a forest interacts with a meadow, or a sea with a shoreline. these ecotones are the places of greatest life, variety and influence in a system. Build participatory process that bring in ideas from the edge.
Complex systems have memory, not located at a specific place, but distributed throughout the system. Any complex system thus has a history, and the history is of cardinal importance to the behavior of the system.
Complex systems are organized into patterns and those patterns are the results of many many decisions and actions over time. Decisions and actions often converge around attractors and boundaries in a system and so understanding these “deep yes’s and deep no’s” as I call them is essential to working in complexity. You are never starting from a blank state, so begin by engaging people in understanding the system, look for the patterns that enable and the patterns that keep us stuck, and plan accordingly.
The behavior of the system is determined by the nature of the interactions, not by what is contained within the components. Since the interactions are rich, dynamic, fed back, and, above all, nonlinear, the behavior of the system as a whole cannot be predicted from an inspection of its components. The notion of “emergence” is used to describe this aspect. The presence of emergent properties does not provide an argument against causality, only against deterministic forms of prediction.
So again, work with patterns of behaviour, not individual parts. And of course, as Dave Snowden is fond of saying, to shift patterns, shift the way the actors interact. Don’t try to change the actors. Once, when working on the issue of addictions stigma in health care, the health authority tried running a project to address stigmatizing behaviours with awareness workshops. The problem was, they couldn’t find anyone that admitted to stigmatizing behaviours. Instead, we ran a series of experiments to change the way people work together around addictions and people with addictions (including providing recognition and help for health care workers who themselves suffered from addictions). That is the way to address an emergent phenomenon.
Complex systems are adaptive. They can (re)organize their internal structure without the intervention of an external agent.
And so your strategy must also be adaptive. I’m learning a lot about Principles Based Evaluation these days which is a useful way to craft strategy in complex domains. Using principles allows people to make decisions consistent and coherent with the preferred direction of travel the strategy is taking us in. when the strategy needs to adapt, because conditions have changed, managers can rely on principles to structure new responses to changing conditions. Participatory processes become essential in interpreting principles for current conditions.
This is a bit of a brain dump, and as usual it makes more sense to me that perhaps it does to everyone else. But I’d be very interested in your reflections on what you are hearing here, especially as it relates to how we craft, design and deliver participatory processes in the service of strategy, planning and implementation.
Share:

At the end of a couple of weeks in Europe and being here in Glasgow during this past week has heightened my sensitivity to how democracy, devoid of deliberation and focused only on numeric results, has been hijacked and rendered ineffective for making complex decisions related to governance of complex issues. The UK is currently paying the price for a ridiculous decision made in June of 2016 to leave the European Union. Whatever you think of the merits of Brexit, there can be no denying that the method for doing so has been deeply flawed both in its democratic implementation and the subsequent negotiation. Britain is currently mired in apolitical, constitutional and economic mess of its own making.
So how to we make better decisions together? This video has some very interesting hypotheses that combine complexity science with deliberation practice. It’s worth reflecting on.