Evaluation is such an influential constraint in organizational and community life. When resources and attention are tied to evaluation results, a kind of tautology gets set up. One begins managing projects towards the evaluation outcomes, in order to give the best chance of an initiative surviving and continuing to attract resources. One of the things I appreciate about developmental evaluation is its deliberate engagement with emergence. Making sense of emergence however can be a really time consuming affair, and so I’m thinking about how we can use good use of time to use dialogue and collective meaning making to help make sense of data and direction.
Developmental evaluation is for the complex domain. That means that we are not working with evaluating actions against desired end states, but instead noticing and paying attention to vectors and directions – intentions and hypotheses that help shape emerging strategy. Developmental evaluation is the process of gathering information about our work to give us some intelligence about what we are doing.
Think of the information needs of two different kinds of athletes. A golf player relies on solid objective data (how many yards to the hole, where the wind is coming from, the nature of the lie of the ball and so on) and interprets that data through her own self-knowledge (I hit a five iron 160 yards. Adjusting for wind and lie and the target topography, I should hit a 4 iron with backspin…) Of course the better a golfer one is, the easier it is to execute a plan and understand exactly where one succeeded or failed.
By contrast soccer players work in a dynamic environment. The information available to them only becomes apparent as they begin to play the match. They may know something about the other team, but they learn rapidly in the first ten minutes or so how the game is going to go. A team will discover where the opposition’s weakness is, or what its attacking strategy is, or where the open spots are on the pitch. Making good use of this information requires excellent communication in real time to share what is being learned. It requires players to play with potentials and patterns rather than certainties. Every move provides yet more information. The better a team works together, the faster they can adjust their strategy to take advantage of potentials.
When we are evaluating work there is a mix of these two types of approaches at play. Summative evaluation will look at the gap between expected outcomes and what actually happened and suggest how to adjust for next time. Budget planning and auditing is a good example of this technical kind of results based evaluation. Count the money and compare against projections. Look for causes. Some of these causes will be technical and some will be down to culture.
Developmental evaluation requires a different strategic approach, and simply put, it might fall into these four things (I’m trying for simplicity here, to try to be able to describe this in an easy way):
- Data points that give us the ability to capture information about a current state of an evolving system. This can render a series of pictures that will allow us to see patterns and trends. You need multiple snapshots over time to make sense of what is happening. One photo of a soccer game in progress tells you nothing. You need to monitor indicators not manage end points. Soccer is much more than just putting the ball in the net, even though that is the desired end result.
- Feedback loops from data to human sensemaking so that data can be used in real time to develop strategy and adjustments to the directionality of work.
- A facilitated sensemaking process to bring together multiple perspectives to interpret what is happening. In a complex system the data won’t give you answers. It will provide information to form hypotheses about the patterns that are emerging, and that information can give you guidance for action.
- A way of acting that doesn’t over commit resources to emerging potential strategies, but which gives enough momentum to see if we can shift things in a desired way. Snowden calls this “safe-to-fail.” This is tricky and calls for good context dependant leadership, but it is the essence of good decision making.
There are all kinds of ways of implementing these strategies. You can use surveys to discover what people are accessing on your website and you can use interviews or sensemaking tools to find out HOW they are using that information. You can use a strategic group to interpret these results and see how they are either coherent with our intentions, or at odds with them. You can then create new initiatives that support what is emerging or figure out ways to abandon what is not working. There are thousands of dialogue methods and processes to use to ask questions about and develop action around the data that is emerging.
Importantly, developmental evaluation needs to be a part of the way you work strategically. It needs a rhythm and a cadence to it, so that you know you are coming back on a regular basis to the emerging picture of what is happening. You need outsiders occasionally to come in and disrupt your point of view and offer alternative views of the patterns, and you need to choose a longer rhythm to continue to develop and refine your evaluation strategy as a whole.
I want this to be simple as a process to use. Strategy without information is just a wild guess. But if we tie our decisions too closely to the data emerging from dynamic systems we can get equally stuck making decisions that try to game the system towards desired results, with sometimes disastrous results for clients, customers and ultimately, organizational integrity. It’s a balance and a practice. How can we make this easy?