
Reflecting these days on some two day courses I have coming up, including one on complexity and social change, one on invitation practice and one on Open Space.
Each of these courses is workshop to introduce people to a practice or a set of practices, as opposed to techniques and skills. In each of these workshops people will come away with an ability to go into the practice, literally as artists. These are not technical trainings designed to download procedures and methods. They are courses that will leave you ready to practice, ready to make mistakes and learn as you go, and ready to improve.
It’s always hard to explain to people when they come on these courses that they will not leave as competent practitioners of the stuff they are learning. All artists make mistakes when they are first using a tool. What’s most important is that you have a way of developing your mastery with a tool, which is to say that you have a framework that helps you understand what you are doing and how well your are doing it. In traditional settings, mentorship is an important piece of this, to help one develop mastery from every attempt as you learn. The point is that these kinds of tools are useful in complexity, meaning that they are context and practitioner dependant. How you use these tools and where you use them matters.
Teaching, therefore, requires a disruption to the pedagogy of filling another person’s brain and body with competence. In my courses, my favourite answer to questions about application is “it depends.” But what doesn’t change over time is the body of theory that needs to inform one’s practice.
Theory is the constant, and therefore a heuristic (a basic set of measurable principles) are they way to develop practice that is appropriate in context. By theory I mean a serious understanding from natural sciences that underpin the ways systems work.
Courses that are pure theory are generally not helpful without grounding them in practice, and courses that are just collections of tools and practices are somewhat useful but can lead practitioners astray if they don;t understand why things work (or they aren’t able to see why things aren’t likely to work). So my basic approach to teaching these kinds of things is to use the following heuristic:
- Theory
- Framework
- Practice examples
- Application
Teaching theory – in my case usually complexity theory – is critical for setting the groundwork for the practices that follow. If you don’t understand the nature of the context you are working in, you are likely to make serious errors in applying practices: linear problem solving doesn’t work in non-linear settings. That seem intuitive but you need to know why and be able to explain it.
Frameworks are helpful because they provide touchstones to connect theory to practice. When we were teaching the harvesting course last year, we came up with the mnemonic PLUME to describe five heuristics that help practitioners design methods that are coherent with good theory. (We have a new one for the invitation course by the way: VALUE. You can learn more about it on the course or in the blog posts that come as a result of the teaching). Sometimes that framework is Cynefin, sometimes it’s the chaordic path.
The important thing about a framework is that it helps you to create something and then it can fall away and what you have created can stand on its own. If your practice relies on maintaining the integrity of the framework then your framework isn’t effective. This is an issue I see sometimes with things like sociocracy where in poor application it’s important that people retain accountability to the framework (but not even necessarily the theory). Frameworks should be important enough containers to inspire grounded and coherent action, but not so critical that the action depends on the framework.
Dave Snowden uses the metaphor of the scaffold, which is useful. Build a scaffold to build your house. But if the scaffolding is a part of your house and your house depends on the scaffolding for it’s structural integrity, you haven’t succeeded
Once we are grounded in theory and have a way of carrying it with us, we can share practices that help practitioners to ground this in real life. I always combine this with an opportunity to apply the learning on real projects. This gives people an opportunity to work together to make sense of what they are learning. It means that folks working on projects get a variety of perspectives from people who have just learned something, including naive and oblique perspectives, which is good when you are trying to do new things. For those that are giving their help with projects, they learn a lot by stepping into the coach or critic role, as they are forced to think about what they have been learning in an application context.
So that’s my basic pedagogy these days. I’ve been on a few facilitation workshops over the years and been shocked at two things: the lack of theory (so how do I know how your methods work) and the over reliance on tips and tricks, which is basically a kind of addictive mechanism for people learning facilitation. many people are super-interested in adding a few things to their tool box, and while I love helping people add tools, I would never give an apprentice carver a knife without helping them understand why this thing works and what happens if you use it incorrectly. And I would never say “here’s a knife, now go make your masterpiece.” Their first effort is going to be terrible, and that’s what practice is. We need more folks teaching the art of facilitation as artists teaching artists and less shady selling of recipes and tools for guaranteed success.
Share:

My friends over at the Social Labs Revolution website have been fielding questions about the prototyping phase of labwork and today published a nice compilation of prototyping resources. It’s worth a visit. It got me thinking this morning about some of the tools I use for planning these days.
Share:

My friend Avril Orloff shared this beautiful quote on her facebook page.
“Nobody tells this to people who are beginners. I wish someone told me. All of us who do creative work, we get into it because we have good taste. But there is this gap. For the first couple of years you make stuff, it’s just not that good. It’s trying to be good, it has potential, but it’s not. But your taste, the thing that got you into the game, is still killer. And your taste is why your work disappoints you.
A lot of people never get past this phase – they quit. Most people I know who do interesting, creative work went through years of this. We know our work doesn’t have this special thing that we want it to have. We all go through this. And if you are just starting out or you are still in this phase, you gotta know it’s normal and the most important thing you can do is do a lot of work…
It is only by going through a volume of work that you will close that gap, and your work will be as good as your ambitions. And I took longer to figure out how to do this than anyone I’ve ever met. It’s gonna take awhile. It’s normal to take awhile. You’ve just gotta fight your way through.”
That is from Ira Glass.
Share:

Evaluation is such an influential constraint in organizational and community life. When resources and attention are tied to evaluation results, a kind of tautology gets set up. One begins managing projects towards the evaluation outcomes, in order to give the best chance of an initiative surviving and continuing to attract resources. One of the things I appreciate about developmental evaluation is its deliberate engagement with emergence. Making sense of emergence however can be a really time consuming affair, and so I’m thinking about how we can use good use of time to use dialogue and collective meaning making to help make sense of data and direction.
Developmental evaluation is for the complex domain. That means that we are not working with evaluating actions against desired end states, but instead noticing and paying attention to vectors and directions – intentions and hypotheses that help shape emerging strategy. Developmental evaluation is the process of gathering information about our work to give us some intelligence about what we are doing.
Think of the information needs of two different kinds of athletes. A golf player relies on solid objective data (how many yards to the hole, where the wind is coming from, the nature of the lie of the ball and so on) and interprets that data through her own self-knowledge (I hit a five iron 160 yards. Adjusting for wind and lie and the target topography, I should hit a 4 iron with backspin…) Of course the better a golfer one is, the easier it is to execute a plan and understand exactly where one succeeded or failed.
By contrast soccer players work in a dynamic environment. The information available to them only becomes apparent as they begin to play the match. They may know something about the other team, but they learn rapidly in the first ten minutes or so how the game is going to go. A team will discover where the opposition’s weakness is, or what its attacking strategy is, or where the open spots are on the pitch. Making good use of this information requires excellent communication in real time to share what is being learned. It requires players to play with potentials and patterns rather than certainties. Every move provides yet more information. The better a team works together, the faster they can adjust their strategy to take advantage of potentials.
When we are evaluating work there is a mix of these two types of approaches at play. Summative evaluation will look at the gap between expected outcomes and what actually happened and suggest how to adjust for next time. Budget planning and auditing is a good example of this technical kind of results based evaluation. Count the money and compare against projections. Look for causes. Some of these causes will be technical and some will be down to culture.
Developmental evaluation requires a different strategic approach, and simply put, it might fall into these four things (I’m trying for simplicity here, to try to be able to describe this in an easy way):
- Data points that give us the ability to capture information about a current state of an evolving system. This can render a series of pictures that will allow us to see patterns and trends. You need multiple snapshots over time to make sense of what is happening. One photo of a soccer game in progress tells you nothing. You need to monitor indicators not manage end points. Soccer is much more than just putting the ball in the net, even though that is the desired end result.
- Feedback loops from data to human sensemaking so that data can be used in real time to develop strategy and adjustments to the directionality of work.
- A facilitated sensemaking process to bring together multiple perspectives to interpret what is happening. In a complex system the data won’t give you answers. It will provide information to form hypotheses about the patterns that are emerging, and that information can give you guidance for action.
- A way of acting that doesn’t over commit resources to emerging potential strategies, but which gives enough momentum to see if we can shift things in a desired way. Snowden calls this “safe-to-fail.” This is tricky and calls for good context dependant leadership, but it is the essence of good decision making.
There are all kinds of ways of implementing these strategies. You can use surveys to discover what people are accessing on your website and you can use interviews or sensemaking tools to find out HOW they are using that information. You can use a strategic group to interpret these results and see how they are either coherent with our intentions, or at odds with them. You can then create new initiatives that support what is emerging or figure out ways to abandon what is not working. There are thousands of dialogue methods and processes to use to ask questions about and develop action around the data that is emerging.
Importantly, developmental evaluation needs to be a part of the way you work strategically. It needs a rhythm and a cadence to it, so that you know you are coming back on a regular basis to the emerging picture of what is happening. You need outsiders occasionally to come in and disrupt your point of view and offer alternative views of the patterns, and you need to choose a longer rhythm to continue to develop and refine your evaluation strategy as a whole.
I want this to be simple as a process to use. Strategy without information is just a wild guess. But if we tie our decisions too closely to the data emerging from dynamic systems we can get equally stuck making decisions that try to game the system towards desired results, with sometimes disastrous results for clients, customers and ultimately, organizational integrity. It’s a balance and a practice. How can we make this easy?
Share:

A few years ago, Juanita Brown shared a very powerful image with me. She talked about how those of us that practice dialogue and facilitation in a deep way have access to various gateways that take us into a “central garden.” All of our pathways invite us into this garden where we come to discover and realize something about the role of dialogue, meaning making and collaboration. It is a set of realizations that lies beneath the practice of methods.
On a call today with my friend Mark McKergow, we were discussing this image There are a bunch of us – although not a large bunch of us – from different practitioner communities who are always interested in transcending our methods and entering into this conversation. Alongside Juanita, Mark has also been wondering “where is everybody else, and how come we’re not connecting?”
Today we were discussing the failure of dialogue to have enough presence to provide workable and practical alternatives to everything from public policy decisions (such as the EU referendum in Britain, or the polarization of US society) to the everyday challenges of managing and running large organizations, evaluating, strategizing and controlling outcomes, people and money.
We know that our field of dialogic practice is massive, well researched and well documented. We know that leadership literature is filled with the importance of relational and sense making work. We know that that mid-career professionals end up coming to our various workshops to take on skills and ideas that are fundamentally transformative to their work and lives and that they go back to places where “it’s difficult to implement” because other mid-career professionals are wedded to globalized management practices that are good enough for what they are trying to do, within the highly constrained performance frameworks within which they are forced to operate. We even know (thanks to people like Jon Husband) that global organizations like Hay Associates have spent the better part of a century ensuring that these management science constraints are widely deployed and understood. They frame everything, not without utility, but to the exclusion of almost every other way of organizing and being together in human endeavour.
So what is the problem? Are we just lousy storytellers? Are we being deliberately marginalized? Is there something fundamentally flawed about the ability of dialogic practice to actually be of value? And how do we disrupt the standard set of management tools and the narcissism of our own communities of practice in a way that creates some serious openings for change?
What do you think?