Chris Corrigan Chris Corrigan Menu
  • Blog
  • Chaordic design
  • Resources for Facilitators
    • Facilitation Resources
    • Books, Papers, Interviews, and Videos
    • Books in my library
    • Open Space Resources
      • Planning an Open Space Technology Meeting
  • Courses
  • About Me
    • Services
      • What I do
      • How I work with you
    • CV and Client list
    • Music
    • Who I am
  • Contact me
  • Blog
  • Chaordic design
  • Resources for Facilitators
    • Facilitation Resources
    • Books, Papers, Interviews, and Videos
    • Books in my library
    • Open Space Resources
      • Planning an Open Space Technology Meeting
  • Courses
  • About Me
    • Services
      • What I do
      • How I work with you
    • CV and Client list
    • Music
    • Who I am
  • Contact me

Category Archives "Evaluation"

Intervening in a complex system: 5 Ps

February 8, 2016 By Chris Corrigan Art of Harvesting, Art of Hosting, Complexity, Conversation, Design, Emergence, Evaluation, Featured, Leadership, Stories

When I was up in Whitehorse last week I got to spend time with folks from the Public Service Commission discussing a project that would see us looking at discriminations in the workplace from a complexity angle.  Using Cynefin and SenseMaker(tm), we hope to understand the ways in which the landscape of discrimination shifts and changes over time so that the PSC can make wiser decisions about the kinds of initiatives it sculpts.  One of the problems with diversity initiatives in the public service (in any large public organization really) is the feeling that they need to be broad based and rolled out to everyone.  This usually results in a single initiative that spreads across the whole organization, but except for a little awareness raising, does little to address specific instances of discrimination.  Everything from awareness raising “cultural competency training” to zero tolerance accountability measures have limited effect because a) discriminatory behaviour is highly context and situation dependant and b) the public service has a permeable boundary to the outside world, meaning ideas, behaviours and people move between the two contexts all the time.  The larger your organization, the more like the real world you have to be.

At any rate, I took a bit of time to do a mini-Cynefin teaching to explain how strategy works in the complex domain.  and my friend Pawa Haiyupis and I added two Ps to my concentric circles of intervention in a complex system.  So to review:

  • Patterns: Study the patterns in a complex setting using narrative capture and sense-making.  This can be done with the SenseMaker(tm) software, and it can also be done with dialogic interventions.  The key thing is to let the people themselves tag their stories or at the very least have a group of people reviewing data and finding patterns together.  For example, you might notice a correlation between stressful times in an organization and an increase in feelings of discriminatory behaviour
  • Probe: Once you have identified some patterns, you can make some hypotheses about what might work and it’s time to develop some safe to fail probes.  These aren’t meant to be successful: they are meant to tell you whether or not the patterns you are sensing have developmental potential.  Failure is entirely welcome. What if we offered stress reduction activities during high stress times to help release pent up feelings? We want to be okay with te possibility that that might not work.
  • Prototype: If a probe shows some promise, you might develop a prototype to develop a concept. Prototypes are designed to have tolerance for failure, in that failure helps you to iterate and improve the concept.  The goal is to develop something that is working.
  • Pilot: A pilot project is usually a limited time proof of concept.  Roll it out over a year and see what you learn.  In Pilot projects you can begin to use some summative evaluation methods to see what has changed over time.  Because of their intensive resource commitment, pilot projects are hardly ever allowed to fail, making them very poor ways of learning and innovating, but very good ways to see how stable we need to make an approach.
  • Project/Program/Policy: Whatever the highest level and most stable form of an initiative is, you will get to there if your pilot shows promise, and the results are clear. Work at this level will last over time, but needs regular monitoring so that an organization knows when it’s time to tinker and when it’s time to change it.

Cynefin practitioners will recognize that what I’m writing about here is the flow between the complicated and the complex domains, (captured by Dave Snowden’s Blue dynamic in this post.)  My intention is to give this some language and context in service organizations, where design thinking has replaced the (in some ways more useful) intuitive planning and innovation used in non-profits and the public service.

Since October, when I first starting sketching out these ideas, I’ve learned a few things which might be helpful as you move through these circles.

  1. Dialogue is helpful at every scale.  When you are working in a complex system, dialogue ensures that you are getting dissent, contrary views and outlying ideas into the process.  Complex problems cannot be addressed well with a top-down roll out of a change initiative or highly controlled implementations of a single person’s brilliant idea.  If at any point people are working on any stage of this alone, you are in danger territory and you need another pair of eyes on it at the very least.
  2. Evaluation is your friend and your enemy. At every stage you need to be making meaning and evaluating what is going on, but it is critically important to use the right evaluation tools.  Developmental evaluation tools – with their emphasis on collective sense making, rapid feedback loops and visible organizational and personal learning – are critical in any complexity project, and they are essential in the first three stages of this process.  As you move to more and more stable projects, you can use more traditional summative evaluation methods, but you must always be careful not to manage to towards targets.  Such an error results in data like “We had a 62% participation rate in our diversity training” which tells you nothing about how you changed things, but can shift the project focus to trying to acheive a 75% participation rate next cycle.  This is an especially pervasive metric in engagement processes. And so you must…
  3. Monitor, monitor, monitor. Intervening in a complex system always means acting without the certainty that what you are doing is helpful.  You need data and you need it on a short term and regular basis.  This can be accomplished by formal and informal ongoing conversations and story captures about what is happening in the system (are we hearing more stories like the ones we want?) or through a SenseMaker(tm) monitoring project that allows employees to end their data with a little data capture.
  4. These practices are nested, not linear. An always to remember that this is not a five step process to intervening in a complex system.  In a large organization, you can expect all of these things to be going on all the time.  Building the capacity for that is a kind of holy grail and would constitute a 21st century version of the Learning Organization in my books.

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

Probes, Prototypes and Pilot projects

October 5, 2015 By Chris Corrigan Complexity, Evaluation, Featured One Comment

I’ve been working in the world of program development with a lot of complexity and innovation and co-creation lately and have seen these three terms used sometimes interchangeably to describe a strategic move. As a result, I’ve been adopting a more disciplined approach to these three kinds of activities.

First some definitions.

Taken explicitly from Cynefin, a probe is an activity that teaches you about the context that you are working with. The actual outcome of the probe doesn’t matter much because the point is to create an intervention of some kind and see how your context responds. You learn about the context and that helps you make better bets as you move forward – “more stories like this, less stories like this” to quote Dave Snowden. Probes are small, safe to fail and easily observed. They help to test different and conflicting hypotheses about the context. If 8 out of 10 of your probes are not failing, you aren’t learning much about the limits of your context. Probes are actually methods of developmental evaluation.

A prototype is an activity that is designed to give you an idea of how a concept might work in reality. Prototypes are designs that are implemented for a short time, adjusted through a few iterations and improved upon. The purpose of a prototype is to put something into play and look at its performance. You need to have some success with a prototype in order to know what parts of it are worth building upon. Prototypes straddle the world of “safe to fail” and fail safe. They are both developmental evaluations tools and they also require some level of summative evaluation in order to be fully understood. Prototypes are also probes, and you can learn a lot about the system from how they work.

A pilot is a project designed to prove the worthiness of an approach or a solution. You need it to have an actual positive effect in its outcomes, and it’s less safe to fail. Pilots are often designed to achieve success, which is a good approach if you have studied the context with a set of probes and maybe prototyped an approach or two. Without good intelligence about the context you are working with, pilots are often shown to work by manipulating the results. A pilot project will run for a discrete amount of time and will then be summatively evaluated in order to determine its efficacy. If it shows promise, it may be repeated, although there is always a danger of creating a “best practice” that does not translate across different contexts. If a pilot project is done well and works, it should be integrated with the basic operating procedure of an organization, and tinkered with over time, until it starts showing signs of weakened effectiveness. From then on, it can become a program. And pilots are alos probes, and as you work with them they too will tell you a lot about what is possible in the system.

The distinctions between these three things are quite important. Often change is championed in the non-profit word with the funding of pilot projects, the design of which is based on hunches and guesses about what works, or worse, a set of social science research data that is merely one of many possible hypotheses, privileged only by the intensity of effort that went into the study. We see this all the time with needs assessments, gap analyses and SWOT-type environmental scans.

Rather than thinking of these as gradients on a line though, I have been thinking of them as a nested set of circles:

PPPsEach one contains elements of the one within it. Developing one will be better if have based your development on the levels below it. When you are confronted with complexity and several different ideas of how to move forward, run a set of probes to explore those ideas. When you have an informed hunch, start prototyping to see what you can learn about interventions. What you learn from those can be put to use as pilots to eventually become standard programs.

By far, the most important mindshift in this whole area is adopting the right thinking about probes. Because pilot projects and even prototyping is common in the social development world, we tend to rely on these methods as ways of innovating. And we tend to design them from an outcomes basis, looking to game the results towards positive outcomes. I have seen very few pilot projects “fail” even if they have not been renewed or funded. Working with probes turns this approach inside out. We seek to explore failure so we can learn about the tolerances and the landscape of the system we are working in. We “probe” around these fail points to see what we can learn about the context of our work. When we learn something positive we design things to take advantage of this moment. We deliberately do things to test hypotheses and, if you’re really good and you are in a safe-to-fail position, you can even try to create failures to see how they work. That way you can identify weak signals of failure and notice them when you see them so that when you come to design prototypes and pilots, you “know when to hold ‘em and know when to fold ‘em.”

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

Dave Snowden’s reflections on a theory of change

August 21, 2015 By Chris Corrigan Complexity, Culture, Design, Evaluation, Leadership 4 Comments

Dave is working on a theory of change, which I think is a good thing. In this latest post he has a nice summation of the way to move to action in complex situations (like cultures):

So where we are looking at culture change (to take an example), we first map the narrative landscape to see what the current dispositional state is. That allows us to look at where we have the potential to change, and where change would be near impossible to achieve. In those problematic cases we look more to stimulating alternative attractors rather that attempting to deal with the problem directly. Our method is the look at the narrative landscape and then ask the questions What can I (we) do tomorrow to create more stories like these and fewer like those? The question engages people in action without analysis and it allows us to take an approach that measures vectors (speed and direction) rather than outcome. The question also allows widespread engagement in small actions in the present, which reduces the unexpected (and potentially negative) consequences of large scale interventions.

In sum, complexity work is about understanding the context to understand where the potential for evolution might lie.  From there you try experiements to see what you can learn, and support what works while removing support for what doesn’t

It’s an old saw, but it’s actually a simple thing.  And I keep writing about it because it seems TOO simple for most folks.  Shouldn’t strategy be more ordered, laid out and thought through than this.

As always the answer depends, but with complex situations the answer is no.  Save your discipline and rigour for understanding things as they evolve rather than trying to get it all right from the start.

 

via Change through small actions in the present – Cognitive Edge.

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

A couple of great days in Montreal

May 26, 2015 By Chris Corrigan Art of Harvesting, Art of Hosting, Conversation, Evaluation, Facilitation, Learning, Travel, World Cafe 3 Comments

Just about to leave Montreal this morning for Toronto and north to Thornbury, Ontario to visit family.  I was here for the conference of the Canadian Evaluation Society, where I participated on a panel on innovative dialogue methods (and yes I noted the irony in my remarks) and later led a World Cafe where I presented some of the sense-making processes I’ve been working on.  I was here on the recommendation of Junita Brown who has been in some good conversations with evaluators around the use of the World Cafe for evaluation purposes.  Originally Amy Lenzo and I were scheduled to host a cafe here that was much more ambitious: a plenary cafe with the participants to explore the learning field of the conference.  Through various machinations that was cut back to a panel presentation and a very small world cafe at the end of the day with 16 people. The conference was one of those highly scripted and tightly controlled affairs that I hardly ever go to.

The session before us was a case competition where student teams were responding to a mock RFP from Canada World Youth to evaluate an Aboriginal Youth leadership Program.  Not a single team had an Aboriginal person on it, and every single presentation was basically the same: full of fundamental flaws about what constitutes success (“Did the youth return to their communities”) or what constitutes a cultural lens (“We are using a medicine wheel to understand various parts of the program).  One group of fresh faced non-Aboriginal students even had the temerity to suggest that they were applying a decolonizing strategy.  Their major exposure to indigenous communities was through a single book on decolonizing methodology and some internet searches about medicine wheels.  It was shocking actually, because these were the students that made the finals of this competition.  They looked like fresh versions of the kinds of evaluation firms that show up in First Nations certain they know what’s going on.

To make matters worse, the case competition organizer had a time mix up with the conference planner meaning that our panel started 30 minutes late which gave me very little time to present.  As I as doing a a cafe directly afterwards I ceded most of my time to my panel colleagues Christine Loignon, Karoline Truchon who did a very interesting presentation on their use of PhotoVoice.  It was clear to me at the conference that the practitioners among us had a better grasp of complexity theory, power  and non-linear sense-making than any of the professional evaluators I met.

I presented most of the work that I have been documenting here over the last few months, and later led a small group through a cafe where we engaged in the creation of a sensemaking framework and used a pen and paper signification framework.

By far the better experience for me was hanging out with friends and colleagues.  On the first night I arrived I had dinner and drinks with my friends from Percolab: Paul Messer, Samatha Slade and Elizabeth Hunt.  We ate fish and chips, drank beer and whisky and caught up.  On Sunday I met Jon Husband for lunch on the grass at McGill with his delightful godson and then joined the Percolab folks for a visit to the new co-operative ECTO co-working space on Mount Royal in the Plateau, followed by a barbeque with family and friends.

And Last night, after my presentations a great evening with Juan Carlos Londono and Lisa Gravel. We had dinner at Lola Rosa and spent hours going over the new French translation of the GroupWorks Pattern Language Deck.  This was a brilliant time.  I learned a bunch of new French words and most fun of all we discussed deeper etymology, nuance and the limitations and benefits of our respective languages in trying to convey some of the more esoteric practices of hosting groups.  The new deck has some beautiful reframing and some names for patterns that need some work.  But it’s exciting to see this translation and I always love diving into the language.

I really do like Montreal a lot and in the past number of years come to love it more as I have lost my inhibition about speaking French.  the more French I speak, the more French I learn and the more the heart of the city opens up.  Many English Canadians have the idea that Montreal is a cold hearted city to English speakers, but I find that isn’t true at all.  Just offer what you can in French and people open up.  And if you’re lucky enough to sit down with lovers of words like the friends I have, your learning explodes.

Off for a couple of days to visit family and then home to Bowen Island for a series of small local facilitation gigs, all of which will tell me something deeper about my home place.

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

You can’t fix this. So please stop trying. Start thinking differently.

May 1, 2015 By Chris Corrigan Community, Complexity, Evaluation, Leadership

I want to invite you to bite down hard and read this article by Rich Lowry, the editor of the National Review: Baltimore, a Great Society Failure:

Read More

Share:

  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • More
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Pocket (Opens in new window) Pocket
  • Click to share on Telegram (Opens in new window) Telegram

Like this:

Like Loading...

1 … 8 9 10 11 12

Find Interesting Things
Events
  • Art of Hosting November 12-14, 2025, with Caitlin Frost, Kelly Poirier and Kris Archie Vancouver, Canada
  • The Art of Hosting and Reimagining Education, October 16-19, Elgin Ontario Canada, with Jenn Williams, Cédric Jamet and Troy Maracle
Resources
  • A list of books in my library
  • Facilitation Resources
  • Open Space Resources
  • Planning an Open Space Technology meeting
SIGN UP

Enter your email address to subscribe to this blog and receive notifications of new posts by email.
  

Find Interesting Things

© 2015 Chris Corrigan. All rights reserved. | Site by Square Wave Studio

%d