A couple of good blog posts in my feed this morning that provoked some thinking. These quotes reminded me how much evaluation and planning is directed towards goals, targets and patterns that cause us to look for data that supports what we want to see rather than learning what the data is telling us about what’s really going on. These helped me to reflect on a conversation I had with a client yesterday, where we designed a process for dealing with this.
First is this one on one good tip to help your research:
And we need all the tricks we can muster because our psyche is playing them on us all the time: making us think things are solely our decision, our idea or our considered opinion, constantly nudging us not to think for ourselves. Perhaps this is the basis of the easy tendency to lean so heavily on research but that’s not the point. The point is that by looking to it for answers we do little more than search for reassurance that we were right all along. If nature abhors a vacuum, your psyche abhors cognitive dissonance and just because you’re aware of the notion of confirmation bias doesn’t mean you’re protected from it. As 20th century philosopher Bertrand Russell puts it:
“If a man is offered a fact which goes against his instincts, he will scrutinise it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence.”
And so to the tip.
Rather late in the day I’m discovering the joys of a bit of Scandinavian hard-boiled in the person of Jo Nesbo’s detective, Harry Hole. In the novel Nemesis, Harry recalls his Police College lecturer’s advice on house searches:
“Don’t think about what you are searching for. Think about what you find. Why is it there? Should it be there? What does it mean? It’s like reading – if you think about an “I” while looking at a “k”, you won’t see the words.”
The lecturer’s advice is spot on. Think about what you find. Don’t try to mash things into pre-determined categories. In the land of Cognitive Edge this is recast as “data precedes the framework.” When we start with a scheme for categorizing our work, we are tempted to do two things, which are fatal strategic mistakes. First we are tempted to fit every outlier into a category, which renders difference and novelty impotent. A gradient of difference, or a range of diversity in a set of data helps us to discover innovative practices and sustain creative tensions. Second, we are tempted to erase differences by working a a scale of granularity that is too broad to be useful. And why would we do that?
Often, consultation and engagement is not undertaken with an innovative mindset. Rather we are seeking to justify the planning targets that we have set. We are looking for justification rather than innovation. And here is why:
A disconnect exists between corporate strategic plans, which typically define the company’s targets for growth, and their day-to-day execution activities….
…There is the fashionable argument that you abandon annual plans and you seek fresh planning ways to create an organization ready to react to critical changes in the marketplace. That is great for the little fella, all nimble and lean but for the larger complex organization they struggle. Each of the different organizational parts have different reaction and response times, are governed in totally different ways. Plans are essential as the instrument of strategic design but how can these become more reflective and responsive, this is a real thorny problem to crack.
A recent report by Strategy& suggests being more agile requires a clear focus on two attributes of ‘strategic responsiveness’ and ‘organizational flexibility’ being built into the design of the larger organizations so they can move far quicker as conditions change. A combination of ‘sensing new risk and opportunities’ to craft quick responses and also being able to “shift execution rapidly”, applying fast retooling and rework, applying this progressively over weeks and months.
in other words, organizations are trying to build a culture of innovation and adaptability but they are constrained by management process that manage to the targets in the plan. What gets sacrificed is the innovation, because no one ever lost their job for hitting the target.
A story to illustrate: yesterday I was coaching a client who was designing a meeting to bring together stakeholders to discuss the results of a massive engagement process and to start thinking about new ways to address their mandate. They have over 17,000 pieces of data from the engagement process and a consultant nrolled these into a number of themes. The plan was to address the themes, see if they made sense and then come up with new ideas to work within them.
There are several problems with this approach:
- It assumes that the consultant got the scheme right.
- If the connsultant didn’t get the scheme right, there is no way of checking it.
- In order to get 17,000 pieces of data down to a dozen themes you have to do a substantial amount of reduction and “chunking up” resulting in a loss of granularity and diversity.
- The themes are boilerplate, motherhood statements that anyone could have come up with with 30 minutes of thought.
- There is nothing inspiring to work with.
- The stakeholders have no context to be able to evaluate the scheme or discuss new ideas.
So we took a complexity approach (distributed cognition, disintermediated sense-making, fine levels of granularity, data precedes the framework) to working with the data and designed this process:
- Print out a random selection of 500 pieces of data.
- At tables of 8 people, hand out 80 pieces of data and give the group 15 minutes to cluster and theme.
- Have people switch tables and work on other group’s schemes (to avoid premature convergence)
- Compare the themes that are emerging across the tables. Adopt all of the the schemes.
- Then go into a process of brainstorming experiments and ideas to address these themes, inspired by the data and the new schemes.
- Staff in the organization then can go away and design some questions and plans for addressing the discrepancy between the stakeholder group and the consultant’s report.
- As an option, the organization can repeat this exercise in local communities across it’s region working with data selected from that locale. That way the strategy would be tailored to each location.
The fears and objections to this new design were predictable, and easy to address:
- What if we get the scheme wrong? In a complex system, there is no way to know if you got it right. You are trying to make sense of the system so you can try new things, not get the truth about the system. No amount of research will give you certainty, but strategic sense-making will help you make decisions.
- What if people are overwhelmed by the amount of data? This is a feature, not a bug. Being overwhelmed (cognitive overload) helps you to break patterns. The scheme that you come to rely on will fall apart and outliers will become more visible.
- What if we can’t agree on a set of themes? That is also a feature, seeing different things creates a diverse set of perpectives which is exactly what you need to engage an organization in innovation.
- What if people try to game the system? In fact it is easier to game the system when you are working with 12 high level boiler plate themes. It is much more difficult to game the system when the schemes you are developing are emergent and based on actual pieces of data.
I’ll let you know how that goes.