So we’ve talked about the clustering of obvious/average ideas in ideation and we’ve talked about how our perspectives shape what’s obvious (and what’s unthinkable) to us. What can we do about it?
In 1578, after eight months of imprisonment in a tiny cell by his fellow Catholic monks, not-yet-Saint John of the Cross wrote the poem The Dark Night of the Soul. The subject matter was the period of spiritual deprivation immediately before the culmination of the spiritual journey of the contemplative monk.
Having first weaned oneself off attachment to worldly pleasures, the aspiring mystic enjoys delightful spiritual pleasures which encourage him or her in the path. But attachment to these, too, has to be cut off. There follows a time of total desolation, with a sense of total fruitlessness.
John of the Cross advised that crossing this abyss requires faith that, having left behind all of the reassuring worldly, emotional and intellectual pleasures, there’s something more out there.
(The term “dark night of the soul” has since been coopted to mean any old period of bleakness in life after stubbing one’s toe or whatever. You have my permission to look down on anyone who uses it in this way.)
Our brains have evolved to rely on heuristics to navigate life. Heuristics are basically those unspoken rules of thumb, knee-jerk tactics for making decisions quickly. As anyone who has read anything about behavioural psychology has been told over and over, these heuristics are extremely useful and we basically couldn’t function without them.
When you’re crossing the street and you hear a loud horn honking from outside your field of vision, it’s not a great survival strategy to get creative with possible explanations – e.g., that it’s a harmless child playing with an airhorn. By the time you’ve decided that it’s probably a car, you’ve run out of time to get out of the way. For example.
In addition to these life-or-death reflexes, we build up other assumptions about the world that seem to work well 95% of the time. Sometimes they’re learned through repeated experience, like a mechanic encountering the same problems with the same underlying cause over and over. Sometimes they’re imbued through social conditioning and pop culture, like baseless racial and gender prejudices.
Whenever we’re trying to come up with ideas, whether they’re creative ideas for marketing campaigns or novel solutions to problems or opportunities, our brains immediately latch on to the most easily available concepts they can find. And heuristics are, by definition, the most easily available.
These heuristics take two forms. One is simple beliefs, unquestioned assumptions about what is true in the world. If we took a moment to think critically, we would admit that we’re not 100% certain of those facts, but unless we consciously consider that, we operate as if they’re definitely true.
The other kind of heuristic is subtler and relates to our previous discussion about horizons of disclosure. These are our mental models, the conceptual frameworks within which we understand a challenge or situation. As with horizons of disclosure, mental models are both necessary for coming up with ideas or solutions and also make some ideas or solutions unthinkable.
I’ll come back to this issue over and over in the next few posts.
We’re naturally lazy. It’s an evolutionarily advantageous trait to get away with spending the least energy possible – and thinking takes energy. For the purposes of surviving long enough to raise children long enough for them to do the same, creativity and critical thinking are mostly useless. Evolution has been described as “survival of the fittest”, but really it’s “survival of the good enough”. And for enough of the time, in enough of the situations, lazily relying on heuristics is good enough.
So when we start having ideas, coming up with possible solutions, the first ones that come to mind will be based in our most familiar mental models. And, if relevant, they will accept that unquestioned assumptions are true. While some future posts will suggest ways to avoid this in the first place, one solid piece of advice from the experience of generations of creative people is this:
Just have the obvious ideas first and get them out of the way.
Write them down as if they’re genuine ideas you’re considering. If you think of them and try to ignore them, your brain is going to say, “Why are you still trying to solve this? It’s solved, buddy. Here, I’ll tell you again: here’s the really obvious solution. Hey. Hey. Listen to me. Here’s the idea. It’s so obvious.” So treat them as if they’re serious ideas, your brain will be satisfied that you’re taking its obviousness seriously, and then keep going.
Get all of the obvious ideas out of the way and then keep going. You will slow down, because those first obvious things are, by definition, the easiest things to think of.
This is one of several ways I oversimplified things when I described the distribution of the quality of ideas as a bell curve and talked about any given idea having a certain probability of being good enough. It is true that if we ran a workshop off a single straightforward ideation prompt, came up with 100 ideas and collated them all at the end, the distribution of their quality would be bell-curvey. That is, most ideas would be average, some mediocre or good, and a few terrible or great1.
However, technically, the probabilities change between the start and the end of ideation. Yes, any idea is more likely to be average than it is to be an outlier, but earlier ideas are even more likely to be average. And so, later ideas are more likely than earlier ideas to be great (or terrible).
Chapter 7 of Faris Yakob’s book Paid Attention makes a similar point with a different visualisation for thinking about it, as ever-expanding territory away from the nearby and familiar. (You can find Faris and Rosie on Substack at Strands of Genius.)
It should now be fairly clear why all of this is.
Earlier ideas reflect our close-at-hand mental models and assumptions – they are obvious.
We have lots of obvious ideas, so they tend to set the standard for averageness – both within a single ideation session and compared with ideas that people in the world have in general (which also tend to be obvious, using similar shared mental models and assumptions).
Once we run out of obvious ideas, the only ideas left to have are less obvious.
These less obvious ideas have more potential to be non-average in their outcomes.
(There are plenty of non-obvious ideas with average outcomes. But for all intents and purposes, there are zero obvious ideas with great outcomes.)
So when we think about trying to get to those good-enough ideas to the right of the red line, the ideas we have start in the middle and work their way out to the edges (both good and bad). The faster you get those obvious ideas out of the way, the sooner you get to the possibility of good and great.
That’s all very well in theory, but how does it actually feel when coming up with ideas? It feels discouraging. Whether individually or as a workshop room, you start out with this high rate of ideation and suddenly things slow down and it feels like there are no ideas left to have.
Much like with St John of the Cross and his dark night of the soul, it’s only once the reassuring and familiar and seemingly fruitful territories of everyday ideas are left behind that we can think in different ways and have an altogether different sort of idea. If you give up on ideation once you hit that wall, you will forever limit yourself to the obvious, the ordinary, and therefore the average in terms of outcomes.2
In individual ideation, this requires some faith. For experienced creatives, they’re familiar with this process and know that persistence pays off despite how creatively barren things feel. And in group ideation, it’s the role of activity planning and prompts to push through the collective slowdown as we move beyond the obvious.
Getting the obvious out of the way quickly is always good advice for any ideation. Next we’re going to talk about how to consciously shift the mental models that make some ideas more obvious than others.
Craig Page of Weirdworks raised a good point, that in his experience the curve is unlikely to be exactly symmetrical – there are more great ideas than equally terrible ideas – and this technically pushes up the mean quality if not the median. This is true for two main reasons: firstly, we tend to self-censor and not table truly terrible ideas; and secondly, most situations have a lower limit on how bad an outcome can be, but no upper limit. (For example, a promotional idea generally can’t do worse than zero sales.) I’ll cover different distribution shapes soon.
For a less fanciful analogy, consider endurance exercise. You start off working with readily available blood sugar. When that runs out, your body switches to glycogen into more blood sugar. And when that runs out, the body switches to converting fatty acids into energy. The point is that the body only switches to new sources of energy once the more readily available sources are exhausted. (And it feels rough.) (So I hear, anyway. I play computer games.)
Thanks for all the hat tips!