Experts, savants and complexity.

It has been my experience and observation, not once, but a few times, that someone I have known to have some level of ability in a domain that goes beyond straightforward expertise, when asked honest and interested questions about how some outcome of theirs was achieved, has become unhelpful. Sometimes tetchy, sometimes incommunicative, sometimes vague.

And this can be surprising. Somehow, we can have an assumption that anyone who can produce extraordinary results in their domain has such a lucid command of that domain that they should be able to tell us just how and why they do it – and further, that they ought to be happy to explain. After all, we are in awe of them, fascinated by their capacity to produce such work – surely they’ll be happy to be listened to.

But no, not always. Quite often not, in my experience.

In the age of science, the cultural story is that power, control, mastery must come from more-or-less complete knowledge, from total command of the tools, the analytics, the process. That we have moved beyond the occult, the mystery of past ages. That any unwillingness to explain or to give detail is evidence of either charlatanry or ‘knowledge as power’ hoarding.

And certainly there are perhaps many people of whom this is true. But again, my own experience – direct and indirect – is that this awkwardness around explication can come from the most straightforward and open people.

This is my take on what’s going on.

There’s being an expert, and there’s what used to be called ‘mastery’ – and for which a new term is obviously urgently needed . For the rest of this post, I’m going to use the word ‘savant‘ – not that it really works either – but it saves confusion, and saves me from using a loaded term like ‘master’*.

There is a great difference between the two, and the road to that difference helps to explain this perceived arrogance of otherwise approachable and helpful domain ‘savants’ in some conditions.

To become an expert you need, of course, to engage deeply with your domain; to study, to practise, to analyse, to develop methods, to gain experience; learn to bring all these together to confidently, appropriately and reliably produce outcomes that meet high standards within that domain.

But there is only so far that being an expert can take you. If we accept – stipulate, actually – that all worthwhile human endeavour has some element of creativity about it; some dimensions in which even sophisticated application of knowledge within finite frameworks doesn’t get you beyond a certain point, then there is a level beyond expertise. We sometimes hear such ability recognised as ‘artistry’, or someone described as ‘making an art-form’ of their domain.

To become an expert takes aptitude, of course – but also dedication, commitment, hard work, practise … all those things. To sustain these requires that some positive feedback loop exists – some experienced personal satisfaction at your developing expertise that keeps you at it when the going is tough – even if that comes only from within yourself, even if it is only felt as a need that is being satisfied. And often this is a form of pride – justified pride, but pride nevertheless. There is a feeding of ego involved. This is not a bad thing – it supports confidence, empowerment, capacity, appreciation of self.

And within the sphere of the expert, such egotism is relatively safe, even valuable – constrained as it is by the need to demonstrate that expertise, and to justify it in terms of analysis, good practice, methods, standards.

But to go beyond expertise, to become a savant, to move beyond ‘good practice’ into ’emergent practice’ – to use these terms with the connotations ascribed to them in the Cynefin Framework – the practitioner has to do something different – and something which seems like the opposite of what they have done to arrive at their expertise.

In learning the ins and outs of their domain, as a neophyte, a practitioner, a would-be expert, they have been seeking and developing control over/within that domain: rich and detailed knowledge, assurance, understanding, predictability around outcome quality.

They have been helped by teachers who have communicated the instruments of control: analysis, technique, frameworks, best practice, selection and use of tools.

But now, to go beyond that, they need to relinquish control, and submit. To set the ego aside. To accept, at a deep level, that the dimensions of that domain which require/invite creativity have their own shape, their own landscape, their own reality – that, in the end, complete maps of these dimensions are out of reach**.

And, to be worth it, this submission, this egoless-ness needs to be deep. An acceptance that the realities of the domain have to be the final arbiters of quality. Not what will be acceptable, not what is cool, not what is clever, not what is amazing, not what makes money, looks professional, not what is ‘up-to-date’, works neatly with something else, makes the client happy – not what feeds self-satisfaction. Although it may well do all of these things, in the end, they are all nice-to-haves, not fundamentals. Further, consideration of any of these as the outcome develops is strongly likely to detract from final quality.

In accepting this, an ‘expert’ often has trouble – real trouble. It is hard – really hard, to accept that, in the end, all that your hard-won, long-practised expertise is useful for is to get you to the place where you know that you don’t have all the answers. And what’s worse, accepting this is no guarantee that you will gain in ability from your submission. The risk is huge.

One of the all-time-great jazz saxophonists, Sonny Rollins achieved real success in the first ten years of his career: by the age of 29, he had wide recognition and was playing with the best – with Miles Davis, Thelonious Monk, John Coltrane, had written tunes which were becoming jazz standards. He was an expert, there is no doubt about it. At this point, though, in the summer of 1959, he simply withdrew; disappeared from ‘the scene’ and for 18 months spent time almost every day practising, alone, on a pedestrian walkway of the Williamsburg Bridge in New York, next to the subway track.

He had become frustrated with what he considered to be his musical limitations.

He had no ideas beyond the practise; “I would be up there 15 or 16 hours at a time spring, summer, fall and winter”, “I could have probably spent the rest of my life just going up on the bridge”. His career stalled. He didn’t play again in public until the end of 1961, and didn’t record again until the year after.

Reception to his new recordings were mixed; critics expected that something radical would have resulted from this strange withdrawing – but it hadn’t. Not in terms of his sound (he retained the full, assured, powerful tone and commitment to melody), not in terms of his domain or repertoire. Over time though, his reputation has grown and grown; he is now considered one of the most brilliant and flexible of jazz saxophonists; ‘the greatest living improviser’, the ‘saxophone colossus’.

I don’t know, of course, what happened during those hundreds of hours of solo practice alongside the noise of the tracks, but there was no certainly no audience, no critic, no external reward, no teacher, no feedback but his own self; nothing but Rollins and what he could make with his instrument. Nothing but that, for 15 hours at a time, day after day.

As an ‘expert’, having worked to acquire the skills, the knowledge, and the recognition by others of this status, this practice of submission can seem perverse, bewildering, frightening. It requires a willingness to accept that all one’s skill and learning might be getting in the way of an optimal result.

The ‘answers’ one gets from working in this way are sometimes hard to justify or explain in the way that experts typically communicate – with facts and figures, analytics, accordance with good practice guidelines and the like. Instead, decisions may arise from strong feeling of rightness, or betterness.

Further, the results may not always look like the product of expertise. When submission to the realities of the domain is your final arbiter, it sometimes happens that what one produces lacks some accepted characteristics – perhaps in detail, perhaps in underlying structure, sometimes unpolished, sometimes awkward, even odd. This can generate kickback and questioning; ‘If you’re so good, why have you done this?’.

And here comes my take on the communications problem. While the quality of the work should be evident, its whys and wherefores may be less so. Its character is likely to be strange in some way.

To non-experts, this may or may not be noticeable. Their reaction will be simple – they will like it, or not, they will trust the savant or they won’t. Communications in this context are easy – the savant ‘knows’ that the best response they can deliver in the context has been proposed. Justifications can be in broad terms. If the non-expert’s response is material, then the context will have changed, and the response can be reassessed. If not, not.

The difficulty comes with experts in the same domain – the strangeness is irritating – experienced sometimes as a direct challenge, more often as a subtle unease. The questions asked may be couched in expert terms, but the real, underlying questions are almost emotional in tone; ‘Why would you do it like that?’, ‘This looks weird/old fashioned/clunky/unfinished’, ‘How can you call this professional?’.

Another version of this question – often heard – is concealed by; “I hate jazz”.

And so the communication problem. If the savant answers in ‘expert’ terms, then at some point they are likely to find they cannot justify some decision, while if the true answer; ‘It was the best I could do in the circumstances’, or ‘It felt right’, is given, it comes across as arrogant or egotistical. This problem can be even worse when the questioner is half-expert or an interested amateur.

And this experienced arrogance, this egotism, can feel rather surprising, given our assumptions about the way that mastery gives confidence, and thus calm.


This topic may seem a little inconsequential. It may also seem to be the sort of nonsense that leads to mystical bullshit being set up as having equivalence to hard-won knowledge.

I don’t think it is the former, and I certainly do not intend the latter. But to deal with these acknowledged points, more is needed.

For me the topic is consequential because I think it illuminates an enormous current problem – the absolute requirement for us to become competent at dealing with complexity wisely. The requirement to accept the unavoidable characteristics of complex systems is where the appearance of mysticism creeps in.

Until quite recently the scientific mode, as it has developed over the last few centuries, has set its face resolutely against complexity – for some superficially sound, and certainly pragmatic, reasons.

First, the domain of complexity was experienced as hard to distinguish from past mysticism – the dragon that science set out to slay. This, of course, is superficial reasoning (ignorance perhaps – but science is the enemy of ignorance, and so cannot make this excuse), and while the history of complexity science – the reactions and insights of scientists as they came up against complex aspects of their subjects – is yet to be written (as far as I know), it certainly will include many clear-eyed appreciations of the implications and significance of complexity. Not all scientists dismissed the ineluctably complex as evidence of as-yet unexplored simplicity or mystical nonsense.

However, they did typically put aside their insights for the simple, pragmatic reason that the computational and analytical tools needed to make progress were not just unavailable, but nigh-on unimaginable, until the advent of fast computing devices and dynamic displays.

By that time, though, reductivism had ceased to be a novel tool, applied against all ‘common sense’ to the evident boiling complexity of interrelations that is the world everyone experiences on a daily basis.

Instead – within the world of science, at least – it had become the ‘obvious’ and ‘correct’ mode with which to approach reality, if one wished to arrive at ’empirical’ knowledge. And this knowledge, as the appliance of science transformed human culture, became widely conflated with ‘facts’.

The underlying nature of those fast computing devices – binary logic – coupled with their own intensely transformative impact further cemented the idea that knowledge was either binary/yet-to-be-known or woo-woo shit. Not that everyone thinks this, of course not – but it became the unspoken, unacknowledged currency of rational thought. As knowledge became elided with ‘facts’, so facts begin to be elided with the notion of eternal, binary, immutable truths.

In such a context even complexity researchers tend to cast their own work in the mould of reductive science. One such I spoke to said, rather ruefully, something along these lines; ‘the problem is with our work is that we aren’t producing strong predictions’.

This is a serious problem. Because the current problems that beset us are all wildly complex – and interact in additionally complex ways (climate change is connected to soil depletion, is connected to the fresh water crisis, is connected to biodiversity collapse is connected to economic instability is connected to the demographic time-bomb…), and trying to dress complexity science results up in the clothes of empiricist reductionism destroys almost all of the valuable knowledge obtained.

In case it’s not obvious by now, what I’m driving at is that these domains of expertise in which the most valuable results are achieved by savants are ones which are inherently complex – which cannot be fully captured by ‘good practice’. Where what is needed is ’emergent practice’. Where ‘being expert’ is sub-optimal.

And sub-optimal outcomes from our decisions and actions around these urgent civilisational challenges are, well, simply not good enough.

Complexity science should not produce ‘strong predictions’ – it should produce ‘strong descriptions’. What complexity science is good for is describing the conditions within which some particular pattern of outcomes is more likely to happen – or conversely, the conditions outside which it is unlikely to happen. Guarantees are not a given. Sensitivity to starting conditions (commonly known as the ‘Butterfly effect‘) typically is. The only iron law of complex systems is the Law of Unintended Consequences.

This is brings us harshly up against an extraordinary problem. Science itself is – despite its best intentions – resisting the development of the kinds of knowledge that are necessary for the adequate resolution of civilisation extinction level crises.

Actually, not ‘science’ – and not scientists either – but ‘science culture’.

The science culture that pushes complexity scientists to undervalue or distort the reality of their domain.

The science culture that sets a standard of reductive reasoning that must be met before a decision can be considered evidence based.

The science culture that resists, at a level which seems at base emotional, the reality – discovered and explored by science – that complexity defeats reductive attempts at understanding; reveals these to be, at the limit, dangerous.

I am not a scientist, but I see evidence of this resistance in debates around publication of models without experimental data, in the presentation of AI outcomes as something beyond curve-fitting in domains with simplistic success metrics (highly sophisticated multi-dimensional curve fitting, but hey, you burned days and weeks of supercomputer petaflops, so I should hope so too). And of course in the self-circumscription of complexity scientists themselves.

A resistance which has interesting parallels with the struggle for acceptance that statistical mechanics had in the late C19th.

So here’s the thing. Complex systems as large as the biosphere (the most complex system we know of, save the universe itself) are beyond computation. And yet we must act. Now.

We want to act in accordance with the best possible evidence, but at the same time in the knowledge that we cannot rely on reductive approaches – because the requirement of such approaches to simplify is dangerous – in that it leads to focus on particular measurable outcomes as proxies for ‘success’.

And in complex systems, success itself must be considered complex. Simplistic success metrics (wealth measured in money, growth measured in GDP, agricultural efficiency measured in output per hectare, ‘will of the people’ measured in simple majorities – there is a long list) are what got us here in the first place.

This means that we are going to need to operate at the boundary between hard evidence and ‘a feel for the domain’. And that we need to do this in the most sane manner possible.

The resistance of science culture to the advances and meaning of complexity science is an extraordinary self-inflicted barrier to our ability to do this.

Because the world is full of people who have no trouble at all operating at that boundary, for all the wrong reasons – the very mystifiers that science deplores; politicians, demagogues, venture capitalists. But these people are being given a free pass by science’s unwillingness to take seriously the most important advance in human understanding of the fabric of reality for a century – our increasing capacity to map the landscapes of complexity.

  • We need to recognise the limitations and distortions on our deepest interactions with systems which are imposed by a science culture which has adopted reductive empiricism as a straitjacket.
  • Clear evidence of the weakness of this culture comes from the strange unwillingness of domain savants to communicate about their process – and from the self-imposed constraints adopted by complexity scientists.
  • Evidence of the harm this weakness does is in the desperately poor record science culture has in influencing policy where it matters most. It turns out that pitting one set of reductive success metrics against another is ineffective.
  • Science needs to learn from the experience of acceptance of the math-mediated reality of statistical mechanics and apply it to the new frontier of complexity science.
  • Domain savants should be bolder and more open about their practice – and look into the use of tools and resources which can map complexity in useful ways – Pattern Language, for instance.

*I use the term ‘savant’ here without much pleasure. I avoid the word ‘master’ for reasons I hope are obvious, but in my search for other terms could only find ones that either imply ever-greater degrees of domination and control, or of consummate perfection – neither of which make sense in the way that I intend.
** In describing complete domain maps as being ‘out of reach’ I am not for one moment implying that there is an end to our capacity to learn more about these domains; far from it. The development of tools for exploring formally complex conditions is for me the most hopeful and powerful of the last half century. What I mean is that such complexity has an intrinsic characteristic -sensitivity to initial conditions – which means that complete maps (which require some loss of detail) can never be made. This is good news! Science can go on and on, forever, without exhausting mystery. There will always be awe and wonder in the universe.

One thought on “Experts, savants and complexity.

  1. Great article, Dil. I think the difficulty for some scientists is that they choose that profession as a coping mechanism for their fear of the unknown. I know, because that is why I studied science. But I got over it 🙂

    Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.