Complexity and Uncertainty Theory: The Folly of Specialists

The Russo-Ukraine war has spawned a whole new generation of defence analysts who glibly draw specific lessons from the ongoing war.

This article is for them. Since they obviously do not read thinkers like Nassim Nicholas Taleb and several others on Complexity Theory, this should help them.

War is not a complicated domain, but a complex one. In the former, the cause and effect relationships are clear, albeit only to the expert. Hence the experts come out and educate us ignorants. In a complicated domain, however, the cause and effect is hidden as well as ever-changing, hence expertise matters not.

From the tantalising proposition that a ‘butterfly flapping its wings in Texas can cause a tornado in California’, the experts draw the wrong lesson that an expert, like them of course, can even predict a tornado in California merely by watching a butterfly flutter its wings in Texas. The right lesson, unfortunately, is that the cause and effect is so uncertain that to find out the chain of events for a tornado in California, you may need to go all the way to a butterfly in a park in Texas.

And that too, back in time.

Despite that, every time a butterfly flaps it wings, it won’t result in a tornado. That’s the whole point – which is sorely missed. In complex domains, the inter-relationship between the constituent parts change continually.

In complex domains, the future does not mimic the past but is created anew every moment. Had it not been so, a fat turkey would have been right in drawing a lesson from its daily feeding that the master is a nice chap. But come Thanksgiving, and it is no more than delicious dinner on a plate.

So, what should Generals and defence analysts do, if not speak on lessons learnt? Indeed. That is the problem of what Taleb calls Plutonicity i.e., “the desire to cut reality into crisp shapes.” In the book The Black Swan he says,

“They do not know about the subject matter (because it is unknowable – interpretation added by me), they are much better at narrating, or smoking you with complicated mathematical models. They are also likely to wear a tie.”

Author provided

We humans also suffer from Narrative Fallacy. Taleb says he would have preferred to call it Narrative Fraud, but settled for Narrative Fallacy. We look for patterns – cause and effect – where none exist. Patterning makes it easier for the brain. We like to simplify and summarise because that reduces the dimensions of the matter. We simply cannot look at any sequence and not force a causal link between them. Post hoc ergo propter hoc – after this hence due to this.

It would have been harmless, were it not resulting in our impression of understanding of this random world.

Professionals, particularly those who get paid for their words, are able to provide a reason for anything. When Saddam Hussein was captured, on December 4, 2003, Blomberg gave out the news at about 1 am  — ‘ US Treasuries rise; Saddam Hussein capture may not curb terrorism’. When the markets actually fell, Bloomberg put out another bulletin soon after  — ‘ US Treasuries fall; Hussein capture boosts allure of risky assets!’

Talk is cheap.

Why is it wrong to draw a general lesson from a specific event? Because, that would be inductive reasoning which never works, but only fools.

To create a general theory from a specific observed event is inductive reasoning, which is unscientific. The only way you can get to the truth is by deductive reasoning which relies on falsifying to get to the truth. You need to show that this and this is not possible because I’ve directly observed it.

Falsification as the way to right logic was first propounded by Karl Raimund Popper.

Frogs die when you put them in the fridge for a week, but that means nothing. If you put a frog in a fridge in a fridge for week and it survives, it is relevant to the search for truth. You have actually seen the opposite be true. If all frogs must die when put in a freezer for a week is true, then when this frog is put in freezer for a week, it should die. But this one doesn’t. Hence, it is not true that when frogs are put in the freezer for a week, they die.

You can never prove a thing is right, only that it is wrong. Essentially, it is chipping away, which helps to remove away some wrong, unwanted stuff. When all the bad stuff is removed, all that is left is the good stuff.

All this war has proved is that it is wrong that tanks columns and aircraft forays are not immune to relatively cheaper missiles, wrong that political gains come only from military successes, and also wrong that the world would jump to stop a war. These are true as falsifications but not as general theory.

Also read: I Felt Like a Saviour, Then a Mercenary Being Paid to Die: Retired Army Colonel on ‘Hollow’ Tributes

We know what is wrong with greater precision than what is right. Seeing frogs live is a corroborative fact but not evidence. Evidence is always negative. I can’t say he is not a murderer because when I had breakfast with him last Friday, he did not murder anyone. But, if I did see him murder someone, I can say that he is a murderer

You get closer to the truth by quoting negative instance, not by verification. In any case, we verify more to confirm our held beliefs, than to really test them.

Psychologist P.C. Watson did this experiment. He gave a three-number sequence to subjects e.g. 2-4-6 and asked them to guess the rule that generated it. The procedure was to be that the subject would speak out the next number in the sequence, based on what he thought was the rule. Hearing that response, the researcher would say either ‘yes’ or ‘no’, depending on whether it validated the real rule or not. Once confident they had gotten to the right rule, the subject would speak it out. The correct rule here was simply ‘numbers in ascending order’. But almost no subject got it right, because to get to that they would have had to offer a number in descending order, say 1, to which the researcher would say ‘no’. But no one did it because they were bent upon getting a ‘yes’ from the researcher i.e. trying to confirm something, rather than disconfirm something. Subjects tenaciously tried to get confirmation to the rules they had made up themselves but did not exist.

This is confirmation bias. You always look for answers that would support your hypothesis, not falsify it.

Taleb aptly derides it as ‘Confirmation Shonfirmation.’

I am told the Indian army has put great faith in two books – the Bhagvad Gita and Aarthashastra – to guide them for the future. That would be inductive reasoning, that too relying on stories from a very distant past. I advise them to instead read Taleb and the Complexity Theory, particularly the Cynefin model by Dave Snowden.

Col Alok Asthana is a veteran, presently a consultant on leadership and innovation. He is author of two books – Leadership for Colonels and Business Managers and Reclaim your Democracy. He can be contacted at [email protected]

Featured image credit: DEF – TALKS by Aadi/YouTube