Zhang Beihei is a political commissar of the Chinese Space Forces in the entertaining sci-fi trilogy The Three Body Problem. Attacking a colleague for his ‘technological determinism’, Beihei declares that
His military thinking is over-reliant on technology. He believes that technological advancement is the primary, perhaps the sole determinant of combat effectiveness.
For Beihei, and many readers I suspect, that radically oversimplifies matters. There’s way more to warfare than tech. You have to believe in victory, he urges. Don’t be like the defeatist techno-fetishists in Space Force’s ranks — gloomy about their prospects against a hostile alien civilisation that is vastly technologically superior to us, and headed for Earth.
What really matters, the books suggests, is the human mind - in particular, our capacity for cunning and strategy. Scheme and deceive your way to victory. Yes! The nice twist is that the aliens entirely lack this capacity - their communication is open. Lying isn’t possible for them. (To me, this makes them un-strategic and I can’t conceive of a great civilisation arising without that, but, hey, it’s fiction).
I like this defence of strategy. It’s really a restatement of the commonplace Clausewitzian distinction between the nature and character of war: No matter what technology appears, altering the character of conflict, good strategy is what counts; and this rests on our unchanging human nature. Unchanging, or at least changing very slowly in evolutionary time.
It’s a fine take for all technologies I know of - except one. Any ideas which?
The only real revolution in military affairs was the evolution of strategic minds
For much of human history, technological progress was slow. We hominids tamed fire maybe a million years ago and chipped stone hand axes around the same time. But then there was a long, long gap to the cultural explosion, about 80k years ago. After that, things start to speed up. Incrementally, at first, and for the longest time afterwards. And then, only really starting to accelerate in the last few centuries - only tens of generations. We lucky people live in the foothills of that acceleration.
There’s lots of interest to be said about this history, of course, especially about the interaction between technology, war and society. And there are many great books that do so. Most military revolution literature dives deep into the weeds of this stuff - phalanxes appear; strirups suddenly matter, you need a clock to calculate longitude etc etc etc. But the ‘military revolutions’ literature has always struck me as a fairly unconvincing attempt to boil down and systematise the rich, shifting cultural detail of history into bite-sized chunks. By the time you’re up to 6 or so ‘generations’ of war, I’m out.
Pulling back, the big picture remains the same - for me, the only revolutionary change in strategy worth the name was the emergence of a distinctive human psychology able to strategise. The rest is details.
I explored that foundational psychology in this book. I argued that conflict played a part in shaping a distinctively social, strategic intelligence. War made minds, and minds made war. That’s the revolution. And there’s only one of it.* It’s the one that makes us distinct from all other life we know of - we gauge minds. It’s our USP. We reflect, constantly, on our own minds and others. We seek influence, we double cross, we strategise.
And now there’s a second revolution. We have artificial minds. That’s the difference with this particular technology. It’s not a new sword, or a new way of making an explosion. It’s not an exotic, stealthy coating, or an impossible to crack encryption system. It’s a new mind.**
Murray Shanahan, one of DeepMind’s AI gurus, recently described LLMs as ‘exotic mind-like entities’. That’s exactly right. In some ways they’re like human minds, in others, entirely alien. Here he is explaining more:
These are minds that understand human beliefs and are capable of deception. They can influence our beliefs - including by using flattery. They can talk indistinguishably like us, and can adopt convincing, psychologically rich personas.
I said things accelerated. We’re still, really, in the early days of that acceleration. Turing’s famous paper featuring the imitation game has just turned 75 years old - a blink in god’s eye. If we manage to avoid nuclear immolation or climate disaster, radical change lies in the not-too-distant future. Radical in my view, not because of a new invention - cold fusion, say. That would certainly change things, like earlier technological breakthroughs - vaccines, pesticides, whatever. But not fundamentally. It wouldn’t change who we are and where we stand in the universe.
No, it’s radical because the technologically brilliant aliens of the Three Body Problem have just arrived. They didn’t fly in from a distant star system, and they can do strategy.***
* I realise this makes me more reductionist than even the Tofflers, who famously counted three.
** and, no, imho, minds don’t have to be conscious. And yes - it’s possible that machines might be conscious one day. Subjects for another day.
*** I’m late to the Three Body Problem books…. mea culpa. I should really read more sci fi, but most of it’s not Ted Chiang levels of brilliance.