Forty-two, famously, is the answer to Life, the Universe and Everything given, ‘with infinite majesty and calm,’ by the super powerful computer Deep Thought in The Hitchhikers’ Guide to the Galaxy. It’s an answer that points to the absurdity of asking a machine about meaning. They’re good for computing numbers and probabilities, but not anything deeper, more human. (Although in fairness there’s something absurd in the answer that might appeal to existentialists. Or even to stoics, who are supposed to take whatever nonsense life throws their way on the chin.)
Still, I’m tired of people saying that machines don’t get what things mean. That they are basically souped-up calculators. I hear that a lot nowadays about language models, the AI of the moment. What they’re doing, I’m confidently informed, is pattern matching words – nothing deeper than that. They have no real grasp of what things actually mean.
Meaning. It’s such a short word. And, like many deceptively simple terms, people usually fly right by it without a moment’s thought. It means what they want it to mean – which in this case is that humans are qualitatively different, and superior to machines. Are we?
The wisdom of crowds
We’re certainly into meaning, as a species. We are homo sapiens – the wise human. The one that knows stuff. But that’s not quite right: The ‘human that wants to know stuff’ would be better. (Homo qui vult scire, less catchily). Victor Frankl was onto something profound with his book Man’s Search for Meaning – where he argued that concentration camp survivors, in addition to astounding luck, had in common that they were able to find some sort of meaning or purpose to motivate their struggle against horrible odds.
While Frankl was searching for meaning in the camps, the American psychologist Abraham Maslow published his landmark paper ‘A theory of human motivation,’ featuring a pyramid structure about our ‘hierarchy of needs’ -- at the base, suggesting some sort of primary importance, were food and physical security. As you advance up the pyramid, the needs become more abstract – through belonging and esteem, until right at the top we reach our need for self-actualisation and transcendence.
I’ve long thought that Maslow got it 180 degrees wrong. Imagine yourself lost and alone in the rainforest. How long would you survive? A day? Two? I wouldn’t last long. There’d be shock, followed soon after by rapidly diminished hope and then listless despair. To survive we need first to thrive: to know what we are about, to be motivated to engage in the world around us. Often that comes from membership of our group. We find our place and our sense of self through it. And from that comes access to the basics of survival – the food and security, and the cultural knowledge that is essential to obtain them.
So, it’s meaning first, munching second. And for us, much of that meaning is social – after all, it takes a village. It’s why loneliness, depression and morbidity correlate. And why ostracism activates the same brain regions that are associated with physical pain. One of the predictions of ‘terror management theory’ is that we invest in our group when reminded about our own mortality. The story of us is a large part what makes life meaningful.
This social meaning even extends into the realms of the aesthetic and spiritual. Why did several thousand of us troop into the museum that day to see Van Gogh’s sunflowers? We were there because we all agree it’s magnificent. Why do we all find mountains sublime? As Robert MacFarlane argues, we didn’t, until fairly recently, when it became all the rage. And yet, I still feel moved when in the high mountains.
Perhaps such herd-like behaviour should give us a moment’s pause when congratulating ourselves on our superior sense of meaning.
Machine meaning
At its most abstract, meaning is just an appraisal of something’s value, judged from our own perspective. Typically that value reflects our motivation – what do we want to achieve and why? How do things stand, relative to how we want them to be?
Machines do that too. In computer science, we might say that motivation is captured in the reward function – the thing to be optimised. And meaning comes from appraising where the algorithm is, relative to that. Like us, there’s a degree of contingency, of uncertainty involved. Machine and human alike are immersed in world of probabilities – of figuring out what’s valuable, and how to get to it. At whatever level – Bayesian reasoning in the mind, electrochemical signalling of neural networks, even molecular and subatomic processes, human minds are weighing probabilities and computing value, much like … computers.
Of course, computers have it comparatively easy – their reward function is typically narrow; the universe in which they seek to optimise it a thin, insubstantial slice of the rich, multifaceted ‘real’ world we inhabit. So for language models, the value comes from working out which words go best with others. We humans, by contrast, are trying to find the best solution for a plethora of motivations – deep, intermediate, and proximate. It’s like our world is multicoloured, and theirs monochrome. Their sense of meaning is correspondingly meagre. A bare, numerical value. 42, perhaps.
Is that all so different from us?
Feelings mediate our social world. They are the currency of human meaning. Value for us isn’t just a number, it’s felt. This is what separates the world of animal meaning from the world of the machine. When we say that the computer doesn’t get meaning, we mean it doesn’t feel what we do. So, while intelligent machines are increasingly social, including with us, their value isn’t experienced. And that, really, is what we mean when we say they lack meaning.
But ultimately, that’s rather a slim difference on which to hang the contention that machines don’t get meaning at all. For one thing, biocomputers will soon have the capacity for feeling. For another, feelings are only one way to represent value, relative to motivations. The language model doesn’t feel the words it adroitly churns out. But does that mean they have no meaning for it? I’m not so sure.
Know what I mean?