Prepare your church for emergency response: Lessons from Grenfell
This guide draws on our research into faith group responses to the Grenfell Tower tragedy, and can be used in emergency response preparation. (2018)
Rachel Thomas argues that our growing self–identification with machines could have the severest consequences for the vulnerable in society. 20.03.2019
This week I found myself calling a friend an ‘absolute machine’ after he’d worked a fourteen hour shift pretty much entirely uninterrupted. He’d worked so hard that he ended up falling asleep across a self–assembled line of bean bags, which in a see–through office, led onlookers to assume he’d actually died.
Of course, the fact that he ‘shut down’ after working so hard made us all aware that he definitely isn’t an actual machine. Yet I was struck by just how easily we compare people to machines, and what implications this could have for those who don’t easily fit the metaphor.
We compare ourselves to machines in everyday language when we speak of ourselves ‘processing information’, of ‘cogs whirring’ inside of our heads and of ‘letting off steam’ when we’re ‘fuming.’
Stemming back to the Industrial Revolution, as machines became more and more complex, we started to look to them in order to better understand ourselves. By the time of the industrial era when Freud was developing his theory of mind, for example, consciousness was understood like a steam engine: with emotional pressure bubbling away until it has to escape, fuelled by unseen forces.
Now, the metaphor of choice for understanding ourselves is the computer. Max Tegmark’s recent bestselling book, ‘Life 3.0: Being Human in the Age of Artificial Intelligence,’ applies the metaphor in one of the most explicit ways I’ve seen, though we adopt it more subtly in everyday language. “Your synapses store all your knowledge and skills as roughly 100 terabytes’ worth of information,” he writes, “while your DNA stores merely about a gigabyte, barely enough to store a single movie download.”
The comparison between humans and machines has proved useful in a number of fields, especially in medicine, where we can now model successful cures precisely through adopting a mechanical outlook on the human body. The more recent comparison of our minds to computers has been especially useful in the field of cognitive psychology.
Yet since mechanical language has been increasingly adopted into our everyday conversation, we’ve subtly started to elevate the mechanical parts of ourselves to the highest status, at the cost of other more vulnerable, and more human, traits. This could have the severest consequences for the vulnerable in society, and has led to Tegmark’s dangerous prioritisation of the perfectly functioning mind. One of the reasons we are equally excited and threatened by recent developments in artificial intelligence, for example, is that it taps into the qualities that we have started to hold as most valuable about ourselves; our ability to learn, process and store information.
For people who have low IQ’s, who don’t find it easy to pick up new information quickly, and for the 50 million people worldwide who suffer from dementia and its associated memory loss, this comparison with computers sends out a less than accepting message. How will these individuals feel valued in a society where our most prized attributes are intelligence, quick learning and a sharp memory? If it is a machine these individuals see when they look in the mirror, it is either badly–programmed, faulty, or one that is shutting down.
I recently spoke with the Head of Computer Engineering at Cambridge University, who works at the forefront of machine programming. I expected him to be virtually obsessed with machines, making a lot of the comparison between humans and the robots he had made. Yet he couldn’t have been more passionately against the link, when taken literally. He gave the moving example of his father running into a series of health problems at the end of his life. It was precisely that his father wasn’t a machine, he said, that made him want to care for him and look after him.
The main issue with mechanical metaphors is that they not only play down the profound vulnerability of people, but also play up the competency of machines. Even the most advanced computers require careful human input and guidance.
In a recent episode of Channel 4’s “the secret life of 4–and–5–year old’s,” a group of children were introduced to a sophisticated companion robot, representing the height of technological advancement. At one point the robot hit his head on an unfortunately positioned beam (its vision obviously not developed quite enough yet). One boy was moved with empathy, and sourced a bandage which he affectionately wrapped the robot up in. He later said that the robot, which he named ‘Bobby’, was his friend, and was sad to see it leave.
Studies have shown that the robots we feel intuitively connected to are the ones that depend upon us in some way for care. Researchers at Massachusetts Institute of Technology (MIT) showed that when a humanoid robot is attacked and cries out for help, human observers display the same level of empathy and care that they apply to other vulnerable humans. This is the reason that technology companies deliberately don’t design robots to look creepy or threatening, but instead to be childlike and vulnerable. Though the metaphors we employ in conversation refer to machines as perfectly functioning systems, the reality of our interaction reveals that perhaps they are needier than their stereotype.
Maybe the solution is not only to realise that humans aren’t all perfectly working mechanical systems, and nor would we want them to be, but that machines aren’t as independent as we think. Recognising the vulnerability of both humans and machines would enable us to reclaim the metaphor in a way that is inclusive of those in society who don’t always function perfectly.
Rachel Thomas is a Religion & Science Researcher and former Theos Research Assistant.
See other recent events and articles
Ben Ryan argues that representative democracy is failing and the time is right for a more ethical approach to decision making. 14/03/2019In Brief
In the 34th episode of The Sacred, Elizabeth Oldfield talks to lawyer and legal commentator David Allen Green. 13/03/2019Podcast
Theos researches and investigates the intersection of religion, politics and society in the contemporary world.