Driverless cars have hit the headlines a lot recently. Auto manufacturers, technology companies including Google, and legislators predict that by 2020, self–driving cars will be navigating public roads in countries such as the US, UK and Finland. An article by the BBC on Monday predicts autonomous vehicles will be tested between London and Oxford as soon as 2019.
Most of what we read contains uncertainty, coupled with anticipation. Which auto–manufacturing company will win the race? Which country will get them on the roads first? Will their arrival be sped up now that Chinese internet giant Baidu has released its technology to be used by other companies?
Whilst the time of their arrival remains an issue of conjecture, that they will arrive seems to be pretty much a given.
In a recent hearing on the future of driverless cars, for example, engineering professor David Noyce remarked, “they’re coming, whether we like it or not.”
We could say that Noyce is overstating the inevitability of their arrival. But I want to suggest that this sense of inevitability around technological progression is pretty prevalent in society and its consequences aren’t great.
Namely, it puts technology in the driver’s seat, rather than us, a kind of modern day equivalent to the way ancient societies spoke of the inevitable actions of ‘the gods’.
The idea that technology deserves a place in society by virtue of the very fact that it exists is not a totally new thing either. John von Neumann back in 1971, for example, claimed that “technological possibilities are irresistible to man.” Progression must happen simply because it is possible.
John Paul II also sheds some interesting light on the topic. Recognising that technology aids human productivity if used in the right way, he also warns that it can “cease to be man’s ally” and almost become an enemy, “when it supplants him, taking away all…incentive to creativity and responsibility… or when, through exalting the machine, it reduces man to the status of its slave.”
Looking at the language we use when discussing technological progression, we would not be crazy for thinking this has already happened.
Noyce’s quote, “they’re coming, whether we like it or not,” implies we have sacrificed ourselves, our futures, and all other values into the hands of the sovereign technology gods. Even the terms ‘autonomous’ and ‘driver–less’ imply that responsibility is lost at our end when technology becomes sufficiently sophisticated, regardless of the intent behind the terms. Liberalism and it’s singular emphasis on agency has so far done nothing to challenge this.
You may ask what the danger in all of this is. Aren’t driverless cars meant to make society safer anyway?
It is true that we actually have good reason to believe self–driving cars will make for much safer roads and eliminate around 99.7% of road injuries. But going back to Noyce’s language of inevitability, whether or not driverless actually contribute positively to society is beside the point.
It is our overly accepting and unquestioning attitude that needs challenging. An openness to the idea that some progress may be harmful, rather than good.
The point is that in order to prioritise human safety, morality and dignity in the face of technological developments, a reflective and calculative approach is needed. We need scientists, engineers and policymakers to really believe that other valuable prerogatives might rule out some technological developments, and a willingness therefore to put on the brakes in the face of powerful tech companies. And the idea that they will hit the roads ‘no matter what’ hardly conjures up images of thoughtful reflection.
We also need to move beyond the language of inevitability. If we do actually see ourselves as responsible for balancing the technological drive against other equally (if not significantly more) important drives in society like human safety, morality, and protection of the environment, then the fatalistic way we talk about technological developments needs to be challenged to reflect that.
At the minute, claims like Noyce’s are all too prevalent, and even if we don’t literally mean them, we are nevertheless telling ourselves a dangerous story in which progression trumps all other drives in society. Through this resigned language, we become passengers rather than drivers, with technology steering the way.
In changing our language to reflect our responsibility, perhaps our journey with technology could be steered in a fruitful direction, by agents who can successfully navigate between values higher than mere progression.
Rachel Fidler is an Assistant Researcher at Theos
Image by Wikicommons, available under Creative Commons License 2.0
Interested by this? Share it on social media. Join our monthly e–newsletter to keep up to date with our latest research and events. And check out our Friends Programme to find out how you can help our work.