Home / Comment / In brief

Defending Privacy: How to Love Your Neighbour in the Digital Age

Defending Privacy: How to Love Your Neighbour in the Digital Age

Nathan Mladin unpacks his essay Data and Dignity, and expands on the importance of privacy in the digital age. 26/01/2023

The Online Safety Bill is supposed to be the UK’s new legislation for making the internet safer for citizens. It’s an ambitious but difficult project. As tricky as it is necessary. The recent controversies, around the proposals to remove the reference to so–called “legal but harmful” content and, more recently, to make executives of big tech companies liable for breaching their duty of care to children, bear this out in spades. Some celebrate a tougher approach that would protect the vulnerable and detoxify the internet. Others fear arbitrary censure, the erosion of free speech and all the negative repercussions that come with it.  

The legislation clearly follows and seeks to address increasingly well–known problems with digital technologies and platforms powered by algorithms and big data. Indeed, it’s becoming old news by now: we’re constantly tracked and profiled by powerful AIs working on behalf of private companies (though profiling and algorithmic decision making is also on the increase in the public sector). Our data – about our internet habits, purchase histories, romantic preferences, job performance – every trace of our lives as we interact with digital technology, are hoovered up and analysed to generate profiles and predictions about us. The apps that we use can become apps that use us, without any regard to the vulnerability of… well, all of us, but particularly the young.  

There are many concerns that have been raised about this system in recent years: declining teen mental health caused by “performative social media” (e.g. Instagram and TikTok), the fraying of our democracies through filter bubbles and conspiracy theories going viral, injustices against minority groups through the use of data–driven policing and facial recognition tech, and so much more.  

Invasion of privacy almost always comes up. Indeed, privacy has come to act as a point of convergence between tech critics, regulators, and other stakeholders within civil society concerned with the data economy. But what is privacy anyway, and why does it matter (‘if I have nothing to hide’)?  

Most often, privacy is treated as an individual’s right to control what information about them is collected and analysed, by whom and on what terms. And there is much to be said about this view, when so much control has been lost to data–hungry corporations. But privacy – in this narrow, individualist, rights–focused sense – is not enough to address the wider moral risks of emerging tech.  

Privacy understood as our right and ability to control our data wrongly assumes that all we reveal about ourselves is the result of deliberate or conscious acts. This is often not the case. Our data disclosures can be manipulated through things like web and app design, unwieldy terms and conditions and default settings that encourage maximum data collection. Deeper still, an emphasis on control places impossible demands on us. It unwittingly overwhelms our agency just as it tries to uphold it. If data is used against us, or even merely to profit from us, it then seems like it’s somehow our fault, or at least a process which we have consented to.   It’s also not merely that we’ve lost control over our data. In fact, much of the information derived, and predictions made about us are based on seemingly anodyne metadata (data about data) that we wouldn’t even recognise as ours in any meaningful way or important to us. Rather, the key issue is that the whole system treats us as less than human. What are we seen as? Digital livestock: farmed for our money, milked for our attention, our relationships and connections harvested for profit, or for government power.  

If we are to resist such dehumanising uses of technology, privacy must be reimagined around a truer, more rounded understanding of what it means to be human.  

Drawing explicitly on Christian thought – although anticipating overlap with other religious or philosophical traditions – Data and Dignity, our latest essay, puts privacy on a wider, more robust anthropological footing. To cut to the chase, privacy is about our dignity as being precisely the sort of creatures that we are: embodied, with limitations and susceptibilities to be honoured rather than violated for gain; relational, made for relationships of trust and mutual care rather than exploitation; agential, with a capacity for intentional action to be upheld rather than undermined. 

This wider anthropological foundation allows us to see more clearly that privacy does not simply have to do with what others (e.g. advertisers or government agencies) know about me – an individual concern – but also with the systems that my data feeds into, the consequences of which apply not simply to me but to more vulnerable others. Rooted in collective dignity rather than individual rights, privacy remains an important way to think about what is at stake and begin to address the challenges facing us all as individuals in the digital age, but most importantly our vulnerable digital neighbours. 

So what to do? Responses have to be both individual and collective, regulatory and entrepreneurial. As individuals, we should reject default settings on websites, apps, and devices, that seek maximal data collection and opt for privacy–conscious technologies and services, including privacy–friendly search engines (e.g. DuckDuckGo, Brave) and email providers (e.g. ProtonMail). Regulatory efforts to curb the extractive business models of today’s technology giants, at both national and supra–national levels, should be encouraged. But regulation is often outrun by technology. What we need then are entrepreneurial responses to the status quo; alternative business models and innovation that place human flourishing and dignity at its centre. As Data and Dignity tries to show, theological anthropology should be seen as a rich source of wisdom to draw on, especially if what is sought is a rounded, rather than a reductionist, a realistic, rather than an idealised, understanding of the human person. 

Privacy is not dead, nor should we allow it to die. Protecting privacy is how we love our neighbour in the digital age.  

Download the essay here.


Interested in this? Share it on social media. Join our monthly e–newsletter to keep up to date with our latest research and events. And check out our Supporter Programme to find out how you can help our work.

Photo by ThisIsEngineering on Pexels 

Nathan Mladin

Nathan Mladin

Nathan joined Theos in 2016. He holds a PhD in Systematic Theology from Queen’s University Belfast and is the author of several publications, including the Theos reports Data and Dignity: Why Privacy Matters in the Digital Age, Religious London: Faith in a Global City (with Paul Bickley), and ‘Forgive Us Our Debts’: lending and borrowing as if relationships matter (with Barbara Ridpath).

Watch, listen to or read more from Nathan Mladin

Posted 26 January 2023

Data, Dignity, Dignity, Privacy, Technology


See all

In the news

See all


See all

Get regular email updates on our latest research and events.

Please confirm your subscription in the email we have sent you.

Want to keep up to date with the latest news, reports, blogs and events from Theos? Get updates direct to your inbox once or twice a month.

Thank you for signing up.