Technology, Responsibility, and the Common Good
This week, a significant court case in the United States has brought renewed scrutiny to some of the world’s largest social media companies. In a landmark development, a federal judge ruled that claims against companies including Meta, TikTok, and YouTube could proceed, finding there was sufficient evidence to argue that aspects of their platforms had been deliberately designed in ways that foster compulsive and addictive use, particularly among children and young people.
The case, brought by a coalition of US states, centres on the allegation that features like infinite scroll, algorithmically curated feeds, and persistent notifications are not incidental, but have been intentionally engineered to maximise the amount of time users spend on these platforms.
Many of us will recognise the experience. I dread to think of how many times I have sat down on the sofa next to my wife and my first instinct has been to reach for my phone to start scrolling on a social media site, or how often I have been drawn in by the subtle pull of notifications that are difficult to ignore. But this case pushes us beyond anecdote to think about a rather more serious question: what happens when the pull is not accidental, but designed?
And perhaps more importantly: what does it mean for a society when some of its most influential tools shape us in ways we barely even notice?
Underlying questions
It is tempting to reduce this conversation to a warning about “too much screen time”. Indeed, in the UK today, new government guidance has been issued suggesting under-5s should be limited to one hour per day, and under-2s should not be left alone with screens at all. Such practical measures are helpful, as far as they go.
But what is at stake here runs much deeper. Increasingly, evidence points to the impact of social media on mental health, particularly among young people. Rising anxiety, diminished attention spans, disrupted sleep, and a growing sense of comparison and inadequacy are now widely discussed topics both in the media and in government.
But underneath all of this lies something more fundamental: the formation of habits, desires, and identity. These platforms are not simply tools we use occasionally. They are environments we inhabit daily. And like any environment, they shape us over time.
This raises several questions we ought to think about. How can we nurture healthy relationships and attentiveness in a world of constant digital interruption? Where are we seeking our identity and worth, and are we in danger of looking for validation in likes and shares. And for all of us, it raises one deeper, underlying question: who - or what - is forming us?
To be clear, innovation and user engagement are not wrong in themselves. But the ethical questions emerge when engagement starts to look more like dependence.
- When does persuasion become manipulation?
- When does convenience become compulsion?
- When does a business model built on attention begin to erode human wellbeing?
These are not merely technical questions; they are moral ones.
Freedom and Formation
In the West, personal freedom has been elevated to become the foremost value of our age; it is freedom which lies at the heart of society’s debates whether it be life issues, sexuality and gender, smoking bans or social media use. And freedom is a good gift from God. But Christians have long been concerned not only with freedom, but formation as well.
The Apostle Paul writes, “‘I have the right to do anything,’ you say, ‘but not everything is beneficial…I will not be mastered by anything’” (1 Corinthians 6:12, NIV). Those words still resonate today. True freedom is not simply the ability to choose, but the ability to choose well, and to live without being quietly mastered by habits or systems that diminish us.
Similarly, Paul’s call in Romans 12:2 to “be transformed by the renewing of your mind” reminds us that we are always being shaped, whether intentionally or by default.
In his book The Tech-Wise Family, Andy Crouch writes about how we become more like what we pay attention to. This case has also been made vividly by Jonathan Haidt in The Anxious Generation, a book that has done much to inform us about the impact smart phones and social media are having on young people.
Haidt’s work acts as modern-day evidence for what Paul wrote two thousand years earlier. The technologies that relentlessly capture and direct our attention are not neutral. They are formative. They shape what we love, what we notice, and ultimately, who we become.
Where does responsibility lie?
One of the strengths of this week’s legal developments is that they resist the temptation to place responsibility solely on the individual user.
Instead, they point to a more complex and honest picture, one that CARE has long sought to emphasise in public life.
Technology companies bear real responsibility for the design choices they make. With influence comes accountability, particularly where products are used by children and young people.
Government also has a role. In the UK, legislation such as the Online Safety Act 2023 is an important step forward. But there is still more to do. Much of our current approach focuses on content, while the deeper questions of platform design - and the incentives that drive it - remain less explored.
CARE has been a consistent voice in this debate. We have been at the heart of shaping some of the positive steps taken in recent years, particularly in ensuring that the protection of children and the most vulnerable remains at the heart of public policy. But our work is far from complete, and technology does not stand still.
If this week’s case tells us anything, it is that we need to think not only about what appears on our screens, but how and why it gets there.
But this is not just a matter for the Government either.
Families and communities play a vital role in modelling healthy habits and setting boundaries. And the Church has an important calling to help people navigate these questions wisely, forming disciples who are attentive, present, and free.
A better vision for technology
It would be easy at this point to lapse into pessimism. But that would be a mistake.
Technology is not inherently harmful. At its best, it can connect, inform, and empower. The question is not whether we use technology, but how, and to what end.
What would it look like for technology to be designed in ways that genuinely support human flourishing? What would it mean for business models to be designed without just profit in mind, but the wellbeing of users? And how can we rethink the idea of an attention-span, so as a society we view it not as a commodity to be captured, but as a gift to be stewarded?
At CARE, our vision is to see “politics renewed and lives transformed”. That vision must surely include our digital lives: spaces where so much of our lives now unfold.
We want to bring better questions about the kind of digital world we are inviting people into. If our technologies are shaping us, then we must ask who is shaping them. If our attention is being captured, we must ask what is capturing it and why. If our habits are being formed, we must consider whether they are helpful or harmful, and whether we are becoming more like the people we are called to be.
As supporters of CARE, you are partnering with us as we seek to bring a better story into these debates, and you help us to bring thoughtful, compassionate, and principled arguments into complex areas in public life.
We also want to equip you with resources as you yourselves try and act with discernment, and as you bring God’s better story into the conversations you might be having yourself, whether it be in your family, church or community.
You might be interested in:
It is both our challenge and our responsibility not to withdraw from the digital world, but to engage with it wisely, with clarity, with care, and with hope.
Because ultimately, the goal is not simply to be less distracted or entertained. It is to be more fully human; to become the people God created us to be.