‘We must stop kids going down lethal rabbit holes like Molly’: The fight for the Online Safety Act

After a five-year Telegraph campaign, new laws to protect young people online are set to be given final approval by the House of Lords

Online Safety Act

It has been nearly six years since Molly Russell’s parents and siblings woke up to discover they were no longer a family of five. Molly took her own life in November 2017 after she was bombarded with images and videos about self-harm and suicide on social media, leading her father, Ian, to declare: “Instagram helped kill my daughter.”

Now, bathed in warm September sunshine on the terrace of the Houses of Parliament, Russell recalls how her loss “fractured” his life and turned him from grieving father into a champion for online safety. 

“I’ve never seen myself as a campaigner. I’ve just seen myself as someone who has some lived experience to tell, in the hope that other people don’t have to live through a similar experience,” he says. 

Today, following years of campaigning by this newspaper, and thanks to Mr Russell’s hard work and determination, new laws to protect young people online will be given final approval by the House of Lords. The Online Safety Bill aims to police the tech giants whose algorithms Ian blames for his 14-year-old daughter’s death.

He says not a day goes by without the memory of Molly and the moment they found her in her bedroom intruding upon his thoughts. “It was like a fracturing of my life. I knew instantly the moment that I saw Molly’s body that my life [had been] broken,” says Ian who has set up the Molly Rose Foundation suicide prevention charity in her name.

“That’s not to say it wouldn’t continue. But I knew that there’d be a before-Molly’s-death and an after-Molly’s-death. She was so future-looking, so positive, so adorable, full of emotional intelligence, caring and showed no obvious signs of any mental ill health or depression.

“But the question of why, how, what made Molly contemplate ending her life and then what led her to carry out that act – we just felt there was an extraordinarily huge chunk of what Molly had been experiencing that was missing, that we were unaware of. 

“There must be something. There was a piece of the jigsaw that was missing and that led us to finding online harms on her social media accounts.”

Molly Russell Credit: Family handout/PA Wire

Because of obstruction and evasion by Meta, the owner of Instagram, it took Ian, his family, their lawyers and coroner Andrew Walker five years to finally uncover the 16,000 “destructive” posts in Molly’s accounts encouraging self-harm, anxiety and suicide in her final six months. In a landmark decision, Mr Walker ruled last September that “the negative effects of online content” had “more than minimally contributed” to her death.

That finding marked a turning point in the debate over the need for regulation of social media. 

“It’s obviously a painful and personal story, but it has become adopted by the wider public,” says Russell. “I think that’s simply because there’s probably not a parent of a teenage child in this land who isn’t, in some way, worried about the potential dangers that can be found online.”

Before Molly’s death, the Government had been steering a course away from legislation. Its Green Paper, published just a month before she died, spoke only of “working together” with the industry to “develop safer online communities” and “empower” citizens to manage risks online.

This was why on June 11, 2018, The Telegraph launched its campaign for social media firms to be subject to a statutory “duty of care” to protect children from online harms. What our investigation uncovered was growing evidence of an “association” between the increasing amount of time children spent online on social media and their rising levels of mental ill health, anxiety and depression.

The campaign was backed by bodies such as the Royal Colleges of Paediatrics and of Psychiatry, along with US tech engineers who had themselves invented the algorithms and “addictive” features like the “infinite scroll”. The creator of the scroll tool, Aza Raskin, was part of a US-based campaign for new laws, warning his never-ending reward loop was “wasting 200,000 lifetimes per day”.

The campaign proved persuasive. As Michelle Donelan, the Secretary of State for Science, Innovation and Technology who has finally steered the legislation onto the statute book, says: “The Telegraph has played a really key role in shining a spotlight on some of the reasons why we have to act. And especially on the issue of young children accessing social media, when they shouldn’t be. It was something that I personally was taken aback by when I took this office.”

In April 2019 – 10 months after The Telegraph launched its campaign – the government published a White Paper that adopted the concept of a legal duty of care on the tech firms to protect people from identifiable legal and illegal “harms”. This would be enforced by a regulator – later named as Ofcom – with powers to fine and potentially prosecute individual directors.

It was, however, never going to be an easy passage into law. This was the first time a government had attempted to regulate the online world in such a comprehensive way, and was taking on some of the wealthiest and most powerful companies in the world. Additionally, there were reservations, including within the Conservative Party, about the Bill’s potential for restricting freedom of speech and threatening users’ privacy.

Kemi Badenoch, the Trade Secretary, articulated those concerns about “overreach” in the Tory leadership race. “We should not be legislating for hurt feelings,” she said. At the heart of the row were plans to regulate “legal but harmful” online content for adults such as that relating to suicide, self-harm, eating disorders, abuse or incitement to hatred of people because of their race or sex.

In the face of a Tory revolt, Ms Donelan navigated a compromise whereby the “legal but harmful” provisions for adults were dropped with two caveats: people can still legally request the companies to protect them from such content; and the firms must abide by their terms and conditions, many of which ban abuse, self harm, suicide and hatred, or face multi-million pound fines.

The solution has stuck, but with a clear distinction between the online regulation of adults and children, who still require protection under the law from “legal but harmful” content. This means age checks to prevent children from being exposed to such adult material are critical.

Ms Donelan told The Telegraph she wanted social media companies to fully comply with the new, tougher age checks or face “humongous” fines. “If that means deactivating the accounts of nine-or eight-year-olds, then they’re going to have to do that,” she says.

While child-safety campaigners like Ian Russell feel the climbdown by Ms Donelan weakened the Bill, it was also given more teeth as a result of a separate Tory rebellion led by Red-Wall MP Miriam Cates and veteran Brexiteer Bill Cash. They defied lobbying by the tech firms to demand Ofcom got powers to prosecute social media bosses, with a jail sentence of up to two years if they persistently fail in their duty to protect children.

Ian Russell, centre, set up the Molly Rose Foundation suicide prevention charity in his daughter’s name Credit: Jeff Gilbert

Further amendments in the Lords have also strengthened the Bill, to ensure bereaved parents get access to their dead children’s social media accounts, to protect against “addictive” technology, to toughen action against abuse of women and girls online, and to criminalise the promotion of self-harm online.

“This is the most detailed and advanced set of protections for children which, if delivered intelligently and willingly, will transform their safety and agency online,” says Baroness Kidron, the former film producer who gave up her Hollywood career to campaign for online safety.  

There are still battles ahead, not least over end-to-end encryption that firms like Meta plan to extend across their platforms to maintain the privacy of private messaging. 

Russell is also worried that tech companies will try to “tangle” Ofcom in appeals and litigation and that the regulator will not be proactive enough. “[Ofcom] needs to look at the basic design of social networks and the fundamental decisions about how they function,” he says. This includes making its own independent decision on whether posts are “harmful” or not, and whether algorithms need to be altered to stop them funnelling harmful or life-threatening content to children.

“If this act doesn’t stop children and teenagers being pushed down lethal rabbit holes like Molly was, it will have failed,” he says. 

He acknowledges the importance of free speech, but says it must not overrule a right to life. 

“In a way those two fundamental human rights can be at odds with each other from time to time,” he says. “If you regulate to maintain one of them, you can be going against the other. So you have to find a compromise and I think this has settled on a compromise.”

That legislation is needed is underlined by a bereaved mother who recently contacted him, having lost her daughter in near-identical circumstances to Molly in February this year. He believes that children are still being bombarded with self harm and suicide content because of the failure of social media firms to adequately rein in their algorithms.

“She is going through a similar struggle [to the one] we went through. Her daughter also showed no obvious signs of any mental ill health. They’ve trawled through the content on her digital devices, and found material relating to depression, anxiety, methods of suicide,” says Ian.

“I was speaking to this mother yesterday and I could not believe I was talking to another poor, bereaved parent who was having to personally come to terms with those horrors and see with her eyes what her daughter had seen.”

Children have the right to enjoy a life online without being exposed to risk

Sir Peter Wanless, NSPCC Chief Executive

Credit: Geoff Pugh

The Online Safety Bill will soon become an Act. Five years since legislation was first promised there will be ground-breaking legal protections for children navigating the online world in the UK. 

It is a momentous day for children that should be celebrated. Not least because the landmark legislation has received cross party approval and overwhelming public support.

It has been scrutinised by joint committees and overseen by seven secretaries of state. It is now on the statute book thanks to the tireless campaigning of online abuse survivors, bereaved parents, supporters, experts, and children and young people themselves. 

From Childline and the NSPCC Young People’s Board for Change we hear about the completely unacceptable levels of abuse and harm they face online every day. 

They tell us how they have had enough of being unable to engage with the positives of the online world without being put at risk because the sites and apps they use have not been designed with their basic safety in mind.

Children have every right to enjoy a life online without being exposed to preventable risk and to be protected from disgraceful abuse. 

I pay tribute to The Daily Telegraph’s Duty of Care Campaign. With determination and patience, it has been instrumental in constantly reminding Government why this legislation is crucial for future generations to thrive.

The need is clear and has been rising. Our own analysis found an 82 per cent increase in online grooming crimes through the period in which legislation has been discussed. Child abuse image crimes rose by two thirds.

The Bill was on hold amid Prime Ministerial changes when last September a coroner ruled that social media had contributed to the death of 14-year-old Molly Russell.

Molly’s father Ian has campaigned with other bereaved parents to make sure companies are accountable for the way their platforms irresponsibly fuel the proliferation of harmful material with devastating consequences for young lives.

Of course there is still detailed work to be done to define the detail and implement with rigour the protections in the new Act. But a reset is required, there can be no doubt of that. 

And so, rather than companies responding in the wake of successive tragedies, they can now set a standard by proactively and positively adhering to new regulations. 

By championing the duty of care set out in the Online Safety Act, rather than having their actions characterised as reluctant adherence to minimum standards, now is the moment to embrace and promote the interests of their young users.

What a great opportunity for the tech industry to actively engage with young people as digital citizens by involving them in service design and product development.  Children are often the real experts in what is happening online and deserve to help shape the solutions.

Young people make up one in three global internet users. Industry should be embracing this large part of their customer base, expanding their markets rather than threatening to cut a section of the population off from what they are offering. 

This week tech leaders heard from over 100 survivors of online abuse and child safety experts, urging them to put safety by design front and centre in their work. Our worldwide coalition highlighted how online child abuse is a global issue that warrants a global response. 

Here in the UK, I’d like to thank all the Parliamentarians who have worked so hard to ensure the Online Safety Act gives Ofcom enforcement powers necessary to change the thinking at the top of the world’s biggest companies, including criminal liability for senior managers. 

I hope we never get to see eye-watering fines or criminal charges being levelled because this would signal irresponsible leadership and continued harm to children when the UK has been clear this will no longer be tolerated.

However, I’m delighted to see tough financial and criminal sanctions in place. They are the just and necessary response to industrial levels of child sexual abuse and systemic, sometimes fatal, harm to children that is utterly unacceptable.

It is now over to companies to deliver the change that will see children feeling safe and empowered as a result of the Online Safety Act.

This means engaging meaningfully with young people. The NSPCC looks forward to continue in our efforts to champion and amplify children’s voices so the full benefits of online regulation are secured for young people who have deserved so much better in their online lives. 


What are your thoughts on new laws to help protecting children online? Join the conversation in the comments section below