Tech platforms may very well be compelled to stop unlawful content material from going viral and restrict the power for individuals to ship digital presents to or document a baby’s livestream, beneath extra on-line security measures proposed by Ofcom.
The UK regulator revealed a session on Monday in search of views on additional protections to maintain residents, significantly youngsters, safer on-line.
These may additionally embody making some bigger platforms assess whether or not they should proactively detect terrorist materials beneath additional on-line security measures.
Oliver Griffiths, on-line security group director at Ofcom, mentioned its proposed measures search to construct on present UK on-line security guidelines however sustain with “always evolving” dangers.
“We’re holding platforms to account and launching swift enforcement motion the place we’ve got issues,” he mentioned.
“However know-how and harms are always evolving, and we’re all the time how we are able to make life safer on-line.”
The session highlighted three principal areas by which Ofcom thinks extra may very well be achieved:
- stopping unlawful content material going viral
- tackling harms at supply
- giving additional protections to youngsters
The BBC has approached TikTok, livestreaming platform Twitch and Meta – which owns Instagram, Fb and Threads – for remark.
Ofcom’s vary of proposals goal numerous points – from intimate picture abuse to the hazard of individuals witnessing bodily hurt on livestreams – and range in what sort or dimension of platform they may apply to.
For instance, proposals that suppliers have a mechanism to let customers report a livestream if its content material “depicts the danger of imminent bodily hurt” would apply to all user-to-user websites that enable a single consumer to livestream to many, the place there could also be a danger of displaying criminality.
In the meantime potential necessities for platforms to make use of proactive know-how to detect content material deemed dangerous to youngsters, would solely apply to the biggest tech companies which current increased dangers of related harms.
The proposals put ahead by Ofcom look to increase upon the measures already in place to try to enhance on-line security.
Some platforms have already taken steps to try to clamp down on options that specialists have warned might expose youngsters to grooming, comparable to via livestreaming.
In 2022, TikTok banned youngsters raised its minimal age for going reside on the platform from 16 to 18 – shortly after a BBC investigation discovered hundreds of accounts going live from Syrian refugee camps with children begging for donations.
YouTube just lately said it will improve its threshold for customers to livestream to 16, from 22 July.
However some teams say the regulator’s potential new necessities spotlight core points with the On-line Security Act – the UK’s sweeping guidelines that Ofcom is tasked with imposing.
“Additional measures are all the time welcome however they won’t tackle both the systemic weaknesses within the On-line Security Act,” mentioned Ian Russell, chair of the Molly Rose Basis – an organisation arrange in reminiscence of his 14-year-old daughter Molly Russell, who took her personal life after viewing hundreds of photos selling suicide and self-harm.
“So long as the main focus is on sticking plasters not complete options, regulation will fail to maintain up with present ranges of hurt and main new suicide and self-harm threats,” Mr Russell mentioned.
He added that Ofcom confirmed a “lack of ambition” in its strategy to regulation.
“It is time for the prime minister to intervene and introduce a strengthened On-line Security Act that may sort out preventable hurt head on by absolutely compelling firms to determine and repair all of the dangers posed by their platforms.”
Leanda Barrington-Leach, govt director of kids’s rights charity 5Rights, mentioned the regulator ought to require firms to “suppose extra holistically” about safeguards for youngsters, somewhat than mandate “incremental adjustments”.
“Kids’s security must be embedded into tech firms’ design of options and functionalities from the outset,” she mentioned.
However the NSPCC’s Rani Govender mentioned Ofcom’s transfer to require extra safeguards for livestreaming “may make an actual distinction to defending youngsters in these high-risk areas”.
The session is open till 20 October 2025 and Ofcom hopes to get suggestions from service suppliers, civil society, regulation enforcement and members of the general public.
Further reporting by Chris Vallance