Ofcom investigating Telegram over child sexual abuse material concerns

The UK media regulator, Ofcom, has launched an investigation into Telegram due to concerns that it may be failing to prevent the sharing of child sexual abuse material (CSAM). Ofcom stated on Tuesday that it was probing the popular messaging service after gathering evidence suggesting that CSAM was present and being shared on the platform.

Under current UK law, user-to-user services operating in the country must have systems in place to prevent individuals from encountering CSAM and other illegal content, as well as mechanisms to tackle such material. Failure to comply can result in substantial fines.

Telegram, in a statement, “categorically denies Ofcom’s accusations.” The company told the BBC that “Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations].” Telegram added, “We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy.”

This investigation is part of a wider crackdown by Ofcom on services it suspects could be flouting the UK’s comprehensive online safety requirements, which include strengthened rules for tech firms to combat CSAM, material that is illegal to possess or share in the UK.

Suzanne Cater, director of enforcement at Ofcom, emphasized the gravity of the issue, stating, “Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities.” She noted that while progress had been made in tackling CSAM on smaller services, the issue “extends to big platforms too.”

Children’s charity the NSPCC welcomed Ofcom’s probe into Telegram. Rani Govender, its associate head of policy, highlighted the scale of the problem: “Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day.” She added, “The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram.”

Ofcom initiated its probe into Telegram after being contacted by the Canadian Centre for Child Protection regarding the alleged presence and sharing of CSAM on the messaging app. The regulator also announced investigations into services Teen Chat and Chat Avenue over potential grooming risks identified through its collaboration with child protection agencies.

Cater warned, “Teen-focused chat services are too easily being used by predators to groom children.” She stressed that “These firms must do more to protect children, or face serious consequences under the Online Safety Act.”

The illegal content duties of the Online Safety Act, which came into effect in March 2025, mandate that user-to-user services like messaging apps and social networks demonstrate their efforts in tackling “priority illegal content.” This category includes CSAM, terrorism, grooming, and extreme pornography.

Ofcom has previously issued fines to providers accused of failing to comply with its duties regarding illegal content or age verification. The regulator has the authority to fine companies £18 million or 10% of their global revenues, whichever amount is higher, in cases of non-compliance. While some firms, such as US message-board 4chan, have shown resistance to Ofcom’s threats, the regulator noted that one file-sharing service it contacted with concerns about its systems for dealing with illegal content had made “material improvements” to comply with its duties.

#OfcomInvestigation
#TelegramSafety
#ChildSafetyOnline
#CSAMPrevention
#OnlineSafetyAct
#DigitalRegulation
#TechAccountability
#ProtectChildren
#MessagingAppSafety
#UKLaw

Leave a Reply

Your email address will not be published. Required fields are marked *