The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) from being shared.

Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform.

Under the current law, user-to-user services operating in the UK must have systems in place to prevent users from encountering CSAM and other illegal content, as well as mechanisms to tackle it - or risk facing large fines for breaches.

Telegram stated that it categorically denies Ofcom's accusations and claimed it has virtually eliminated the public spread of CSAM on its platform through advanced detection algorithms and collaboration with NGOs.

The company expressed surprise at the investigation, suggesting it might be part of a broader attack on online platforms that advocate for freedom of speech and the right to privacy.

This case is part of a wider crackdown from Ofcom on services suspected of flouting the UK's stringent online safety requirements. Enforcement of these regulations, particularly against CSAM, is a critical priority for Ofcom.

Rani Govender, associate head of policy at children's charity NSPCC, highlighted the urgency of addressing child sexual exploitation and abuse, noting that police are recording around 100 child sexual abuse image offenses every day.

Ofcom's investigation into Telegram was prompted by alerts from the Canadian Centre for Child Protection and follows ongoing scrutiny of various online services, including Teen Chat and Chat Avenue, over potential risks for child grooming. Ofcom has issued several fines to providers for non-compliance with its illegal content duties, which could lead to significant penalties for companies that fail to meet these requirements.