Telegram faces probe after reports of illegal content and grooming risks
Ofcom launched a formal investigation into Telegram following evidence of illegal material
Britain’s communications regulator, Ofcom, has launched a formal investigation into the messaging app Telegram over concerns regarding child safety and the sharing of illegal material.
The probe was initiated on Tuesday following evidence provided by the Canadian Centre for Child Protection, which suggested that child sexual abuse material (CSAM) was present on the platform.
Ofcom stated that the investigation will examine whether the Dubai-based company has failed to comply with its legal duties under the UK’s 2023 Online Safety Act, which mandates tougher standards for digital platforms.
Telegram has categorically denied the allegations, asserting that it has "virtually eliminated" the public spread of harmful content through sophisticated detection algorithms since 2018.
In an official statement, the company expressed surprise at the probe, suggesting it might be part of a "broader attack on online platforms that defend freedom of speech and the right to privacy."
Despite these claims, Ofcom remains unsatisfied with the current protections in place for British children.
The investigation into Telegram coincides with broader regulatory action, as Ofcom also opened probes into Teen Chat and Chat Avenue to investigate the risk of grooming by predators.
Prime Minister Keir Starmer has signalled that the government may go further than current legislation, having recently met with tech executives to discuss a potential social media ban for children under sixteen.
"These firms must do more to protect children, or face serious consequences," warned Suzanne Cater, Ofcom’s Director of Enforcement.
This move follows a similar fine issued to Telegram by Australian regulators in February for failing to address questions regarding extremist and abusive content.