EU urged to act after Meta child safety cover up claims

Padraig Conlon 09 Sep 2025

Explosive allegations that Meta deleted or doctored internal child safety research have triggered urgent calls in Strasbourg for tougher EU rules to keep minors away from harmful content online.

According to a Washington Post investigation, Meta, the parent company of Instagram, Facebook and WhatsApp, allegedly removed or altered findings showing children being exposed to grooming, sexual harassment and violence on its virtual reality platforms.

Speaking from the European Parliament, Midlands North West MEP Nina Carberry, a member of the Parliament’s Intergroup on Children’s Rights, said:

“This is gravely worrying.

“The investigation appears to show that Meta knew that minors were seeing harmful and age inappropriate content, yet manipulated the data to show this wasn’t the case.”

Carberry, who is working on a European Parliament report to strengthen protections for minors online, called on the European Commission to urgently implement EU-wide age verification tools to block children from harmful or inappropriate content.

She warned that current rules under the Digital Services Act are not keeping pace with fast-moving risks. Pointing to Artificial Intelligence models capable of generating synthetic child sexual abuse material, she said stronger EU action is essential to ensure children are properly protected in the digital world.

Related News