TikTok “must fix toxic algorithims” says Andrews

Mike Finnerty 22 Apr 2024

Dublin MEP Barry Andrews has warned TikTok that it could face billions in fines unless it overhauls its algorithms targeting harmful content at children.

In February, the European Union passed the EU Digital Services act which legally compels social media platforms to safeguard the privacy, security and well-being of young users.

Andrews said that in the context of Irish law, Coimisiún na Meán now has enforcement powers over TikTok, Twitter and Meta.

Should companies be found in breach of the rules, they could be fined up to 6% of their global turnover.

The fines will kick in should it be found that a social media platform are not doing enough to monitor content accessible to children.

“The recent RTÉ Prime Time investigation shone a light on the poisonous content accessible to children on TikTok.  It is alarming to see how quickly and easily children could access videos discussing self-harm techniques and suicide. This social media platform is widely used by children from the moment they get their first phone.”

Andrews noted statistically, the age bracket with the highest incidence of self-harm is teenagers aged 15 to 19 years, while there has a rise in self-harm rates among those aged 10 to 14 in recent years.

“Parents need to know what actions can be taken to prevent such poisonous content from reaching young people at such a vulnerable and influential time of their lives,” he said.

“These tech giants have been allowed to wield unparalleled control, utilising algorithms crafted to lure young users with tailored content and subsequently hold retain their attention.”

“The EU has now provided us with a weapon to tackle this and it needs to be enforced to the fullest extent. Under the new regulations, online platforms must safeguard the privacy, security, and overall well-being of younger users. This includes implementing default privacy and security settings tailored for their protection. It means tech companies can now be held accountable for monitoring the content they make accessible to children. Failure to comply will results in fines of up to 6% of their global turnover – potentially billions of euro.

“Too often, young people fall victim to toxic advertisements disguised as organic content on these platforms, exposing them to harmful material with profound consequences. However, this scenario has changed with the implementation of the DSA, which prohibits targeted advertising to minors on online platforms.

“This is just the beginning in addressing the mental health crisis fuelled by social media giants. This legislation now needs to be robustly enforced while we continue to work towards protecting our children online.”

Related News