Dublin People

TikTok fails to detect disinformation ahead of General Election 

An investigation carried out by Global Witness Digital Threats team has found that TikTok approved ads containing disinformation about the upcoming General Election. 

According to the campaign group, TikTok approved ads containing harmful disinformation and voter suppression messages relating to the Irish general election which takes place on Friday. 

Ahead of the vote, they submitted 28 ads containing clear and obvious disinformation to the platform. The ads included messages in English and Irish. 

Results: 

*After platform review, we deleted all ads before they were published on the platform so that the ads did not go live 

 The majority of approved ads (eight) were in Irish, pointing to a clear blind spot in TikTok’s moderation system.  

 According to the latest transparency report TikTok had to submit under the EU Digital Services Act, the platform has no dedicated Irish language moderators.  

 Ava Lee, Campaign Lead – Digital Threats at Global Witness, said: 

“Social media platforms have a responsibility to keep elections safe.  

“As more and more people get their news from their social media feeds, the least platforms should do is ensure that this content is free from the most obvious forms of disinformation. 

 “Yet this test shows once again that platforms are prioritising profit over safety and proper risk mitigation measures.  

“The layoffs implemented by TikTok’s leadership speak for themselves.” 

 The failings of TikTok are particularly noteworthy as the platform bans all forms of political advertisement, not only election disinformation.  

 The disinformation that we included should have been immediately clear to any human reviewer. Examples include: 

Edel McGinley, Executive Director at Hope and Courage Collective said:

Election integrity is fundamental to democracy.

“The power to monitor, investigate and combat the dissemination of disinformation and misinformation lies with the Electoral Commission.

“However, the relevant parts of legislation have not been enacted and therefore there is no regulatory oversight of this right now. 

“Should these adverts have gotten through we know from experience that algorithms could have amplified this disinformation.

“Enacting the legislation and holding social media bosses to account for these failures must be of the highest priority for those seeking election this Friday and for the next government.” 

This investigation builds on our previous work in Ireland around the last EU parliamentary elections. 

In May 2024, they investigated TikTok’s and other platforms’ abilities to detect similar election disinformation messages.

During this test TikTok approved 100% of the submitted ads prompting us to submit a complaint to the EU regulator to investigate potential breaches of the Digital Services Act.  

In response to the previous investigation, TikTok said they “instituted new practices for moderating ads that may be political in nature to help prevent this type of error from happening in the future”.

In a statement, Global Witness said:

“While TikTok improved in this latest investigation, our findings continue to call into question the platform’s ability to consistently protect users from blatant disinformation. 

“We approached TikTok for comment and a TikTok spokesperson confirmed that all of the ads we submitted violated their advertising polices.

“TikTok conducted an investigation into why some of the ads were not rejected.

“The spokesperson highlighted that ads may go through additional stages of review as certain conditions are met, such as reaching certain impression thresholds or being reported by users once the ad has gone live.”

Additionally, they stated that the platform is “focused on keeping people safe and working to ensure that TikTok is not used to spread harmful misinformation that reduces the integrity of civic processes or institutions” and said that they do this by, among other things, “enforcing robust policies to prevent the spread of harmful misinformation”. 

This investigation, however, suggests that they do not enforce their policies well enough.   

Exit mobile version