Technology

Threads is getting its own fact-checkers to combat misinformation

Threads logo is being displayed on a smartphone with Threads visible in the background.

Meta plans to add direct fact-checking for Threads, with the aim to address misinformation on the app itself instead of referentially through its other platforms.

Though the owner of Facebook and Instagram uses third-party fact-checking teams to debunk misinformation and disinformation on these sites (whether it’s wholly successful is another thing), Meta’s answer to Twitter/X doesn’t have its own standalone fact-checking team.

“Early next year, our third-party fact-checking partners will be able to review and rate false content on Threads,” Meta outlined in an update. “Currently, when a fact-checker rates a piece of content as false on Facebook or Instagram, we extend that fact-check rating to near-identical content on Threads, but fact-checkers cannot rate Threads content on its own.”

Instagram head Adam Mosseri also shared a post about the program, though didn’t give much detail beyond the fact that it’s coming “next year”.

“We currently match fact-check ratings from Facebook or Instagram to Threads, but our goal is for fact-checking partners to have the ability to review and rate misinformation on the app,” Mosseri wrote. “More to come soon.”

Meta has long faced criticism for allowing misinformation (as well as hate speech) to run rampant on its platforms, especially related to COVID-19 and during the 2016 and 2020 U.S. presidential elections. Meta has revealed its plans for political advertising during the 2024 global elections, though, as Mashable’s Meera Navlakha writes, “Political advertising on Meta’s platforms has and continues to be a contentious matter. During previous elections, accusations of rampant misinformation — and a clear failure to block said misinformation — have tainted Meta’s self-declared reputation of prioritising the protection of elections online.”

In 2021, Facebook started to flag pages which spread fake news constantly, and officially banned all Instagram accounts, Facebook Pages and Groups related to the QAnon conspiracy theory. After Russia’s invasion of Ukraine, Meta set up a Special Operations Center to combat misinformation and remove hate speech and content inciting violence. However, these very types of content have continued to surge on Facebook and Instagram during the ongoing Israel-Hamas war — with the EU even stepping in to investigate. Meta has taken action in some of these cases.

A key factor here is Threads’ connection to news. Though Threads is making moves toward making trending topics more intuitively collected, Meta doesn’t really push the platform as a news and current affairs-forward space, with Mosseri writing in July, “Politics and hard news are inevitably going to show up on Threads – they have on Instagram as well to some extent – but we’re not going to do anything to encourage those verticals.”

Notably, certain words have been blocked from Threads’ search, with The Washington Post reporting words like “coronavirus,” “vaccines,” “vaccination,” “sex,” “porn,” “nude,” and “gore,” as intentionally blocked. Threads still doesn’t have its own community guidelines; instead the company says Threads is “specifically part of Instagram, so the Instagram Terms of Use and the Instagram Community Guidelines” apply to Threads too.

However, Threads already has a hate speech problem, as Mashable’s Chase DiBenedetto reported warnings from civil rights groups in July. At the time, a Meta spokesperson told Mashable and Media Matters for America in a statement, “Our industry leading integrity enforcement tools and human review are wired into Threads. Like all of our apps, hate speech policies apply. Additionally, we match misinformation ratings from independent fact checkers to content across our other apps, including Threads. We are considering additional ways to address misinformation in future updates.”

This seems like that update, set to roll out “early next year”.

Mashable