Twitter on Tuesday announced a new feature that allows users to flag content that may contain misinformation, a scourge that only grew during the pandemic.
“We are testing a feature for you to report tweets that look misleading – as you see them,” the social network said in its security account.
We’re testing a feature for you to report Tweets that look misleading – how you see them. Starting today, some people in the US, South Korea and Australia will find the option to flag a tweet as “misleading” after clicking on Report Tweet.
– Twitter Security (@TwitterSafety) August 17, 2021
As of Tuesday, a button would be visible for some users in the US, South Korea and Australia to choose “is misleading” after clicking “report tweet”.
Users can then be more specific, flagging the misleading tweet as potentially containing incorrect information about “health”, “politics” and “others”.
“We are evaluating whether this is an effective approach, so we are starting small,” said the San Francisco-based company.
“We may not take action and not respond to every experiment report, but your input will help us identify trends so we can improve the speed and scale of our broader disinformation work.”
Twitter, like Facebook and YouTube, is regularly criticized by critics who say it is not enough to combat the spread of disinformation.
But the platform lacks the resources of its Silicon Valley neighbors and often relies on experimental techniques that are cheaper than recruiting armies of moderators.
These efforts have increased as Twitter tightened its disinformation rules during the COVID-19 pandemic and during the US presidential election between Donald Trump and Joe Biden.
For example, Twitter began blocking users in March who were warned five times about spreading false information about vaccines.
And the network began flagging Trump’s tweets with a banner warning of their misleading content during his 2020 re-election campaign, before the then-president was finally banned from the site for posting incitements to violence and messages discrediting election results.
Concern over COVID-19 vaccine misinformation became so widespread that in July Biden said Facebook and other platforms were responsible for “killing” people by allowing false information about vaccines to spread.
He backtracked the comments to clarify that false information itself is what can harm or even kill those who believe in it.