- YouTube videos with false information gather more shares on social media than the videos of five leading news broadcasters combined.
- Facebook is the most popular platform for sharing YouTube misinformation videos in comparison with Twitter and Reddit.
- Analysis shows failure of Facebook’s content moderation policies, with third party fact-checks catching less than 1% of COVID-19 misinformation videos.
- Google fact-checkers for Youtube eventually deleted most of the videos – but not before they had slipped through Facebook’s net and been shared thousands of times
A new study by the Oxford Internet Institute and the Reuters Institute for the Study of Journalism reveals that coronavirus-related misinformation videos are primarily spread through social media and that Facebook is the primary channel for sharing misinformation due to a lack of sufficient fact checks in place to moderate content.
Google’s fact-checking is evidently much more rigorous than Facebook’s. The company has improved its algorithm for filtering out fake news so people searching on YouTube itself for information about coronavirus information will find accurate, valid sources. Meanwhile, Facebook is rife with fake news and it is the videos and articles containing misinformation which which get more exposure.
The Oxford study examined over a million YouTube videos about COVID-19 which circulated on social media and identified videos which had eventually been removed by YouTube because they contained false information.
However, the study found that these misinformation videos do not find their audience through YouTube itself, but largely by being shared on Facebook; data analysis revealed that YouTube videos containing coronavirus-related misinformation were shared nearly 20 million times on Facebook between October 2019 and June 2020. They had a higher reach on social media than the five largest English-language news sources on YouTube (CNN, ABC News, BBC, Fox News and Al Jazeera) combined, whose videos were shared 15 million times.
Facebook also generated a higher number of reactions than other social media platforms. Misinformation videos shared on Facebook generated a total of around 11,000 reactions (likes, comments or shares), before being deleted by YouTube. In comparison, videos posted on Twitter were retweeted on average around 63 times.
The Oxford researchers also found that out of the 8,105 misinformation videos shared on Facebook between October 2019 and June 2020, only 55 videos had warning labels attached to them by the company’s third-party fact checkers, which is less than 1% of all misinformation videos. This failure of fact-checking helped COVID-related misinformation videos spread on Facebook and find a large audience.
Just 250 Facebook groups are responsible for half of the visibility that misinformation videos acquire among public social media accounts. The most popular misinformation videos often include individuals who claim medical expertise making unscientific claims about treatments for and protection from the coronavirus disease.
Dr Aleksi Knuutila, Postdoctoral Researcher at the Oxford Internet Institute, said:
“People searching for Covid-related information on YouTube will see credible sources, because the company has improved its algorithm. The problem, however, is that misinformation videos will spread by going viral on other platforms, above all Facebook. The study shows that misinformation videos posted on YouTube found a massive audience, and it is likely to have changed people’s attitudes and behaviour for the worse. For the sake of public health, platform companies need to ensure the information people receive is accurate and trustworthy.”