YouTube will counter COVID-19 misinformation with new “fact checking information panels” that’ll try to set the record straight on the virus.
The panels will appear on YouTube searches for specific coronavirus topics. You’ll see an information box from a third-party fact-checking group, which will tell you whether the coronavirus claim is true, partially true, or false alongside a link explaining why.
“During fast-moving news cycles, these panels will highlight fact-checked articles above search results, so viewers can make their own informed decision about claims made in the news,” YouTube said.
An example of a fact check information panel.
The Google-owned service already displays fact-checking information panels for longstanding conspiracy theories, like videos that claim the Earth is flat. Users who search for these topics will see an information bar that links to Wikipedia or the Encyclopedia Britannica.
However, last year YouTube also began testing the fact-checking information panels in Brazil and India on videos for current news topics. Now the company is expanding the feature to the US market when the whole tech industry is trying to address misinformation and scams swirling around the ongoing pandemic.
“We’re now using these panels to help address an additional challenge: Misinformation that comes up quickly as part of a fast-moving news cycle, where unfounded claims and uncertainty about facts are common. (For example, a false report that COVID-19 is a bio-weapon),” YouTube wrote in a blog post.
The fact checking will come from more than a dozen verified US publishers, including The Dispatch, FactCheck.org, PolitiFact, and The Washington Post Fact Checker. However, an information panel isn’t guaranteed to appear for every news topic. It’ll depend on what you search for and if YouTube has a relevant fact-checked article available on the topic. “As always, it will take some time for our systems to fully ramp up,” YouTube added. “Our systems will become more accurate, and over time, we’ll roll this feature out to more countries.”
The bigger issue is whether people will pay attention to the information panels; in 2017, Facebook said alerts about possibly fake news weren’t super effective. However, YouTube is taking videos down that promote more serious forms of COVID-19 misinformation. This includes content about miracle cures for the virus when they’re actually dangerous. In addition, YouTube has been taking down videos that claim 5G is connected to the coronavirus, which has prompted people to vandalize cellular towers in the UK and Europe.