Sign In  |  Register  |  About Menlo Park  |  Contact Us

Menlo Park, CA
September 01, 2020 1:28pm
7-Day Forecast | Traffic
  • Search Hotels in Menlo Park

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

YouTube has removed 1 million videos for dangerous COVID-19 misinformation

YouTube has removed 1 million videos for dangerous COVID-19 misinformation since February 2020, according to YouTube’s Chief Product Officer Neal Mahon in a new blog post.
Karissa Bell Contributor Karissa Bell is a Senior Editor at Engadget

YouTube has removed 1 million videos for dangerous COVID-19 misinformation since February 2020, according to YouTube’s Chief Product Officer Neal Mahon.

Mahon shared the statistic in a blog post outlining how the company approaches misinformation on its platform. “Misinformation has moved from the marginal to the mainstream,” he wrote. “No longer contained to the sealed-off worlds of Holocaust deniers or 9-11 truthers, it now stretches into every facet of society, sometimes tearing through communities with blistering speed.”

At the same time, the Youtube executive argued that “bad content” accounts for only a small percentage of YouTube content overall. “Bad content represents only a tiny percentage of the billions of videos on YouTube (about .16-.18% of total views turn out to be content that violates our policies),” Mahon wrote. He added that YouTube removes almost 10 million videos each quarter, “the majority of which don’t even reach 10 views.”

Facebook recently made a similar argument about content on its platform. The social network published a report last week that claimed that the most popular posts are memes and other non-political content. And, faced with criticism over its handling of COVID-19 and vaccine misinformation, the company has argued that vaccine misinformation isn’t representative of the kind of content most users see.

Both Facebook and YouTube have come under particular scrutiny for their policies around health misinformation during the pandemic. Both platforms have well over a billion users, which means that even a small fraction of content can have a far-reaching impact. And both platforms have so far declined to disclose details about how vaccine and health misinformation spreads or how many users are encountering it. Mahon also said that removing misinformation is only one aspect of the company’s approach. YouTube is also working on “ratcheting up information from trusted sources and reducing the spread of videos with harmful misinformation.”

Editor’s note: This post originally appeared on Engadget.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 MenloPark.com & California Media Partners, LLC. All rights reserved.