Election 2020: YouTube announces ban on misleading election-related content

YouTube on Monday said it plans to remove misleading election-related content from its site. (Olly Curtis/Future via Getty Images)

Credit: Olly Curtis/Future via Getty Images

Credit: Olly Curtis/Future via Getty Images

YouTube on Monday said it plans to remove misleading election-related content from its site. (Olly Curtis/Future via Getty Images)

YouTube announced Monday in a blog post it plans to remove misleading election-related content that can cause "serious risk of egregious harm."

The Google-owned site rolled out its new policy on the day of the Iowa caucuses, The New York Times reported. YouTube's new policy is targeted toward misleading and false viral posts, the newspaper reported.

"Over the last few years, we've increased our efforts to make YouTube a more reliable source for news and information, as well as an open platform for healthy political discourse," Leslie Miller, the vice president of government affairs and public policy at YouTube, said in the blog post. She added that YouTube would be enforcing its policies "without regard to a video's political viewpoint."

That will be a daunting task. More than 500 hours of YouTube video is uploaded every minute, the Times reported.

In its blog post, YouTube said it would ban videos that gave viewers the wrong date, spread false information about participating in the 2020 census. or spread lies about a political candidate's citizenship status or eligibility for public office.

The company also said it would ban videos that attempt to impersonate another person or channel, misrepresent that person’s country of origin or hide their association with a government actor. Users also will be banned for artificially inflating the number of views, comments, likes or other metric formula by using automatic systems or by posting videos to unsuspecting viewers.

Ivy Choi, a YouTube spokeswoman, told the Times a video's context and content would determine whether it was taken down or allowed to remain on the site.

Choi added that YouTube would target videos that were “technically manipulated or doctored in a way that misleads users beyond clips taken out of context,” the newspaper reported.

Renée DiResta, the technical research manager for the Stanford Internet Observatory, told the Times that YouTube's new policy was trying to address "what it perceives to be a newer form of harm."

“The downside here, and where missing context is different than a TV spot with the same video, is that social channels present information to people most likely to believe them,” DiResta said.

About the Author