YouTube changes policy to allow false claims about elections

The YouTube app is displayed on an iPad on March 20, 2018 in Baltimore.

YouTube will stop removing content that falsely claims the 2020 or other previous US presidential elections were affected by “fraud, errors or widespread errors,” the platform announced Friday.

The change is an investment for the Video service owned by GoogleWhich one said a month after the 2020 election, it would begin removing new posts that falsely claimed widespread voter fraud or error changed the outcome.

What you need to know

YouTube says it will stop removing content that falsely claims the 2020 election and other previous US presidential elections were affected by fraud, errors or widespread errors.

The Google-owned video service said in a blog post on Friday that it wanted to avoid the unintended effect of restricting political speech without significantly reducing the risk of violence or other real-world harm.

The updated policy will not prevent YouTube from removing material that attempts to mislead voters in future elections, including the upcoming 2024 presidential election.

The change comes as YouTube and other major social media companies have come under fire for not doing more to combat election disinformation

YouTube said in a blog post that the updated policy it was an attempt to protect the ability to “openly debate political ideas, even those that are controversial or based on rebuttable assumptions.”

“In today’s environment, we find that while removing such content curbs some misinformation, it could also have the unintended effect of reducing political discourse without significantly reducing the risk of violence or other world harm real,” the blog post said.

The updated policy, which will take effect immediately, will not prevent YouTube from removing content that attempts to mislead voters in the upcoming 2024 election or other future races in the US and abroad. The company said its other existing rules against election disinformation remain unchanged.

That could prove difficult to enforce, said John Wihbey, an associate professor at Northeastern University who studies social media and disinformation.

“It doesn’t take a genius if you’re on the side of the ‘they hurt us in 2020’ misinformation to say, ‘wait a minute, we’re saying voting in general isn’t worth it.’ And 2020 is our example,” he said. “I don’t know how you can disentangle the rhetoric that’s about both past mistakes and advanced possibilities. The content moderation team, which will try to do that, will be in knots trying to figure out exactly where that line is.”

The announcement comes after YouTube and other major social media companies, including Twitter and Meta-owned Facebook and Instagram, they have been under fire in recent years for not doing more to combat the hose of electoral disinformation and misinformation that is being spread on their platforms.

Left-wing media watchdog group Media Matters said the policy change comes as no surprise as it was one of the “last major social media platforms” to keep the policy in place.

“YouTube and the other platforms that preceded it in weakening its electoral disinformation policies, such as Facebook, have made it clear that an attempted insurrection was not enough. They’re setting the stage for an encore,” their vice president Julie Millican said in a statement.



Source link

You May Also Like

Leave a Reply

Your email address will not be published. Required fields are marked *