YouTube, a subsidiary of Alphabet Inc, announced on Friday that it will no longer delete content that could have potentially spread false information regarding the 2020 and previous U.S. presidential elections.
The new updates are part of YouTube’s election disinformation policy, effective immediately.
“In the current environment, we find that while removing such content may curb some misinformation, it may also have the unintended effect of curbing political discourse,” YouTube said in a blog post.
The platform also said that its other policies against hate speech, harassment and incitement to violence still apply to all user content, including elections.
The spread of disinformation has raised questions about how social media platforms police their policies against misleading content about the election.
Other social media platforms, such as Twitter and Facebook’s Meta Platforms Inc, have also seen an increase in election-related misinformation.