Historically, YouTube has taken down videos containing offensive slurs, misinformation about Covid vaccines, and false claims related to elections, citing violations of its platform policies.
Since the recent change in the White House administration, YouTube has directed its content moderators to allow certain videos that might breach platform rules to remain online if they are deemed to serve the public interest. This includes content covering political, social, and cultural topics.
This previously undisclosed policy adjustment aligns YouTube with other social media platforms that have eased content policing amid political pressure to reduce moderation. Earlier this year, Meta discontinued a fact-checking program on its platforms, and Elon Musk’s social media site shifted moderation responsibilities largely to its users.
However, unlike these platforms, YouTube has not publicly announced its moderation changes. The updated guidelines were introduced in mid-December through internal training materials reviewed by Multinational Times.
Under the new rules, videos classified as being in the public interest can contain more questionable content before removal is considered—raising the allowable threshold from a quarter to half of the video’s length. This includes footage from city council meetings, political rallies, and public debates. This marks a departure from previous pandemic-era practices when YouTube removed videos such as local government meetings or discussions involving Florida’s governor and scientists, due to medical misinformation concerns.
These expanded exceptions could particularly benefit political commentators who produce lengthy videos mixing news, opinions, and various claims. The updated policy also helps YouTube deflect criticism from politicians and activists dissatisfied with the platform’s handling of content related to Covid-19 origins, the 2020 election, and topics involving Hunter Biden.
0 Comments
No comments yet. Be the first to comment!