In a press release late last month, YouTube announced that they will
YouTube says their aim with this policy change, only affecting U.S. users for now, is to reduce the chance that users will see content that could misinform them in “harmful ways.”
Some of the harmful content YouTube listed as examples included “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
While the site says that this will only affect one percent of content on YouTube, with a monthly user rate of 1.9 billion, even a small change like this will alter the types of videos millions of people watch every day on YouTube.
Critics on the left
Criticism stemming from the Left side of the political spectrum often focuses on the role YouTube and other social media platforms play as distributors of content that sometimes contains hate speech.
It appears that with this change, YouTube may be partially acquiescing to Change The Terms demands, released in October 2018, which seek to crack down on “hateful activities” online.
In their statement, YouTube admitted they will be using both “machine learning and real people” to determine what type of content is not fit to be recommended. These human evaluators will be working off of hate group reports from “well-established organizations” to determine if the content needs to be downplayed or even banned.
So what organizations are these? Well, YouTube never makes this very clear in their guidelines. When I reached out for clarification, none was provided. However, later in the guidelines, YouTube does provide an example of some of these potential organizations.
Using the example of the white supremacist website The Stormfront as “lowest quality page” type, YouTube cites proof of this claim from respected organizations like the Anti-Defamation League and Pew Research as well as newspaper articles from USA Today and Fox News.
While it is encouraging to see reputable organizations cited, the face that newspaper articles can be used to discredit a channel’s content is concerning. That concern, coupled with the fact that YouTube has partnered with a widely discredited organization like the Southern Poverty Law Centre in the past to determine acceptable content, is troubling.
Free speech issues
The lack of clarity over what is hateful or even borderline content is an a major issue for a website that plays a crucial role in the dissemination of information online. These vague standards have drawn complaints from not only right-wing channels like Prager U and Steven Crowder but also from centrist or apolitical YouTuber’s like Philip DeFranco, Pewdiepie, and Nerd City.
Even self-proclaimed Leftists like mathematician and economist Eric Weinstein have been criticizing YouTube for the lack of clarity surrounding their recommendation rules. In a Twitter thread Sunday, Weinstein called for YouTube to be “hyper-specific” about what content comes close to violating their terms of service.
This request from Weinstein gets to the heart of the matter of content restriction and free speech on YouTube. Without clear and specific standards of what content is acceptable and what content comes close to crossing “the line,” the confusion and calls for action from both sides will only continue to increase.
YouTube violating their own founding principles
If YouTube does not step up to the plate and spell out some clear standards, those voices on the outskirts of the conversation will be slowly edged out as the Overton window of acceptable discourse continues to shrink.
While restricting speech that directly leads to physical violence, something the site already does, is a good and necessary thing, shutting down unpopular or even unsavoury opinions on the edges of the public dialogue is not.
YouTube’s own mission statement says that it is their goal to “give everyone a voice and show them the world.” This a laudable goal and one that is being achieved everyday when people log on to YouTube to learn and explore the myriad of topics the platform offers.
By acting to restrict content which does not physically harm people, YouTube is going against its own mission statement and violating the four principles upon which they are based, freedom of expression, freedom of information, freedom of opportunity, and freedom to belong.
YouTube originally set out to do better than this. It’s time they returned their founding principles and protect freedom of speech for everyone.