Patreon's policy of policing speech leaves people poor at Christmas

Many creators lost thousands of dollars and face an uncertain future just in time for Christmas.

ADVERTISEMENT
Image
Joseph Fang Toronto Ontario
ADVERTISEMENT

On Christmas Eve, the New York Times finally covered what’s going on at Patreon. The paper of record weighed in for the first time on the #PatreonPurge that effectively began with the deplatforming of YouTuber Carl Benjamin (Sargon of Akkad) and escalated when Sam Harris walked away last week.

The article by Nellie Bowles features comments from Jordan Peterson, who is building a free-speech Patreon alternative along with Dave Rubin; Jack Conte, CEO of Patreon; Jacqueline Hart, head of Patreon’s Trust and Safety team; and other players in the growing scandal.

Bowles seeks comment from both sides and produces what is probably the most useful piece of reporting on the Patreon debacle thus far. But this piece also reveals that what Sam Harris said is true: “These recent expulsions seem more readily explained by political bias.”

With regard to the specifics of the Carl Benjamin/Sargon of Akkad situation, Hart says, “You cannot say those words on our platform. It doesn’t matter who you’re directing them at.”

The thing is, Benjamin did not say those words on their platform. He did not violate their terms of service at all, and Hart unwittingly confirms this in the New York Times of all places.

It has been noted and confirmed by various sources that he was booted from Patreon for off-platform activity—politically incorrect language on a relatively small YouTube channel years ago.

Hart explains that her team spend their time watching for and investigating complaints about Patreon creators’ behavior not just on Patreon but all over the internet. The inclusion of platforms other than Patreon is new and directly contradicts what Jack Conte said when he appeared on The Rubin Report last year and assured his creators that they would only judge content that appeared on their own platform.

Now, it appears that if the Trust and Safety team deem any complaint from anywhere on the internet to be serious enough, they will reach out to the creator, ask them to mend their ways and publicly apologize.

It’s a kind of low key, digital struggle session for the sake of the “safety” of the online community.

It’s not only Orwellian, it’s also arbitrary. Journalist Nick Monroe recently pointed out that the #11 top creator on Patreon, a podcast called “Cum Town,” regularly uses racist and sexist language on content that it posts directly to Patreon.

This is one of many examples of language that actually violates Patreon’s own terms of service. A simple keyword search for the offending language that Benjamin used yields pages upon pages upon pages of creators using the exact same words. Nothing has been done in these many instances.

Jack Conte is quoted as saying, “You can use a press debacle like this to drum up your community and rile people up and get them to support a cause.” He seems to be suggesting that there is a conspiratorial element to all of this, and that those who have been affected by Patreon’s arbitrary policing of speech are somehow benefitting.

The truth is that no one is benefitting. The hardest hit are the small creators who have seen massive losses in support and do not have a large enough pulpit to announce alternate plans to regain the support they have lost.

Perhaps the most revealing part of this New York Times piece is the following:

Patreon takes a highly personal approach to policing speech. While Google and Facebook use algorithms as a first line of defense for questionable content, Patreon has human moderators. They give warnings and reach out to talk to offenders, presenting options for “education” and “reform.” Some activists hope this will become a model for a better and kinder internet.

“A personal approach to policing speech” is an interesting phrase. It is good for Patreon to take a human approach as opposed to an automated approach to its terms of service. But if it’s personal then it is also necessarily subjective. And if you have a Trust and Safety team with zero ideological diversity, then you get an automation of another kind. The decisions made and punishments doled out will be 100% based on political bias.

As for a “better and kinder internet,” it's not looking great.

Patreon’s new pastime of arbitrarily policing people’s speech has led to many creators losing thousands of dollars and facing an uncertain future just in time for Christmas.

ADVERTISEMENT
ADVERTISEMENT

Join and support independent free thinkers!

We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.

Support The Post Millennial

Remind me next month

To find out what personal data we collect and how we use it, please visit our Privacy Policy

ADVERTISEMENT
ADVERTISEMENT
By signing up you agree to our Terms of Use and Privacy Policy
ADVERTISEMENT
© 2024 The Post Millennial, Privacy Policy | Do Not Sell My Personal Information