Press "Enter" to skip to content

A New Lawsuit May Force YouTube To Own Up To The Mental Health Consequences Of Content Moderation

A New Lawsuit May Force YouTube To Own Up To The Mental Health Consequences Of Content Moderation. According to The Verve Facebook agreed to pay out $52 million to moderators suffering from PTSD and other conditions, and now YouTube is asked to do the same.

A New Lawsuit May Force YouTube To Own Up To The Mental Health Consequences Of Content Moderation

For big technology platforms, one of the more urgent questions to arise during the pandemic’s early months was how the forced closure of offices would change their approach to content moderation.

However, FB, Twitter, and YouTube rely on huge numbers of third-party contract workers to police their networks, and traditionally those workers have worked side by side in big offices. So, when some companies shuttered their offices, they closed down most of their content moderation facilities also.

They keep paying their moderators, even those who could no longer work because their jobs required them to make use of secure facilities. Although, with the usage of an election on the horizon and social networks surging, there has never been a greater need for moderation. So, Silicon valley largely shifted moderation duties to automated systems.

A New Lawsuit May Force YouTube To Own Up To The Mental Health Consequences Of Content Moderation

James Vincent in The Verge: the question was whether it would work, and this week, we began to get some details. Both YouTube and Facebook had warned that automated systems would make more mistakes than humans.

The FT says that around 11 million videos were removed from YouTube between April and June or about twice the usual rate.  About 320, 00 of these takedowns were appealed, and half of the appealed videos were reinstated. The FT added that roughy double the usual figure, which is a sign that the AI systems were over-zealous in their attempts to spot harmful content.

Neal Mohan, YouTube’s chief product officer told FT “ One of the decisions we made at the beginning of the pandemic when it came to machines who couldn’t be as precise as humans, we were going to err on the side of making sure that our users were protected, even though that might result in a slightly higher number of videos coming down.”

Moreover, it turns out that automated systems didn’t take down a slightly higher number of videos, they took down twice the number of videos. This is worth thinking about for all of us, but especially those who complain that tech companies censor too much content. For many reasons, some of which I’ll get in a minute- companies like YouTube are under increasing pressure to both remove more bad posts and to do so automatically.  Surely the systems will improve over time, but the past few months have shown us the limits of the approach. They’ve also displayed that when you pressure technology companies to remove more harmful posts for good reasons, the trade-off is an uptick in censorship.

However, we almost no talk about the two pressures in tandem, and yet it’s essential for crafting solutions that we can all live with.

He continues, there’s another more urgent trade-off in content moderation: the use of automated systems that are error-prone but invisible, versus the use of human beings who have more skills but vulnerable to the effects of the job. I traveled to Austin and Washington DC last year to profile current and former moderators for YouTube and Google. I spent most of my time with people who work on YouTube’s terror queue, that is the ones who examine videos of violent extremism each day to remove it from the company’s services. And it was part of a year-long series I did about content moderators that attempted to document the long-term consequences of doing this work. And at YouTube, the same as Facebook, lots of moderators I spoke to suffer from the post-traumatic stress disorder.

More from VideoMore posts in Video »

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *