YouTube Took Down More Videos Than Ever Last Quarter As It Relied More On Non-human Moderators. YouTube is an American online video-sharing platform. According to Wikipedia, the platform was founded by Chad Hurley, Steve Chen, and Jawed Karim in 2005. The site was bought by Google in November 2006.
The platform allows users to upload, view, share, rate, report, add to playlist, comment on videos and subscribe to other users. It provides a wide variety of user-generated and corporate media videos. However, the contents available on the platform are video clips, Music videos, TV show clips, movie trailers, live streams, short and documentary films, audio recordings and lots more.
Due to the COVID-19 pandemic, there was an in-office staffing reduction, which makes it rely more on non-human moderators. Because of this, the platform has taken down more videos than ever last quarter.
YouTube Took Down More Videos Than Ever Last Quarter As It Relied More On Non-human Moderators
According to The Verge YouTube removed more videos in the second quarter of 2020 than ever before, as the company relies more on its algorithm in place of most of its human content moderators. That’s in accordance with the Community Guidelines Enforcement report the company gave on Tuesday, which appears that it took down more than 11.4 m videos between April and June. At this same time last year, it removed below 9 million videos.
The company wrote in a blog post “When reckoning with greatly reduced human review capacity because of COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement. Because responsibility is our top priority, we chose the latter- using technology to help with some of the work normally done by reviewers.”
What Google Told Employees
Google told employees in March it was extending it’s work-from-home policy until the end of 2020 due to the pandemic. However, the company warned that the measures meant it would rely more on technology than human reviewers and the videos that would normally be fine on the platform may end up being removed in error. According to them, it’s human moderators work from offices specifically setup for reviews; to allow such work to be done outside of a controlled environment would risk having user data and sensitive videos- inadvertently exposed.
Moreover, YouTube knew that taking down more videos that didn’t violate its rules would also mean more appeals from content creators as a result. So it went further to add more staff to its appeals process to handle requests as fast as possible. The number of appeals for takedown contents ranges from 166,000 in the first quarter of the year to more than 325,000 in the second quarter. The company reversed itself and reinstated more videos in the second quarter: more than 16,000, to compare to just over 41,000 in the first quarter.
In the company blog post, it said that for sensitive policy areas such as child safety and violent extremism, it saw more than triple the number of removals as usual during the second quarter of the year. But it accessed the temporary inconvenience for creators as worth the end result. It added, “ We accepted a lower level of accuracy to ensure that we were removing as many pieces of violative content as possible.”