YouTube Machine Learning Software Axed Millions Of Unsavory Videos Before Anyone Saw Them
YouTube announced that last quarter its software helped to pull down millions of objectionable videos before any users were able to view them. The machine learning software that helped YouTube to pull this off has been trained with the assistance of human moderators and has made policing the crush of new videos uploaded daily easier.
YouTube has offered up hard numbers noting that it has pulled down 8 million videos in the last quarter and that of that number most were spam or adult content videos. Of that 8 million, 6.7 million were first flagged by computers. YouTube has been targeting violent extremism and says that at the start of 2017, only 8% of that sort of view was taken down before ten views were made. YouTube says that now more than half of all violent extremism videos are pulled with less than ten views.
YouTube plans to have 10,000 people working on pulling objectionable content on its network this year. With the massive scale of YouTube, it has to rely on software and computer systems to remove objectionable content. The catch for the video streaming platform is that it does no moderation before videos go live meaning that anything can be uploaded and remain up until a human moderate gets to it.
The challenge for YouTube is that if objectionable content stays up for a short period of time that content has the chance to offend a viewer or advertiser. The other side of the challenge are users who make content that many don’t see as objectionable at all. Gun-related channels have been hit particularly hard by YouTube's policies that have seen many creators barred from making money on the platform.
The channels were previously monetized and have seen their ability to host ads pulled. Demonetization of this sort is in part due to advertisers not wanting their brands associated with that type of video content. Some video makers see this as a way that YouTube can censor content by making previously profitable and sustainable channels unable to make money, thus potentially forcing the creator to stop making content or use its platform. The real problem is that objectionable content is often in the eye of the beholder. YouTube pulls all sorts of objectionable content in addition to the extremism videos. For instance, the network pulled all Tide Pod challenge videos back in January.