YouTube’s Adpocalpyse

Ed. Note: This post by Arvind Pennathur is a part of the TLF Editorial Board Test 2018

YouTube may have started out as an innocuous platform for sharing funny cat videos, but in today’s world it is a giant in the field of entertainment; a place where people can earn a living by uploading original and creative content for the world to see. With such a huge business, there are bound to be pieces that need to fit together in order to ensure smooth operation, and throughout 2017, the ties between the company and its advertisers were continuously tested during what is now widely being called the ‘YouTube Adpocalypse’.

In January of 2017, the biggest personality on the platform, Felix Kjellberg (aka ‘PewDiePie’) uploaded a video wherein he showed two young men laughing and holding up a sign that said ‘Death to All Jews’. The video was immediately noticed by other creators and criticized for being offensive and anti Semitic. However, what separated this from YouTube controversies of the past was the wide coverage the incident received from several main news outlets, likely due to his status as the most followed person on the platform. This led to Felix and Disney immediately severing commercial ties, as well as his original YouTube Red series being cancelled. This set into motion a series of events that led several advertisers to sever ties with the platform, come May of the same year. The reason that was given was that the platform had a lot of unsuitable and inappropriate content that companies did not want to promote. This of course, meant that YouTube lost tons of money.

In order to stem the losses, YouTube created a monetization system wherein stronger filters would be applied in order to ensure safe content could be advertised. How this worked was through the usage of a demonetization system where each video would be reviewed using the metadata (which included the title of the video, the tags and the description) and thereby determined if it was safe for advertisers. On the face of it, it seemed terrific, but there existed an inherent flaw in this design.

Channels began getting hit with demonetization left, right and center, with most of the content that was being hit not containing anything remotely in violation of the community guidelines and advertising guidelines that YouTubers must abide by. The list includes bans on sexually explicit content, hate speech, violent or graphic content, inappropriate language, or any videos that breach privacy in any form. However, the guidelines also include a ban on ‘controversial issues and sensitive events’. Now, the guidelines don’t give an exact definition of what these constitute, but due to several accounts from YouTubers that have uploaded videos talking about exploring sexuality or opening up about depression, and even about global events that deal with complex issues. Due to the seemingly ‘controversial’ content of these videos, these videos would get flagged. This is a mistake on YouTube’s part because while it makes sense that issues of this nature can’t be thrown around casually, imposing a blanket ban on even talking about them is problematic because in order to gain perspective and initiate discourse on such issues.

YouTube’s terms of service agreement states that when you upload a video on to the platform, you own the rights to the video, but at the same time you grant the company the right to use your content however they wish. This includes the right to check the video for questionable content. However, the implementation of combing through each and every video hasn’t really been effective since the metadata that is used takes words out of context and on this basis, strikes down videos.

Another issue with this system is the lack of communication between the company of the system and the people being affected by it. A review can be requested for the video, and if the video is deemed friendly enough, it is allowed to earn revenue. However, aside from this and the occasional update to the guidelines, YouTube did not address the issues faced in any way. In fact, for many of the complaints that were raised, an automated response was sent back with a standard response to every message. If the company is going to essentially take away the means for people to earn money, there should be some transparency in what happens in the process in order for the creators to know what is acceptable and what isn’t. The guidelines are broad and cannot account for every single piece of content in the video.

In early 2018, the scale of the problem increased when controversial YouTuber Logan Paul uploaded a video that showed the body of a deceased man who committed suicide in a Japanese forest. The video proceeded to climb the charts on the Trending page, eventually ending at number one. YouTube responded to this by taking strong action against Logan, but only 10 days after the actual incident. During those 10 days, it struck down every video talking about the incident, but for some strange reason, didn’t take the actual video down. People accused the company of playing favourites, and pandering to advertisers who saw people like Logan as the face of the platform. While these accusations are mostly knee jerk in nature, it does bring into question how far the platform would go to protect its relationship with advertisers.

The ‘Adpocalypse’ is not likely to come to any definitive answer anytime soon, so in order to prevent YouTube from suffering any long term damage, corrective measures need to be taken by the company to ensure that the creators who contribute to it know that they are appreciated and that their content means something to the platform. Preventing them from talking about critical issues will not solve anything; it will prevent discussion of issues that need to be talked about in order to progress as a society.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.