There isn’t a single person alive today, who hasn’t spent hours browsing on YouTube. Let’s face it, certain YouTube channels and their ‘creators’ are just too hard to ignore. But like all social media today, it does have its negative effects
YouTube’s first brush with troubles happened in 2006, when the Japanese Society for Rights of Authors, Composers, and Publishers successfully issued takedown requests for more than 30,000 pieces of content. Following similar suit, Comedy central then filed a copyright infringement complaint which forced its content off the platform.
According to Vice, it wasn’t long before YouTube, faced with the possibility of being blocked in even more locales, was forced to take a stronger stance on certain types of content. Among the company’s first controversial decisions came in 2007 when it suspended Egyptian user Wael Abbas. Abbas, an award-winning blogger and anti-torture activist, had used the platform to draw attention to police brutality in Egypt, uploading more than a hundred such clips. Although there were rumors that YouTube’s decision had come at the behest of the Egyptian government, the company later told Abbas that it had removed the videos after receiving numerous complaints from other users.
But the real question is – who is to be blamed for this lack of censorship, the creators or the platform provided to them? Saying that the creators should take responsibility for what they put out makes a compelling argument, but should there be a filter to things that don’t pass the test?
If you type ‘How to kill a man’ on YouTube’s search engine, the first video that pops up is ‘how to kill a man with a newspaper’. While the disclaimer says that the creator ‘does not condone any violence’ and markets it as a form of self-defence, there’s no denying that such a video can be severely misleading. Although, YouTube requires your consent before you can view certain videos, there are some videos that are accessible without any consent.
If you scroll down a little, you’ll find a video on 10 poisons used to kill people. We certainly don’t need to make it easier for someone to end our lives, or anyone’s for that matter. And since YouTube has no restrictions on the age group of people, it’s hard to imagine what effect such videos might have on inquisitive toddlers or vulnerable teens. Such videos promoting hatred and violence, especially when it’s so easily accessible, will ultimately do more harm
On the lighter side, makeup tutorials teach you how to apply your makeup flawlessly, like a professional. But seeing mom’s upload videos of ‘my five-year-old daughter applies makeup’ is only a testament to the negative effects of such light-hearted videos. Five-year-olds shouldn’t be allowed anywhere near makeup, and to see adults encouraging in such behaviour in the garb of value ‘content’ is disheartening.
There are also a number of pornographic videos floating around on YouTube. Says Susan (name changed) whose 7-year-old son happened to catch hold of such a video “My son used to love watching cartoons on YouTube so we would allow him 45 minutes of screen-time everyday, but one day he came up to us and asked the meaning of certain filthy and obscene words. We don’t even know how he even reached such videos, but this certainly has a lot to do with a lack in any sort of censors on YouTube’s part. We do not even have accounts on YouTube, so there’s no question of consent here.”
There’s no telling what could possibly be put out on YouTube, especially when its user-generated content for the purpose of views. And when people have such unrestricted access to, it can only mean one thing – quality content has definitely taken a backseat while mass appeal has been given more importance.