YouTube’s Child Exploitation Video Galore and Finding Answers to Fix it
Last few months have been pretty rocky for YouTube. There has been a string of upsetting and exploitative videos either featuring children in revealing clothing and shown in creepy situations, or videos aimed at children which are highly insensitive.
According to media reports, some videos show children restrained with ropes or tape and sometimes crying in visible distress whereas some of them include gross-out themes including children covered in fake blood, children seemingly eating feces, bugs attacking children, young girls scantily clad talking and singing in front of webcams etc.
There are also videos titled, “Peppa Pig Crying at the Dentist Doctor Pull Teeth!” where the character is basically tortured at the dentist to another one titled “BURIED ALIVE Outdoor Playground Finger Family Song Nursery Rhymes Animation Education Learning Video”. Most of these videos had racked up millions of views and the channels too had millions of subscribers before they were taken down by YouTube.
These disturbing videos have trickled down to YouTube Kids app as well which is supposed to contain only child-friendly content. Slipping past the filters of YouTube either by mistake or by creators exploiting the loopholes in its algorithms, videos featuring well-known children’s character in vile or violent situations with disturbing imagery, sometimes set to nursery rhymes or disguised as seemingly harmless, everyday scenarios have made their way to the screens. To add to the videos, the comments below them are creepy and predatory in nature and round up the unsettling experience.
YouTube’s Response
Back in November, when the first reports appeared in media, Johanna Wright, Vice President of Product Management at YouTube published a blog titled, ‘5 ways we’re toughening our approach to protect families on YouTube and YouTube Kids’ which highlighted the efforts of YouTube to remove such videos from the platform. In the blog post, Johanna talks about the number of channels and videos they had purged under their community guidelines, removed ads from inappropriate videos and blocked inappropriate comments on videos featuring minors.
Content Creation Guide
The response addresses some points but misses quite a few as well. The post says that YouTube wants to help creators produce quality content for the app and that they will release a comprehensive guide on how to create family-friendly content.
Clearly, people are aware about what content is suitable for kids. The videos that land up on YouTube kids are not there because the creators’ intentions were good (in some cases, maybe) and they somehow messed up with the content. These inappropriate videos are deliberately made for children as their target audience and to confuse them. They are intentionally masqueraded to pass off as their favorite shows and exploit the filters in place using tags to appear in their stream and mess with them. These videos are also made to appeal to people who get a kick by watching children or characters in distressing scenarios.
The YouTube Algorithm
Not to forget the autoplay feature for recommended videos which starts playing a video similar to the one just watched within a few seconds of its completion. When a child watches one upsetting video, he is more likely to see more and more of it due to the algorithm-driven recommended video feature. Yes, you can stop the autoplay feature, but the recommended videos feature do take someone down the rabbit hole of videos you just watched.
Also as mentioned before, the multiple keyword stuffing in order to appear higher for certain search results and gain more views needs addressing. The process is simple. You create a video, include as many keywords and tags in the title, description and other options while uploading the video and there is a high probability that the algorithm will boost the position of the video in search results. In fact, Google provides a Keyword tool to help you come up with as many combinations of keywords and the number of monthly searches they get.
Plus, when a certain character or trend becomes mainstream, there are thousands of similar videos which jump on the bandwagon and try to get views. Most of these iterations appear in the sidebar or up next list.
Why can’t this algorithm be worked upon to ensure better content sorting? Why can’t it be applied to these shadier knock-offs, or content created using stock animations and audio and filled up with keywords to produce endless video clips meant to rank higher in search results? We have seen how quickly the videos get removed from YouTube if they infringe on some copyrights. Why can’t the same urgency or accuracy be applied to such content? The algorithm in its current form and function is an open invitation to exploitation.
Community Guidelines
YouTube says that they will ensure “Tougher application of community guidelines and faster enforcement through technology”, and will increase its human content moderators to 10,000 in 2018. The sheer scale of the platform with hours and hours of video uploaded every minute, it’s quite impossible to have an appropriate human vigilance or to know how much is enough. Although this is a welcome move, we have seen in Facebook’s case too that moderators can’t do much if they don’t have an understanding on child rights and cultural contexts.
And more importantly, there should be a will and inclination to act urgently. And Google is failing miserably at that. According to a Buzzfeed report, Matan Uziel — a producer and activist who leads Real Women, Real Stories had tried to bring live-action child exploitation videos to YouTube’s attention multiple times yet no substantive action was taken.
Ease of Reporting
The option to flag a video is prominent in YouTube app and appears right below the video. However, when accessed on a computer, this reporting option appears in a sub menu. Same goes for the YouTube Kids app.
What can Google do? Make the reporting simpler and the reporting action more prominent. Instead of a subtle cue such as a flag, why can’t it be more obvious in the form of a simple button which says “Report” or something similar? This will make it easier for parents to teach their kids to report a video if they feel what they’re watching is bizarre and not the norm.
Also, we need to make sure to explain them about these services and that there is a chance that they may stumble upon some video which is not intended for them. You need to reassure them that it’s not their mistake that they landed up on such content and encourage them to speak up if they see something which is upsetting.
When the characters that are loved, adored and trusted by children are suddenly shown in darker light or engaging in weird scenarios or nasty activities, it can have a very worrying impact on them. The psychological impact and social implications of this can be quite menacing. While Google works on fixing this problem, we as parents need to make sure not to leave our children unattended while they are plugged into smartphones and tablets.