Spider-Man, Elsa, pregnancy, and comically large medical-grade syringes. These items hardly belong together— especially on a website which produces content targeted to pre-verbal children. Yet, this disturbing mix of mismatched mascots and adult themes has long been the winning content formula on video platforms such as YouTube Kids. When parents use these videos, cartoons, and nursery rhymes to pacify their children, they unintentionally pay a sinister price: their children’s innocence.
In early 2016, the emergence of a YouTube channel called Webs and Tiaras generated unprecedented website traffic for the platform. Bizarre videos, unlicensed by Disney or Marvel, depicted the characters Spiderman and Elsa in adult situations unsuitable for children. Some of these scenarios included instances of childbirth, abuse, and depictions of disturbing medical procedures. Shockingly, Webs and Tiaras reached a collective total of more than one billion views on their videos from 2016-2017, presumably from the younger audience that had flooded the online media market.
For video streaming platforms like YouTube, children are one of the most lucrative advertising demographics. They tend to view advertisements in their complete runtime, never complain about content quality, and patronize content for multiple hours a day. With the introduction of the autoplay feature just a year prior, children could independently binge inappropriate content for extended periods without parental assistance.
Over the last decade, parents have become increasingly dependent on technology as a means to pacify their children. Kid’s content was a generalized, band-aid solution for temper tantrums, boredom, and education alike. The bright colors and recognizable mascots in these videos captured the attention of developmentally vulnerable toddlers, prematurely stifling outbursts. In some instances, these videos were even tagged with keywords including “education” and “learning” to subvert content safety algorithms. Yet, parents rarely stopped to review the nature of the content YouTube algorithms had been introducing to their children, especially since thumbnails (illegally) depicting licensed characters suggested a sense of “brand safety.”
The origin of this content is an even stranger story: most of this disturbing content, including flash animations, live action skits, or claymation productions that continue to exist on YouTube Kids is produced outside of the United States, in non-English speaking areas, including Eastern Europe and select regions of Asia. One clue that reveals this bizarre detail to domestic audiences is the distinct lack of human speech throughout the Elsagate genre. Despite videos being marketed as “educational” content, the over reliance on sound effects in these wordless skits can also lead to the arrested development of children, considering verbal communication is a critical contributor to early language abilities.
Beyond disturbing content on video platforms that children frequent, early adolescent exposure to technology has poorly understood long-term implications. Yet, we are starting to reach a point where we can examine the immediate effects of this content tablet usage on children. In 2023, a study from Australia found that the average 24-month-year-old infant experienced more than 2 hours of screen time per day. Early adolescent technology use impairs a child’s emotional development, and ability to cope with their environments without screens. Excessive screen time is bad enough; but when the content in question is also disturbing and uneducational, this accentuates the problem to extreme levels.
Parents and regulators have long been concerned with the media children are consuming. In 1990, the Federal Communications Commission began enforcing the Children’s Television Act by Congress. It broadly requires that television broadcasters actively document their efforts to produce content that benefits the social, emotional, and cognitive development of young children. While the merits of government interference in broadcast television are debatable, the point stands that platforms like YouTube are not subject to this same level of scrutiny. The nature of user-generated content and sheer volume of videos available on the platform makes filtration a challenging task.
Despite the mainstream discovery of this disturbing content in 2017, the original wave of Elsagate content has re-emerged behind new mascots such as the indie horror icon Poppy Playtime. The profitable nature of children as a demographic group likely means that this problem will continue until we hold platforms more accountable for the content they enable on their platforms.
Ultimately, it is the responsibility of parents to monitor what their children watch. No amount of “educational” video content can replace the human connection that parents should have with their children, nor can any level of regulation stop corporations from exploiting a profitable demographic. So the next time a younger family member has an outburst— whether it be a sibling, nephew, niece, or your own child— consider leaving the tablet in the diaper bag.