Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Is YouTube's Algorithm Endangering Kids?

Amid reports of disturbing kid-oriented content and pedophilic comments on its site, YouTube says it is increasing enforcement of guidelines relating to content featuring or targeting children.
d3sign
/
Getty Images
Amid reports of disturbing kid-oriented content and pedophilic comments on its site, YouTube says it is increasing enforcement of guidelines relating to content featuring or targeting children.

A strange and unsettling thing was happening this morning on YouTube. If you typed the words "how to have" into the site's search bar, one of the suggested searches was "how to have s*x with kids."

By the afternoon, that autocomplete result and a few related ones no longer appeared.

Google, which owns YouTube, did not reply to NPR's request for an explanation for why this would be a suggested search, but the company toldBuzzFeed News that it had been "alerted to this awful autocomplete result" and is investigating the matter.

As BuzzFeed reports, the occurrence was likely brought about by people gaming YouTube's algorithm: "Motivated trolls, for example, could theoretically search for 'how to have s*x with your kids' with enough frequency in order to make the search result appear far more popular than it is."

Though brief, the incident is a troubling reminder of other issues involving children that have recently cast a harsh glare on YouTube and the billion hours of video users watch there each day.

Earlier this month, two articles brought attention to the countless videos on YouTube that feature well-known characters from kids' entertainment. Instead of being made by Disney or Nickelodeon, they are made by obscure production companies that crank out the videos at high speed, label them with keyword-packed titles, and make money off the ads that appear alongside or during the videos. Google makes money, too, selling those ads.

If such videos were merely ad-driven, noneducational drivel, they would be unfortunate but arguably not wholly unlike the Saturday morning cartoons many of us watched as kids.

But many of the videos on YouTube that feature well-known characters from kids' shows are a strange and different beast. Playing in endless, auto-playing succession on phones and tablets, some of the videos seem crafted to disturb children, while garnering millions of views and winning the favor of the site's algorithms.

For a time on Sunday and Monday morning, YouTube was showing troubling auto-suggested search terms.
/ Screengrab by NPR
/
Screengrab by NPR
For a time on Sunday and Monday morning, YouTube was showing troubling auto-suggested search terms.

As The New York Times reported in early November, one mother found her 3-year-old son watching a video titled "PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized" on the YouTube Kids app, which involved a vehicle smashing into a light pole and some of the characters from the Nick Jr. series dying.

The newspaper noted that not all the site's troubling videos feature cartoon characters: "Alisa Clark Wilcken of Vernal, Utah, said her 4-year-old son had recently seen a video of a family playing roughly with a young girl, including a scene in which her forehead is shaved, causing her to wail and appear to bleed."

Two days after the Times story, the writer James Bridle posted an essay on Medium about the epidemic of violent and disturbing content on YouTube: cheaply-made videos purporting to teach colors or nursery rhymes, but instead something more sinister takes place.

Videos such as those showing the character Peppa Pig drinking bleach or eating her father, Bridle writes, are so widespread they "make up an entire YouTube subculture."

He says these videos are the product of an algorithm-powered system that only cares about clicks and ad revenue: "What we're talking about is very young children, effectively from birth, being deliberately targeted with content which will traumatise and disturb them, via networks which are extremely vulnerable to exactly this form of abuse."

And, he argues, YouTube and Google are complicit: "The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale."

Most of the videos mentioned in the Times article and Bridle essay have since been removed, and last week YouTube published a blog post titled "5 ways we're toughening our approach to protect families on YouTube and YouTube Kids."

The company said it had terminated over 50 channels, removed thousands of videos, and taken steps to age-restrict content with "family entertainment characters but containing mature themes or adult humor." It said it had updated its policies around this type of content in June, and had removed ads from 3 million videos since then, as well as an additional 500,000 once it "further strengthened the application of that policy."

A second type of child endangerment on the site emerged on Friday, as the BBC and The Times of London reported that videos of children on YouTube, some uploaded by children themselves, were attracting explicitly pedophilic comments. Alongside those videos appeared ads from major brands.

The outlets found that even after YouTube was informed of the explicit comments by members of its Trusted Flagger program, 23 of the 28 comments flagged by the group remained in place until BBC inquired.

Brands including Adidas, Mars, Cadbury and Deutsche Bank subsequently pulled their ads from the site.

England's Children's Commissioner Anne Longfield said that YouTube is "complacent" and that regulation is "looming if companies don't self-regulate themselves," the Times reported.

What that regulation looks like is anyone's guess, given that 400 hours of content are uploaded to the site every minute.

YouTube said it was applying "machine learning technology and automated tools" to quickly spot content that violates its guidelines and escalate it for human review.

"Across the board we have scaled up resources to ensure that thousands of people are working around the clock to monitor, review and make the right decisions across our ads and content policies," wrote Vice President of Product Management Johanna Wright. "We're wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right."

The company did not reply to a request for comment on issues related to children's content.

In his essay, Bridle writes that YouTube and Google "have so far showed absolutely no inclination" to change the system that fuels such troubling content.

"I have no idea how they can respond without shutting down the service itself, and most systems which resemble it," he writes. "We have built a world which operates at scale, where human oversight is simply impossible. ... [T]his is being done by people and by things and by a combination of things and people. Responsibility for its outcomes is impossible to assign but the damage is very, very real indeed."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Laurel Wamsley is a reporter for NPR's News Desk. She reports breaking news for NPR's digital coverage, newscasts, and news magazines, as well as occasional features. She was also the lead reporter for NPR's coverage of the 2019 Women's World Cup in France.