Talk about a hot-potato topic.
One Arkansas advertising executive sent me a link to a CNBC story, just wanting to make sure I was aware of the subject. He didn’t want to be quoted, though.
Then another marketing pro told me that parents have a moral right to protect their kids from internet threats, but as a businesswoman she fears overreaction in an era of viral backlashes. But that was on background. She had no comment for the record.
So why all the sensitivity?
Pedophiles had discovered YouTube, and advertisers were fleeing.
Pedophile comments attached to YouTube videos featuring children pushed multimillion-dollar advertisers like AT&T, Disney, Hasbro and Nestle to halt commercials on the Google-owned video website, demanding that YouTube clamp down on lewd comments and adjust algorithms and search-suggestion functions to avoid catering to fetishists and criminals.
On Thursday, just as this column was going to press, YouTube announced that it was disabling comments on all videos involving kids. That should comfort children’s safety advocates and businesses that see YouTube advertising as effective. Many had feared a widening boycott.
The solution defies YouTube’s reader engagement goals, which rely on comments, but it preserves advertising. A big persentage of commercials seen today are wedged between videos seen online. Channels like YouTube offer a direct line to viewers, who can’t skip the ads.
And it was a YouTube celebrity, Matt Watson, who drew attention to what Futurism.com’s Kristin Houser called “the latest shocking example of how difficult it is for Silicon Valley platforms to police their own communities at scale, even in the face of egregious abuse.”
Watson, in a video that racked up more than 2 million views, said YouTube’s comment sections were offering a platform for a “soft-core pedophilia ring” to exchange contact information and links.
Before disabling the comments altogether, YouTube outlined corrective policies, but advertisers wanted more direct action. (AT&T had pulled its advertising from YouTube before, in a 2017 wave of cancellations over ads’ proximity to videos from terror organizations and hate groups.) The company issued a statement: “Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We took immediate action by deleting accounts, reporting illegal activity to authorities and disabling violative comments.”
A Google representative, Chi Hea Cho, told The New York Times that YouTube disabled comments on millions of videos featuring children and deleted thousands of comments about them, including timestamps flagging parts of videos — perhaps a moment where a child takes off her shoes or plays on a swing — for pedophiles. “There’s more to be done,” Cho was quoted as saying, and shortly thereafter, the comment shutdown was announced.
Advertisers’ disgust with their brands appearing near offensive content has been a persistent problem for YouTube, which suffered significant revenue losses in what users called 2017’s “adpocalypse.” One local ad executive called the situation a shame, a case of a few wicked people effectively holding an attractive ad platform hostage.
The problem with pedophile comments relates to another YouTube-and-children cautionary tale examined last week. Arstechnica.com reported on horrid content, including tips on committing suicide, spliced into cartoons on the YouTube Kids app. “I’d say YouTube, like Facebook, is broken,” Arkansas Business Online Editor Lance Turner posted on Twitter last week. “But both have been designed to be the way they are.”
Caroline Knorr, senior parenting editor for Common Sense Media, says parents should take great care in posting their children’s videos on YouTube, even though comments have now been disabled. “Privacy settings are crucial,” she said in a phone interview. “If you want to upload kids’ recitals or school plays, I’d argue against posting to the wide internet.” Instead, keep your channel private, sharing links only with trusted people.
Knorr said she’s not certain that all the people who caused trouble in YouTube’s comment sections are pedophiles. “Internet trolls may be trolling YouTube to expose weaknesses and exploit vulnerabilities in the algorithm.” It did put YouTube on notice, she said, but the harm to children was egregious.
“That’s often the problem with user-created content. People want to share wonderful moments of a child’s upbringing, those funny and sweet things along the way. What could go wrong.?”
Well, now we know.