Stories
Mom Discovers Suicide Instructions Hidden in Videos on YouTube Kids
Next Post

Press {{ keys }} + D to make this page bookmarked.

Close
Photo: behance.net/PrtSc

Mom Discovers Suicide Instructions Hidden in Videos on YouTube Kids

13198

Supposed to be a safe place for children, YouTube Kids has recently made some parents worried about their children’s psychological health as several disturbing videos – with short clips giving children instructions on how to harm themselves edited into them – popped up on the site.

Have you ever explained to your child how to properly slit their wrists? Don’t worry, there’s no need: YouTube Kids has already taken care of it.

A Florida pediatrician who just happens to be a mother was far from the first to learn about the app’s new “options,” but she managed to draw attention to the problem to warn other parents to better monitor their kids’ online activity.

Dr. Free Hess found multiple videos on YouTube Kids that revealed hidden suicide instructions and posted the video on her blog PediMom.com. According to the woman, the video has appeared twice on YouTube and YouTube Kids since July.

“Looking at the comments, it had been up for a while, and people had even reported it eight months prior,” Hess told CBS News.

“It makes me angry and sad and frustrated,” she told CNN. “I’m a pediatrician, and I’m seeing more and more kids coming in with self-harm and suicide attempts. I don’t doubt that social media and things such as this is contributing.”

The instructions are in a 9-second clip that’s spliced between clips of the popular Nintendo game Splatoon. After about four minutes a man comes on and starts instructing the viewer on how to kill themselves. 

“Remember, kids, sideways for attention, longways for results. End it,” a man says as he pretends to cut his forearm.

boingboing.net/PrtSc

The man in the clip has been identified, though. It’s the YouTuber having over 6.2 million subscribers Filthy Frank, or “the embodiment of everything a person should not be,” as he calls himself. CBS News reports that there is no evidence that suggests Frank was involved in creating the doctored video.

Hess immediately took action and notified YouTube of the video. The woman claimed it took about a week for the video to get pulled.

In a response to CNN on the videos Hess found, YouTube said it takes its commitment to parents seriously.

“We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed,” their statement said.

“We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.”

But, according to Hess, she saw the video again in February – after it was flagged by several parents. Since flagging the first video, Hess does say that YouTube Kids has been faster about pulling videos from its site: This time it took YouTube Kids a couple of days to pull it.

A YouTube representative said the company works hard “to ensure YouTube is not used to encourage dangerous behavior.”

“We rely on both user flagging and smart detection technology to flag this content for our reviewers,” the representative said.

“Every quarter we remove millions of videos and channels that violate our policies and we remove the majority of these videos before they have any views. We are always working to improve our systems and to remove violative content more quickly, which is why we report our progress in a quarterly report and give users a dashboard showing the status of videos they’ve flagged to us.”

The platform’s assurances nevertheless didn’t stop Hess from looking for other pieces of offensive content. She said she has made it her mission to seek out these kinds of videos after seeing higher rates of child suicide in her emergency room over the past few years.

Since what happened to her own child, she has reported seven more disturbing videos glorifying suicide, sexual exploitation and abuse and even school shooting on YouTube Kids.

“I had to stop, but I could have kept going,” Hess said. “Once you start looking into it, things get darker and weirder. I don’t understand how it’s not getting caught.” 

Moreover, the woman runs a website encouraging other parents to educate themselves and be vigilant in monitoring what their children watch.

In a blog, she recalled how another mom told her last July that both their sons were watching a cartoon on YouTube Kids and was shocked at what was seen. Right in the middle of the video, a man in sunglasses popped up and showed kids how to slit their wrists.

“Suicide is the SECOND leading cause of death in individuals between the ages of 10 and 34 and the numbers of children exhibiting some form of self-harm is growing rapidly,” Hess said.

“Every year 157,000 young people between the ages of 10 and 24 present to Emergency Departments for self-inflicted injuries and/or suicide attempts.”

In addition to the videos with the direct instructions on how to harm yourself, it should be noted there’re also the so-called “challenges” for children that can be inserted into some videos.

For instance, a “Momo-challenge” became very popular on the web in mid-2018 with a report that a 12-year-old Argentinian girl had been motivated by the “Momo Game” to hang herself from a tree in her family’s backyard near Buenos Aires.

The Momo challenge is supposedly a form of cyberbullying prevalent on platforms such as WhatsApp and YouTube, through which children receive anonymous threatening messages tied to pictures of “Momo,” an unrelated sculpture of a grinning figure with dark hair and bulging eyes created by a Japanese special effects company.

sohu.com/PrtSc

The “Momo” messages allegedly compel youngsters to engage in perilous activities such as taking pills, stabbing other people, and even killing themselves, the Snopes reports.

Other warnings emerging in early 2019 cautioned that Momo-related threats and suicide imagery were being inserted into videos (such as Peppa Pig) viewed by children on YouTube and elsewhere.

As reported by the Populist Media,

“News emerged that YouTube removed a channel called Buds131. The channel, created by Don Shipley, dedicated to exposing stolen valor, exposed Nathan Phillips as being a liar and fraud. The event caused outrage because the press painted Sandman as the aggressor when in reality it was Phillips.

Just on Sunday, Steven Crowder, a well-known right-wing YouTuber had his long advertised Oscar night viewing party pulled during the event after about 25,000 viewers watched. The feed was in reality pulled by ABC and the company cited copyright infringement as the reason. However, from the feed that reappeared later, posted by another YouTuber, there wasn’t even any footage of the Oscars played on the live feed. Meanwhile, more left-leaning channels still have Oscar party videos up as we speak.”

So please be careful with YouTube and its “child” – YouTube Kids, as it’s all about censoring content. They censor videos from people that lean right but allow videos to appear that have suicide instructions on them geared at your children.

Author: USA Really