TikTok is trying to remove graphic videos circulating the app that show a man shooting himself with a gun, and banning accounts from people who are re-uploading the clip.
TikTok users are complaining a gruesome video showing a man’s apparent suicide that began circulating Sunday night is still popping up on the social media platform, raising concerns over TikTok’s ability to stamp out objectionable content.
TikTok says the clip was originally streamed on Facebook and has appeared on other apps. (Facebook did not immediately respond to Forbes’s questions about the original livestream).
As the TikTok community became aware of the clip, many creators started posting videos warning their fellowers to look out for an image — a man sitting in front of his desk with a grey beard — and swipe away from the video. Other creators spoke about the most disturbing part of the video being hidden inside more innocuous looking TikToks.
A TikTok representative confirmed to The Verge that “clips of a suicide” started circulating on Sunday night.
“Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide,” the spokesperson said. “We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”
These types of videos have appeared on other sites in the past, including Facebook, Instagram, and Reddit. Since TikTok videos are surfaced into one main feed — known as the For You page — that people scroll through, it can be harder to avoid the footage.
That could explain why the TikTok community is trying to be extra vigilant about warning others about the imagery in the video. Warnings have also started popping up on Instagram where clips of the video are circulating. On sites like Twitter, parents have spoken about their children and teenagers coming across the video, and the toll it’s taken.
Users are also saying that when they try to report the video, TikTok responds with a note saying that it isn’t against community guidelines (TikTok did not immediately respond to Forbes’s question about this problem).
“Please try not to go on TikTok today or tomorrow,” wrote one user, explaining: “If you watch a video and it starts with a man with long hair and a beard on the phone scroll the hell away from it as quick as you can!”
“We appreciate our community members who’ve reported content and warned others against watching, engaging or sharing such videos on any platform out of respect for the person and their family,” said the TikTok spokesperson.
This is not the first time TikTok has struggled to deal with a suicide video circulating on its platform. Earlier this year, a 19-year-old user livestreamed his apparent suicide on TikTok and the company was criticized for waiting almost three hours to alert authorities. Other social media platforms have also struggled with this issue, including Facebook, which came under fire for delays in removing a live-streamed massacre at a mosque in New Zealand in 2019.
18 million. The number of daily TikTok users whom the company estimated to be 14 or younger, as of July 2020.
TikTok is currently in the middle of a standoff with the U.S. government. President Trump last month signed an executive order giving Chinese parent company ByteDance 90 days to divest TikTok’s operations in the U.S. TiKTok sued the Trump administration last month in response, and is currently considering offers from several U.S. companies, including, Oracle, and Microsoft and Walmart, who have reportedly put in a joint bid.
“If anyone in our community is struggling with thoughts of suicide or concerned about someone who is, we encourage them to seek support, and we provide access to hotlines directly from our app and in our Safety Center,” the spokesperson said.