Why Did Facebook Keep a Man's Livestreamed Suicide Up for Hours?
Joshua Steen wasn’t particularly surprised when he saw his friend Ronnie McNutt, 33, on Facebook Live on August 31st. “He often used a livestreaming platform as his form of therapy,” Steen tells Rolling Stone about his friend, who he met during a community theater production of Footloose and with whom he cohosted a podcast. “He would get on whatever service it was and just ramble. He liked to talk; he liked to argue with people about theology [and] geek and pop culture news. He just liked the back and forth.”
It took only a few seconds for Steen to realize that this time was different. McNutt appeared to be heavily inebriated and despondent, having broken up with his girlfriend (though he had not lost his job, as press reports would later confirm). An Iraq War veteran, McNutt had long struggled with depression and PTSD, and there were hundreds of comments from people on the stream urging him to get help. At one point, he appeared to fire a rifle in the air.
Steen and a few of McNutt’s other friends contacted the police. They also reported the livestream to Facebook at 10 p.m., but heard nothing until 11:51 p.m. Facebook refused to take down the livestream, on the grounds that it did not violate the platform’s community standards.
But at that point, McNutt had been deceased for more than an hour. He had taken his own life during the livestream, in front of his friends and family members, including Steen. What’s more, Steen says Facebook did not remove the footage until hours after McNutt’s death. In a statement, Facebook tells Rolling Stone, “We removed the original video from Facebook last month on the day it was streamed and have used automation technology to remove copies and uploads since that time. Our thoughts remain with Ronnie’s family and friends during this difficult time.”
Since McNutt’s death, the footage has gone viral, with users posting it on multiple platforms including Twitter and TikTok. Steen says that his wife saw it embedded in a video that opened with puppies, with parents complaining that their children were similarly tricked into watching it. Because TikTok’s #foryou page automatically plays footage, many users did not even realize what they were watching until it was too late, prompting many to warn others about the video or even to stay off the app.
(“Our systems have been automatically detecting and flagging these clips which violate our policies against content that displays, praises, glorifies, or promotes suicide,” TikTok said in a statement. “We are removing content and banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.”)
But much of Steen’s ire is directed at Facebook for reportedly refusing to take down the livestream while McNutt was still alive, as it has done in the past while other crimes have taken place. McNutt’s loved ones’ efforts to remove the video from social platforms has prompted the creation of the hashtag campaign #ReformForRonnie. “Had their response been adequate and they just ended his livestream, I honestly don’t think he would’ve killed himself,” he says. “It would’ve diverted his attention and would’ve been a crucial factor in changing the situation as it was.”
Steen says he still continues to see the video posted in private groups and even in comments on McNutt’s final post on his Facebook page. In a screengrab obtained by Rolling Stone, Facebook responded to someone reporting a link to the graphic video by saying that “it doesn’t go against one of our Community Standards.” (Facebook’s community standards specifically prohibit content “that encourages suicide or self-injury.”)
This is far from the first time Facebook has been under scrutiny for failing to police violent content. The platform was criticized for failing to remove live-streamed footage of the Christchurch mosque mass shooting in 2019. Shortly afterward, it announced that it had machine learning technology that could ostensibly screen for weaponry used in a livestream — such as the rifle used by McNutt.
Instagram, which is owned by Facebook, was also criticized when a photo of a murdered 17-year-old named Bianca Devins went viral, with trolls posting the gruesome image to her loved ones’ Instagram and Facebook pages. Instagram also struggled to contain the spread of those photos, even though it said in a statement to Rolling Stone at the time that it had “taken steps to prevent others from re-uploading the content posted to that account to Instagram.” Facebook has also been criticized for rigorously enforcing certain policies, such as anti-nudity guidelines, over others prohibiting violence. “If some woman posts a topless photo, their software will detect that, remove it, and ban their account,” says Steen. “That’s [apparently] more offensive than my friend killing himself.”
Anyone experiencing a crisis is encouraged to call the National Suicide Prevention Hotline at 1-800-273-8255 or contact the Crisis Text Line by texting TALK to 741-741.
Source: Read Full Article