When word broke that the massacre in New Zealand was live-streamed on Facebook, I immediately thought of Robert Godwin Sr In 2017, Godwin was murdered in Cleveland, Ohio, and initial reports indicated that the attacker streamed it on Facebook Live, at the time a relatively new feature of the social network.
Facebook later clarified that the graphic video was uploaded after the event, but the incident called public attention to the risks of live-streaming violence.
In the wake of Godwin’s murder, I recommended that Facebook Live broadcasts be time-delayed, at least for Facebook users who had told the company they were under 18. That way, adult users would have an opportunity to flag inappropriate content before children were exposed to it.
Though the company has hired more than 3,000 additional human content moderators, Facebook is not any better at keeping horrifying violence from streaming live online without any filter or warning for users.
In the 24 hours after the New Zealand massacre, 1.5 million videos and images of the killings were uploaded to Facebook’s servers, the company announced. Facebook highlighted the fact that 1.2 million of them “were blocked at upload.”
However, as a social media researcher and educator, I heard that as an admission that 300,000 videos and images of a mass murder passed through its automated systems and were visible on the platform.
The company recently issued some analytic details and noted that fewer than 200 people viewed the live-stream of the massacre, and that surprisingly, no users reported it to Facebook until after it ended.
These details make painfully clear how dependent Facebook is on users to flag harmful content. They also suggest that people don’t know how to report inappropriate content – or don’t have confidence the company will act on the complaint.
The video that remained after the live-stream ended was viewed nearly 4,000 times – which doesn’t include copies of the video uploaded to other sites and to Facebook by other users.
It’s unclear how many of the people who saw it were minors; youth as young as 13 are allowed to set up Facebook accounts and could have encountered unfiltered footage of murderous hatred.
It’s past time for the company to step up and fulfil the promise its founder and CEO, Mark Zuckerberg, made two years ago, after Godwin’s murder: “We will keep doing all we can to prevent tragedies like this from happening.”
A Simple Time-delay
In the television industry, short time-delays of a few seconds are typical during broadcasts of live events. That time allows a moderator to review the content and confirm that it’s appropriate for a broad audience.
Facebook relies on users as moderators, and some live-streams may not have a large audience like TV, so its delay would need to be longer, perhaps a few minutes. Only then would enough adult users have screened it and had the chance to report its content.
Major users, including publishers and corporations, could be permitted to live-stream directly after completing a training course. Facebook could even let people request a company moderator for upcoming live-streams.
Facebook has not yet taken this relatively simple step – and the reason is clear. Time-delays took hold in TV only because broadcasting regulators penalised broadcasters for airing inappropriate content during live shows.
Whether and how to regulate social media is a political question, but many US politicians have developed deep ties with platforms like Facebook. Some have relied on social media to collect donations, target supporters with advertising and help them get elected. Once in office, they continue to use social media to communicate with supporters in hopes of getting reelected.
Federal agencies also use social media to communicate with the public and influence people’s opinions – even in violation of US law. In my view, Facebook’s role as a tool to gain, keep and spread political power makes politicians far less likely to rein it in.
US Regulation Isn’t Coming Soon
Congress has not yet taken any meaningful action to regulate social media companies. Despite strong statements from politicians and even calls for hearings about social media in response to the New Zealand attack, US regulators aren’t likely to lead the way.
European Union officials are handling much of the work, especially around privacy. New Zealand’s government has stepped up, too, banning the live-stream video of the mosque massacre, meaning anyone who shares it could face up to NZ$10,000 in fines and 14 years in prison. At least two people have already been arrested for sharing it online.
Facebook Could – And Should – Act Now
Much of the discussion about regulating social media has considered using anti-trust and monopoly laws to force the enormous technology giants like Facebook to break up into smaller separate companies.
But if it happens at all, that will be very difficult – breaking up AT&T lasted a decade, from the 1974 lawsuit to the 1984 launch of the “Baby Bell” companies.
In the interim, there will be many more dangerous and violent incidents people will try to live-stream. Facebook should evaluate its products’ potential for misuse and discontinue them if the effects are harmful to society.
No child should ever see the sort of “raw and visceral content” that has been produced on Facebook Live – including mass murder. I don’t think adult users should be exposed to witnessing such heinous acts either, as studies have shown that viewing graphic violence has health risks, such as post-traumatic stress.
That’s why I’m no longer recommending just a live-stream delay for adolescent users – it was an appeal to protect children, when more major platform changes are unlikely.
But all people deserve better and safe social media. I’m now calling on Mark Zuckerberg to shut down Facebook Live in the interest of public health and safety.
In my view, that feature should be restored only if the company can prove to the public – and to regulators – that its design is safer.
Handling live-streaming safely includes having more than enough professional content moderators to handle the workload.
Those workers also must have appropriate access to mental health support and safe working environments, so that even Facebook employees and contractors are not unduly scarred by brutal violence posted online.
(This is an opinion piece and the views expressed above are the author’s own. The Quint neither endorses nor is responsible for the same. This article was originally published on The Conversation. Read the original article here.)