Former Facebook Programmer Pleads For Stricter Regulation After New Zealand Attack

Remigio Civitarese
Marzo 17, 2019

The video, which shows a first-person view of the killings in Christchurch, New Zealand, was readily accessible during and after the attack - as was the suspect's hate-filled manifesto. "What's going on here?" she said, referring to the shooter's ability to livestream for 17 minutes.

New Zealand police urged people not to share the footage, and many internet users called for tech companies and news sites to take the material down.

The live footage of Friday's attacks, New Zealand's worst-ever mass shooting, was first posted to Facebook and has since been shared on Twitter, Alphabet Inc's YouTube and Facebook-owned Whatsapp and Instagram.

"Police alerted us to a video on Facebook shortly after the live-stream commenced, and we quickly removed both the shooter's Facebook and Instagram accounts and the video", the company said on its Twitter account.

"While Google, YouTube, Facebook and Twitter all say that they're cooperating and acting in the best interest of citizens to remove this content, they're actually not because they're allowing these videos to reappear all the time", Lucinda Creighton, a senior adviser at the Counter Extremism Project, an worldwide policy organization told CNN. Authorities have charged a 28-year-old man in the attack and took others into custody.

Facebook has spent years building artificial intelligence and in May 2017 it promised to hire another 3,000 people to speed the removal of videos showing murder, suicide and other violent acts. Still, the problem persists. "We are working with social media platforms, who are actively removing this content as soon as they are made aware of an instance of it being posted".

A Twitter spokesperson said in a statement released to media that it has "rigorous processes and a dedicated team in place for managing exigent and emergency situations such as this". "We also cooperate with law enforcement to facilitate their investigations as required", it said. "We are working to have any footage removed".

Damian Collins, the chairman of the Commons culture committee, said it appeared to be a "terror attack designed for social media" and demonstrated why there had to be "statutory regulation of the distribution of content online through social networks".

"Any content containing links to the video stream are being removed in accordance with our site-wide policy", the popular message boards platform told The Washington Post in a statement.

Users intent on sharing the violent video took several approaches ― doing so at times with an nearly military precision.

In a 15-minute window, Reuters found five copies of the footage on YouTube uploaded under the search term "New Zealand" and tagged with categories including "education" and "people & blogs". Others shared shorter sections or screenshots from the gunman's livestream.

All platforms encourage reporting such videos.

With billions of users, Facebook and YouTube are "ungovernable" at this point, said Vaidhyanathan, who called Facebook's livestreaming service a "profoundly stupid idea". "Because if you do, sharing this video is exactly how you do it", Moore said. He added: "Take some ownership".

Altre relazioniGrafFiotech

Discuti questo articolo

Segui i nostri GIORNALE