Social Media Platforms Rush To Stop Spread Of New Zealand Shooting Footage

Remigio Civitarese
Marzo 17, 2019

But hours after the attack copies of the video were still available on Facebook, Twitter and Alphabet Inc's YouTube, as well as Facebook-owned Instagram and WhatsApp. "We also cooperate with law enforcement to facilitate their investigations as required". In 2017 it said it would hire 3,000 people to review videos and other posts, on top of the 4,500 people Facebook already tasks with identifying criminal and other questionable material for removal.

Rasty Turek, CEO of Pex, a video analytics platform that is also working on a tool to identify re-uploaded or stolen content with YouTube has told The Verge that it is almost impossible to stop live streams as they happen since the content is always changing. New Zealand Prime Minister Jacinda Ardern also labeled it a "terrorist attack".

Action was not taken to remove it until New Zealand police alerted Facebook.

Other violent crimes that have been live-streamed on the internet include a father in Thailand in 2017 who broadcast himself killing his daughter on Facebook Live.

Because it's 2019, and livestreaming has had five years or so to really build up into a mainstream activity that people actually do, this means that horrific acts of violence and terror around the world have a greater-than-zero chance of having some video component attached to them.

But the viral reach of yet another obscene video caused politicians around the globe on Friday to voice the same conclusion: Tech companies are failing.

PewDiePie, whose real name is Felix Kjellberg, said on Twitter he felt "absolutely sickened" that the alleged gunman referred to him during the livestream.

Also in a social media post just before the attack, an account that is believed to belong to one of the attackers posted a link to an 87-page manifesto that was filled with anti-immigrant, anti-Muslim ideas and explanations for an attack.

Facebook told CNET it had removed the unverified footage and was also pulling down "praise or support" posts for the shootings.

Users intent on sharing the violent video took several approaches.

YouTube, which publicises itself as being on the forefront for taking down copyrighted content on its platform, was unable to take down videos that contained at least a part of the massacre's footage.

The Australian Broadcasting Corporation reported that Tarrant was a personal trainer in Grafton, New Zealand.

New Zealand authorities said that three people had been arrested, but their identities were not made public.

"The government has been clear that all companies need to act more quickly to remove terrorist content".

In footage that at times resembled scenes from a first-person shooter video game, the mosque shooter was seen spraying terrified worshippers with bullets, sometimes re-firing at people he had already cut down. "That's unacceptable, it should have never happened, and it should have been taken down a lot more swiftly".

"Our hearts are broken over today's awful tragedy in New Zealand", YouTube, which is operated by Google, said in a Twitter posting.

Altre relazioniGrafFiotech

Discuti questo articolo

Segui i nostri GIORNALE