Gunmen staged the terrorist attacks on two mosques packed with worshipers attending Friday prayers in the South Island's city, which killed 49 people and injured 48 others. The company said 1.2 million of those videos were blocked from upload but it's not clear how many people watched the 300,000 videos that made it through the cracks before they were deleted.
Sajid Javid, the Home Secretary, condemned tech firms' failure to stop the 17-minute video being shared for more than 10 hours after the self-professed white supremacist killer Brenton Tarrant opened fire on Muslims at prayer.
As the alleged gunman callously picked off his victims in Christchurch's Al Noor mosque, he livestreamed the gruesome scene on Facebook Live, apparently using a camera mounted on his body, after also tweeting a racist "manifesto".
For digital platforms, the challenge of policing a live video is enormous.
The company said it started "hashing" the Christchurch shooting video, which means that videos posted on Facebook and Instagram that are visually similar to the original can be detected and automatically removed.
Social media regularly comes under criticism for not doing enough to curb extremism and hate speech, with no single platform left unscathed by controversy.
"If we can cut down the messenger, we can we can put strict regulations on these big social media sites once they reach a certain membership this can have a big impact on actually limiting you know all of this". "It's used in a way in which it becomes integral part of the crime", Helfgott told InsideEdition.com.
Social media expert and Buzzfeed journalist Craig Silverman said the killer "created the equivalent of a multiplatform content strategy" that was "meticulously planned". Although the feed was soon taken down, the video was copied and more than 235,000 people viewed it on Facebook.
The Twitter account, which violated the platform's policies, has since been suspended and the site is working on taking down any existing video content related to the violence.
"The rapid and wide-scale dissemination of this hateful content - live-streamed on Facebook, uploaded on YouTube and amplified on Reddit - shows how easily the largest platforms can still be misused", Warner said in a statement.
Ms Mia Garlick, a spokesman for Facebook in New Zealand, said: "We will continue working directly with the New Zealand police as their response and investigation continue".
Authorities also asked members of the public to please not share the harrowing footage.
"We would strongly urge that the link not be shared".
Facebook, Twitter, Alphabet Inc and other social media companies have previously acknowledged the challenges they face policing content on their platforms.
- One Charged With Murder in Wake of Christchurch Shootings
- OPEC cuts oil production in February despite pressure from Trump
- Cavs Players Clown on Odell Beckham Jr.’s Move to Cleveland Browns
- Pep Guardiola refuses to discuss potential Manchester City quadruple
- DUP issues its final demands to get May's deal over line
- New Zealand mosque's imam: 'We still love this country'
- How the New Zealand mosque shootings unfolded
- One million people sign petition calling for Fraser Anning to leave parliament
- New Zealand mosque attack suspect charged with murder, appears in court
- Christchurch Test cancelled after shooting at mosque