The social media platform is introducing a "one-strike" policy for its Facebook Live feature, which will temporarily restrict access for users who have broken the social network's guidelines and faced disciplinary action, the company said in a statement.
Ahead of a summit on online extremism, Ardern was responding to a question asking whether countries can learn from New Zealand. It's a global push for major tech companies and governments worldwide to prevent the spread of internet hate content, which leads to radicalization and, in the case of the Christchurch mosque shootings, mass violence.
Facebook also said that it was partnering with The University of Maryland, Cornell University, and the University of California, Berkeley to figure out a way to identify videos of events like the Christchurch mosque attack that are edited in a way to make them now hard for Facebook's systems to identify.More news: Aston Villa into playoff final after shootout win over West Brom
The decision puts the United States at odds with US tech companies including Facebook and Google, which are expected to support the effort.
Facebook's move came hours before its executives would face the prime minister of New Zealand, where an attacker killed 51 people in March - and livestreamed parts of it on Facebook.
Facebook toughened its livestreaming policies Wednesday as it prepared to huddle with world leaders and other tech CEOs in Paris to find ways to keep social media from being used to spread hate, organize extremist groups and broadcast terror attacks.
"This work will be critical for our broader efforts against manipulated media, including deepfakes", Rosen said, a reference to videos altered using artificial intelligence.More news: Two coffees a day could help you live longer, research suggests
Knott said the United States should be involved in the effort, and anxious that its refusal to engage could undermine an issue of global importance - particularly because virtually all of the major social-media companies are based there.
Prior to this, Facebook had simply taken down content that violated its community standards, and if that person kept posting violating content they'd be blocked from the whole platform for a period of time. But "ultimately the regulation of these tools that transmit information should be a matter for governments, not just the whims of private companies".More news: John Daly responds to Tiger Woods' jab: You don't get it