UK Prime Minister Theresa May has called on countries around the world to unite to implement new cyberspace regulations to stop the spread of extremism on platforms like Facebook and Twitter.

Speaking following the terror attack in London over the weekend that saw seven people killed, May said that there needs to be new international agreements to regulate the internet and criticised tech companies for not doing enough to prevent the spread of extremism online.

“We cannot allow this ideology the safe space it needs to breed,” May said in her speech.

“Yet that is precisely what the internet and the big companies that provide internet-based services provide. We need to work with allied democratic governments to reach international agreements that regulate cyberspace, to prevent terrorist and extremist planning. And we need to do everything we can at home to reduce the risks of extremism online.”

The speech has placed further scrutiny on the role that tech companies like Facebook and Twitter play in the spreading of radical and extreme points of view, and how this is balanced with freedom of speech.

Facebook especially has faced intense criticism of its hosting of violent videos and hate speech, with its moderation policies -- recently leaked to the media -- allowing much of this content to remain.

But Facebook director of policy Simon Milner quickly hit back following May’s comments, saying the social media platform is already doing a lot to combat extremism.

“We do not allow groups or people that engage in terrorist activity, or posts that express support for terrorism. We want Facebook to be a hostile environment for terrorists,” Milner said in a statement.

“Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it. If we become aware of an emergency involving imminent harm to someone’s safety, we notify law enforcement.”

Twitter was also quick to respond to May’s speech, saying it had proactively suspended nearly half a million accounts in the second half of 2016 and would be continuing to crack down on extremism.

“Terrorist content has no place on Twitter,” UK head of public policy at Twitter Nick Pickles said in a statement.

“We continue to expand the use of technology as part of a systematic approach to removing this type of content.”

May’s speech has placed further pressure on Facebook especially, which is already facing calls from European politicians to be treated more as a publisher and be held more accountable for the content it hosts.

The social media giant recently lost a court battle in Austria, with the court ruling that it had to remove hate speech globally, rather than just in the specific country where it was posted.

Facebook recently hired 3000 new employees to work in the moderation team, aiming to removing violent content and hate speech in accord with its controversial moderation policy.