A Thai soldier who killed at least 26 people in a shooting over the weekend was allowed to post live to Facebook for more than five hours during the massacre, despite the tech giant pledging to do better to remove violent content during active incidents.

Over the weekend, the Thai solder went on a killing rampage in the northeastern Thai city of Nakhon Ratchasima, mostly taking place in the Terminal 21 shopping centre.

The alleged shooter, 32-year-old Jakrapanth Thomma, was shot dead by authorities on Sunday morning.

The siege saw at least 26 people killed and 57 injured.

Ratchasima posted live updates during the attacks to his Facebook profile.

Despite the attack beginning at 6pm in Thailand, his account was not removed until about 11pm that night.

Before the killings began, Ratchasima posted a photo of a pistol with three sets of bullets along with the caption “It is time to get excited” and “nobody can avoid death”.

He posted updates of the attack in the hours after, including questioning whether he should surrender or not.

The man also posted a live-streamed video of himself during the attack, where he was seen wearing an army helmet and holding a gun while talking about how tired he was.

The live-streamed video did not show any of the actual violence.

Facebook only removed the account after a request from the Thai Ministry of Digital Economy and Society, BBC has reported.

The account was removed at about 11pm, five hours after the massacre had begun.

“We have removed the gunman’s presence on our services and have found no evidence that he broadcasted this violence on FBLive,” a Facebook spokesperson said.

“We are working around the clock to remove any violating content related to this attack.

“Our hearts go out to the victims, their families and the community affected by this tragedy in Thailand.

“There is no place on Facebook for people who commit this kind of atrocity, nor do we allow people to praise or support this attack.”

Facebook now has a team of 15,000 people who are tasked with reviewing content posted on the platform in 50 languages, but the company still predominantly relies on users reporting content for it to be flagged.

Its ability to deal with mass murderers’ utilising the platform and posting live violent content was called into question following the Christchurch terrorist attack last year.

Fifty people were killed during the Christchurch attack, which was live-streamed to Facebook for 17 minutes by the shooter and was widely shared across the internet.

It led to growing pressure on Facebook to improve how it deals with violent and hateful content, especially during active incidents.

Facebook said it removed 1.5 million videos of the attack around the world in the day following it, with 1.2 million of these blocked at the point of upload.

The Christchurch Call was later signed by representatives from 18 countries and eight global tech companies, including Facebook.

The Call vowed to stop the spread of terrorist and violent extremist content online and set industry standards to prevent the use of the platforms to disseminate this content.

“Many of you have rightly questioned how online platforms such as Facebook were used to circulate horrific videos of the attack...we have heard feedback that we must do more – and we agree,” Facebook chief operating officer Sheryl Sandberg said following the Christchurch attack.