The Turkish military has taken delivery of the first-ever drone with mounted machine guns that can shoot a person-sized target from 200m away.
Weighing 25kg and providing enough lift to carry the machine gun and 200 rounds of ammunition, Asisguard’s Songar drone will be delivered to the country’s government before year’s end.
Drones have long been used to drop explosives on targets, but Songar marks a new era of morally-ambiguous airborne weapons that had been technically impractical until now.
Researchers have figured out how to manage the physical forces as drones interact with physical objects – as in, for example, a roof-repairing drone equipped with a nail gun – but the recoil from a machine gun would cause most drones to spin uncontrollably.
Songar uses cameras and a laser rangefinder to compensate for distance, angle and wind speed while a gimbal system – conceptually similar to those used to keep drone and action cameras rock-steady – cancels the effect of the kickback.
The new aerial battlefield
Tactical use of deadly drones has been a concern for years – this year’s Hollywood blockbuster ‘Angel Has Fallen’, for one, featured a spectacular assassination attempt by a swarm of facial recognition-guided explosive drones – but actually equipping a drone to shoot targets from hundreds of metres away is “a bit of a game-changer”, says Oleg Vornik, CEO of Australian anti-drone maker DroneShield.
DroneShield has built a solid export market for its DroneGun devices, which let authorities monitor and jam drones over populated or sensitive areas such as France’s Bastille Day celebrations in July or this month’s Southeast Asia Games, where seven drones were detected and disabled.
The prospect of drones that can shoot back, however, “is quite scary really”, Vornik told Information Age.
While he is confident DroneShield’s core technology would work as well against Songar as against other drones – all of which use similar RF-based technologies to transmit commands to the drone and live video back to operators – evolving technology could complicate the situation as killer drones leverage AI and automation to recognise and attack targets without human control.
“Imagine sending 100 of those drones with machine guns and you have a little army of Terminators that can fight in any kind of situation,” he said – presaging a situation that is already in motion as reports suggest Turkey is also set to adopt swarming STM Kargu drones in Syria by next year.
Tightening regulations for safety
Managing drone activities has been an ongoing issue for regulators, with a government enquiry into drone noise recently attracting 92 submissions and CASA this year working to tighten drone rules that balance safety with convenience.
Manufacturers are working in step with regulators to increase drones’ self-awareness, with market leader DJI, for example, offering Aeroscope technology that can identify drones and locate their operators.
From 1 January, all DJI drones over 250g will be equipped with AirSense, a technology based on widely-used ADS-B technology that will broadcast the drone’s identification and location as well as allowing it to detect nearby helicopters and airplanes.
“Every day we are blown away with new applications of this very new, very exciting technology,” said DJI corporate communication director Adam Lisberg, who noted that it is “enormously frustrating” having to counter the stream of negative publicity fuelled by incidents such as last year’s drone shutdown of Gatwick Airport.
“The technology already exists to deal with the problems that people have with drones,” he said, offering “a round of applause” to CASA for its proactivity around drone regulations and cautioning naysayers not to let their imagination fly away from them.
“We can think of all sorts of crazy scenarios,” he said.
“In practice there is a lot of good, smart, creative work being done in labs around the world about how to have drones do smart things autonomously – but regulatory requirements will require humans at the controls.”
Can flying death machines be ethical?
Yet military drones have long anticipated technology that would remove humans from the loop, with the first global Meeting of Experts on Lethal Autonomous Weapons (LAWs) held in Geneva in late 2014.
As militarised drones push armed conflict into the skies, governments will need to fast-track decisions around the rules of engagement for self-targeting technology that can now easily cross national borders and bypass ground-based defences.
A growing body of AI researchers have pushed back against military applications for their technology – with employee pressure driving Google to pull out of the US Department of Defence’s Project Maven surveillance-drone project (recently transferred to military contractor Palantir).
Groups like the Campaign to Stop Killer Robots argue for a total ban that would paint autonomous flying killing machines in the same vein as UN bans on biological and chemical weapons.
Some have argued that LAWs could and should only be used to make conflicts more ethical, while a UNSW Canberra academic recently commenced a $9m study exploring necessary ethical constraints on the systems.
Yet for all the ethical discussions, Vornik warned that longstanding moral objections to such technology would evaporate as armies faced the need to fight back against adversaries using the killer drones.
“Although they’re designed for a good purpose – taking soldiers off the field and replacing them with drones – these things inevitably end up in the wrong hands,” he explained.
“If your enemy is going to have better technology, there is a strong argument about deploying the technology regardless of what you think about the ethics of it.”