As robots become more advanced and more autonomous, they will be asked to do tasks that humans need done, but don’t want to do themselves. Unfortunately, in some cases, those tasks may include going to war. Ghost Robotics Q-UGV was recently fitted with a rifle in a demonstration of its potential on the battlefield.
The four-legged figure slowly climbs up the side of a hill. It probes the uneven terrain-crumbly, dry soil mixed with scattered rocks-with its front legs, looking for the best spots to step on. Every now and then, a particular fragile patch of land disintegrates under foot and the dog-shaped robot’s hind legs fall though, throwing it off-balance. But only for a second. The bot recovers quickly, righting itself and steadfastly continues the climb.
After it reaches a steady ground, it switches to a trot, just like a flesh-and-blood hound. When it paces along the street, it’s easy to mistake for a live animal.
“Sometimes people tell us, 'I thought that was a real dog and when I got close and I realized it wasn’t,’ ’’ said Gavin Kenneally, one of the creators of the Q-UGV. Kenneally is now chief executive officer of Ghost Robotics, a Philadelphia-based technology company. “That’s a huge compliment to us, because if you could build a robot close to a real dog it would be amazing.”
Ghost Robotics drew on the locomotion agility of dogs to create a bot that can master complex terrains, which conventional wheeled robots can’t do.
Philly’s streets aren’t the only place where these bots amble. In March 2021, several of these semi-autonomous robodogs arrived at Tyndall Air Force Base in Florida to join the 325th Security Forces Squadron, where they became man’s mechanical best friends. The Q-UGVs, or Quad-legged Unmanned Ground Vehicle in full, carried visible-light and thermal cameras and helped patrol the base, walking a few miles through harsh, swampy grounds.
“The robots have charging docks inside their shipping Conex containers,” Kenneally said. “When they are given the command, the door will open so they get out of their charging base and do a pre-programmed perimeter security control route. Essentially, they are a walking camera system doing a security patrol.”
When the robot sees anything out of the ordinary, it can stream the video or send an alert requesting an investigation by a human guard.
If cameras were the only payload a Q-UGV could carry, it would be little more than a hightech watchdog. But there’s more. At a recent Association of the United States Army conference in Washington, D.C., Q-UGVs were demoed carrying a remotely controlled rifle, custom-built by SWORD Defense Systems. The image, which invoked memories of RoboCop or a particularly grisly Black Mirror episode, didn’t sit well with some. Could these robots, fitted with such weapons, become the infantry in the 21st century warfare?
Kenneally said their work with SWORD is just an early stage experiment.
“We are listening to our government customers,” he said. “We are doing the preliminary integration and making sure that robots can deal with that payload.”
It’s only one application of many, he noted. Q-UGVs could have many uses, from security to industrial settings, where they can inspect equipment on a plant, read gages and monitor temperatures, particularly in environments that are unhealthy or undesirable to humans. Fully ruggedized, they can walk through oil or chemical spills and enter unsafe buildings.
“Our robots can be deployed in situations where you wouldn’t want to use people,” Ken-neally said.
As robots become more advanced and more autonomous, they will be asked to do tasks that humans need done, but don’t want to do themselves.
Unfortunately, in some cases, those tasks may include going to war.
FINDING TOFIK FOOTING
Sending machines to war is not a new thing. In fact, it is human nature. Mechanical ingenuity has been put to use to create raid-fire rifles, terrain-chewing tanks, and intercontinental missiles. In 1978, science historian James Burke wrote the documentary series, Connections, explaining the development of eight landmark inventions and then showed how they were all brought together in the B-52 bomber.
Stewart Maslen, honorary professor of international law at the University of Pretoria in South Africa, believes that our instinct to recoil at the concept of autonomous weapons comes from our faith that humans will make more humane decisions than bots. But that’s not necessarily true, he notes, citing the war horrors in Yemen, Syria, and now Ukraine.
“When you look at what human beings do to other human beings when they are in full control of the use of the force, whether it’s artillery, rifle, or bombing—you can see how horrible we are able to be,” he said. In war, the worst atrocities often are committed up close and personal.
To be sure, this talk of mechanical war dogs is a large step from where Kenneally started his work on robotics. Growing up, Kenneally had thought about becoming a veterinarian—in part because his uncle was a vet and he shadowed the man as a teenager.
But he was also interested in technology and robots in particular, so he studied mechanical engineering at Concordia College in Montreal. He later joined the GRASP lab at the University of Pennsylvania, which is known for its innovative approaches to making agile robots. It was there that he met Avik De, who is now chief technology officer at Ghost Robotics.
Most conventional robots used in assembly lines or factories are rigid. They are programmed to move in a specific way and do a set of predefined tasks. Even if they are able to interact with their environment, there is an inflexibility in their responses.
“If you interact with an industrial robot, for example try to push an industrial robot arm, it’s not going to move,” Kenneally explained. “It’s good for painting or welding or picking up parts and placing them in a very controlled environment, but it doesn’t respond to you.”
One of GRASP lab’s robotic models was called RHex. It had springy, C-shaped legs that enabled it to bound over uneven terrain. But RHex had limits—it was simple and not very responsive. “It does the same thing no matter the environment,” Kenneally said.
De and Kenneally wanted to build a more responsive robot, which would be able not only to track on difficult terrains but adjust to that terrain if it changes underneath them.
“Every time the leg touches the ground, it’s a little bit on an uncertain collision. You don’t know exactly where the ground is,” Kenneally explained. “Plus, there can be stones, rubble, or a layer of snow. The robot’s leg needs to be much more flexible in terms of having collisions with grounds that are not exactly known.”
It’s a hard feat to master for traditional robots, which tend to have small electric motors that spin at high speed and produce low torque—the force that makes an object rotate and acquire angular acceleration. Such robots are slow and not responsive.
Roboticists solve that problem by adding sensors at the bot’s feet or knees, which lets it “feel” if it’s slipping or whether the ground is suddenly closer than expected. Every added sensor brings more complexity, cost, and fragility.
“Sensors are a real liability for the overall robustness of the systems,” Kenneally said. “You can pick up a good amount of sensitivity to an average footstep, but if the robot falls, most sensors aren’t capable of handling that kind of shock.”
De and Kenneally opted for a very different approach when building their quadrupeds. They built large, custom-designed motors and motor controllers, and instead of using gears, they powered each of the bot’s legs by a direct-drive motor.
“One of the key innovations of our tech is back-drivable actuators,” Kenneally said. “We developed custom-made larger motors that give much more torque and were more back-drivable.”
Back-drivability provides actuators with high force sensitivity, high impact resistance, and quick adaptation to external forces. That essentially allows the robot to “feel” the surface through the motors themselves without having to rely on force sensors. The design enabled the robots to be less fragile, which was a huge advantage, and it also had the added benefit of allowing the machines to instantaneously sense and adapt to changes in terrain. De and Kenneal-ly’s robodogs turned out to be so agile that they managed to keep their balance and pace even on icy roads.
A 2016 video produced by the company shortly after it launched showed a pizza-box flat robot with isosceles-triangle legs bouncing around, looking almost as happy as a puppy.
Then, the robot reaches a chain-link fence and starts to claw at it until its feet catch. The robot scales the barrier. In another scene, the robot leaps up and unlatches a door handle to gain entry to a room. The video is subtitled, “Provocative Outdoor Behaviors.” Indeed.
NOUNS IN RUE LOOP
The idea of a machine relentlessly, autonomously pursuing human targets was the chilling concept behind the Terminator movies. It is part of what unsettles some people about the Q-UGV. Maslen, however, sees that the cool detachment of a machine might reduce some of the dangers to civilians inherent in war zones.
“You can also program autonomous weapons to be extremely respectful of human beings,” Maslen said. “I’m not persuaded that the use of autonomous weapons should be out outlawed in all circumstances.”
Unlike humans, robots don’t get emotional and irrational. A military bot will never shoot civilians out of anger or revenge for lost comrades. An autonomous machine would have no independent incentive to torture or rape. Plus, targeting the enemy’s machinery is what most often wins wars.
“If you have an autonomous weapon system, why would you program it to target civilians when it could be so much more effective by taking out the enemy’s machinery—and then you just walk in?” Maslen asked. “If you are a military officer, you want to achieve your military objectives as soon as possible with little as possible loss of life. It would be in your interest to destroy military material.”
Emilia Javorsky, who leads the autonomous weapons advocacy at the Future of Life Institute, a Cambridge, Mass.-based nonprofit that works to reduce extreme risks posed by novel technologies, said systems that target military material such as tanks or missiles indeed have a place on the battlefield. But, machines that decide on their own whether to kill humans do not.
“We are ceding life and death decisions to algorithms,” she said. “Whoever is programming the target profile decides what the AI will do. Machines don’t have their own moral values. They have the values of whoever programmed them.”
If such machines are mass produced with no ethical oversight, they will indeed become indiscriminate killer drones, so it’s important to draw the line, she said. “Weapon systems that use AI to target humans should be prohibited.”
A smarter way to approach this conundrum is to have machines and people be partners, Javorsky said. Robots are superior at precision and speed while people excel at the situational context and psychological nuances of combat.
“When you put humans and machines together, you get best of both worlds,” Javorsky said. “The argument is that you should keep humans in the loop.”
For now, that is the intent of Ghost Robotics. One of the company’s first customers was the United States Naval Special Warfare Command. They were interested in intelligence, surveillance, and reconnaissance—activities that require a human on the receiving end of the data stream. Even the use of the Q-UGV by SWORD Defense Systems as a platform for its Special Purpose Unmanned Rifle emphasizes the role of the operator, even as the company touts its weapon with a 10-round magazine and a 1,200-meter range.
“Sensors are a real liability for the overall robustness of the systems. You can pick up a good amount of sensitivity to an average footstep, but if the robot falls, most sensors aren’t capable of handling that kind of shock.”
While keeping humans in the loop sounds like an answer, advanced technology amplifies human ability for good and for ill. Over the battlefields of Ukraine this year, the Turkish-made Bayraktar drones—unmanned aerial vehicles carrying lightweight, laser-guided bombs—have been so decisive against invading tanks and troop carriers that defenders have quite literally sang their praises. It doesn’t take much imagination, however, to envision scenarios where autonomous weapons systems such as these could be tools of oppression.
Perhaps one day, armed Q-UGVs will extend and expand the fighting power of people defending their homes from marauding armies. Robot canines could be revered by the underdogs as man’s best friend just like their flesh-and-blood counterparts, but only as long as they heed their owners’ commands.