In Kinshasa, capital of the war-torn Democratic Republic of Congo, traffic congestion is a serious problem. Apparently, few drivers bother to obey signs, lights, or even human traffic directors. It was a snarled free-for-all. Until the robots showed up. The Congolese engineer Isaie Therese designed and built two 8-foot-tall, classic Robbie the Robot-style automatons to take over traffic-directing duties from their human forebears, and apparently, the plan is working.
“As a motorcyclist I’m very happy with the robot’s work," one commuter told CCTV Africa. "Because when the traffic police control the cars here there’s still a lot of traffic. But since the robot arrived, we see truly that the commuters are respectful.”
The tale of a giant, traffic-directing robot solving an urban problem in the Congo proved irresistible to the press, and the story spread widely and quickly. But as with many such stories, the biggest looming question is whether the new technology will actually prove to be a sustainable, lasting solution. And the answer to that question hinges on whether people will obey their robot overlords.
So. Why are people obeying the robot traffic director when they didn't heed the human one? Might it simply be the novelty of seeing an imposing metallic hulk where there used to be an overwhelmed traffic director? The kind of novelty apt to wear off before long, as many past technologies aimed at solving social woes have shown prone to do?
There's actually been very little research or data gathered on how likely humans are to obey robots. What might be the closest relevant study, Would You Do as a Robot Commands?, carried out by researchers at the Human-Computer Interaction Lab at the University of Manitoba, did find that people will respond to commands issued from robots.
The study asked subjects to complete a number of repetitive, mind-numbing tasks, like clicking on a moving target on a computer screen or changing the extension of files while seated across from a small human-like robot that would issue directives as they went along. Sure enough, the paper found that the presence of a robot directing them to continue or complete their work did in fact improve compliance. The researchers concluded that, "at the very least, people can be pressured by a robot to continue a highly tedious task."
"The results show that the robot had an authoritative social presence: A small, childlike humanoid robot had enough authority to pressure 46% of participants to rename files after 80 minutes, even after indicating they would like to quit," the researchers wrote.
Much study is left to be done in this arena, but it's an interesting result. And driving is in many ways the ultimate tedious task: stopping every few hundred feet, waiting for lights to turn, keeping a respectful distance from the driver doing 10 mph under the speed limit in front of you. Ostensibly, the giant authority-bot looming over the snarl and its booming commands could give people the extra incentive to stop, wait, and maintain order.
They may not look like silvery frankensteins, but automated machines designed to facilitate compliance to mundane tasks are precisely the sort we're already obeying—automatic toll collectors, supermarket checkout computers, and the automated voice commands in public transit. These we obey without a thought, though they're generally installed in relatively frictionless environments.
If they're shown to improve obedience, seeing a humanoid robot that projects authority in similar situations may not be unthinkable in the near future. Our speculative fiction has long fretted with the dark side of this prospect; they're robot overlords, after all. Machines enforcing order on behalf of a security state is as common a scenario in dystopian sci-fi as a nuclear holocaust. Just last summer, the robo-guards of Elysium were gut-checking Matt Damon into submission. At best, we're certainly uneasy with the idea of introducing more robots to smooth the social order, philosophically. That sort of sci-fi scenario is certainly a far-off nightmare, and will be more difficult to achieve than we might fear.
"Maintaining a robot’s status as an authority figure is a complex and multi-faceted problem," the Human-Computer Interaction lab researchers write. But they did note that "robots can indeed pressure people to things they would rather not do." They found that "Even after trying to avoid the task or engaging in arguments with the robots, participants still (often reluctantly) obeyed its commands." And that's just a tiny robot taskmaster. The researchers did note that some of the compliance may be due to the bot's novelty, but also noted that it might get even more compliance if it wasn't designed to be friendly-looking.
Still, even the friendly Congolese traffic bots carry just a hint of foreboding—the robot is designed to clear up congestion, sure, but also to collect revenue for the state.
“If a driver says that it is not going to respect the robot because it’s just a machine the robot is going to take that and there will be a ticket for him,” Therese, its engineer, has said of its efficacy. “We are a poor country and our government is looking for money. And I will tell you that with the roads the government has built, it needs to recover its money.”
How and when we obey robots in social settings is going to be a fraught consideration in coming days, not only as they continue to take our jobs, but as our everyday interactions with computers, mobile apps, and other, even smarter, technologies become more complex and frequent.