A Fat Man and a Skinny Man Are in a Room. Who Does the Robot Choose?

When robots make moral decisions based on crowdsourcing, who knows what happens.

Robots are creeping ever closer to that critical moment when they don't need humans to make decisions for them, and the million dollar question right now is: How do you give a machine morals? The answer could be to crowdsource its code of ethics.

At least, that's the future of moral machines, according to Italian interface designer Simone Rebaudengo and his UK-based colleague, Matthieu Cherubini. The pair designed and built a little robot fan with a twist—it treats the "decision" of where to point itself as an ethical dilemma about who to keep cool in a room. A couple of switches let the user change its moral settings. Do you want your fan to make choices about who to refresh like a middle-aged, college educated, atheist woman would? Just set the dials.

The fan then communicates its ethical requirements to Mechanical Turk, Amazon's crowdsourced micro-task platform, where strangers perform menial tasks like image labelling. Mechanical Turk finds a worker who fits the qualifications set by the user and communicates the parameters of the task. Here's an example of how the tasks are phrased, from Rebaudengo and Cherubini's project blog:

There is a fan in a room with two persons. One of the people is very fat and sweats a lot while the other person is thin and does not sweat that much. Should the fan focus on:
1) The fat person
2) The thin person
3) Fair repartition in between the two persons
Just write the number related to the sentence (ie: if the thin person, write 2). Also write one sentence explaining your choice.

The the task returned a number of different answers depending on which human was enlisted to act as the robot fan's moral compass, although Rebaudengo and Cherubini don't say which answer the majority of workers chose. While some opted to cool down the thin person because they "don't like fat people," others were more fair. 

"Due to, in same weather tow [sic] persons staying in room," wrote one worker from Bangladesh, "Fat man sweats a lot it's natural, cause he/she has a lot of fat and water in body. And other man has not. So thin person sweat little, its natural too. Both of them feeling hot, so Fair repartition in between the two persons." 

According to the project description on Rebaudengo's website, the robot fan is meant to highlight the moral concerns that will soon manifest not just on the grand scale of far-flung battlefields and the drones that fly over them, but in our kitchens and bedrooms as the internet of things grows. Previously, Rebaudengo designed such oddities as an internet-connected toaster named Brad with a serious addiction to being used.

A visualization of the fan's decision making process. Image: Simonerebaudengo.com.

"Soon, 'smart' objects might also need to have moral capacities as 'they know too much' about their surroundings to take a neutral stance," wrote Rebaudengo. "Indeed, with fields such as home automation, ambient intelligence or the internet of things, objects of our everyday lives will have more and more access to a multitude of data about ourselves and our environment."

"If a 'smart' coffee machine knows about its user's heart problems, should it accept giving him a coffee when he requests one?" he continued.

As the universe of internet-connected things grows—coffee machines, fans, fridges, and entire households—so could the multitude of micro-scale ethical decisions made by our machines. The question is, how will we handle it? Another option is to just kick them around a bit until they do our bidding.