FYI.

This story is over 5 years old.

Tech

This Moth-Driven Robot Is a Step Towards Automatons That Mimic Life

To model how an animal tracks scents, a team from the University of Tokyo developed a tiny robot and gave it to a silkmoth to drive.

Replicating nature's various noses is an impressively difficult task. If sniffing bots are to replace environmental monitors and drug dogs in the future, they'll have to clear a pair of major hurdles. First is developing sensors that can even measure trace scents, and second is developing the systems to guide a nose-bot to a smell's source. That brings us to the curious contraption you see above. To see if an animal can work with a synthetic system to track scents, a team from the University of Tokyo developed a small robot and gave it to a silkmoth to drive.

Advertisement

"The most difficult subject in odor tracking is that the odorants are distributed intermittently in the atmosphere, not distributed continuously. Therefore it is difficult to find out its source by simple chemotaxis, tracking the concentration gradient of an odor," said lead author Dr. Noriyasu Ando in an email. "Olfaction is essential for organisms to survive because they use it for finding foods, nests and mating partners for example  and they evolved fine strategies for tracking odour and localize it source. From a biomimetic perspective, this is the reason why the animal odor tracking is attractive for developing the artificial [version] useful for finding chemical spills or hazardous materials."

The experiment was fairly straightforward: The pheromones from a female silk moth were placed at one end of a wind tunnel, and wafted at about the speed a female would beat her wings towards where a male was set inside his robo-car. As the male walks towards the scent–because "Hey ladies!"–the plastic ball the moth stood on rolled and acted as a controller for the entire apparatus. Boom, you've got a moth-driven robot.

A moth in its robo-car. Click to enlarge and see how it's held in place.

The exercise was aimed at developing a biomimetic model for autonomous sensory robots by seeing if a silkmoth's processing abilities can work well enough with a robotic system that they might be replicated artificially. Silkmoths were chosen by the Japanese team, whose results are published in Bionspiration and Biomimetics, because their behavior for pheromone detection is well studied. On their own, the moths will amble towards the scent source with high accuracy, which, considering scents are diffused through the air, isn't as easy as you think.

"The simple and robust odor tracking behaviour of the silkmoth allows us to analyse its neural mechanisms from the level of a single neuron to the moth's overall behaviour," said Ando in a release. "By creating an 'artificial brain' based on the knowledge of the silkmoth's individual neurons and tracking behaviour, we hope to implement it into a mobile robot that will be equal to the insect-controlled robot developed in this study."

Advertisement

Team found that the silkmoth was still able to triangulate the source of a pheromone in its rig. In fact, because the rig is designed with fans on its sides that catch pheromone-laden air and direct it towards the silkmoth, its receptive zone was wider than when it was traveling on its own, as shown in the figure below.

Essentially, the team was able to develop a scent-detecting robot that is directed by a silkmoth, and which actually was able to tracks scents even better than the silkmoth could on its own. All 14 silkmoths tested in the rig were successful in driving it towards the goal.

Now, if you've watched the above video (if you haven't, you're missing out), you're probably wondering what the bit about a turning bias is. The research team, to test how well a moth could adapt to dynamic conditions, also tested moths driving bots that were programmed to turn slightly clockwise (pull to the right), like a shopping cart with a bad wheel. In a separate test, the researchers covered the front of the car so that the moths couldn't see where they were going.

Impressively, the moths were able to find drive their cars to the pheromone source despite the turning bias, screen, and a combination of the two. Success rates did drop, from 100 percent from the unhindered car, to 84.2 percent and 80.8 percent for the covered and turn-biased tests, respectively. With both hindrances, the success rate dropped all the way to 53.8 percent.

Advertisement

Additionally, as the moth's behavior was being tested to see if it could be used to create an artificial tracking model for a sniff-bot, the team tested the moths' ability to track the scent plume with a delay between the moth's movement on the tracking ball and the car actually moving forward. Under normal conditions, the moths were 100 percent successful with a delay of 200ms, and were still 90 percent successful at 600ms, which is enough time to simulate a processing lag between a high-tech sensor and motor functions. The rates dropped when a turn bias was added, which makes sense, as the robot would be veering off in the wrong direction during those lag periods.

"Most chemical sensors, such as semiconductor sensors, have a slow recovery time and are not able to detect the temporal dynamics of odours as insects do," Ando said. "Our results will be an important indication for the selection of sensors and models when we apply the insect sensory-motor system to artificial systems."

"Now, I am become Death, the destroyer of worlds."

Still, that a blind moth can still drive a wonky car suggests the moths are more flexible trackers than one might think. In other words, that the moths were able to control a robot that didn't move in direct relation to its own movements is a cool result, and suggests that animal-powered bots could work outside of very strict lab conditions. it also suggests that animal-powered bots–or even bots using a processor modeled after animal behavior–aren't outside the realm of possiblity.

"Animal behaviours are attractive for engineers in terms of behaving adaptively in ceaselessly changing environments," Ando wrote. "We think that exploring the mechanisms behind the adaptability will give us valuable knowledge fordeveloping autonomous robots which can behave and complete tasks even inuncertain environments."

As Motherboard's Austin Considine explained in a recent feature, utilizing animals as drones is becoming an increasing trend. For one, we can't yet create machine analogs of living brains, and we're also finding that animal-machine interfacing can actually work. It doesn't have to mean wiring machine guns into an eagle's brain; in this case, it's as simple as designing a mobile exoskeleton that even a moth can drive.

Sure, it seems totally wacky and futuristic, but the holy grail of this realm of biomimetics is to be able to replicate the analytical prowess and flexibility of organic brains on silicon. It appears that the best way to get there is to first figure out how brains best work with machines, and then try to copy from there. It may end up that creating a gas-leak sniffing drone powered by insects isn't feasible in the long term. But for the moment, moth cars and whatever they lead to next do appear to be a promising avenue of research towards the life-mimicking automatons of the future.

@derektmead