– Flower energy & insect intelligence prepared the ground to higher drone landings

0
179

January 21, 2021

The researchers present a learning process based on optical flows with which robots can estimate distances based on the visual appearance (shape, color, texture) of the visible objects.

This artificial intelligence (AI) -based learning strategy improves the navigation skills of small flying drones and brings with it a new hypothesis on the intelligence of insects.

If you’ve ever seen a honeybee hop elegantly from flower to flower or avoid you in passing, you may have wondered how such a small insect possesses such perfect navigational skills. The abilities of these flying insects are partly explained by the concept of optical flow: they sense the speed at which objects move through their field of view. Robotics researchers have attempted to mimic these strategies on flying robots, but with limited success.

A team from TU Delft and researchers from the Westphalian University of Applied Sciences is therefore presenting a learning process based on optical flows with which robots can estimate distances based on the visual appearance (shape, color, texture) of visible objects. This artificial intelligence (AI) -based learning strategy improves the navigation skills of small flying drones and brings with it a new hypothesis on the intelligence of insects. The article is published today in Nature Machine Intelligence.

How do honey bees land on flowers or avoid obstacles? One would expect such questions to be of particular interest to biologists. However, the rise of small electronics and robotic systems has made these questions relevant to robotics and artificial intelligence (AI) as well. For example, small flying robots are extremely limited in terms of the sensors and processing that they can carry on board. If these robots are to be as autonomous as the much larger self-driving cars, they have to use an extremely efficient type of artificial intelligence – similar to the sophisticated intelligence of flying insects.

Optical flow

One of the main tricks up the insect’s sleeve is the widespread use of “optical flow”: the way objects move in their sight. You land on flowers and avoid obstacles or predators. Insects use surprisingly simple and elegant optical flow strategies to accomplish complex tasks. For example, to land honeybees, use the Divergence optical flow, which detects how quickly things get bigger. If a honey bee fell to the ground, this divergence would continue to increase, for example the grass in view would get bigger and bigger. When landing, however, honey bees follow the strategy of keeping the divergence constant by slowing down. The result is that they make smooth, soft landings.

“Our work on optical flow control began with an enthusiasm for the elegant, simple strategies of flying insects,” says Guido de Croon, professor of bio-inspired micro-aircraft and first author of the article. “The development of the control methods for implementing these strategies in flying robots, however, turned out to be anything but trivial. For example, our flying robots didn’t actually land, but instead began to swing and went up and down continuously, directly above the landing area.”

Basic restrictions

Optical flow has two fundamental limitations that have been extensively described in the growing bio-inspired robotics literature. The first is that the optical flow only provides mixed information about distance and speed – and not about distance or speed separately. To illustrate, if there are two landing drones and one is flying twice as high and twice as fast as the other drone, they will experience exactly the same optical flow. However, for good control, these two drones should react differently to deviations in the optical flow divergence. If a drone does not adapt its response to altitude when landing, it will never arrive and will begin to swing over the landing area. Second, in order to avoid obstacles, it is very unfortunate that there is very little optical flow in the direction a robot is moving. This means that optical flow measurements in this direction are noisy and therefore provide very little information about the presence of obstacles. Therefore, the most important obstacles – the ones the robot is moving towards – are actually the hardest to spot!

Learning visual appearance as a solution

“We realized that both problems of the optical flow would disappear if the robots could interpret not only the optical flow but also the visual appearance of objects in their environment,” adds Guido de Croon. “This would allow robots to see distances to objects in the scene, much like we humans can estimate distances in a still image. The only question was: How can a robot learn to see such distances?”

The key to this question lay in a theory recently developed by De Croon that showed that flying robots can actively induce optical flow vibrations to perceive distances to objects in the scene. In the approach proposed in the article Nature Machine Intelligence, the robots use such vibrations to learn how the objects in their environment look at different distances. In this way, the robot can learn, for example, how fine the grass structure is when viewed from different heights during landing, or how thick tree bark is at different distances when navigating a forest.

Relevance to robotics and applications

“Learning to recognize distances through visual appearance resulted in much faster and smoother landings than before,” says Christophe De Wagter, researcher at TU Delft and co-author of the article. “In addition, to avoid obstacles, the robots could now see obstacles in the flight direction very clearly. This not only improved the performance of obstacle detection, but also enabled our robots to run faster.” The proposed methods are of great importance for resource-constrained flying robots, especially when they are used in a rather cramped environment, e.g. B. in greenhouses to monitor the harvest or to track the inventory in warehouses.

Relevance to biology

The results are not only relevant to robotics, but also provide a new hypothesis for the intelligence of insects. “Typical honey bee experiments begin with a learning phase in which honey bees show different vibration behavior when they familiarize themselves with a new environment and related novel clues such as artificial flowers,” says Tobias Seidl, biologist and professor at the Westphalian University of Applied Sciences.

“The final measurements presented in articles usually take place after this learning phase is completed and mainly focus on the role of optical flow. The presented learning process creates a novel hypothesis about how flying insects improve their navigational skills such as landing over the course of their lives This suggests that we should set up more studies to examine and report on this learning phase. “