LIDAR gives laser vision to robotic bees

SUNY Buffalo leads a RoboBee initiative project funded by a $1.1 million NSF grant to create robotic bees that can see.

Robot insects may someday be used in agriculture and disaster relief situations. (Image credit: Microrobotics Lab, Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, SUNY UB)
Robot insects may someday be used in agriculture and disaster relief situations. (Image credit: Microrobotics Lab, Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, SUNY UB)

IMAGE: Robot insects may someday be used in agriculture and disaster relief situations. (Image credit: Microrobotics Lab, Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, SUNY UB)

The State University of New York (SUNY) University at Buffalo (UB) is leading a research project funded by a $1.1 million National Science Foundation grant that includes researchers from Harvard University and the University of Florida. It is an offshoot of the RoboBee initiative, led by Harvard and Northeastern University, that aims to create insect-inspired robots that someday may be used in agriculture and disaster relief.

RELATED ARTICLE: LIDAR nears ubiquity as miniature systems proliferate

By equipping robotic insects with tiny laser-powered sensors that act as eyes, the miniature robotic bees and other insect-like robots are able to sense the size, shape, and distance of approaching objects.

"Essentially, it's the same technology that automakers are using to ensure that driverless cars don't crash into things," says UB computer scientist Karthik Dantu. "Only we need to shrink that technology so it works on robot bees that are no bigger than a penny."

Researchers have shown that robot bees are capable of tethered flight and moving while submerged in water. One of their limitations, however, is a lack of depth perception. For example, a robot bee cannot sense what's in front of it.

This is problematic if you want the bee to avoid flying into a wall or have it land in a flower, says Dantu, who worked on the RoboBee project as a postdoctoral researcher at Harvard before joining UB’s School of Engineering and Applied Sciences in 2013 as an assistant professor.

The UB-led research team will address the limitation by outfitting the robot bee with remote sensing technology called LIDAR, the same laser-based sensor system that is making driverless cars possible.

After sensors measure the time it takes for illuminating light to return from an object, this information is then analyzed by computer algorithms to form a coherent image of the car's path. This enables the car to "see" its environment and follow traffic signs, avoid obstacles and make other adjustments. These systems, which are typically mounted on the car roof, are about the size of a traditional camping lantern. The team Dantu leads wants to make them much smaller, a version called "micro-lidar."

University of Florida researchers will develop the tiny sensor that measures the light's reflection, while Dantu will create novel perception and navigation algorithms that enable the bee to process and map the world around it. Harvard researchers will then incorporate the technology into the bees.

The technology the team develops likely won’t be limited to robot insects. The sensors could be used, among other things, in wearable technology; endoscopic tools; and smartphones, tablets and other mobile devices.

SOURCE: SUNY University at Buffalo;http://www.buffalo.edu/news/releases/2015/10/042.html

More in Test & Measurement