LIDAR gives laser vision to robotic bees

Oct. 24, 2015
SUNY Buffalo leads a RoboBee initiative project funded by a $1.1 million NSF grant to create robotic bees that can see.

The State University of New York (SUNY) University at Buffalo (UB) is leading a research project funded by a $1.1 million National Science Foundation grant that includes researchers from Harvard University and the University of Florida. It is an offshoot of the RoboBee initiative, led by Harvard and Northeastern University, that aims to create insect-inspired robots that someday may be used in agriculture and disaster relief.

RELATED ARTICLE: LIDAR nears ubiquity as miniature systems proliferate

By equipping robotic insects with tiny laser-powered sensors that act as eyes, the miniature robotic bees and other insect-like robots are able to sense the size, shape, and distance of approaching objects.

"Essentially, it's the same technology that automakers are using to ensure that driverless cars don't crash into things," says UB computer scientist Karthik Dantu. "Only we need to shrink that technology so it works on robot bees that are no bigger than a penny."

Researchers have shown that robot bees are capable of tethered flight and moving while submerged in water. One of their limitations, however, is a lack of depth perception. For example, a robot bee cannot sense what's in front of it.

This is problematic if you want the bee to avoid flying into a wall or have it land in a flower, says Dantu, who worked on the RoboBee project as a postdoctoral researcher at Harvard before joining UB’s School of Engineering and Applied Sciences in 2013 as an assistant professor.

The UB-led research team will address the limitation by outfitting the robot bee with remote sensing technology called LIDAR, the same laser-based sensor system that is making driverless cars possible.

After sensors measure the time it takes for illuminating light to return from an object, this information is then analyzed by computer algorithms to form a coherent image of the car's path. This enables the car to "see" its environment and follow traffic signs, avoid obstacles and make other adjustments. These systems, which are typically mounted on the car roof, are about the size of a traditional camping lantern. The team Dantu leads wants to make them much smaller, a version called "micro-lidar."

University of Florida researchers will develop the tiny sensor that measures the light's reflection, while Dantu will create novel perception and navigation algorithms that enable the bee to process and map the world around it. Harvard researchers will then incorporate the technology into the bees.

The technology the team develops likely won’t be limited to robot insects. The sensors could be used, among other things, in wearable technology; endoscopic tools; and smartphones, tablets and other mobile devices.

SOURCE: SUNY University at Buffalo; http://www.buffalo.edu/news/releases/2015/10/042.html

About the Author

Gail Overton | Senior Editor (2004-2020)

Gail has more than 30 years of engineering, marketing, product management, and editorial experience in the photonics and optical communications industry. Before joining the staff at Laser Focus World in 2004, she held many product management and product marketing roles in the fiber-optics industry, most notably at Hughes (El Segundo, CA), GTE Labs (Waltham, MA), Corning (Corning, NY), Photon Kinetics (Beaverton, OR), and Newport Corporation (Irvine, CA). During her marketing career, Gail published articles in WDM Solutions and Sensors magazine and traveled internationally to conduct product and sales training. Gail received her BS degree in physics, with an emphasis in optics, from San Diego State University in San Diego, CA in May 1986.

Sponsored Recommendations

Request a quote: Micro 3D Printed Part or microArch micro-precision 3D printers

April 11, 2024
See the results for yourself! We'll print a benchmark part so that you can assess our quality. Just send us your file and we'll get to work.

Request a free Micro 3D Printed sample part

April 11, 2024
The best way to understand the part quality we can achieve is by seeing it first-hand. Request a free 3D printed high-precision sample part.

How to Tune Servo Systems: The Basics

April 10, 2024
Learn how to tune a servo system using frequency-based tools to meet system specifications by watching our webinar!

Precision Motion Control for Sample Manipulation in Ultra-High Resolution Tomography

April 10, 2024
Learn the critical items that designers and engineers must consider when attempting to achieve reliable ultra-high resolution tomography results here!

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!