Le Plessis Robinson, France--Researchers from ICT are creating autonomous robots that use LIDAR (light detection and ranging) in a new way. Rather than having a robot simply follow a one-step process of LIDAR measurement followed by analysis to determine what is near the robot and where it must go next, the new process has two steps.
The first step is a comprehensive mapping of the surrounding area that occurs before the robot is in normal operation. In the second step, the robot uses its LIDAR to detect what's around it; however, with its already stored knowledge of the surroundings, the robot only needs to distinguish what is new. Such a separation of tasks helps the robot in many situations -- for example, navigating autonomously through an airport, where the robot now starts out with a solid knowledge of the setting (keeping the robot oriented), and only has to determine what's new and different -- such as the appearance and movement of people.
The project is called IRPS (Intelligent Robot Porter System), and is being pursued by a group of institutions and companies, four from Europe and one each from Canada and Israel, that is being funded by the European Union.
The researchers, who call the technology 3D LIMS (3D LIDAR Imaging and Measurement System), foresee a broad range of applications for it, from navigating autonomous vehicles around airports to monitoring industrial equipment and enhancing security surveillance.
"This two-step LIDAR process, involving first calibration and then real-time navigation, is the key innovation. It allows the system to accurately and rapidly detect changes in the environment," says Maurice Heitz, the manager of the IRPS project and a researcher at French technology firm CS Communication & Systèmes.
The technology not only detects objects with greater accuracy, but unlike camera-based robotic vision systems it is not affected by shadows, rain or fog, and provides angular and distance information for each pixel, making it suitable for use in virtually any environment.
Robotic airport buggies
To highlight the potential of 3D LIMS, the IRPS team built a prototype application in which the technology was used to navigate buggy-like autonomous vehicles that might one day transport passengers or luggage around an airport.
Showcased at Faro Airport in Portugal last December, the robotic porters first built up a 3D image of the static airport environment: walls, columns, doors, staircases, and so on. The buggies then used onboard LIDAR to accurately calculate their position and detect obstacles as they moved around the airport.
"Our vision is that one day people, perhaps elderly or with a disability, will go to the airport and by speaking to a porter control center on their mobile phone or through a web interface on their PDA would be able to order a vehicle to take them to their boarding gate," says Heitz. "The vehicle would transport them autonomously, weaving its way between moving objects such as passengers and piles of luggage."
However, Heitz says it will probably be many years before robotic buggies start buzzing around airports autonomously due to a combination of safety concerns and the need for further technological advances.
The IRPS project received funding from the ICT strand of the EU's Sixth Framework Programme for research. (Source: ICT Results)
About the Author
John Wallace
Senior Technical Editor (1998-2022)
John Wallace was with Laser Focus World for nearly 25 years, retiring in late June 2022. He obtained a bachelor's degree in mechanical engineering and physics at Rutgers University and a master's in optical engineering at the University of Rochester. Before becoming an editor, John worked as an engineer at RCA, Exxon, Eastman Kodak, and GCA Corporation.