Outsight says hyperspectral material sensor outperforms cameras and lidar for autonomous vehicles

Sept. 19, 2019
Its Semantic Camera can perceive the environment from hundreds of meters, providing the key chemical composition of objects like snow, skin, and plastic.
Outsight
The Semantic Camera looks at more than a point cloud of the surroundings; it uses hyperspectral imaging to identify the chemical constituents of the environment--highly useful for autonomous vehicle applications.
The Semantic Camera looks at more than a point cloud of the surroundings; it uses hyperspectral imaging to identify the chemical constituents of the environment--highly useful for autonomous vehicle applications.
The Semantic Camera looks at more than a point cloud of the surroundings; it uses hyperspectral imaging to identify the chemical constituents of the environment--highly useful for autonomous vehicle applications.
The Semantic Camera looks at more than a point cloud of the surroundings; it uses hyperspectral imaging to identify the chemical constituents of the environment--highly useful for autonomous vehicle applications.
The Semantic Camera looks at more than a point cloud of the surroundings; it uses hyperspectral imaging to identify the chemical constituents of the environment--highly useful for autonomous vehicle applications.
The Semantic Camera looks at more than a point cloud of the surroundings; it uses hyperspectral imaging to identify the chemical constituents of the environment--highly useful for autonomous vehicle applications.

Outsight (Paris, France), a new entity formed from Diboticsa pioneer in smart machine perception and developer of real-time processing solutions for 3D data, launched its 3D Semantic Camera that it says takes world perception and comprehension to new levels for autonomous driving and other industries.

Outsight founders, Raul Bravo (co-founder and CEO of former company Dibotics) and Cedric Hutchings (co-founder of Withings and former VP of Nokia Technologies), joined forces to create a new entity that aims to combine the software assets of Dibotics with a new 3D sensor technology. Together with Dibotics’ other co-founder Oliver Garcia and Scott Buchter (co-founder of Lasersec), the four have assembled a global team in San Francisco, Paris, and Helsinki to turn their vision into reality.

“Mobility is evolving rapidly, with new ways to travel being introduced every day. In the US alone, approximately 4 million people are seriously injured by car accidents every year. At Outsight, we believe in building safer mobility by making vehicles much smarter. We are excited to unveil our 3D Semantic Camera that brings an unprecedented solution for a vehicle to detect road hazards and prevent accidents,” says Cedric Hutchings, CEO and co-founder of Outsight.

Outsight says its 3D Semantic Camera will not only bring full situation awareness and new levels of safety/reliability for currently man-controlled machines like Level 1- 3 ADAS (Advanced Driver Assistance Systems), construction/mining equipment, helicopters, etc., but also accelerate the emergence of fully automated smart machines like Level 4- 5 self-driving cars, robots, drones, and even autonomous flying taxis.

The company says its technology provides full situation awareness in a single device, with the ability to simultaneously perceive and comprehend the environment from hundreds of meters, including the key chemical composition of objects like skin, cotton, ice, snow, plastic, metal, and wood.

This is partly made possible through the development of a low-powered, long-range, and eye-safe broadband laser that allows for material composition to be identified through active hyperspectral analysis. Combined with its 3D SLAM on Chip capability (Simultaneous Localization and Mapping), Outsight's Semantic Camera technology operates in real time, providing actionable information and object classification through the onboard SoC (system on a chip) that does not rely on machine learning, resulting in lower power consumption at low bandwidths, eliminating the need for massive data sets for training by actually “measuring” surrounding objects. Being able to determine the material of an object adds a new level of confidence to determine what the camera is actually seeing.

The company says that camera better comprehends the world by providing the position, the size, the full velocity, and the chemical composition of all moving objects in its surroundings, providing valuable information for path planning and decision making.

The 3D Semantic Camera can provide important information regarding road conditions and can, for example, identify black ice and other hazardous road conditions. This feature is vital for safety in ADAS systems for example. The system can also quickly identify pedestrians and bicyclists through its material identification capabilities.

Outsight has already started joint development programs with key OEMs and Tier1 providers in automotive, aeronautics, and security-surveillance markets and will progressively open the technology to other partners in Q1 2020.

SOURCE: Outsight; https://www.outsight.tech/hubfs/PR/EN_Outsight_Autosens_Sept19.pdf?hsLang=en

About the Author

Gail Overton | Senior Editor (2004-2020)

Gail has more than 30 years of engineering, marketing, product management, and editorial experience in the photonics and optical communications industry. Before joining the staff at Laser Focus World in 2004, she held many product management and product marketing roles in the fiber-optics industry, most notably at Hughes (El Segundo, CA), GTE Labs (Waltham, MA), Corning (Corning, NY), Photon Kinetics (Beaverton, OR), and Newport Corporation (Irvine, CA). During her marketing career, Gail published articles in WDM Solutions and Sensors magazine and traveled internationally to conduct product and sales training. Gail received her BS degree in physics, with an emphasis in optics, from San Diego State University in San Diego, CA in May 1986.

Sponsored Recommendations

Melles Griot® XPLAN™ CCG Lens Series

March 19, 2024
IDEX Health & Science sets a new standard with our Melles Griot® XPLAN™ CCG Lens Series fluorescence microscope imaging systems. Access superior-quality optics with off-the-shelf...

Spatial Biology

March 19, 2024
Spatial Biology refers to the field that integrates spatial information into biological research, allowing for the study of biological systems in their native spatial context....

Custom-Engineered Optical Solutions for Your Application

March 19, 2024
We combine advanced optical design and manufacturing technology, with decades of experience in critical applications, to take you from first designs to ongoing marketplace success...

Semrock Optical Filters Resources

March 19, 2024
Looking for more information about Semrock optical filters? Explore sets by fluorophore, download the 2023 Semrock catalog and more.

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!