Synchronization-free LEDs and smartphone become 3D image-capturing device

Feb. 24, 2021
A top-down illumination photometric stereo imaging setup based on a handheld mobile phone, four unsynchronized pulsed LEDs, and a reconstruction algorithm captures 3D images.

Researchers from the University of Strathclyde (Glasgow, Scotland) and Bristol Robotics Laboratory (Bristol, England) are tapping into dynamically controlled (modulated) LEDs to create a simple illumination system for 3D optical imaging. They have demonstrated that this can be performed using just a cell phone and LEDs without requiring any complex manual processes to synchronize the camera with the lighting.

Human vision relies on the brain to reconstruct depth information from a stereo input. Depth information can also be acquired using photometric stereo imaging, in which one camera is combined with illumination that comes from multiple directions. This lighting setup allows images to be recorded with different shadowing, which can then be used to reconstruct a 3D image. Photometric stereo imaging traditionally requires four light sources, such as LEDs, which are deployed symmetrically around the viewing axis of a camera. In the new work, the researchers show that 3D images can also be reconstructed when objects are illuminated from the top down, but imaged from the side. This setup allows overhead room lighting to be used for illumination.

In work supported under the UK’s EPSRC Quantic research program, the researchers developed algorithms that modulate each LED in a unique way, which acts like a fingerprint that allows the camera to determine which LED generated which image to facilitate the 3D reconstruction. The new modulation approach, called Manchester-encoded binary frequency-division multiple access (MEB-FDMA), also carries its own clock signal so that the image acquisition can be self-synchronized with the LEDs by simply using the camera to passively detect the LED clock signal. In MEB-FDMA, an arbitrary phase mismatch between two LED signals does not affect the decoding result; in other words, the two signals remain “orthogonal” to each other.

Smartphone in high-speed mode

To demonstrate this new approach, the researchers tried out their modulation scheme using a photometric stereo setup based on commercially available LEDs. A simple Arduino board provided the electronic control for the LEDs. Images were captured using the high-speed (960 fps) video mode of a Samsung Galaxy S9 smartphone. The researchers imaged a 48-mm-tall figurine that they 3D-printed with a matte material to avoid any shiny surfaces that might complicate imaging (see figure). After identifying the best position for the LEDs and the smartphone, the researchers achieved a reconstruction error of just 2.6 mm for the figurine when imaged from 42 cm away. This error rate shows that the quality of the reconstruction was comparable to that of other photometric stereo imaging approaches.

In a test to see if a moving object could be imaged, an ellipsoidal object 100 mm long and 60 mm wide was rotated at a speed of 7.5 RPM and imaged at 960 fps over an 8 s time span; the data was processed to form a fully 3D-reconstructed 25 fps video. In a signal-to-noise ratio (SNR) test, static images of a 48-mm-diameter sphere were taken with increasing levels of background room illumination, showing SNR effects varying between 4 and 6 mm over a SNR range of -1 to 6 dB.

In the current system, the image reconstruction takes a few minutes on a laptop. To make the system practical, the researchers are working to decrease the computational time to just a few seconds by incorporating a deep-learning neural network that would learn to reconstruct the shape of the object from the raw image data.

“Our new approach could be used to illuminate different indoor areas to allow better surveillance with 3D images, create a smart work area in a factory, or to give robots a more complete sense of their environment,” says Emma Le Francois, a doctoral student in the research group led by Martin Dawson, Johannes Herrnsdorf, and Michael Strain at the University of Strathclyde.

REFERENCE

1. E. Le Francois et al., Optics Express, 29, 2, 1502-1515 (2021); https://doi.org/10.1364/oe.408658.

About the Author

John Wallace | Senior Technical Editor (1998-2022)

John Wallace was with Laser Focus World for nearly 25 years, retiring in late June 2022. He obtained a bachelor's degree in mechanical engineering and physics at Rutgers University and a master's in optical engineering at the University of Rochester. Before becoming an editor, John worked as an engineer at RCA, Exxon, Eastman Kodak, and GCA Corporation.

Sponsored Recommendations

Request a quote: Micro 3D Printed Part or microArch micro-precision 3D printers

April 11, 2024
See the results for yourself! We'll print a benchmark part so that you can assess our quality. Just send us your file and we'll get to work.

Request a Micro 3D Printed Benchmark Part: Send us your file.

April 11, 2024
See the results for yourself! We'll print a benchmark part so that you can assess our quality. Just send us your file and we'll get to work.

Request a free Micro 3D Printed sample part

April 11, 2024
The best way to understand the part quality we can achieve is by seeing it first-hand. Request a free 3D printed high-precision sample part.

How to Tune Servo Systems: The Basics

April 10, 2024
Learn how to tune a servo system using frequency-based tools to meet system specifications by watching our webinar!

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!