Researchers in the MIT Media Lab (Cambridge, MA) have developed a three-dimensional (3D) ânano-cameraâ that can operate at the speed of light and costs only $500 to build. The camera has utility in medical imaging and in collision-avoidance detectors for cars, and could improve the accuracy of motion tracking and gesture-recognition devices used in interactive gaming.
The camera is based on âtime-of-flightâ technology like that used in Microsoftâs recently launched second-generation Kinect device, in which the location of objects is calculated by how long it takes a light signal to reflect off a surface and return to the sensor. However, unlike existing devices based on this technology, the new camera is not fooled by rain, fog, or even translucent objects, says co-author Achuta Kadambi, a graduate student at MIT.
Related: Inexpensive optical device enables determination of eyeglass prescriptions
Related: Time-of-flight camera sees around corners
âUsing the current state of the art, such as the new Kinect, you cannot capture translucent objects in 3D," Kadambi says. âThat is because the light that bounces off the transparent object and the background smear into one pixel on the camera. Using our technique, you can generate 3D models of translucent or near-transparent objects.â
In a conventional time-of-flight camera, a light signal is fired at a scene, where it bounces off an object and returns to strike the pixel. Since the speed of light is known, it is then simple for the camera to calculate the distance the signal has traveled and therefore the depth of the object from which it has been reflected.
Unfortunately, though, changing environmental conditions, semitransparent surfaces, edges, or motion all create multiple reflections that mix with the original signal and return to the camera, making it difficult to determine which is the correct measurement.
Instead, the new device uses an encoding technique commonly used in the telecommunications industry to calculate the distance a signal has travelled, says Ramesh Raskar, an associate professor of media arts and sciences and leader of the Camera Culture group within the Media Lab, who developed the method alongside Kadambi, Refael Whyte, Ayush Bhandari, and Christopher Barsi at MIT and Adrian Dorrington and Lee Streeter from the University of Waikato in New Zealand.
âWe use a new method that allows us to encode information in time,â Raskar says. âSo when the data comes back, we can do calculations that are very common in the telecommunications world, to estimate different distances from the single signal.â
The idea is similar to existing techniques that clear blurring in photographs, says Bhandari, a graduate student in the Media Lab. âPeople with shaky hands tend to take blurry photographs with their cellphones because several shifted versions of the scene smear together,â Bhandari says. âBy placing some assumptions on the modelâfor example, that much of this blurring was caused by a jittery handâthe image can be unsmeared to produce a sharper picture.â
In 2011, Raskarâs group unveiled a trillion-frame-per-second camera capable of capturing a single pulse of light as it traveled through a scene. The camera does this by probing the scene with a femtosecond impulse of light, and then uses fast but expensive laboratory-grade optical equipment to take an image each time. However, this âfemto-cameraâ costs around $500,000 to build.
In contrast, the new ânano-cameraâ probes the scene with a continuous-wave signal that oscillates at nanosecond periods. This allows the team to use inexpensive hardwareâoff-the-shelf light-emitting diodes (LEDs) can strobe at nanosecond periods, for exampleâmeaning the camera can reach a time resolution within one order of magnitude of femtophotography while costing just $500.
âBy solving the multipath problem, essentially just by changing the code, we are able to unmix the light paths and therefore visualize light moving across the scene,â Kadambi says. âSo we are able to get similar results to the $500,000 camera, albeit of slightly lower quality, for just $500.â
Follow us on Twitter, 'like' us on Facebook, and join our group on LinkedIn
Subscribe now to BioOptics World magazine; it's free!