HIGH-SPEED IMAGING: World’s fastest camera images with time-domain data stream
Researchers at the University of California, Los Angeles (UCLA; Los Angeles, CA) have developed a real-time camera using a serial time-encoded amplified imaging technique that achieves frame rates at least 1000 times faster than traditional sensor-based cameras.
Researchers at the University of California, Los Angeles (UCLA; Los Angeles, CA) have developed a real-time camera using a serial time-encoded amplified imaging technique that achieves frame rates at least 1000 times faster than traditional sensor-based cameras.1
While CCD and CMOS sensor-based cameras are used for numerous high-speed imaging applications with speeds up to tens of thousands of frames per second and even up to 106 frames/s for the Shimadzu HPV-1 high-speed video camera, they fall short of performing ultrafast real-time imaging with high sensitivity and resolution because of their fundamental architecture; essentially, it takes time to read out data from sensor arrays. Even streak cameras—which can capture attosecond-speed events—operate in burst mode and must be synchronized with the event to be captured, making them unable to capture random or unique events.
Building up STEAM
Rather than reading out individual pixel data from a two-dimensional (2-D) sensor array, the UCLA imaging technique is adapted from serial time-encoded amplified microscopy (STEAM) imaging technology.
Basically, a 2-D image from an object is encoded into a serial time-domain waveform that is amplified and captured by a single-pixel photodiode and an oscilloscope. This image amplification in the optical domain enables continuous, real-time operation at a frame rate of 6.1 MHz and a shutter speed of 440 ps.
Mapping of the 2-D image into an amplified serial time-domain waveform is accomplished by first encoding the 2-D spatial information onto the spectrum of a broadband pulse using a 2-D spatial disperser that includes an orthogonally oriented virtually imaged phase array (VIPA) and a diffraction grating.
The optical source for the STEAM camera is a modelocked femtosecond laser with a 6.1 MHz repetition rate that is input to a length of nonlinear optical fiber to generate a supercontinuum with a bandwidth of 40 nm centered around 1590 nm. The source is sent through the 2-D spatial disperser to an object. The dispersed pulse is then reflected off the object and back through the disperser for wavelength recombination.
This signal is then amplified 10 dB in an erbium-doped fiber amplifier (EDFA) and then boosted an additional 15 dB using distributed Raman amplification in dispersive fiber that also performs a Fourier-transform operation, mapping the spectrum into the time domain. As the image enters the fiber, the aperture of the fiber rejects scattered light from out-of-focus planes, effectively rendering the system a confocal microscope.
Optical amplification of the signal is necessary to bring it to a level that exceeds the noise level of the single-pixel photodiode receiver. The time-domain waveform can then be analyzed using a conventional oscilloscope. To obtain an image, the one-dimensional temporal data stream is sorted into a 2-D matrix.
Capturing laser ablation
To demonstrate real-time operation of the STEAM camera, the researchers captured images of laser ablation of a sample consisting of a bilayer of aluminum and silicon dioxide on a silicon-on-insulator substrate (see figure). A real-time sequence of ablation images corresponds to the dynamics of laser-induced mass ejection caused by the ablation pulse.
Bahram Jalali, professor of electrical engineering and principal investigator of the research at UCLA, says, “Ultrahigh-speed imaging is a potentially effective solution for detection of abnormal, rogue cells in blood that are indicative of early stage cancer and other disease.”
Keisuke Goda, postdoctoral fellow and lead author of the paper at UCLA, adds, “Traditional blood analyzers can count cells and extract information about their size, but they cannot take pictures of every cell because there are no cameras fast and sensitive enough for the job.”
- K. Goda et al., Nature 458, p. 1145 (April 30, 2009).