TRACKING AND ID FOR SECURITY: SWIR technology takes surveillance to a new level

Sept. 1, 2007
A short-wave-infrared camera, in combination with an eye-safe laser-radar system, provides sophisticated tracking abilities, and can take advantage of natural “nightglow” radiation.


Facility security and surveillance technology is critical to both military and commercial markets. Although radar has been a long-time solution and an excellent form of technology for long-range object detection, it is not ideal for short-range perimeter defense or intruder identification and tracking. The most recent developments in target tracking, target identification, and high-speed free-space communication involve the shortwave-infrared (SWIR) region of the electromagnetic spectrum. Lasers at SWIR wavelengths can be eye-safe at high power and propagate more efficiently through the atmosphere than visible lasers. These laser emissions are undetectable to most imaging technologies, including silicon, CCD cameras, night-vision tubes, and long-wave IR cameras.

A newly developed laser-radar-based area-surveillance system, called the Laser Perimeter Awareness System (LPAS), operates in the SWIR and can simultaneously detect a perimeter breach, track multiple targets, and slew a video detector to identify a moving object. An indium gallium arsenide (InGaAs) camera can identify targets day or night, supplying enhanced image data not available from other camera technologies. The system is also covert to most detection methods, including night-vision goggles, radar-signal detectors, visible CCD cameras, and thermal imagers.

In free-space laser-tracking applications, high-speed acquisition of the laser source enables fast tracking of its position to ensure the best signal strength. The signal can then be collected by a bore-sighted optic that focuses the light onto an InGaAs avalanche photodiode capable of gigabit data-transfer rates. A large-format (640 × 512-pixel), high-speed windowing SWIR camera developed by Goodrich SUI (Princeton, NJ) meets this fast-tracking requirement with 16 × 16-pixel windowed frame rates of 15,151 frames per second.

Imaging through haze

Mie and Rayleigh scattering in the atmosphere can obscure the transmission of shorter visible wavelengths, often producing extremely low contrast and unidentifiable images with visible cameras. The Rayleigh scattering effect is only defined for particles much smaller than the wavelength of interest. The scattering coefficient is inversely proportional to the fourth power of the wavelength (λ-4) resulting in significantly more scattering of shorter wavelengths versus longer wavelengths. Scattering from particles whose size approaches or exceeds the incident wavelength of the light is explained by the Mie equation; this effect is also wavelength-dependent, causing light at shorter wavelengths to be scattered more readily.

Longer wavelengths transmit more readily through hazy atmosphere, allowing objects to be seen over greater distances. In one example, three IR cameras each detected a different wavelength range (see Fig. 1). The selected scene was not visible through the haze by the naked eye, nor was a CCD camera able to resolve the scene shown by the longer-wavelength cameras. All three cameras produced images through the mist and haze, but the nature of the reflected, rather than emitted, light produced a higher-contrast image in the SWIR wavelength range of 0.9 to 1.7 µm.

A SWIR camera can identify objects through moderate mist and haze conditions more accurately than can visible, near-IR, and some thermal cameras. Longer wavelengths transmit through the hazy atmosphere to produce images of objects over greater distances. As a result, the InGaAs SWIR camera produces high-contrast, high-resolution images with important information like license plate numbers on vehicles and identification details such as facial features, hair on a person’s head, and the style of clothing being worn. This information is often lost when relying on thermal-camera options.

Eye-safe laser radar

The LPAS detects laser light reflected from an object and computes its range from the total amount of time required for the light to travel to the object and return to the sensor. The system can detect multiple walkers at ranges from 20 to 250 m, and vehicles the size of cars or small boats up to one kilometer away.

The 1550 nm laser wavelength of the LPAS beam lies outside the detection band of CCD cameras and standard night-vision equipment and does not interfere with video-surveillance systems. Laser radiation at 1550 nm is eye-safe at much higher power than lasers of visible (400 to 700 nm) or near-IR (700 to 1000 nm) wavelengths. The surface of the eye absorbs 1550 nm radiation very efficiently (without experiencing burning or other harm) at the low power levels used by the LPAS, avoiding damage to the retina, whereas visible and NIR wavelengths readily pass through the eye and can cause damage at the same power levels.
The LPAS system detects intruders after first generating a background clutter map of the terrain. After storing the clutter map into memory, it continues to collect laser echoes and search for intrusions in real time against this three-dimensional map. Laser echoes that do not correspond to those in the clutter map, within a specified tolerance, trigger an intruder alert at an appropriate interface. The interface can be a display screen with intruder icons which form a track overlaid on an aerial photo, an audible alarm, or a network linking multiple sensors together (see Fig. 2). The alert contains the range, bearing, and elevation coordinates of the intruder, as well as a time stamp.

When the LPAS locates an intruder, it can automatically cue another sensor (such as a CCD camera, SWIR camera, or thermal imager) to slew its field of view to the target, allowing security personnel to assess the threat level. Used in this capacity, high-sensitivity SWIR cameras with built-in image-enhancement algorithms provide excellent night-vision performance not possible with visible cameras and night-vision intensifier-tube-coupled CCDs (see Fig. 3).


When artificial lights are not present in a scene at night, many cameras cannot produce an image because of the lack of available photons. However, the natural phenomenon dubbed “nightglow” provides SWIR cameras with a ready light source. Nightglow is radiation produced in the Earth’s mesosphere (at altitudes from 50 to 85 km) at wavelengths that lie predominantly in the SWIR wavelength band, but are not significant in the visible or NIR regions for which night-vision goggles are sensitive. The phenomenon results from a chemical reaction involving the hydroxyl ion, as well as other molecular and atomic emissions, and results in radiation at wavelengths detectable by only the most sensitive shortwave-infrared cameras. Nightglow illuminates any object that is not under direct cover, and is often strong enough for SWIR cameras to image in the dead of night even when there is no moonlight present.

Because cloud cover can reduce the available nightglow, a SWIR camera can be equipped with an active-illumination system to light up the field of view while still remaining covert to other laser-identification and night-vision systems. The use of a dispersed eye-safe 1550 nm laser provides illumination at much farther distances than lasers detectable by other technologies. Coupled with the very high (greater than 80%) quantum efficiency of the SWIR camera at 1550 nm, the combination produces a system capable of illuminating and identifying targets well beyond the range of other active-illumination systems.


1. Jamieson at al., Laser Perimeter Awareness System (LPAS), Goodrich Corp., U.S. Patent 6,985,212 (2006).

Marc Hansen is an applications engineer at SUI-Goodrich, 3490 Route 1, Building 12, Princeton, NJ 08540; e-mail: [email protected]; Mark D. Ray is a senior principal engineer and Owen D. Evans is a principal engineer at Goodrich Sensor Systems, 14300 Judicial Rd., Burnsville, MN 55306.

Sponsored Recommendations

Request a quote: Micro 3D Printed Part or microArch micro-precision 3D printers

April 11, 2024
See the results for yourself! We'll print a benchmark part so that you can assess our quality. Just send us your file and we'll get to work.

Request a free Micro 3D Printed sample part

April 11, 2024
The best way to understand the part quality we can achieve is by seeing it first-hand. Request a free 3D printed high-precision sample part.

How to Tune Servo Systems: The Basics

April 10, 2024
Learn how to tune a servo system using frequency-based tools to meet system specifications by watching our webinar!

Precision Motion Control for Sample Manipulation in Ultra-High Resolution Tomography

April 10, 2024
Learn the critical items that designers and engineers must consider when attempting to achieve reliable ultra-high resolution tomography results here!

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!