JAPANWATCH: Varying focal length improves ranging

Incorporating news from O plus E magazine, Tokyo
TOKYO—The research laboratory of Makoto Sato, graduate student Michio Sanwa, and others in the engineering department of the Tokyo Institute of Technology has improved a new method of measuring the distance to objects by taking multiple pictures with different focal points.

Nov 1st, 1999
Th Acfb0

Incorporating news from O plus E magazine, Tokyo

TOKYO—The research laboratory of Makoto Sato, graduate student Michio Sanwa, and others in the engineering department of the Tokyo Institute of Technology has improved a new method of measuring the distance to objects by taking multiple pictures with different focal points. By analyzing the degree to which the objects are out of focus, the distance can be estimated. Previously, this type of method could measure objects up to 5 m away, but this new variation can measure distances up to 20 m.

An image taken by shifting the focal point over many shots is known as a multiple-focal-point image. After focusing on a single point of an object, a change in the focal point causes a degree of blurring. By tracking this change in focus, the distance to the object can be estimated.

Click here to enlarge image

PHOTO. A novel distance-measuring system effectively increases the resolution power so a shallow object depth can be achieved (left). In this experiment, an object depth of 4 m was achieved with an object 20 m distant (right). The areas that are completely black (the sidewalk in front and some signs) are regions in which the distance could not be measured because there were no discernible patterns.

If the focal point of the camera is fixed, the blurring of the image changes as a function of the distance between the object and the camera, u. The range of values of u for which the object is in focus is called the object depth. "Being in focus" means that the spread of blurring is less than the pixel pitch of the imaging device.

The magnitude of the object depth becomes larger as the distance from the lens becomes longer. In turn, the angular size of the lens as seen from the object becomes smaller, the distance between the lens and the image focus becomes shorter, and the pixel pitch becomes larger relative to the size of the image (resolving power becomes lower).

To determine distance u from the degree of blurring, the object depth must be shallow. However, when the focal length of a conventional charge-coupled-device camera and lens combination is set to a large value, the object depth becomes so large that it cannot be used. Consequently, this method could only be used to measure short distances in the past.

The research group has developed a camera that effectively increases the resolution, so a shallow object depth can be achieved. In this experiment, an object depth of 4 m was achieved with an object 20 m distant (see figure).

To measure the degree of blurring, the spatial-frequency components of the image can be measured. Contrast decreases in blurry images, and higher-frequency components die out. In this experiment, the intensity of each pixel was compared with the intensities of the surrounding pixels. The blurring of each block was measured and compared among three images with different focal points, and the distance was estimated.

Using such distance information, objects at specified distances can be "cut away" from the background of an image. The group anticipates that this technology can be used in applications such as robotic vision.

Courtesy O plus E magazine, Tokyo

More in Research