# Case Study: 3D Image Analysis

### Image Analysis of a 3DImage

Several manufacturers sell 3D cameras which use 2D sensor arrays sensitive to the phase of reflected laser light. All of them spread laser light so that it permanently illuminates the complete scene of interest. Laser light is modulated synchronously with the pixel sensitivity. The phase of reflected laser light, going back to sensor pixels, depends on the distance to reflection points. This is the basis for calculating the XYZ positions of the illuminated surface points.

While this basic principle of operation is the same for a number of 3D cameras, there are a lot of technical details which determine the quality of the obtained data.

The best known of these 3D cameras is Microsoft Kinect. It also provides the best distance measurement resolution. According to our measurements, the standard deviation of distance to both white and relatively dark objects is below 2 mm. Most 3D cameras have higher distance measurement noise, often unacceptably high for even relatively high target reflectivity of 20 percent. Here we show the example of data obtained using one not-so-good European 3D camera with 2D array of laser light Time-Of-Flight sensitive pixels. We used the default settings of the camera to calculate distances to the white target at 110 cm from the camera, which is close to the default calibration setup distance for the camera. Deviations of distance from smooth approximating surface in the center of the image are shown by the point cloud here:

Here X and Y are distances from the image center measured in pixels.

Here is a histogram of the distance noise:

Both figures show that, in addition to Gaussian-looking noise, there are some points with very large deviations. Such large deviations are caused by strong fixed-pattern noise (difference between pixel sensitivities).

While the noise of this camera is at least 8 times higher than the noise of Kinect, there are more problems which become visible looking at 2D projection of the 3D point cloud. If projected to the camera plane, color-coded distances, shown in cm, do not look too bad for some simple scanned scene:

Here x index and Y index are just pixel numbers in x and y directions.

The picture becomes more interesting when looking at the projection of the same point cloud to the X-Distance plane:

We can clearly see stripes separated by about 4 cm in distance.

Nothing like these spurious stripes can be seen in point clouds from the good 3D cameras, such as Kinect.

So the conclusion of our 3D image analysis of this European 3D camera is that it is not competitive with the best available 3D cameras.

### How to Tune Servo Systems: The Basics

April 10, 2024
Learn how to tune a servo system using frequency-based tools to meet system specifications by watching our webinar!

### Motion Scan and Data Collection Methods for Electro-Optic System Testing

April 10, 2024
Learn how different scanning patterns and approaches can be used in measuring an electro-optic sensor performance, by reading our whitepaper here!

### How Precision Motion Systems are Shaping the Future of Semiconductor Manufacturing

March 28, 2024
This article highlights the pivotal role precision motion systems play in supporting the latest semiconductor manufacturing trends.

### Case Study: Medical Tube Laser Processing

March 28, 2024
To enhance their cardiovascular stent’s precision, optimize throughput and elevate part quality, a renowned manufacturer of medical products embarked on a mission to fabricate...