NIST creates tool and procedure to dynamically characterize optical trackers used on robotic arms
The ability to pinpoint exactly where the end of a robotic arm is in 3D space is critical for most robotic applications, such as operating a robotic forklift around a factory, manipulating a mechanical arm on an assembly line, or guiding a remote-controlled laser scalpel inside a patient. To make that measurement more reliable, a public-private team led by the National Institute of Standards and Technology (NIST; Gaithersburg, MA) has created a new standard test method to evaluate how well an optical tracking system can define an object's position and orientation—known as its "pose"—with six degrees of freedom: up/down, right/left, forward/backward, pitch, yaw, and roll.
Optical tracking systems work on a principle similar to the stereoscopic vision of a human. In an optical tracking system, two or more cameras record the room and are partnered with beam emitters that bounce a signal—infrared, laser, or lidar—off objects in the area. With both data sources feeding into a computer, the room and its contents can be virtually recreated.
Determining the pose of an object is relatively easy if it doesn't move, and previous performance tests for optical tracking systems relied solely on static measurements. However, for systems such as those used to pilot automated guided vehicle (AGV) forklifts (which are used in many factories and warehouses) that isn’t good enough. Their vision must be very good for both stationary and moving objects to ensure they work efficiently and safely. To address this need, a recently approved ASTM International standard (ASTM E3064-16) now provides a standard test method for evaluating the performance of optical tracking systems that measure pose in six degrees of freedom for static—and for the first time, dynamic—objects.
NIST engineers helped develop both the tools and procedure used in the new standard. "The tools are two barbell-like artifacts for the optical tracking systems to locate during the test," says NIST electronics engineer Roger Bostelman. "Both artifacts have a 300 mm bar at the center, but one has six reflective markers attached to each end while the other has two 3D shapes called cuboctahedrons [a solid with 8 triangular faces and 6 square faces]." Optical tracking systems can measure the full poses of both targets.
According to Bostelman’s colleague, NIST computer scientist Tsai Hong, the test is conducted by having the evaluator walk two defined paths—one up and down the test area and the other from left and right—with each artifact. Moving an artifact along the course orients it for the x-, y- and z-axis measurements, while turning it three ways relative to the path provides the pitch, yaw and roll aspects.
"Our test bed at NIST's Gaithersburg, Maryland, headquarters has 12 cameras with infrared emitters stationed around the room, so we can track the artifact throughout the run and determine its pose at multiple points," Hong says. "And since we know that the reflective markers or the irregular shapes on the artifacts are fixed at 300 mm apart, we can calculate and compare with extreme precision the measured distance between those poses."
Bostelman said that the new standard can evaluate the ability of an optical tracking system to locate things in 3D space with unprecedented accuracy. "We found that the margin of error is 0.02 millimeters for assessing static performance and 0.2 mm for dynamic performance," he says.
Along with robotics, optical tracking systems are at the heart of a variety of applications including virtual reality in flight/medical/industrial training, motion capture in film production, and image-guided surgical tools. "The new standard provides a common set of metrics and a reliable, easily implemented procedure that assesses how well optical trackers work in any situation," Hong says.
The E3064-16 standard test method was developed by the ASTM Subcommittee E57.02 on Test Methods, a group with representatives from various stakeholders, including manufacturers of optical tracking systems, research laboratories and industrial companies. The E3064-16 document detailing construction of the artifacts, setup of the test course, formulas for deriving pose measurement error and the procedure for conducting the evaluation may be found on the ASTM website (https://www.astm.org).

John Wallace | Senior Technical Editor (1998-2022)
John Wallace was with Laser Focus World for nearly 25 years, retiring in late June 2022. He obtained a bachelor's degree in mechanical engineering and physics at Rutgers University and a master's in optical engineering at the University of Rochester. Before becoming an editor, John worked as an engineer at RCA, Exxon, Eastman Kodak, and GCA Corporation.