GigE Vision delivers a higher standard

May 1, 2006
Machine vision meets Gigabit Ethernet networking.

New standards and technologies usually take a while to develop, especially in the machine-vision industry. This spring, following a recent burst in development of Gigabit Ethernet cameras and with the Automated Imaging Association (AIA; Ann Arbor, MI) readying the final draft of its GigE Vision standard, changes in the form of networked industrial cameras are set to impact the future of machine-vision systems design.

These Gigabit Ethernet cameras are poised to open new applications and price/performance points for machine-vision system integrators. The combination of high data rates, standard interface hardware and camera controls, and low-cost cabling make Gigabit Ethernet cameras potentially very attractive. However, questions still surround the implementation of the AIA’s GigE Vision standard, including its implementation, how GigE Vision relates to important emerging standards, such as GenICam, under development by the European Machine Vision Association (EMVA; Frankfurt, Germany), and what the system integrator requires to implement GigE-based machine-vision systems.

Engineers in the machine-vision industry typically approach the subject of GigE cameras from a communications-protocol perspective, citing the benefits of a large installed base of networked CPUs and PLCs and the ease of transferring data between components using standards-based protocols. But until now, without accepted machine-vision-specific standards, Ethernet-based systems have not been a means to transfer image data between cameras and computers.

Adding to the confusion, camera vendors have different perspectives on how to implement Gigabit Ethernet interfaces into their products. Some smart cameras, for example, perform high-speed image-processing functions on-board, producing pass/fail decisions that are transmitted using the well-defined TCP/IP to the host computer. While these companies may call their cameras Gigabit Ethernet-compliant, the function of the communications protocol is simply to transmit pass/fail data, rather than images, over the network.

Implementing the Standard

To implement the AIA GigE Vision standard, camera vendors must format image and control data into IP packets for transmission over Gigabit Ethernet networks. Currently, camera vendors are taking three approaches. The first is to use the iPORT IP engine from Pleora Technologies (Kanata, ON, Canada) in an embedded FPGA or as an off-the-shelf product. Other camera vendors off-load these functions using an FPGA in conjunction with a general-purpose processor and a Gigabit Ethernet controller chip. Both Imperx (Boca Raton, FL) and Mikrotron (Unterschleissheim, Germany) have incorporated the iPORT engine into their cameras.

Rather than use third-party products, Prosilica (Burnaby, BC, Canada) and Tattile (Brescia, Italy) have developed their own offload engines in hardware using an FPGA, and provide customers with custom drivers and software development kits. In the design of its forthcoming GigE camera, Alacron (Nashua, NH) uses an off-the shelf processor with a built-in Gigabit Ethernet controller and IP software that will allow the camera to support both TCP/IP and UDP protocols.

To allow interoperability between cameras and third-party software, the GigE Vision standard can use the GenICam standard, which is designed to provide a single camera-control interface to all types of cameras, no matter whether they use the GigE Vision, Camera Link, or 1394 FireWire standards. The GigE Vision and GenICam committees are also working together to produce a list of mandatory, optional, and recommended feature names and descriptions.

In addition to these development tasks, system latency remains a challenge. GigE cameras with I/O capabilities can bypass this limitation to a certain extent. By time-stamping each image and correlating it to an encoder count, the I/O system can monitor the location of, for example, a product on a conveyor belt.

Finally, the problem of benchmarking CPU use with each different hardware incarnation, network interface card, and software development kit will not be an easy task. That is a challenge that the AIA and EMVA are aware of and one that should be addressed by committee members to provide system integrators the benchmark data they will need in choosing such cameras.

About the Author

Conard Holton | Editor at Large

Conard Holton has 25 years of science and technology editing and writing experience. He was formerly a staff member and consultant for government agencies such as the New York State Energy Research and Development Authority and the International Atomic Energy Agency, and engineering companies such as Bechtel. He joined Laser Focus World in 1997 as senior editor, becoming editor in chief of WDM Solutions, which he founded in 1999. In 2003 he joined Vision Systems Design as editor in chief, while continuing as contributing editor at Laser Focus World. Conard became editor in chief of Laser Focus World in August 2011, a role in which he served through August 2018. He then served as Editor at Large for Laser Focus World and Co-Chair of the Lasers & Photonics Marketplace Seminar from August 2018 through January 2022. He received his B.A. from the University of Pennsylvania, with additional studies at the Colorado School of Mines and Medill School of Journalism at Northwestern University.

Sponsored Recommendations

Request a quote: Micro 3D Printed Part or microArch micro-precision 3D printers

April 11, 2024
See the results for yourself! We'll print a benchmark part so that you can assess our quality. Just send us your file and we'll get to work.

Request a free Micro 3D Printed sample part

April 11, 2024
The best way to understand the part quality we can achieve is by seeing it first-hand. Request a free 3D printed high-precision sample part.

How to Tune Servo Systems: The Basics

April 10, 2024
Learn how to tune a servo system using frequency-based tools to meet system specifications by watching our webinar!

Precision Motion Control for Sample Manipulation in Ultra-High Resolution Tomography

April 10, 2024
Learn the critical items that designers and engineers must consider when attempting to achieve reliable ultra-high resolution tomography results here!

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!