A big optics/vision opportunity: service robots

By Tom Hausken
Looking for a new opportunity in optics and vision systems?   Check out this new  market report on vision for service robots  from my colleagues at Vision Systems Design .  This is a market set to take off.  To give you an idea, with industrial robot unit sales in the tens of thousands per year, service robots could potentially sell in the millions.

Most robots today are not the futuristic kind we remember from the Jetsons or the somewhat creepy Actroid kind commercialized in Japan..  An industrial robots today is basically just factory automation with an articulating arm that makes it seem like a robot. 

A service robot is more like the more futuristic version--mobile, uncontained, and diverse--but not trying to act human, like some insecure, fawning android.  More precisely, it operates semi- or fully-autonomously to perform service functions, excluding manufacturing.  An industrial robot can be a service robot too, if it meets this definition. 

Examples of service robots include: UAVs, explosive or hazard disposal, automating cow milking, driver assistance, inspection and maintenance of hard-to-reach places, medical rehabilitation, surgery, and scientific exploration.  The UAV is the biggest market opportunity, becuase of the sophistication involved.  There are many smaller, fast growing segments.

This is a big deal for photonics because most service robots requrie machine vision of some kind.  This means the use of structured light (like what is used in the Microsoft Kinect), time-of-flight (like what is used in virtual keyboards), LIDAR, and so forth.  This has to be fused with other technologies, like GPS, radar, sonar, and inertial guidance.  For more sophisticated robots, simultaneous localization and mapping (SLAM) is critical to build maps of unknown environments or to update maps within known environments, while at the same time keeping track of the current location of the robot.

The technology is still emerging and remains to be worked out.  That means lots of hardware and software, and pretty deep stuff.  Imagine that the system doesn't necessarily need to "see" things the way we do--it just has to get the information it needs from its sensors.

For more information on the report, click here.
  •  
  •  
  •  
  •  
  •  
Copyright © 2007-2014. PennWell Corporation, Tulsa, OK. All Rights Reserved.PRIVACY POLICY | TERMS AND CONDITIONS