« Back To Vision & Imaging Industry Insights
AIA Logo

Member Since 1984


AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Automotive and Robotics Automotive and Robotics

Visual Inspection & Testing Visual Inspection & Testing

Machine Vision in the Automotive Industry

POSTED 08/24/2006  | By: Paul Kellett, AIA Director of Market Analysis

Increased complexity in today’s automobiles brings the potential for greater production errors, but automobile manufacturers can ill afford such errors in a highly competitive market. To achieve the quality that customers demand, manufacturers and their suppliers are increasingly relying on a highly effective approach to preventing defects at multiple stages of production. That approach is machine vision.

In the automotive industry, machine vision (MV) is used in a range of applications involving primarily inspections and robotic guidance .  Using embedded vision sensors to find objects in 2 or 3-dimensional space and adjust paths for the positions of the objects, robots utilize machine vision for far greater accuracy in critical activities, including auto racking (picking parts out of racks), bin picking and the positioning of parts (such as doors and panels) for assembly. 

MV systems also efficiently perform various types of inspections, determining essentially whether the sundry items comprising an automobile pass muster and rejecting those that do not.  This includes surface inspection for cosmetic flaws (such as dings, dents and wrinkles in body panels) as well as detection of functional flaws (such as irregularities on the bearing surfaces of automotive rocker arms or the correct spacing and size of mounting holes on disk brake pads).  Machine vision systems also verify the presence (or absence) of parts and the correctness of their shapes (such as in the case of gears, which can have missing or malformed teeth).  Finally, machine vision inspections for assembly verification insure error-free assembly (such as with closure panels that include doors, hoods, lift gates and tail gates). 
MV systems also perform parts recognition . For example, they can read treads of different makes and types of tires and direct their correct routing by conveyor belt to designated vehicles.  MV systems can also perform parts recognition via OCR functions where printed labels have been attached to parts.

Machine vision moreover enables dimensional gauging of precision machined components (such as fasteners, transmissions and other sub-assemblies). In so doing, MV systems insure that only parts falling within the correct tolerances find their way to into vehicles departing the assembly line.

Finally, machine vision can also be used for 2D data matrix reading .   An example of this application is the reading of codes that are laser-etched in a camshaft bar stock to provide precise instructions for grinding the camshaft, insuring a correct fit between cam and engine block.

As suggested by this cursory overview, different MV applications abound today in the automotive industry.  But will machine vision continue to advance in the automotive industry?   To answer this general question, we spoke to:

  • Valerie Bolhouse of Ford,
  • Steve Jones of GM and
  • Brad Dailey of Daimler Chrysler

1) What are the machine vision (MV) applications currently utilized by your company?

Valerie Bolhouse (Ford): ‘‘Ford uses vision for traceability, error proofing, in-station process control, robot guidance and dimensional control.’‘ 

Steve Jones (GM): ‘‘Machine Vision applications in GM Powertrain Manufacturing facilities can generally be divided into four categories:  Inspection/error proofing, part ID/tracking, gauging, and robot guidance.’‘

Brad Dailey (Daimler Chrysler):  ‘‘Vision- guided, robotic material handling systems,     inspection stations for adhesive application, dimensional validation systems, process control, and 3-D camera scanning for dimensional inspections.’‘

2) Which of these applications are most important?

Valerie Bolhouse (Ford): ‘‘The number one application of vision at Ford is reading data matrix codes for traceability.  We have really driven this technology, both internally and externally with our suppliers.  We also have a number of systems installed for dimensional control and vision guided robotics.’‘

Steve Jones (GM): ‘‘Each type of application contributes in a unique way to the success of the manufacturing operations, ensuring quality and eliminating waste.’‘

Brad Dailey (Daimler Chrysler): ‘‘Each system is very important in providing safe operation, improving efficiency of manufacturing and inspection of critical characteristics for a quality product.’‘

3) Do you foresee an increase in use of machine vision in your company?  If yes, why?  What are the drivers of increased use for machine vision within your firm?

Valerie Bolhouse (Ford): ‘‘We are using more and more vision in our plants.  One of the drivers is the demand for more flexible automation.  Vision can reduce the hard-tooled custom fixtures with adaptable tooling.  It can also be used for model verification on a flexible line.  Another driver is the requirement for improved quality.  Vision provides feedback for in-station-process control.  The dramatic improvement in price and performance of vision systems make them a viable option just when the need for the benefits they can provide is increasing.’‘

Steve Jones (GM): ‘‘The use of machine vision will continue to grow.  Factors contributing to the growth include improved reliability, easier use (GUI improvements), better vision algorithms, improved maintainability, lower implementation costs, application successes, and leadership confidence.  All of these factors contribute to improved quality and reduce operational costs which will in turn drive continued use of machine vision.’‘

Brad Dailey (Daimler Chrysler): ‘‘Yes, because of tighter tolerancing and increased productivity. The drivers of increased machine vision use are: safety, ergonomics, quality and delivery without delays and improved costs.’‘

4) Will different MV applications become more important in the future of your firm?

Auto RacingValerie Bolhouse (Ford): ‘‘I think we will continue with our current strategy and look at vision for quality and productivity improvements, two areas in which Ford is committed to exceed standard market capabilities.’‘

Steve Jones (GM): ‘‘Robot guidance has the potential to change the way we look at material handling applications.’‘

Brad Dailey (Daimler Chrysler): ‘‘Absolutely, DCX has invested heavily in additional, new robotic load systems for Advanced Flexible Manufacturing programs coming into the plants. MV makes everything easier in these types of manufacturing systems.’‘

5) Does your firm use smart cameras?  If yes, for what applications?

Valerie Bolhouse (Ford): ‘‘Many of our applications use smart cameras. We use smart cameras for traceability.  We also use them for error-proofing applications, which are typically just easier vision applications.’‘

Steve Jones (GM): ‘‘Smart Cameras are used in inspection/error proofing, part ID/tracking, and gauging applications.’‘

Brad Dailey (Daimler Chrysler): ‘‘Smart Cameras have proven to be very cost competitive when used in the right applications: inspection/error proofing, part ID/tracking, and gauging applications.’‘

6) Does your firm use PC-based machine vision systems?  If yes, for what applications?

Valerie Bolhouse (Ford): ‘‘Any PC-based machine vision systems we have come into our plants integrated as an application solution.  These would typically be systems where the supplier has developed a product using multiple cameras with a frame grabber and PC.  While we have concerns about the long term support of custom software solutions, we do have applications for robot guidance applications and body sealer and adhesive verification and maybe a few others with this architecture.  For multi-camera solutions, there is a crossover point on price where PC-based vision makes sense. 

However, there are advantages to smart cameras where we will pay a premium to use them.  For robot mounted cameras, the smart camera provides a more robust packaging solution.  The camera is already in a protective housing so you can reduce size and weight.  You can replace the problematic video cable with standard high flex Ethernet cables.  LED strobing can be controlled at the source by the smart camera.  If you mount an Ethernet switch on the robot's end of arm tooling, the only wires you have to dress out are one power cable and one Ethernet line.’‘

Steve Jones (GM): ‘‘PC Based machine vision systems are primarily used for robot guidance’‘.

Brad Dailey (Daimler Chrysler): ‘‘Most of our systems are PC-based machine vision systems.’‘

7) If your firm uses both smart cameras and PC-based based machine vision systems, which of these does your firm rely on the most?

Valerie Bolhouse (Ford): ‘‘Smart cameras.’‘

Steve Jones (GM): ‘‘We rely on both types of systems for different applications.  We do not rely on smart cameras for 3D robot guidance applications, nor would we utilize a PC-based machine vision system in a simple error proofing part presence application.  Choosing the right format for the application is fundamental.’‘

Brad Dailey (Daimler Chrysler): ‘‘PC-based machine vision systems for major applications and smart cameras for simple ones.’‘

8) Where is machine vision used primarily in your firm?  At the end of the production line, or in-station process control? 

Block Finishing and Inspectin CellValerie Bolhouse (Ford): ‘‘We primarily use vision for in-station process control.  By the time the product reaches the end of the line, our ability to inspect and assure that our product is of the highest quality is much more difficult.  We also like in-station process control because it allows us a level of immediate feedback that helps us ensure we're reaching our high Ford standards by catching issues before they can negatively impact our quality.  When using in-station process control, we have a greater ability to catch the first bad part in the pipeline.  Repairs at the end of the line are more costly, so this helps keep costs low while also better from a quality standpoint.’‘

Steve Jones (GM): ‘‘In many GM Powertrain plants, Machine Vision is used throughout the manufacturing floor in a variety of applications.’‘

Brad Dailey (Daimler Chrysler): ‘‘Both.’‘                                                         
9) What type of cameras – area scan or line scan- are typically used by your firm for MV applications?   Are cameras typically monochrome or color?

Valerie Bolhouse (Ford): ‘‘Our applications tend to use monochrome area scan cameras. We're usually looking at metal, and color cameras don't add much information.  We've had the occasional request to error-proof plastic parts for correct color, but usually these parts are so close in hue that color cameras can't discriminate between them anyhow.  Even though many of our applications are on a moving line, capturing an area image of the part on the fly with a part present trigger is easier and cheaper than synchronizing a line scan camera to the conveyor to acquire an image.  The other reason we've looked at line scan in the past would be for improved resolution, but even there, today's high resolution area cameras can do most of what we need.’‘

Steve Jones (GM): ‘‘Although there are specialized exceptions for line scan and color, most cameras are area scan monochromatic cameras.’‘

10)  What type of lighting is typically used for MV applications in your company?  ‘‘White’‘ light, UV, IR, NIR, etc.? 

Valerie Bolhouse (Ford): ‘‘We have moved to mostly LED lighting due to the long life and general robustness of the LEDs.  We use a lot of red LEDs, because they are readily available and the color is closely matched to the spectral response of the camera.  With metal parts, we do not get any advantage with color, and since people like to see the light as they are setting up the system and debugging it, we usually work in the visible spectrum.  We generally strobe the LEDs, especially for the high power LumiLEDs to extend their life and reduce the heat.  However, for strobing applications it is really nice to use the near-infrared (NIR) LEDs.  The operator is not bothered by the strobing lights, and often, is not even aware of the system.  Now that infrared lighting is finally available and inexpensive, we're finding the camera manufacturers are matching the spectral response of human vision instead of favoring infrared.  CCD and CMOS cameras provide about only 20% of the quantum efficiency in the infrared region as the visible spectrum.   I would really like to see a better response in the near infrared, so that we can use NIR LEDs for most of our applications.’‘

Steve Jones (GM): ‘‘Near IR (red LED’s) is the most common lighting used for illuminating metallic engine parts.  Many applications also utilize white light and some utilize other specialized lighting techniques.’‘

Brad Dailey (Daimler Chrysler): ‘‘Various lighting based on the application: white, red LED.’‘

11) What role, if any, do telecentric lenses play in MV applications in your firm?

Valerie Bolhouse (Ford): ‘‘We typically do not use telecentric lenses in our applications.  For any gauging or dimensional control applications we tend to use laser-based gauges which are calibrated throughout their volume and use conventional lenses.  We usually require a larger field of view than telecentric imaging can provide.  And in automotive, we haven’t had the need for the other advantages that telecentric imaging can provide (limited perspective error, change in magnification).’‘

Steve Jones (GM): ‘‘Telecentric and other specialty lenses are used in gauging applications.’‘

Brad Dailey (Daimler Chrysler): ‘‘I don’t know of any applications we have that use telecentric lenses.’‘

12) Does your firm itself perform system integration of MV components or is system integration outsourced?

Valerie Bolhouse (Ford): ‘‘System integration is usually outsourced.  Most of our vision systems come in as a subsystem as part of the larger cell.  The vision integrator will be a Tier 2 or Tier 3 supplier to Ford.  That doesn’t mean that we are not directly involved in the vision application.  Just as we have engineers who work with our Tier 1 integrators on the welding or sealing equipment and process, we will develop and implement solutions with our vision suppliers, and then work with them on the integration so that it meets our requirements.  We do integrate some of our own error proofing systems at the plant level with smart cameras, but this is generally when the need is identified after the line is in the plant and launched.’‘

Steve Jones (GM): ‘‘Although engineers within GM Powertrain have successfully integrated vision applications, most integration is outsourced.’‘

Brad Dailey (Daimler Chrysler): ‘‘Combination of both depending on program timing.’‘

13) In deciding to purchase MV components, software or systems what are the criteria that are most important to your firm?

Valerie Bolhouse (Ford): ‘‘Application robustness, system reliability, ease of use, and commonality of equipment.  Each of these is equally important.  Ford will not consider an application if we are not confident of a reliable, robust solution.  So we spend a lot of time up front evaluating system capability.  Theoretically, if you had a reliable, robust solution, the operator would never have to do anything with the system and usability would be of lesser importance.  However, that is not realistic, so we have to have systems with a good operator interface.  Then we’re always looking for ways to reduce the cost of the capital equipment, so if we are presented with a lower cost alternative, we will evaluate it.  Our supply base has been very good at recognizing this, so we’ve seen their system capability improve and the cost of the equipment come down without having to jump around from supplier to supplier chasing the lower cost.  We’re able to meet our last objective, equipment commonality, without sacrificing cost or performance.’‘

Steve Jones (GM): ‘‘Experience, durability, ease of use, repeatability, maintainability, and cost.’‘

Brad Dailey (Daimler Chrysler): ‘‘Reliability, ease of use, familiarity with current user base at the plant.’‘

14) Where in the production process does your firm employ vision-guided robots?

Valerie Bolhouse (Ford): ‘‘We use vision-guided robots for decking windshields and back lights to the vehicle.  We also use vision on a number of sealing systems in our paint shop.  Sealing in body can typically be done blind because we have the body panels fixtured in location so we know where they are.  But once the body is completely built up, and no longer precisely fixtured, vision is used to guide the operation for precise location.  We also use vision-guided robots for material handling, loading engine blocks and cylinder heads to the line in our Powertrain plants and completed assemblies to racks in Stamping and Body.’‘

Steve Jones (GM): ‘‘Vision-guided robots are used in material handling applications.  Applications include loading and unloading of conveyors, trays, racks, and pallets.’‘

Brad Dailey (Daimler Chrysler): ‘‘Vision-guided robots are used in material handling applications, sealing, glass decking, bin picking and loading and unloading of SGV (self-guided vehicles) reducing handling materials as much as possible to improve efficiencies.’‘

15) Of all the robots utilized by your firm what percent is vision-guided?  Do you anticipate a future increase in this percentage?
Valerie Bolhouse (Ford): ‘‘Because we have hundreds of robots in our body shops working on fixtured parts, the majority of our robots are blind.  We have been installing vision guided robotics for Op 10-- loading parts to the line, and for loading the finished product into the racks at the end of the line.  These applications have worked out so well to eliminate ergonomic issues and improve productivity, we definitely expect an increase in vision guided applications.  Also, there is a lot of exciting work being done for assembly on the fly applications using vision to track parts on a moving line, or picking parts off a moving conveyor.  Once this technology gets commercialized, many more applications will open up in Powertrain, Final Assembly and Stamping.  So I do think that more and more robotic applications will use vision.’‘

Steve Jones (GM): ‘‘With the improvements in reliability and the potential to reduce fixture and dunnage costs, the increase in vision guided robots is likely to continue.  Today GM Powertrain uses just over 1000 robots in its North American manufacturing facilities.  Today, between five and ten percent are vision guided.  Most of the robots have been installed for several years while all of the VGR have been installed more recently.’‘

Brad Dailey (Daimler Chrysler): ‘‘Current percentage is low but that is because you need to pick the right location to implement them and as the technology progresses I see this being one of the major ways everyone in the industry will use to increase productivity.’‘ 

Going forward, AIA will continue to monitor the use of machine vision in the automobile industry and is holding an important workshop on this subject.   The workshop, ‘‘Vision & Robots for Automotive Applications’‘, is scheduled for October 17 & 18, 2006 at the Sheraton Detroit Novi Hotel in Novi, Michigan (suburban Detroit).  The workshop is exclusively designed for Engineering and Manufacturing leaders at automotive OEMs and tier suppliers looking for:

  • New application ideas
  • Proven vision & robotic solutions
  • Methods to reduce cost, boost productivity, improve quality and increase flexibility
Embedded Vision This content is part of the Embedded Vision curated collection. To learn more about Embedded Vision, click here.