« Back To Vision & Imaging Industry Insights
AIA Logo

Member Since 1984

LEARN MORE

AIA - Advancing Vision + Imaging has transformed into the Association for Advancing Automation, the leading global automation trade association of the vision + imaging, robotics, motion control, and industrial AI industries.

Content Filed Under:

Industry:
Automotive Automotive

Application:
Visual Inspection & Testing Visual Inspection & Testing

Maturity Leads to Segmentation for Automotive Automation Systems

POSTED 01/26/2006  | By: Winn Hardin, Contributing Editor

Interested in learning more
about Machine Vision for
Automotive Applications?
Join us October 17 & 18, 2006 in Novi,
Michigan for the Machine Vision and
Robots for Automotive Applications
Workshop.  This intensive two-day
workshop will focus on successful
applications of machine vision and
robotics in the automotive industry
(such as robot guidance, traceability,
and error proofing).  Industry experts
will provide knowledge that can be put
to immediate use.  Table-tops exhibits
during the workshop bring the latest
in machine vision and robotics
technology together in one venue.

.

Segmentation and optimization are common characteristics of maturing technologies. As the microprocessor matured, computer systems that originally came only in large mainframes, split into dozens of varieties of mainframes, minis, servers, workstations and desktop platforms.

Machine vision systems are following suit, as revealed by a close look at one of the largest end-user markets for machine vision. Vision applications in the automotive industry are benefiting from several recent system advances, including slimmed-down versions of smart cameras, sometimes called 'smart gauges,' improvements in direct part marking for part traceability that is more powerful and robust than RFID tags, and advanced algorithm development for robot guidance applications.

Automotives' 'Big Three' Applications
Machine vision applications in automotive manufacturing fall in three broad categories, according to Frank Maslar of Ford Motor Company’s Advanced Manufacturing Technology Development: part identification, inspection and gauging, and robot guidance.

Direct part marking (DPM) is driving vision deeper into identification, or part tracking applications in the automotive industry. This system, which typically uses a variety of 2D data matrix codes, contains both serialization numbers for tracking individual parts, but can also help lead to more flexible manufacturing lines by including process-related information.

‘‘For example, when we machine engine blocks, there are bearings that go with the crank shaft,’‘ explains Ford's Maslar. ‘‘We measure the sizes of the journals and the engine assembly and use that information to pick the right size bearing liners. If we make the parts in one plant and assemble in another plant, we have to transfer that information with the parts. We have done this with [radio frequency] RF tags, but you have to recycle the tags. They don’t last forever; they get lost, etc., which leads to expense. So what we do now is take the size information and the measurement data for the bearing liner and encode that into a data matrix code and mark it directly on the part. When the part gets to assembly, we know exactly what part goes with what assembly.’‘

DPM allows each part to be individually identified for its lifetime, which may last up to 25 years or more, but also allows manufacturers like Ford Motor Company to be more flexible in their operations, since the car model number associated with a particular part can be encoded directly onto the part. ‘‘DPM gives us portability as well as greater flexibility in utilizing our infrastructure,’‘ Maslar said.

DPM is among the hottest applications in automotive automation today, according to John Lewis, public relations specialist at Cognex Corp. (Natick, Massachusetts), ‘‘Interest is driven in part by the need to comply with the TREAD or Transportation Recall, Enhancement, Accountability and Documentation Act, but also because part traceability plays a key role in error proofing initiatives.’‘

‘‘DPM has moved past the early adoption phase,’‘ Lewis continued. ‘‘In that phase of adoption, customers were marking parts for internal traceability projects (i.e. closed loop) and needed 'verifiers' for internal 'process control.' Now, auto makers are pushing their suppliers to comply, and these marks (primarily 2D codes) must be read throughout the supply chain. This is driving tremendous growth right now for ID readers and verification systems.’‘


Certified System Integrator Program

Set Yourself at the Forefront of the Global Vision Market

Vision system integrators certified by A3 are acknowledged globally throughout the industry as an elite group of accomplished, highly skilled and trusted professionals. You’ll be able to leverage your certification to enhance your competitiveness and expand your opportunities.

GET CERTIFIED


 

Inspection and Gauging
Machine vision systems were first designed to automatically find features in an image and make objective measurements based on that visual information, essentially inspecting parts for the purpose of quality evaluation or closed-loop process control. Today, the maturing of vision systems is leading to a segmentation of these inspection and gauging systems, with particular emphasis on working in a three-dimensional (3D) world while offering more flexible, cost-effective solutions once handled by discreet sensors, also better known as error-proofing or 'go-no-go' applications.

According to Cognex's Lewis, absence/presence sensors are significantly expanding the number of vision supported operations throughout the automotive manufacturing chain. ‘‘Automotive manufacturers have been replacing multiple photoelectric sensors and PLC logic with the recently released Checker sensor because it's more cost-effective. Checker has solved many error proofing applications throughout the automotive manufacturing process including verifying the presence of piston rings, weld nuts, threads in holes, correct buttons, clips on seats and many more.’‘

‘‘You don't need a lot of image processing power for these 'go-no-go' applications,’‘ adds Ford's Maslar. ‘‘You don't need a lot of filtering, or correlation…they're just straight forward absence/presence applications. When we need more power, we go with smart cameras for the most part, because the power of the smart camera is approaching the power of the traditional, PC-host vision system. And from a maintainability standpoint, many of the smart cameras are often easier to use. Many of the traditional systems are programmed in C or C++ and there aren’t many maintenance people who are really comfortable doing that kind of thing. We found that the smart camera graphic interfaces are simple enough that maintenance technicians can go out and maintain the system, which is a big issue for us.’‘

A 3D World
According to Adil Shafi, president of Shafi Inc. (Brighton, Michigan), non-laser-scanning, passive camera systems for 3D measurements are helping to provide flexibility in assembly operations by operating as in-line coordinate measurement machines (CMM), offering the precision of CMM machines without the need to take the part to an offline CMM location.

‘‘These systems look at various regions in a frame, and make 3D measurements based on the location of linkages, brackets, cross-members and such,’‘ Adil explained. This same capability also is greatly expanding the role of vision in robotic assembly workcells.

‘‘The move from 2D to 3D robot applications is making vision very popular for robot guidance,’‘ explains Ford's Maslar. ‘‘Today, companies like Fanuc (Rochester Hills, Michigan), ISRA (Lansing, Michigan), Shafi, Braintech (North Vancouver, BC, Canada), Adept (Livermore, California) and ABB (Zurich, Switzerland) all offer 3D vision systems of one type or another. What we're really looking forward to is vision guidance for moving objects.’‘

‘‘As 3D vision technologies improve, these systems are going into body-in-white applications, which is very exciting because this is an area that rarely used a great deal of robot guidance,’‘ said ISRA's Kevin Taylor, sales manager for North America. ‘‘Some of these stations are very complex because of all the tooling in them, and we're showing that vision systems and robots can do the tasks that end users thought were not possible just a few years ago.’‘

Sometimes called vision servoing, Maslar prefers the term object tracking because it implies fewer disciplines. Maslar expects object tracking to be particularly useful on retrofit applications, or the majority of vision-guided robot workcell applications because most manufacturing lines do not have encoders built into them, nor do they offer an easy place to install one. Other applications, such as truck assemblies, suspend the truck frame from hanging metal chains, so the frame is constantly moving, which poses a challenge for a vision guided robot workcell. ‘‘Object tracking has already been demonstrated successfully in labs,’‘ Shafi added.

‘‘This technology can also help us compensate for part-to-part variations,’‘ Maslar explained. ‘‘The tools made here will go a long way to handling variations and improving tolerances in many other assembly operations, leading to more automation.’‘