Self-Driving Vehicle Success is Tied to Machine Vision
Rideshare company Uber made a big splash when it introduced self-driving cars in Pittsburgh to pick up riders. Autonomous vehicles have moved off the drawing board and into the streets. During this test-phase, Uber will continue using engineers behind the wheel until the system is fully developed. Passenger pick-ups in the city of bridges and winding streets have so far proven successful.
This transportation breakthrough is a major opportunity for machine vision and the use of multiple devices to relay data. The makers of autonomous vehicles aren’t just changing how people will drive. They’re also re-thinking safety.
Self-driving cars being developed for consumers include changing safety systems from reacting to crashes such as the use of air bags to preventing crashes. These active safety systems are opening many opportunities for makers of machine vision systems.
A reason why vision is so powerful is highlighted in a 1989 paper from the International Centre for Mechanical Sciences, Issues on Machine Vision: [Vision] allows us to interact with the environment and to make decisions without being in physical contact with the objects around us.
Vehicles on the move that are constantly monitoring their environment with machine vision systems, as noted in the article Autonomous Car Industry Comes Knocking on Machine Vision’s Front Door, are in a mode of preventing accidents.
Expect cars to use multiple imaging devices and components including sensors, cameras, LIDAR (Light Detection and Ranging), and adar. These devices will be tasked with monitoring and eventually controlling everything from lane departure to parking.
But all good ideas have challenges to overcome. In the development of autonomous vehicles, power and size constraints need addressed in the design stage. The cables alone that are needed to connect the various components can add considerable weight to a vehicle and negatively impact its fuel efficiency.
A January 2016 edition of Automotive News featuring a major supplier of electrical harnesses for cars describes how luxury cars currently have “miles” of electrical cables. But the point made in Yazaki rethinks wiring for autonomous age, is that Yazaki Corporation is setting out to make fewer harnesses for cars because “there simply isn’t room to pack in all of the extra wiring that vehicles are expected to need.”
In keeping with the age of digital communications, the world’s biggest maker of harness systems is considering wireless communication among components within cars.
A presentation at the recent Auto-Sens conference in September 2016, Challenges Facing Autonomous Vehicles, highlighted a few conditions that have to be met before self-driving vehicles are found in every residential driveway.
Imaging solutions need further development so calculating distances and range finding can be done in all lighting conditions.
LIDAR is used in prototypes and yet a single scanner can cost $80,000. A question that the conference raised was what happens when “hundreds of vehicles using LIDAR share the same frequency band on busy multi-lane roads?”
Firms like Yazaki are addressing system architecture, but questions remain with details like the placement of processors and sensors.
As the use of autonomous vehicles increases, component suppliers should be able to count on a steady stream of income. Specifications parts will need to be provided “up to and perhaps over a decade after original implementation.”
The autonomous or self-driving car industry will change how people relate to their cars and how they expect them to perform. Conquering roadways is a high-stakes effort that will demonstrate automation’s flexibility in uncertain environments. In Uber’s Pittsburgh test, cars have to adapt to human drivers as noted in a Business Insider article Uber’s driverless cars have problems. Changes in lighting and poor conditions due to storms are other variables that the vehicles will have to negotiate.
Machine vision will guide autonomous vehicles just like vision systems are useful in collaborative robots. It will play a major role in the cars of the future as well as autonomous robots in the factories of today and tomorrow.
Automation brings together a range of disciplines and specialists in areas like imaging and motion control to create systems. Stay in the know and access resources through A3automate.org.
- AI Applications In The Global Supply Chain
- Application Stories: Highlighting Automation Benefits
- Innovative Technologies Emerge to Meet Evolving Machine Vision and Imaging Challenges
- Vision and Imaging Technologies Continue Growth Beyond the Factory Floor
- Robot Ethics: Where Values And Engineering Meet
- Motion Control: Where Mathematics & Physics Prevent Collisions
- View All