Embedded Vision Explained: How Do Autonomous Cars See?

Embedded Vision Explained: How Do Autonomous Cars See?If autonomous cars are going to drive better than humans, cars must use embedded vision to “see” better than humans. Developers are creating a detection system that can sense a vehicle’s environment better than human eyesight through a combination of camera, radar, and LiDAR sensors (Light Detection and Ranging; a remote sensing method that uses light in the form of a pulsed laser to measure ranges.

Working together, diverse sensors can verify accurate detection. Camera, radar, and LiDAR provide the car visuals of its surroundings and also detect the speed and distance of nearby objects. Inertial measurement units are used to track a vehicle’s acceleration and location.

Cameras Provide a Visual

Cameras are the most accurate way embedded vision can capture a visual representation of the world. Autonomous cars use cameras placed on every side: front, rear, left, and right. These images are used to put together a 360-degree view of their environment.

Cameras include wide-angle lenses with a shorter range, and others offer a narrower view capable of long-range visuals. Fish-eye cameras provide panoramic views and are often used to help the vehicle park itself.

Vehicle manufacturers often use a series of CMOS imaging sensors to produce images between 1 and 2 megapixels; most use relatively inexpensive 2D cameras, but some are incorporating 3D cameras as well. Embedded vision systems need sensors with a high dynamic range of more than 130 dB to ensure a clear image in all conditions, including direct sunlight.


Don't Miss These Industry-Leading Events!

A3 Business Forum

January 20-22, 2025
Orlando, FL

Automate

May 12-15, 2025
Detroit, MI

AISA

November 3-5, 2025
Houston, TX



Radar Sensors Extend Capabilities

Radar sensors are used to supplement camera vision in low visibility, such as night driving or poor weather. Radar transmits radio waves in pulses -- those waves bound off an object and back to the sensor, providing information about the speed and location of the object.

Radar sensors also usually surround the car to detect objects at all angles. They can detect speed and distance, but can’t provide information on the type of vehicle -- this is where another sensor used in embedded vision systems comes into play.

LiDAR Provides a 3D View

Camera and radar are included on most new vehicles. Cars use them for advanced driver assistance and park assist technologies. They can also offer some level of autonomy when a human is in the driver’s seat with control of the vehicle.

For a completely autonomous car, the embedded vision system uses LiDAR. LiDAR has the capability of providing a 3D view of the environment, with the added benefit of working well in low-light conditions. It detects the shape and depth of surrounding cars, pedestrians, and road geography. Only a few LiDAR sensors are needed for an effective embedded vision system.

Like the human brain, an autonomous car’s embedded vision system must take in the data from all of these sensors and make interpret the information. Having different types of sensors offers reliability and overlap, as a fail-safe, that can respond in real-time.

BACK TO VISION & IMAGING BLOG

Embedded Vision This content is part of the Embedded Vision curated collection. To learn more about Embedded Vision, click here.