Vision & Imaging Blog
Need a Ride? Prepare for the Robo-Taxi with Embedded Vision
Tesla hopes to be the first company to put a mass number of robo-taxis on the road by the end of 2020. CEO Elon Musk claims the neural network and embedded vision technology that all current generation Tesla vehicles already in use will allow their self-driving cars to go head-to-head with services like Uber and Lyft.
Tesla’s Robo-Taxi Service
Tesla’s plan is to eliminate the need for human drivers and to use the Tesla vehicles already on the road for rides. Tesla owners would be able to put their self-driving cars to work to earn money when they’re not using them. No longer would a self-driving car be just another depreciating asset -- it could generate income for its owner. Tesla would take a 25-30% cut of the payment for the rides.
Tesla says that if owners aren’t sharing enough cars, they would provide dedicated robo-taxis. Musk says robo-taxis may even change vehicle design entirely; a car would no longer need a steering wheel or pedals.
How Self-Driving Cars “See”
Embedded vision systems use a diverse group of sensor technology that overlaps to verify what is being detected is accurate, and to provide redundancy if any part of the system fails. Most embedded vision systems use a combination of camera, radar, and LiDAR (Light Detection and Ranging; a remote sensing method that uses light in the form of a pulsed laser to measure ranges.)
Certified System Integrator Program
Set Yourself at the Forefront of the Global Vision Market
Vision system integrators certified by A3 are acknowledged globally throughout the industry as an elite group of accomplished, highly skilled and trusted professionals. You’ll be able to leverage your certification to enhance your competitiveness and expand your opportunities.
Cameras surround the vehicle on all sides and are used to put together a 360-degree view of their environment. Different types of cameras offer a wide range of view, long-range visuals, and super-wide panoramic views. Software is used to identify objects and judge distance and movement.
Radar sensors supplement camera vision in low visibility and provide speed and location data of surrounding objects. Radar can work in low light and in poor visibility. Radar sensors surround the car so that objects can be detected at every angle.
LiDAR is used in many vehicles, pulsing lasers to provide self-driving cars with a 3D view of their environment. A map is created to help the vehicle navigate. LiDAR, like radar, also works at night and in low visibility.
Tesla’s Embedded Vision System
Tesla’s vehicles use cameras and radar rather than LiDAR for their autonomous driving systems. Tesla’s brand of the self-driving car has a system called a full self-driving computer (FSDC). It includes two duplicate systems onboard that provide redundancy and added safety.
Each of Tesla’s vehicles is equipped with an embedded vision system that includes cameras, ultrasonic sensors, and radar. These sensors gather data to enable Tesla’s neural network to learn and recognize images, determine what objects are, and figure out what to do next. Tesla’s self-driving software is storing images and learning at an exponential rate thanks to its hundreds of thousands of vehicles already on the road.
Share This On X:
BACK TO VISION & IMAGING BLOG
Recent Posts