The Future of Warehouses: Advances in AMR Vision Systems and Machine Learning

Artificial intelligence turbocharges two powerful technologies to pave the way for a revolution in logistics.

Where automated guided vehicles (AGVs) require the equivalent of guardrails to navigate —  most commonly lines or wires on the ground to follow — autonomous mobile robots (AMRs) hypothetically enjoy complete freedom of movement.

That freedom of movement comes at a price. AGVs can be trusted to stay on schedule while sticking to paths that provide maximum safety for humans and the least interference with other vehicles. AMRs, on the other hand, must in real time calculate their surrounding environment and relative position before taking action.

Even AMRs sporting vision systems with edge inference and the most sophisticated machine learning algorithms still may not provide the desired speed, accuracy, or safety when faced with the highly variable conditions of a busy warehouse or order fulfillment center.

However, incorporating the latest AI technology into AMRs, at both the hardware and software levels, changes the speed and accuracy equations entirely.

Sensor Fusion and Smart Navigation

Most AMRs use combinations of 3D vision technologies like LiDAR, structured light, time-of-flight, and stereo vision for navigation and localization.

Cameras and laser sensors paired with internal measurement units (IMUs) can provide simultaneous localization and mapping (SLAM) capability, for example. This allows the AMR to continually revise its map of the immediate environment while the AMR is in motion. Units equipped with 4G/5G transmitters can also share their locations with each other or with control systems that monitor entire AMR fleets.

Fiduciary tag readers or RFID scanners can assist AMRs with identifying the correct shelves or totes to move. For more delicate operations like pick-and-place, AMRs can use machine vision to guide attached robot arms equipped with cameras.

AMRs also utilize machine learning for tasks like translating historical movement data into optimal pathing. The AMR may “learn” that certain rows within a warehouse almost always provide obstructions to movement, for example, because humans or other machines are usually present.

Machine learning algorithms running on the edge of the AMR, or remotely off the cloud with fleet management systems, can analyze the movement data and calculate paths that avoid these warehouse rows as much as possible.

Vision system and machine learning technologies have advanced to the point where AMRs can reliably complete all of these operations in a variety of warehouse environments. The questions now are how quickly AMRs may operate, how complicated a set of tasks they can complete, and how safely they can operate in close proximity to human beings.

Faster AI Enables Superior AMRs


July 22-23, 2025
Hyatt Regency, Minneapolis, MN



Neural computing, processing data on artificial neural networks that mimic the human brain, has exponentially increased the speed of AI software. These benefits have now even translated to the hardware level in the form of neural processing units (NPUs) that offload AL calculation from the rest of the computer system. NPU technology has become so affordable that the chips have been standard in smart devices for years and the newest laptop PCs feature NPUs in standard chipsets.

Neural computing is practically tailor-made for AMRs. The technology increases the speed of inference tasks like the localization, navigation, and object recognition, on which AMR operations depend.

The maximum speed at which an AMR may safely operate is determined by the complexity of its environment and the processing speed of its guidance software. AMRs powered by neural computing will be able to sense and process data at previously impossible speeds in even the most challenging environments.

Companies like Symbotic are already manufacturing AMRs that come standard with NPUs. Wal-Mart has deployed this technology in its regional supply networks since 2021 and signed an agreement in 2022 to deploy the advanced AMRs in all of its 42 regional distribution centers by 2030.

Faster inference also makes it possible for AMRs to operate safely in close proximity to humans. In 2022, Amazon developed a new AMR called Proteus designed to carry out operations in the exact same spaces as human workers. Proteus employs cameras, LiDAR, fiducial tag readers, and other collision-avoidance sensors and relies heavily on advanced AI.

In addition to the benefits enjoyed by individual AMRs, AI also improves fleet management software like KUKA’s Mobile Robot expert System (KMReS). New AI-powered control systems will be able to manage larger AMR fleets than ever before, determining and adjusting optimal pathing for hundreds of robots simultaneously.

Fleet-wide AMR predictive maintenance systems will gain increased speed and accuracy when powered by AI. Advanced fleet management software could even schedule work orders for AMRs to increase the efficiency of human maintenance teams.

See the Future of AMR Technology in Person

Thanks to neural computing and AI, the future is extremely bright for AMR technology. To learn more about the future of AMR vision systems and machine learning, register for the Humanoid Robot Forum on Monday, October 7, and the Autonomous Mobile Robots & Logistics Conference immediately following in the same location on October 8 and 9. Register for both events and save $250!

BACK TO BLOG

Recent Posts