« Back To Case Studies
Prophesee Logo

Component Supplier

Member Since 2021

LEARN MORE

Prophesee is the inventor of the world’s most advanced neuromorphic vision systems. Prophesee’s patented sensors and AI algorithms, introduce a new computer vision paradigm based on how the human eye and brain work, revealing the invisible.

Content Filed Under:

Industry:
N/A

Application:
N/A

Event-Based Metavision® for Industrial Applications

POSTED 01/11/2024

Reveal the invisible in your factory

With Metavision® neuromorphic sensors, see what traditional cameras can't for drastically improved productivity, quality, safety and predictive maintenance.

   

 

Blur-free

 

High-Speed

 

1280x720px

No exposure time

  >10k fps time resolution equivalent   Resolution
         

>120dB

 

0.08 Lux

 

Shutter-free

Dynamic Range   Low light cutoff   Neither global nor rolling
shutter: async. pixel activation

 

 

ADD VALUE TO YOUR INDUSTRIAL APPLICATIONS

 

 

BOOST PRODUCTIVITY

Boost your productivity by counting and measuring at a rate of >1,000 objects/sec and inspect objects at >10m/s with 100x less data to be processed.

 

UNPRECEDENTED QUALITY CONTROL

Cut reject rate with real-time feedback loops for advanced processing methods, down to 5us time resolution. Unlock high-speed recognition applications with blur-free asynchronous event output (i.e OCR).

 

PREDICTIVE MAINTENANCE

Detect early signs of machine failure: Unique ability to measure vibration frequencies from Hz to Khz can predict abnormalities for predictive maintenance, minimizing spare parts inventory and machine downtime.

 

MONITORING + SAFETY

Monitor in real time areas where workers and machines interact for next-generation safety levels, without capturing images, even in complex lighting environments.

 

CUT COMPLEXITY & COSTS

Streamline – The Metavision® sensor is frameless, low data rate, high dynamic range and very sensitive in low lighting scenes. These characteristics let you streamline cumbersome processes by removing frame grabbers, industrial PCs, custom illumination from your visual acquisition and processing pipelines.

 

ONE SENSOR, MANY APPLICATIONS

 

XYT MOTION ANALYSIS

Typical use cases: Movement analysis – Equipment health monitoring – Machine Behavior monitoring

Discover the power of time – space continuity for your application by visualizing your data with our XYT viewer.

See between the frames

Zoom in time and understand fine motion in the scene

 

ULTRA SLOW MOTION

Typical use cases: Kinematic Monitoring, Predictive Maintenance

Slow down time, down to the time-resolution equivalent of over 200,000+ frames per second, live, while generating orders of magnitude less data than traditional approaches. Understand the finest motion dynamics hiding in ultra fast and fleeting events.

Over 200,000 fps (time resolution equivalent)

 

OBJECT TRACKING

Typical use cases: Part pick and place – Robot Guidance – Trajectory monitoring

Track moving objects in the field of view. Leverage the low data-rate and sparse information provided by event-based sensors to track objects with low compute power.

Continuous tracking in time: no more “blind spots” between frame acquisitions

Native segmentation: analyze only motion, ignore the static background

 

OPTICAL FLOW

Typical use cases: Conveyor Speed Measurement – Part/Object Speed Measurement, Trajectory Monitoring, Trajectory Analysis, Trajectory Anticipation

Rediscover this fundamental computer vision building block, but with an event twist. Understand motion much more efficiently, through continuous pixel-by-pixel tracking and not sequential frame by frame analysis anymore.

17x less power compared to traditional image-based approaches 

Get features only on moving objects

 

HIGH-SPEED COUNTING

Typical use cases: Object counting and gauging – pharmaceutical pill counting – Mechanical part counting 

Count objects at unprecedented speeds, high accuracy, generating less data and without any motion blur. Objects are counted as they pass through the field of view, triggering each pixel independently as the object goes by.

READ MORE

>1,000 Obj/s. Throughput

>99.5% Accuracy @1,000 Obj/s.

 

SPATTER MONITORING

Typical use cases: Traditional milling, laser & process monitoring, Quality prediction

Track small particles with spatter-like motion. Thanks to the high time resolution and dynamic range of our Event-Based Vision sensor, small particles can be tracked in the most difficult and demanding environment.

Up to 200kHz tracking frequency (5µs time resolution)

Simultaneous XYT tracking of all particles  

 

VIBRATION MONITORING

Typical use cases: Motion monitoring, Vibration monitoring, Frequency analysis for predictive maintenance

Monitor vibration frequencies continuously, remotely, with pixel precision, by tracking the temporal evolution of every pixel in a scene. For each event, the pixel coordinates, the polarity of the change and the exact timestamp are recorded, thus providing a global, continuous understanding of vibration patterns.

From 1Hz to kHz range

1 Pixel Accuracy

 

PARTICLE / OBJECT SIZE MONITORING

Typical use cases: High speed counting, Batch homogeneity & gauging

Control, count and measure the size of objects moving at very high speed in a channel or a conveyor.

Get instantaneous quality statistics in your production line, to control your process.

Up to 500 000 pixels / second speed

99% Counting precision

 

EDGELET TRACKING

Typical use cases: High speed location, Guiding and fitting for pick & place

Track 3D edges and/or Fiducial markers for your AR/VR application. Benefit from the high temporal resolution of Events to increase accuracy and robustness of your edge tracking application.

Automated 3D object detection with geometrical prior

3D object real-time tracking

 

VELOCITY & FLUID MONITORING

Typical use cases: Fluid dynamics monitoring, Continous process monitoring of liquid flow 

 

CABLE / YARN VELOCITY & SLIPPING MONITORING

Typical use cases: Yarn quality control, Cable manufacturing monitoring

 

PLUME MONITORING

Typical use cases: Dispensing uniformity & Coverage control, Quality & efficiency of dispersion, Fluid dynamics analysis for inline process monitoring

 

NEUROMORPHIC VISION AND TOUCH COMBINED FOR ENHANCED ROBOTIC CAPABILITIES

Researchers at the Collaborative, Learning, and Adaptive Robots (CLeAR) Lab and TEE Research Group at National University of Singapore are taking advantage of the benefits of Prophesee’s Event-Based Vision, in combination with touch, to build new visual-tactile datasets for the development of better learning systems in robotics. The neuromorphic sensor fusion of touch and vision is being used to help robots grip and identify objects.

READ MORE

 

1000x times faster than human touch

0.08s rotational slip detection

 

Don’t see a use case that fits?
Our team of experts can provide access to additional libraries of privileged content.
Contact us >