You are here : Home > Making event-based image sensors up to ten times more frugal

News

Making event-based image sensors up to ten times more frugal


When speed, robustness, and low power are non-negotiable, event-based image sensors possess some intrinsic qualities that clearly position them ahead of other kinds of devices. However, they do have one major flaw, which is that when the entire scene moves, a huge number of events is generated. An advance recently made at CEA-List solves this problem.

Published on 25 October 2022

Event-based sensors are different from conventional sensors in that individual pixels detect changes in brightness at a point in the scene to trigger recording. If no changes are picked up, nothing is captured, vastly reducing the amount of data to process and the associated computing resources needed. These very sensitive sensors have high time resolutions (around a microsecond), making them ideal for the observation of fast, transient events. The images captured are also high-quality, regardless of how well-lit the scene is. The main drawback is the large number of events generated when the entire scene moves. If the output interface is saturated by too much information, data can be lost.

Startup Prophesee, which pioneered event-based sensor technology, commercializes an advanced device that a PhD candidate at CEA-List recently used as a test case. The PhD research project hinged on filtering at the sensor to reduce the number of events generated. This was done by introducing parallel processing using space-time convolution filters. Inspired by the first layer of the human visual cortex, the filters extract the oriented edges (vertical, horizontal, or diagonal lines) that form the basic building blocks of images perceived by humans. 3D integration technologies were then investigated, and it was shown that eight convolution filters can be integrated onto each 32x32 pixel array using CMOS 28 nm processes. This first filter narrows the location of the event down to groups of pixels, reducing the events generated tenfold with no information loss.

SLAM (Simultaneous Localization and Mapping) algorithms were then added to the filter-equipped sensor for testing on an indoor drone piloting scenario. The drone position was measured after 69 meters of travel with an error of just 10 cm, between three and ten times better than the current state of the art. For fast visual odometry, the sensors proved to be particularly well-suited for use cases where speed and sensitive detection of small details matter, qualities that would be especially beneficial to in-line quality inspection for the manufacturing industries.

Three patents have been filed as a result of this research. CEA-List is now working with Prophesee to develop specific algorithms and applications to get maximum leverage out of event-based sensors' intrinsic qualities.

Top page