As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
The field of artificial vision could greatly benefit from the use of smart bio-inspired vision sensors capable of extracting relevant information from the luminosity signal, instead of simply sampling the external world at fixed time intervals. State-of-the-art neuromorphic vision sensors asynchronously convey information about local variations of input signals, either over time or over space. A new sensor that builds on such sensors is proposed that merges the two approaches, with the aim of concurrently evaluating the temporal and spatial derivative of visual input. The Asynchronous Space Variant vision sensor reports the temporal derivative and the absolute level of illumination in four regions of the pixel, allowing the computation of the spatial derivative as up/down and left/right difference. Space variance is built by two different resolution regions, a fovea and a periphery, that allow for high central resolution and wide field of view while minimizing the number of pixels and hence the amount of sensory data.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.