This works is a demonstration of a SNN deployed on FPGA with the SPLEAT neuromorphic accelerator on it. The Demo use a computer to read the event of the camera, stream it to the board. The Board compute the events with the SNN and send the class to the computeur that display the result on the screen.
This video is extracted from one test set sample of the SEENIC dataset.
This sample shows a Hubble mockup recorded on a robotic test bench with an event-based camera.
This tool converts standard RGB video into an event-based representation and provides utilities to visualise and export event frames. It includes a file-to-event video converter and an interactive live USB camera converter with real-time controls (threshold, noise, merge mode, spatial/temporal low-pass, display scale) and recording. The project is intended primarily as a pedagogical tool rather than a research-grade tool.
This works is one of the result coming from this paper work : IJCNN - Embedded Event Based Object Detection with Spiking Neural Network.