Asynchronous Convolutional Networks for Object Detection in Neuromorphic Cameras

IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops(2019)

引用 102|浏览40
暂无评分
摘要
Event-based cameras, also known as neuromorphic cameras, are bioinspired sensors able to perceive changes in the scene at high frequency with low power consumption. Becoming available only very recently, a limited amount of work addresses object detection on these devices. In this paper we propose two neural networks architectures for object detection: YOLE, which integrates the events into surfaces and uses a frame-based model to process them, and fcYOLE, an asynchronous event-based fully convolutional network which uses a novel and generalformalization of the convolutional and max pooling layers to exploit the sparsity of camera events. We evaluate the algorithm with different extensions of publicly available datasets, and on a novel synthetic dataset.
更多
查看译文
关键词
neural networks architectures,object detection,frame-based model,asynchronous event-based fully convolutional network,convolutional pooling layers,max pooling layers,camera events,neuromorphic cameras,event-based cameras,bioinspired sensors
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要