Ultra-Low Bandwidth Video Streaming Using A Neuromorphic, Scene-Driven Image Sensor

2016 IEEE International Symposium on Circuits and Systems (ISCAS)(2016)

引用 2|浏览15
暂无评分
摘要
This live demonstration shows ultra-low bandwidth video streaming based on a scene-driven event-encoding imaging sensor. The approach exploits the inherent focal-plane redundancy suppression / video compression achieved by an array of autonomous, auto-sampling pixels. The data readout from the camera is optimized for transmission bandwidth using variable bit-length pixel address encoding and spatio-temporal pre-filtering of the raw image data, resulting in instantaneous bit rates that vary between 0bps and a set rate, e.g. 256kbps, only depending on scene activity. The demonstrated device is a small form-factor stand-alone camera that, besides the sensor and optics, includes hardware-based data encoding and filtering, and a wireless data transmission module. The device is designed for e.g. surveillance applications in resource-limited environments such as sensor-networks or IoT.
更多
查看译文
关键词
ultra-low bandwidth video streaming,scene driven image sensor,scene driven event encoding imaging sensor,inherent focal plane redundancy suppression,video compression,auto-sampling pixels,spatio-temporal prefiltering,wireless data transmission module
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要