Full Text:   <1986>

Summary:  <338>

CLC number: 

On-line Access: 2022-01-24

Received: 2020-08-11

Revision Accepted: 2021-05-16

Crosschecked: 0000-00-00

Cited: 0

Clicked: 3430

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Peiwen ZHANG

https://orcid.org/0000-0002-0040-8546

Jiangtao XU

https://orcid.org/0000-0003-1162-6562

Zhiyuan GAO

https://orcid.org/0000-0001-6708-6222

Kaiming NIE

https://orcid.org/0000-0002-5383-0580

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2022 Vol.23 No.1 P.113-122

http://doi.org/10.1631/FITEE.2000407


Motion detection for high-speed high-brightness objects based on a pulse array image sensor


Author(s):  Peiwen ZHANG, Jiangtao XU, Huafeng NIE, Zhiyuan GAO, Kaiming NIE

Affiliation(s):  School of Microelectronics, Tianjin University, Tianjin 300072, China; more

Corresponding email(s):   authorneptune@tju.edu.cn, xujiangtao@tju.edu.cn, niehuafeng_ee@163.com, flyuphigher@outlook.com, nkaiming@tju.edu.cn

Key Words:  Optical flow, Retina-like image sensor, Pulse triggered, High-speed targets, Vision processing


Peiwen ZHANG, Jiangtao XU, Huafeng NIE, Zhiyuan GAO, Kaiming NIE. Motion detection for high-speed high-brightness objects based on a pulse array image sensor[J]. Frontiers of Information Technology & Electronic Engineering, 2022, 23(1): 113-122.

@article{title="Motion detection for high-speed high-brightness objects based on a pulse array image sensor",
author="Peiwen ZHANG, Jiangtao XU, Huafeng NIE, Zhiyuan GAO, Kaiming NIE",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="23",
number="1",
pages="113-122",
year="2022",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2000407"
}

%0 Journal Article
%T Motion detection for high-speed high-brightness objects based on a pulse array image sensor
%A Peiwen ZHANG
%A Jiangtao XU
%A Huafeng NIE
%A Zhiyuan GAO
%A Kaiming NIE
%J Frontiers of Information Technology & Electronic Engineering
%V 23
%N 1
%P 113-122
%@ 2095-9184
%D 2022
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2000407

TY - JOUR
T1 - Motion detection for high-speed high-brightness objects based on a pulse array image sensor
A1 - Peiwen ZHANG
A1 - Jiangtao XU
A1 - Huafeng NIE
A1 - Zhiyuan GAO
A1 - Kaiming NIE
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 23
IS - 1
SP - 113
EP - 122
%@ 2095-9184
Y1 - 2022
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2000407


Abstract: 
We describe a method of optical flow extraction for high-speed high-brightness targets based on a pulse array image sensor (PAIS). PAIS is a retina-like image sensor with pixels triggered by light; it can convert light into a series of pulse intervals. This method can obtain optical flow from pulse data directly by accumulating continuous pulses. The triggered points can be used to filter redundant data when the target is brighter than the background. The method takes full advantage of the rapid response of PAIS to high-brightness targets. We applied this method to extract the optical flow of high-speed turntables with different background brightness, with the sensor model and actual data, respectively. Under the sampling condition of 2×104 frames/s, the optical flow could be extracted from a high-speed turntable rotating at 1000 r/min. More than 90% of redundant points could be filtered by our method. Experimental results showed that the optical flow extraction algorithm based on pulse data can extract the optical flow information of high-brightness objects efficiently without the need to reconstruct images.

基于脉冲阵列图像传感器的高速高亮度目标检测

张培文1,2,徐江涛1,2,聂华峰1,2,高志远1,2,聂凯明1,2
1天津大学微电子学院,中国天津市,300072
2天津市成像与感知微电子技术重点实验室,中国天津市,300072
摘要:提出一种基于脉冲阵列图像传感器(PAIS)的高速高亮目标光流提取方法。PAIS是将光信号转换成一系列脉冲间隔的仿视网膜图像传感器。通过累积连续脉冲直接从脉冲数据流中获得光流,当目标相对于背景亮度较大时,触发点可过滤冗余数据。该方法充分利用PAIS对高亮度目标快速响应特性。将该方法用于不同背景亮度高速转盘的光流提取,在传感器模型和实际拍摄数据中进行实验。在2×104帧/秒的采样条件下拍摄转速为1000转/分的高速转盘,可以滤除90%以上冗余点。实验结果表明,基于脉冲数据的光流提取算法可在无需重构灰度图像基础上有效提取高亮目标光流信息。

关键词:光流;仿视网膜图像传感器;脉冲触发;高速目标;视觉处理

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Almatrafi M, Hirakawa K, 2020. DAViS camera optical flow. IEEE Trans Comput Imag, 6:396-407.

[2]Barron JL, Fleet DJ, Beauchemin SS, 1994. Performance of optical flow techniques. Int J Comput Vis, 12(1):43-77.

[3]Benosman R, Ieng SH, Clercq C, et al., 2012. Asynchronous frameless event-based optical flow. Neur Netw, 27:32-37.

[4]Benosman R, Clercq C, Lagorce X, et al., 2014. Event-based visual flow. IEEE Trans Neur Netw Learn Syst, 25(2):407-417.

[5]Berner R, Brandli C, Yang MH, et al., 2013. A 240×180 120dB 10mW 12μs-latency sparse output vision sensor for mobile applications. Symp on VLSI Circuits, p.C186-C187.

[6]Brooks JM, Gupta AK, Smith MS, et al., 2018. Particle image velocimetry measurements of Mach 3 turbulent boundary layers at low Reynolds numbers. Exp Fluids, 59(5):83.

[7]Brosch T, Tschechne S, Neumann H, 2015. On event-based optical flow detection. Front Neurosci, 9:137.

[8]Chae Y, Cheon J, Lim S, et al., 2010. A 2.1Mpixel 120frame/s CMOS image sensor with column-parallel ΔΣ ADC architecture. IEEE Int Solid-State Circuits Conf, p.394-395.

[9]Denman S, Fookes C, Sridharan S, 2009. Improved simultaneous computation of motion detection and optical flow for object tracking. Proc Digital Image Computing: Techniques and Applications, p.175-182.

[10]Denman S, Fookes C, Sridharan S, 2010. Group segmentation during object tracking using optical flow discontinuities. 4th Pacific-Rim Symp on Image and Video Technology, p.270-275.

[11]Fülöp T, Zarándy Á, 2010. Bio-inspired looming object detector algorithm on the Eye-RIS focal plane-processor system. 12th Int Workshop on Cellular Nanoscale Networks and their Applications, p.1-5.

[12]Gao J, Wang YZ, Nie KM, et al., 2018. The analysis and suppressing of non-uniformity in a high-speed spike- based image sensor. Sensors, 18(12):4232.

[13]Lance S, Brock CA, Rogers D, et al., 2010. Water droplet calibration of the cloud droplet probe (CDP) and in-flight performance in liquid, ice and mixed-phase clouds during ARCPAC. Atmos Meas Techn, 3(6):1683-1706.

[14]Lichtsteiner P, Posch C, Delbruck T, 2008. A 128×128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J Sol-State Circ, 43(2):566-576.

[15]Loucks T, Ghosh BK, Lund J, 1992. An optical flow based approach for motion and shape parameter estimation in computer vision. Proc 31st IEEE Conf on Decision and Control, p.819-823.

[16]Low WF, Gao Z, Xiang C, et al., 2020. SOFEA: a non-iterative and robust optical flow estimation algorithm for dynamic vision sensors. IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops, p.368-377.

[17]Moeys DP, Corradi F, Li C, et al., 2018. A sensitive dynamic and active pixel vision sensor for color or neural imaging applications. IEEE Trans Biomed Circ Syst, 12(1):123-136.

[18]Pan JJ, Tian Y, Zhang X, et al., 2018. Infrared target detection based on local contrast method and LK optical flow. IEEE 3rd Optoelectronics Global Conf, p.176-179.

[19]Pan YJ, Sun XY, Wu F, 2020. Enriching optical flow with appearance information for action recognition. IEEE Int Conf on Visual Communications and Image Processing, p.251-254.

[20]Pantilie CD, Nedevschi S, 2010. Real-time obstacle detection in complex scenarios using dense stereo vision and optical flow. Proc 13th Int IEEE Conf on Intelligent Transportation Systems, p.439-444.

[21]Posch C, Matolin D, Wohlgenannt R, et al., 2009. A microbolometer asynchronous dynamic vision sensor for LWIR. IEEE Sens J, 9(6):654-664.

[22]Ridwan I, Cheng H, 2017. An event-based optical flow algorithm for dynamic vision sensors. In: Karray F, Campilho A, Cheriet F (Eds.), Image Analysis and Recognition. Springer, Cham, p.182-189.

[23]Rueckauer B, Delbruck T, 2016. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor. Front Neurosci, 10:176.

[24]Suh Y, Choi S, Ito M, et al., 2020. A 1280×960 dynamic vision sensor with a 4.95-μm pixel pitch and motion artifact minimization. IEEE Int Symp on Circuits and Systems, p.1-5.

[25]Sun DQ, Roth S, Black MJ, 2010. Secrets of optical flow estimation and their principles. IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.2432-2439.

[26]Valeiras DR, Clady X, Ieng SH, et al., 2019. Event-based line fitting and segment detection using a neuromorphic visual sensor. IEEE Trans Neur Netw Learn Syst, 30(4):1218- 1230.

[27]Wang Z, Yang XJ, 2018. Moving target detection and tracking based on pyramid Lucas-Kanade optical flow. IEEE 3rd Int Conf on Image, Vision and Computing, p.66-69.

[28]Wang ZR, Sun X, Diao W, et al., 2019. Ground moving target indication based on optical flow in single-channel SAR. IEEE Geosci Remote Sens Lett, 16(7):1051-1055.

[29]Wang ZY, Guo W, Sun ZY, et al., 2007. Demonstration of a task-flow based aircraft collaborative design application in optical grid. Proc 33rd European Conf and Exhibition of Optical Communication, p.1-2.

[30]Xu JT, Yang Z, Gao ZY, et al., 2019. A method of biomimetic visual perception and image reconstruction based on pulse sequence of events. IEEE Sens J, 19(3):1008-1018.

[31]Zhang CX, Chen Z, Li M, 2015. Linear model for 3D motion estimation and shape reconstruction based on the straight-line optical flow. Proc 12th IEEE Int Conf on Electronic Measurement & Instruments, p.1172-1177.

[32]Zhu AZ, Yuan LZ, Chaney K, et al., 2018. EV-FlowNet: self-supervised optical flow estimation for event-based cameras. Proc 14th Conf on Robotics-Science and Systems, p.1-9.

[33]Zhu AZ, Yuan LZ, Chaney K, et al., 2019. Live demonstration: unsupervised event-based learning of optical flow, depth and egomotion. IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops, p.1694.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2022 Journal of Zhejiang University-SCIENCE