
CLC number:
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 0000-00-00
Cited: 0
Clicked: 8496
Citations: Bibtex RefMan EndNote GB/T7714
https://orcid.org/0000-0002-0040-8546
https://orcid.org/0000-0003-1162-6562
Peiwen ZHANG, Jiangtao XU, Huafeng NIE, Zhiyuan GAO, Kaiming NIE. Motion detection for high-speed high-brightness objects based on a pulse array image sensor[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2000407 @article{title="Motion detection for high-speed high-brightness objects based on a pulse array image sensor", %0 Journal Article TY - JOUR
基于脉冲阵列图像传感器的高速高亮度目标检测1天津大学微电子学院,中国天津市,300072 2天津市成像与感知微电子技术重点实验室,中国天津市,300072 摘要:提出一种基于脉冲阵列图像传感器(PAIS)的高速高亮目标光流提取方法。PAIS是将光信号转换成一系列脉冲间隔的仿视网膜图像传感器。通过累积连续脉冲直接从脉冲数据流中获得光流,当目标相对于背景亮度较大时,触发点可过滤冗余数据。该方法充分利用PAIS对高亮度目标快速响应特性。将该方法用于不同背景亮度高速转盘的光流提取,在传感器模型和实际拍摄数据中进行实验。在2×104帧/秒的采样条件下拍摄转速为1000转/分的高速转盘,可以滤除90%以上冗余点。实验结果表明,基于脉冲数据的光流提取算法可在无需重构灰度图像基础上有效提取高亮目标光流信息。 关键词组: Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article
Reference[1]Almatrafi M, Hirakawa K, 2020. DAViS camera optical flow. IEEE Trans Comput Imag, 6:396-407. ![]() [2]Barron JL, Fleet DJ, Beauchemin SS, 1994. Performance of optical flow techniques. Int J Comput Vis, 12(1):43-77. ![]() [3]Benosman R, Ieng SH, Clercq C, et al., 2012. Asynchronous frameless event-based optical flow. Neur Netw, 27:32-37. ![]() [4]Benosman R, Clercq C, Lagorce X, et al., 2014. Event-based visual flow. IEEE Trans Neur Netw Learn Syst, 25(2):407-417. ![]() [5]Berner R, Brandli C, Yang MH, et al., 2013. A 240×180 120dB 10mW 12μs-latency sparse output vision sensor for mobile applications. Symp on VLSI Circuits, p.C186-C187. ![]() [6]Brooks JM, Gupta AK, Smith MS, et al., 2018. Particle image velocimetry measurements of Mach 3 turbulent boundary layers at low Reynolds numbers. Exp Fluids, 59(5):83. ![]() [7]Brosch T, Tschechne S, Neumann H, 2015. On event-based optical flow detection. Front Neurosci, 9:137. ![]() [8]Chae Y, Cheon J, Lim S, et al., 2010. A 2.1Mpixel 120frame/s CMOS image sensor with column-parallel ΔΣ ADC architecture. IEEE Int Solid-State Circuits Conf, p.394-395. ![]() [9]Denman S, Fookes C, Sridharan S, 2009. Improved simultaneous computation of motion detection and optical flow for object tracking. Proc Digital Image Computing: Techniques and Applications, p.175-182. ![]() [10]Denman S, Fookes C, Sridharan S, 2010. Group segmentation during object tracking using optical flow discontinuities. 4th Pacific-Rim Symp on Image and Video Technology, p.270-275. ![]() [11]Fülöp T, Zarándy Á, 2010. Bio-inspired looming object detector algorithm on the Eye-RIS focal plane-processor system. 12th Int Workshop on Cellular Nanoscale Networks and their Applications, p.1-5. ![]() [12]Gao J, Wang YZ, Nie KM, et al., 2018. The analysis and suppressing of non-uniformity in a high-speed spike- based image sensor. Sensors, 18(12):4232. ![]() [13]Lance S, Brock CA, Rogers D, et al., 2010. Water droplet calibration of the cloud droplet probe (CDP) and in-flight performance in liquid, ice and mixed-phase clouds during ARCPAC. Atmos Meas Techn, 3(6):1683-1706. ![]() [14]Lichtsteiner P, Posch C, Delbruck T, 2008. A 128×128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J Sol-State Circ, 43(2):566-576. ![]() [15]Loucks T, Ghosh BK, Lund J, 1992. An optical flow based approach for motion and shape parameter estimation in computer vision. Proc 31st IEEE Conf on Decision and Control, p.819-823. ![]() [16]Low WF, Gao Z, Xiang C, et al., 2020. SOFEA: a non-iterative and robust optical flow estimation algorithm for dynamic vision sensors. IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops, p.368-377. ![]() [17]Moeys DP, Corradi F, Li C, et al., 2018. A sensitive dynamic and active pixel vision sensor for color or neural imaging applications. IEEE Trans Biomed Circ Syst, 12(1):123-136. ![]() [18]Pan JJ, Tian Y, Zhang X, et al., 2018. Infrared target detection based on local contrast method and LK optical flow. IEEE 3rd Optoelectronics Global Conf, p.176-179. ![]() [19]Pan YJ, Sun XY, Wu F, 2020. Enriching optical flow with appearance information for action recognition. IEEE Int Conf on Visual Communications and Image Processing, p.251-254. ![]() [20]Pantilie CD, Nedevschi S, 2010. Real-time obstacle detection in complex scenarios using dense stereo vision and optical flow. Proc 13th Int IEEE Conf on Intelligent Transportation Systems, p.439-444. ![]() [21]Posch C, Matolin D, Wohlgenannt R, et al., 2009. A microbolometer asynchronous dynamic vision sensor for LWIR. IEEE Sens J, 9(6):654-664. ![]() [22]Ridwan I, Cheng H, 2017. An event-based optical flow algorithm for dynamic vision sensors. In: Karray F, Campilho A, Cheriet F (Eds.), Image Analysis and Recognition. Springer, Cham, p.182-189. ![]() [23]Rueckauer B, Delbruck T, 2016. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor. Front Neurosci, 10:176. ![]() [24]Suh Y, Choi S, Ito M, et al., 2020. A 1280×960 dynamic vision sensor with a 4.95-μm pixel pitch and motion artifact minimization. IEEE Int Symp on Circuits and Systems, p.1-5. ![]() [25]Sun DQ, Roth S, Black MJ, 2010. Secrets of optical flow estimation and their principles. IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.2432-2439. ![]() [26]Valeiras DR, Clady X, Ieng SH, et al., 2019. Event-based line fitting and segment detection using a neuromorphic visual sensor. IEEE Trans Neur Netw Learn Syst, 30(4):1218- 1230. ![]() [27]Wang Z, Yang XJ, 2018. Moving target detection and tracking based on pyramid Lucas-Kanade optical flow. IEEE 3rd Int Conf on Image, Vision and Computing, p.66-69. ![]() [28]Wang ZR, Sun X, Diao W, et al., 2019. Ground moving target indication based on optical flow in single-channel SAR. IEEE Geosci Remote Sens Lett, 16(7):1051-1055. ![]() [29]Wang ZY, Guo W, Sun ZY, et al., 2007. Demonstration of a task-flow based aircraft collaborative design application in optical grid. Proc 33rd European Conf and Exhibition of Optical Communication, p.1-2. ![]() [30]Xu JT, Yang Z, Gao ZY, et al., 2019. A method of biomimetic visual perception and image reconstruction based on pulse sequence of events. IEEE Sens J, 19(3):1008-1018. ![]() [31]Zhang CX, Chen Z, Li M, 2015. Linear model for 3D motion estimation and shape reconstruction based on the straight-line optical flow. Proc 12th IEEE Int Conf on Electronic Measurement & Instruments, p.1172-1177. ![]() [32]Zhu AZ, Yuan LZ, Chaney K, et al., 2018. EV-FlowNet: self-supervised optical flow estimation for event-based cameras. Proc 14th Conf on Robotics-Science and Systems, p.1-9. ![]() [33]Zhu AZ, Yuan LZ, Chaney K, et al., 2019. Live demonstration: unsupervised event-based learning of optical flow, depth and egomotion. IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops, p.1694. ![]() Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou
310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn Copyright © 2000 - 2026 Journal of Zhejiang University-SCIENCE | ||||||||||||||


ORCID:
Open peer comments: Debate/Discuss/Question/Opinion
<1>