CLC number:
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 0000-00-00
Cited: 0
Clicked: 6314
Citations: Bibtex RefMan EndNote GB/T7714
https://orcid.org/0000-0002-0040-8546
https://orcid.org/0000-0003-1162-6562
Peiwen ZHANG, Jiangtao XU, Huafeng NIE, Zhiyuan GAO, Kaiming NIE. Motion detection for high-speed high-brightness objects based on a pulse array image sensor[J]. Frontiers of Information Technology & Electronic Engineering, 2022, 23(1): 113-122.
@article{title="Motion detection for high-speed high-brightness objects based on a pulse array image sensor",
author="Peiwen ZHANG, Jiangtao XU, Huafeng NIE, Zhiyuan GAO, Kaiming NIE",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="23",
number="1",
pages="113-122",
year="2022",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2000407"
}
%0 Journal Article
%T Motion detection for high-speed high-brightness objects based on a pulse array image sensor
%A Peiwen ZHANG
%A Jiangtao XU
%A Huafeng NIE
%A Zhiyuan GAO
%A Kaiming NIE
%J Frontiers of Information Technology & Electronic Engineering
%V 23
%N 1
%P 113-122
%@ 2095-9184
%D 2022
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2000407
TY - JOUR
T1 - Motion detection for high-speed high-brightness objects based on a pulse array image sensor
A1 - Peiwen ZHANG
A1 - Jiangtao XU
A1 - Huafeng NIE
A1 - Zhiyuan GAO
A1 - Kaiming NIE
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 23
IS - 1
SP - 113
EP - 122
%@ 2095-9184
Y1 - 2022
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2000407
Abstract: We describe a method of optical flow extraction for high-speed high-brightness targets based on a pulse array image sensor (PAIS). PAIS is a retina-like image sensor with pixels triggered by light; it can convert light into a series of pulse intervals. This method can obtain optical flow from pulse data directly by accumulating continuous pulses. The triggered points can be used to filter redundant data when the target is brighter than the background. The method takes full advantage of the rapid response of PAIS to high-brightness targets. We applied this method to extract the optical flow of high-speed turntables with different background brightness, with the sensor model and actual data, respectively. Under the sampling condition of 2×104 frames/s, the optical flow could be extracted from a high-speed turntable rotating at 1000 r/min. More than 90% of redundant points could be filtered by our method. Experimental results showed that the optical flow extraction algorithm based on pulse data can extract the optical flow information of high-brightness objects efficiently without the need to reconstruct images.
[1]Almatrafi M, Hirakawa K, 2020. DAViS camera optical flow. IEEE Trans Comput Imag, 6:396-407.
[2]Barron JL, Fleet DJ, Beauchemin SS, 1994. Performance of optical flow techniques. Int J Comput Vis, 12(1):43-77.
[3]Benosman R, Ieng SH, Clercq C, et al., 2012. Asynchronous frameless event-based optical flow. Neur Netw, 27:32-37.
[4]Benosman R, Clercq C, Lagorce X, et al., 2014. Event-based visual flow. IEEE Trans Neur Netw Learn Syst, 25(2):407-417.
[5]Berner R, Brandli C, Yang MH, et al., 2013. A 240×180 120dB 10mW 12μs-latency sparse output vision sensor for mobile applications. Symp on VLSI Circuits, p.C186-C187.
[6]Brooks JM, Gupta AK, Smith MS, et al., 2018. Particle image velocimetry measurements of Mach 3 turbulent boundary layers at low Reynolds numbers. Exp Fluids, 59(5):83.
[7]Brosch T, Tschechne S, Neumann H, 2015. On event-based optical flow detection. Front Neurosci, 9:137.
[8]Chae Y, Cheon J, Lim S, et al., 2010. A 2.1Mpixel 120frame/s CMOS image sensor with column-parallel ΔΣ ADC architecture. IEEE Int Solid-State Circuits Conf, p.394-395.
[9]Denman S, Fookes C, Sridharan S, 2009. Improved simultaneous computation of motion detection and optical flow for object tracking. Proc Digital Image Computing: Techniques and Applications, p.175-182.
[10]Denman S, Fookes C, Sridharan S, 2010. Group segmentation during object tracking using optical flow discontinuities. 4th Pacific-Rim Symp on Image and Video Technology, p.270-275.
[11]Fülöp T, Zarándy Á, 2010. Bio-inspired looming object detector algorithm on the Eye-RIS focal plane-processor system. 12th Int Workshop on Cellular Nanoscale Networks and their Applications, p.1-5.
[12]Gao J, Wang YZ, Nie KM, et al., 2018. The analysis and suppressing of non-uniformity in a high-speed spike- based image sensor. Sensors, 18(12):4232.
[13]Lance S, Brock CA, Rogers D, et al., 2010. Water droplet calibration of the cloud droplet probe (CDP) and in-flight performance in liquid, ice and mixed-phase clouds during ARCPAC. Atmos Meas Techn, 3(6):1683-1706.
[14]Lichtsteiner P, Posch C, Delbruck T, 2008. A 128×128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J Sol-State Circ, 43(2):566-576.
[15]Loucks T, Ghosh BK, Lund J, 1992. An optical flow based approach for motion and shape parameter estimation in computer vision. Proc 31st IEEE Conf on Decision and Control, p.819-823.
[16]Low WF, Gao Z, Xiang C, et al., 2020. SOFEA: a non-iterative and robust optical flow estimation algorithm for dynamic vision sensors. IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops, p.368-377.
[17]Moeys DP, Corradi F, Li C, et al., 2018. A sensitive dynamic and active pixel vision sensor for color or neural imaging applications. IEEE Trans Biomed Circ Syst, 12(1):123-136.
[18]Pan JJ, Tian Y, Zhang X, et al., 2018. Infrared target detection based on local contrast method and LK optical flow. IEEE 3rd Optoelectronics Global Conf, p.176-179.
[19]Pan YJ, Sun XY, Wu F, 2020. Enriching optical flow with appearance information for action recognition. IEEE Int Conf on Visual Communications and Image Processing, p.251-254.
[20]Pantilie CD, Nedevschi S, 2010. Real-time obstacle detection in complex scenarios using dense stereo vision and optical flow. Proc 13th Int IEEE Conf on Intelligent Transportation Systems, p.439-444.
[21]Posch C, Matolin D, Wohlgenannt R, et al., 2009. A microbolometer asynchronous dynamic vision sensor for LWIR. IEEE Sens J, 9(6):654-664.
[22]Ridwan I, Cheng H, 2017. An event-based optical flow algorithm for dynamic vision sensors. In: Karray F, Campilho A, Cheriet F (Eds.), Image Analysis and Recognition. Springer, Cham, p.182-189.
[23]Rueckauer B, Delbruck T, 2016. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor. Front Neurosci, 10:176.
[24]Suh Y, Choi S, Ito M, et al., 2020. A 1280×960 dynamic vision sensor with a 4.95-μm pixel pitch and motion artifact minimization. IEEE Int Symp on Circuits and Systems, p.1-5.
[25]Sun DQ, Roth S, Black MJ, 2010. Secrets of optical flow estimation and their principles. IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.2432-2439.
[26]Valeiras DR, Clady X, Ieng SH, et al., 2019. Event-based line fitting and segment detection using a neuromorphic visual sensor. IEEE Trans Neur Netw Learn Syst, 30(4):1218- 1230.
[27]Wang Z, Yang XJ, 2018. Moving target detection and tracking based on pyramid Lucas-Kanade optical flow. IEEE 3rd Int Conf on Image, Vision and Computing, p.66-69.
[28]Wang ZR, Sun X, Diao W, et al., 2019. Ground moving target indication based on optical flow in single-channel SAR. IEEE Geosci Remote Sens Lett, 16(7):1051-1055.
[29]Wang ZY, Guo W, Sun ZY, et al., 2007. Demonstration of a task-flow based aircraft collaborative design application in optical grid. Proc 33rd European Conf and Exhibition of Optical Communication, p.1-2.
[30]Xu JT, Yang Z, Gao ZY, et al., 2019. A method of biomimetic visual perception and image reconstruction based on pulse sequence of events. IEEE Sens J, 19(3):1008-1018.
[31]Zhang CX, Chen Z, Li M, 2015. Linear model for 3D motion estimation and shape reconstruction based on the straight-line optical flow. Proc 12th IEEE Int Conf on Electronic Measurement & Instruments, p.1172-1177.
[32]Zhu AZ, Yuan LZ, Chaney K, et al., 2018. EV-FlowNet: self-supervised optical flow estimation for event-based cameras. Proc 14th Conf on Robotics-Science and Systems, p.1-9.
[33]Zhu AZ, Yuan LZ, Chaney K, et al., 2019. Live demonstration: unsupervised event-based learning of optical flow, depth and egomotion. IEEE/CVF Conf on Computer Vision and Pattern Recognition Workshops, p.1694.
Open peer comments: Debate/Discuss/Question/Opinion
<1>