CLC number: TP391.4
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2011-11-04
Cited: 1
Clicked: 8456
Wei Wang, Zhi-xun Su, Jin-shan Pan, Ye Wang, Ri-ming Sun. Robust optical flow estimation based on brightness correction fields[J]. Journal of Zhejiang University Science C, 2011, 12(12): 1010-1020.
@article{title="Robust optical flow estimation based on brightness correction fields",
author="Wei Wang, Zhi-xun Su, Jin-shan Pan, Ye Wang, Ri-ming Sun",
journal="Journal of Zhejiang University Science C",
volume="12",
number="12",
pages="1010-1020",
year="2011",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.C1100062"
}
%0 Journal Article
%T Robust optical flow estimation based on brightness correction fields
%A Wei Wang
%A Zhi-xun Su
%A Jin-shan Pan
%A Ye Wang
%A Ri-ming Sun
%J Journal of Zhejiang University SCIENCE C
%V 12
%N 12
%P 1010-1020
%@ 1869-1951
%D 2011
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1100062
TY - JOUR
T1 - Robust optical flow estimation based on brightness correction fields
A1 - Wei Wang
A1 - Zhi-xun Su
A1 - Jin-shan Pan
A1 - Ye Wang
A1 - Ri-ming Sun
J0 - Journal of Zhejiang University Science C
VL - 12
IS - 12
SP - 1010
EP - 1020
%@ 1869-1951
Y1 - 2011
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1100062
Abstract: Optical flow estimation is still an important task in computer vision with many interesting applications. However, the results obtained by most of the optical flow techniques are affected by motion discontinuities or illumination changes. In this paper, we introduce a brightness correction field combined with a gradient constancy constraint to reduce the sensibility to brightness changes between images to be estimated. The advantage of this brightness correction field is its simplicity in terms of computational complexity and implementation. By analyzing the deficiencies of the traditional total variation regularization term in weakly textured areas, we also adopt a structure-adaptive regularization based on the robust Huber norm to preserve motion discontinuities. Finally, the proposed energy functional is minimized by solving its corresponding Euler-Lagrange equation in a more effective multi-resolution scheme, which integrates the twice downsampling strategy with a support-weight median filter. Numerous experiments show that our method is more effective and produces more accurate results for optical flow estimation.
[1]Baker, S., Roth, S., Scharstein, D., Black, M., Lewis, J., Szeliski, R., 2007. A Database and Evaluation Methodology for Optical Flow. IEEE 11th Int. Conf. on Computer Vision, p.1-8.
[2]Barron, J., Fleet, D., Beauchemin, S., 1994. Performance of optical flow techniques. Int. J. Comput. Vis., 12(1):43-77.
[3]Black, M.J., Anandan, P., 1996. The robust estimation of multiple motions: parametric and piecewise-smooth flow fields. Comput. Vis. Image Understand., 63(1):75-104.
[4]Brox, T., Malik, J., 2011. Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans. Pattern Anal. Mach. Intell., 33(3):500-513.
[5]Brox, T., Bruhn, A., Papenberg, N., Weickert, J., 2004. High Accuracy Optical Flow Estimation Based on a Theory for Warping. European Conf. on Computer Vision, p.25-36.
[6]Bruhn, A., Weickert, J., Schnörr, C., 2005. Lucas/Kanade meets Horn/Schunk: combining local and global optical flow methods. Int. J. Comput. Vis., 61(3):211-231.
[7]Cassisa, C., Simoens, S., Prinet, V., 2009. Two-frame optical flow formulation in an unwarped multiresolution scheme. LNCS, 5856:790-797.
[8]Dessauer, M.P., Dua, S., 2010. Optical flow object detection, motion estimation, and tracking on moving vehicles using wavelet decompositions. SPIE, 7694:76941J.
[9]Efros, A., Berg, A., Mori, G., Malik, J., 2003. Recognizing Action at a Distance. Proc. 9th IEEE Int. Conf. on Computer Vision, p.726-733.
[10]Fakih, A., Zelek, J., 2008. Structure from Motion: Combining Features Correspondences and Optical Flow. 19th Int. Conf. on Pattern Recognition, p.1-4.
[11]Gennert, M.A., Negahdaripour, S., 1987. Relaxing the Brightness Constancy Assumption in Computing Optical Flow. Technical Report, Massachusetts Institute of Technology, Cambridge, MA, USA.
[12]Gilland, D.R., Mair, B.A., Parker, J.G., 2008. Motion estimation for cardiac emission tomography by optical flow methods. Phys. Med. Biol., 53(11):2991-3006.
[13]Gray, R.M., 2006. Toeplitz and circulant matrices: a review foundations and trends. Commun. Inform. Theory, 2(3):155-239.
[14]Haussecker, H.W., Fleet, D.J., 2001. Computing optical flow with physical models of brightness variation. IEEE Trans. Pattern Anal. Mach. Intell., 23(6):661-673.
[15]Horn, B., Schunck, B., 1981. Determining optical flow. Artif. Intell., 17(1-3):185-203.
[16]Hsiao, I., Rangarajan, A., Gindi, G., 2003. A new convex edge-preserving median prior with applications to tomography. IEEE Trans. Med. Imag., 22(5):580-585.
[17]Huber, P.J., 1973. Robust regression: asymptotics, conjectures and Monte Carlo. Ann. Stat., 1(5):799-821.
[18]Kim, Y.H., Martinez, A.M., Kak, C.A., 2005. Robust motion estimation under varying illumination. Image Vis. Comput., 23(4):365-375.
[19]Lempitsky, V., Roth, S., Rother, C., 2008. FusionFlow: Discrete Continuous Optimization for Optical Flow Estimation. IEEE Conf. on Computer Vision and Pattern Recognition, p.1-8.
[20]Li, Y., Osher, S., 2009. A new median formula with applications to PDE based denoising. Commun. Math. Sci., 7(3):741-753.
[21]Liu, C., Yuen, J., Torralba, A., Sivic, J., Freeman, W.T., 2008. SIFT flow: dense correspondence across different scenes. LNCS, 5304:28-42.
[22]Myronenko, A., Song, X., 2009. Image Registration by Minimization of Residual Complexity. IEEE Conf. on Computer Vision and Pattern Recognition, p.49-56.
[23]Negahdaripour, S., 1998. Revised definition of optical flow: integration of radiometric and geometric cues for dynamic scene analysis. IEEE Trans. Pattern Anal. Mach. Intell., 20(9):961-979.
[24]Sand, P., Teller, S., 2008. Particle video: long-range motion estimation using point trajectories. Int. J. Comput. Vis., 80(1):72-91.
[25]Shulman, D., Herve, J.Y., 1989. Regularization of Discontinuous Flow Fields. Proc. Workshop on Visual Motion, p.81-86.
[26]Steinbrücker, F., Pock., T., 2009. Large Displacement Optical Flow Computation without Warping. Int. Conf. on Computer Vision, p.1609-1614.
[27]Strang, G., 1999. The discrete cosine transform. SIAM Rev., 41(1):135-147.
[28]Sun, D., Roth, S., Black, M.J., 2010. Secrets of Optical Flow Estimation and Their Principles. IEEE Conf. on Computer Vision and Pattern Recognition, p.2432-2439.
[29]Teng, C.H., Lai, S.H., Chen, Y.S., Hsu, W.H., 2005. Accurate optical flow computation under non-uniform brightness variations. Comput. Vis. Image Understand., 97(3):315-346.
[30]Wang, M.Y., Hu, H.B., Qin, B.J., 2007. Robust Deformable Medical Image Registration Using Optical Flow and Multilevel Free Form Deformation. Nuclear Science Symp. Conf. Record, p.4552-4555.
[31]Wedel, A., Pock, T., Zach, C., Bischof, H., Cremers, D., 2009a. An improved algorithm for TV-L1 optical flow. LNCS, 5604:23-45.
[32]Wedel, A., Cremers, D., Pock, T., Bischof, H., 2009b. Structure- and Motion-Adaptive Regularization for High Accuracy Optic Flow. IEEE 12th Int. Conf. on Computer Vision, p.1663-1668.
[33]Werlberger, M., Trobin, W., Pock, T., Wedel, A., Cremers, D., Bischof, H., 2009. Anisotropic Huber-L1 Optical Flow. British Machine Vision Conf., p.1-11.
[34]Yoon, K.J., Kweon, I.S., 2006. Adaptive support-weight approach for correspondence search. IEEE Trans. Pattern Anal. Mach. Intell., 28(4):650-656.
[35]Zach, C., Pock, T., Bischof, H., 2007. A duality based approach for realtime TV-L1 optical flow. LNCS, 4713:214-223.
Open peer comments: Debate/Discuss/Question/Opinion
<1>