CLC number: TP391
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 0000-00-00
Cited: 3
Clicked: 7117
WU Xue-dong, SONG Zhi-huan. Gaussian particle filter based pose and motion estimation[J]. Journal of Zhejiang University Science A, 2007, 8(10): 1604-1613.
@article{title="Gaussian particle filter based pose and motion estimation",
author="WU Xue-dong, SONG Zhi-huan",
journal="Journal of Zhejiang University Science A",
volume="8",
number="10",
pages="1604-1613",
year="2007",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.2007.A1604"
}
%0 Journal Article
%T Gaussian particle filter based pose and motion estimation
%A WU Xue-dong
%A SONG Zhi-huan
%J Journal of Zhejiang University SCIENCE A
%V 8
%N 10
%P 1604-1613
%@ 1673-565X
%D 2007
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.2007.A1604
TY - JOUR
T1 - Gaussian particle filter based pose and motion estimation
A1 - WU Xue-dong
A1 - SONG Zhi-huan
J0 - Journal of Zhejiang University Science A
VL - 8
IS - 10
SP - 1604
EP - 1613
%@ 1673-565X
Y1 - 2007
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.2007.A1604
Abstract: Determination of relative three-dimensional (3D) position, orientation, and relative motion between two reference frames is an important problem in robotic guidance, manipulation, and assembly as well as in other fields such as photogrammetry. A solution to pose and motion estimation problem that uses two-dimensional (2D) intensity images from a single camera is desirable for real-time applications. The difficulty in performing this measurement is that the process of projecting 3D object features to 2D images is a nonlinear transformation. In this paper, the 3D transformation is modeled as a nonlinear stochastic system with the state estimation providing six degrees-of-freedom motion and position values, using line features in image plane as measuring inputs and dual quaternion to represent both rotation and translation in a unified notation. A filtering method called the gaussian particle filter (GPF) based on the particle filtering concept is presented for 3D pose and motion estimation of a moving target from monocular image sequences. The method has been implemented with simulated data, and simulation results are provided along with comparisons to the extended Kalman filter (EKF) and the unscented Kalman filter (UKF) to show the relative advantages of the GPF. Simulation results showed that GPF is a superior alternative to EKF and UKF.
[1] Aggarwal, N., Karl, W.C., 2006. Line detection in images through regularized hough transform. IEEE Trans. on Image Processing, 15:582-591.
[2] Andreff, N., Espiau, B., Horaud, R., 2002. Visual servoing from lines. Int. J. Robot. Res., 21:679-700.
[3] Ansar, A., Daniilidis, K., 2003. Linear pose estimation from points or lines. IEEE Trans. on Pattern Anal. Machine Intell., 25:578-589.
[4] Aytac, T., Barshan, B., 2002. Differentiation and localization of targets using infrared sensors. Opt. Commun., 210:25-35.
[5] Blais, F., Beraldin, J.A., 2001. Comparison of Pose Estimation Methods of a 3D Laser Tracking System Using Triangulation and Photogrammetry Techniques. SPIE Proc., Electronic Imaging, Videometrics and Optical Methods for 3D Shape Measurement VII. San Jose, California, USA, 4309:185-194.
[6] Broida, J., Chandrashekhar, S., Chellappa, R., 1990. Recursive 3-d motion estimation from a monocular image sequence. IEEE Trans. on Aeros. Electron. Syst., 26:639-656.
[7] Burl, J.B., 1993. A reduced order extended Kalman filter for sequential images containing a moving object. IEEE Trans. on Image Processing, 2:285-295.
[8] Chang, C.C., Tsai, W.H., 1999. Reliable determination of object pose from line features by hypothesis testing. IEEE Trans. on Pattern Anal. Machine Intell., 21:1235-1241.
[9] Chen, H.H., 1991. A Screw Motion Approach to Uniqueness Analysis of Head-eye Geometry. IEEE Conf. on Computer Vision and Pattern Recognition. Mau, KI, USA, p.145-151.
[10] Chou, J.C.K., 1992. Quaternion kinematic and dynamic differential equations. IEEE Trans. on Rob. Autom., 8:53-64.
[11] Daniilidis, K., 1999. Hand-eye calibration using dual quaternions. Int. J. Robot. Res., 18:286-298.
[12] Deng, L.F., Wilson, W.J., Janabi-Sharifi, F., 2005. Decoupled EKF for Simultaneous Target Model and Relative Pose Estimation Using Feature Points. IEEE Conf. on Control Application. Toronto, Canada, p.749-754.
[13] Dijk, E., Berkel, K., Aarts, R., Loenen, E., 2003. Ultrasonic 3D Position Estimation Using a Single Base Station. First European Symposium on Ambient Intelligence. Veldhoven, the Netherlands, p.133-148.
[14] Diosi, A., Taylor, G., Kleeman, L., 2005. Interactive SLAM using Laser and Advanced Sonar. IEEE Conf. on Robotics and Automation. Barcelona, Spain, p.1103-1108.
[15] Hazas, M., Ward, A., 2002. A Novel Broadband Ultrasonic Location System. Conf. on Ubiquitous Computing. Göteborg, Sweden, p.264-280.
[16] Julier, S.J., Uhlmann, J.K., 2004. Unscented filtering and nonlinear estimation. Proc. IEEE, 92:401-422.
[17] Kim, G.H., Kim, J.S., Hong, K.S., 2005. Vision-based Simultaneous Localization and Mapping with Two Cameras. IEEE Conf. on Intelligent and Systems. Alberta, Canada, p.1671-1676.
[18] Kotecha, J.H., Djuric, P.M., 2003. Gaussian particle filtering. IEEE Trans. on Signal Processing, 51:2592-2601.
[19] Lin, H.H., Tsai, C.C., Hsu, J.C., Chang, C.F., 2003. Ultrasonic Self-localization and Pose Tracking of an Autonomous Mobile Robot Via Fuzzy Adaptive Extended Information Filtering. IEEE Conf. on Robotics and Automation, Taipei, Taiwan, p.1283-1290.
[20] Miro, J.V., Dissanayake, G., Zhou, W.Z., 2005. Vision-based SLAM Using Natural Features in Indoor Environments. Conf. on Intelligent Sensors, Sensor Networks and Information Processing. Melbourne, Australia, p.151-156.
[21] Pages, J., Collewet, C., Chaumette, F., Salvi, J., 2005. Visual Servoing by Means of Structured Light for Plane-to-Plane Positioning. Technical Report 5579, INRIA.
[22] Porta, J.M., Kröse, B.J.A, 2003. Vision-based Localization for Mobile Platforms. First European Symp. on Ambient Intelligence. Veldhoven, the Netherlands, p.209-219.
[23] Rehbinder, H., Ghosh, B.K., 2003. Pose estimation using line-based dynamic vision and inertial sensors. IEEE Trans. on Autom. Control, 48:186-199.
[24] Sarcinetti-Filho, M., Bastos-Filho, T., Freitas, R., 2003. Mobile Robot Navigation Via Reference Recognition Based on Ultrasonic Sensing and Monocular Vision. Conf. on Advanced Robotics. Coimbra, Portugal, p.204-209.
[25] Shademan, A., Janabi-Sharifi, F., 2005. Sensitivity Analysis of EKF and Iterated EKF Pose Estimation for Position-based Visual Servoing. IEEE Conf. on Control Applications. Toronto, Canada, p.755-760.
[26] Shakernia, O., Ma, Y., Koo, T., Sastry, S., 1999. Landing an unmanned air vehicle: vision based motion estimation and nonlinear control. Asian J. Control, 1:128-145.
[27] Takahashi, S., Ghosh, B.K., 2001. Motion and Shape Parameters Identification with Vision and Range. American Control Conf., 6:4626-4631.
[28] Tomono, M., 2005. 3D Localization and Mapping Using a Single Camera Based on Structure-from-Motion with Automatic Baseline Selection. IEEE Conf. on Robotics and Automation. Barcelona, Spain, p.3342-3347.
[29] Wu, X.D., Jiang, X.H., Zheng, R.J., Huang, J.S., 2006. An Application of Unscented Kalman Filter for Pose and Motion Estimation Based on Monocular Vision. IEEE Int. Symp. on Industrial Electronics, 4:2614-2619.
[30] Youngrock, Y., DeSouza, G.N., Kak, A.C., 2003. Real-time Tracking and Pose Estimation for Industrial Objects Using Geometric Features. IEEE Conf. on Robotics and Automation. Taipei, Taiwan, p.3473-3478.
[31] Zhang, X.Y., Liu, Y.C., Huang, T.S., 2006. Motion analysis of articulated objects from monocular images. IEEE Trans. on Pattern Anal. Machine Intell., 28:625-663.
Open peer comments: Debate/Discuss/Question/Opinion
<1>