CLC number: TP13
On-line Access: 2021-02-01
Received: 2019-08-31
Revision Accepted: 2019-10-09
Crosschecked: 2020-05-18
Cited: 0
Clicked: 6238
Citations: Bibtex RefMan EndNote GB/T7714
Wei Li, Rong Xiong. A hybrid visual servo control method for simultaneously controlling a nonholonomic mobile and a manipulator[J]. Frontiers of Information Technology & Electronic Engineering, 2021, 22(2): 141-154.
@article{title="A hybrid visual servo control method for simultaneously controlling a nonholonomic mobile and a manipulator",
author="Wei Li, Rong Xiong",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="22",
number="2",
pages="141-154",
year="2021",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1900460"
}
%0 Journal Article
%T A hybrid visual servo control method for simultaneously controlling a nonholonomic mobile and a manipulator
%A Wei Li
%A Rong Xiong
%J Frontiers of Information Technology & Electronic Engineering
%V 22
%N 2
%P 141-154
%@ 2095-9184
%D 2021
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1900460
TY - JOUR
T1 - A hybrid visual servo control method for simultaneously controlling a nonholonomic mobile and a manipulator
A1 - Wei Li
A1 - Rong Xiong
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 22
IS - 2
SP - 141
EP - 154
%@ 2095-9184
Y1 - 2021
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1900460
Abstract: Visual servo control rules that refer to the control methods of robot motion planning using image data acquired from the camera mounted on the robot have been widely applied to the motion control of robotic arms or mobile robots. The methods are usually classified as image-based visual servo, position-based visual servo, and hybrid visual servo (HVS) control rules. mobile manipulation enhances the working range and flexibility of robotic arms. However, there is little work on applying visual servo control rules to the motion of the whole mobile manipulation robot. We propose an HVS motion control method for a mobile manipulation robot which combines a six-degree-of-freedom (6-DOF) robotic arm with a nonholonomic mobile base. Based on the kinematic differential equations of the mobile manipulation robot, the global Jacobian matrix of the whole robot is derived, and the HVS control equation is derived using the whole Jacobian matrix combined with position and visual image information. The distance between the gripper and target is calculated through the observation of the marker by a camera mounted on the gripper. The differences between the positions of the markers ’ feature points and the expected positions of them in the image coordinate system are also calculated. These differences are substituted into the control equation to obtain the speed control law of each degree of freedom of the mobile manipulation robot. To avoid the position error caused by observation, we also introduce the kalman filter to correct the positions and orientations of the end of the manipulator. Finally, the proposed algorithm is validated on a mobile manipulation platform consisting of a Bulldog chassis, a UR5 robotic arm, and a ZED camera.
[1]Agravante DJ, Chaumette F, 2017. Active vision for pose estimation applied to singularity avoidance in visual servoing. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.2947-2952.
[2]Bateux Q, Marchand E, Leitner J, et al., 2017. Visual servoing from deep neural networks. https://arxiv.org/abs/170508940
[3]Benhimane S, Malis E, 2007. Homography-based 2D visual tracking and servoing. Int J Robot Res, 26(7):661-676.
[4]Chaumette F, Hutchinson S, 2006. Visual servo control. I. Basic approaches. IEEE Robot Autom Mag, 13(4):82-90.
[5]Chaumette F, Hutchinson S, 2007. Visual servo control. II. Advanced approaches. IEEE Robot Autom Mag, 14(1):109-118.
[6]Dong GQ, Zhu ZH, 2015. Position-based visual servo control of autonomous robotic manipulators. Acta Astronaut, 115:291-302.
[7]Fantacci C, Vezzani G, Pattacini U, et al., 2018. Markerless visual servoing on unknown objects for humanoid robot platforms. IEEE Int Conf on Robotics and Automation, p.3099-3106.
[8]Hafez AHA, Cervera E, Jawahar C, 2008. Hybrid visual servoing by boosting IBVS and PBVS. 3rd Int Conf on Information and Communication Technologies: from Theory to Applications, p.1-6.
[9]Horaud R, Conio B, Leboulleux O, et al., 1989. An analytic solution for the perspective 4-point problem. Comput Vis Graph Image Process, 47(1):33-44.
[10]Hu G, MacKunis W, Gans N, et al., 2008. Homography-based visual servo control via an uncalibrated camera. American Control Conf, p.4791-4796.
[11]Kermorgant O, Chaumette F, 2011. Combining IBVS and PBVS to ensure the visibility constraint. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.2849-2854.
[12]Lang HX, Khan MT, Tan KK, et al., 2016. Application of visual servo control in autonomous mobile rescue robots. Int J Comput Commun Contr, 11(5):685-696.
[13]Levine S, Pastor P, Krizhevsky A, et al., 2016. Learning hand-eye coordination for robotic grasping with large-scale data collection. Int Symp on Experimental Robotics, p.173-184.
[14]Sandy T, Buchli J, 2017. Dynamically decoupling base and end-effector motion for mobile manipulation using visual-inertial sensing. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.6299-6306.
[15]Tsakiris DP, Rives P, Samson C, 1998. Extending visual servoing techniques to nonholonomic mobile robots. In: Kriegman DJ, Hager GD, Morse AS (Eds.), The Confluence of Vision and Control. Springer, London, p.106-117.
[16]Wang Y, Zhang GL, Lang HX, et al., 2014. A modified image-based visual servo controller with hybrid camera configuration for robust robotic grasping. Robot Auton Syst, 62(10):1398-1407.
[17]Zhong XG, Zhong XY, Peng XF, 2015. Robots visual servo control with features constraint employing Kalman-neural-network filtering scheme. Neurocomputing, 151:268-277.
Open peer comments: Debate/Discuss/Question/Opinion
<1>