CLC number: TP242.6
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2016-03-09
Cited: 2
Clicked: 8318
Feng-yu Zhou, Xian-feng Yuan, Yang Yang, Zhi-fei Jiang, Chen-lei Zhou. A high precision visual localization sensor and its working methodology for an indoor mobile robot[J]. Frontiers of Information Technology & Electronic Engineering, 2016, 17(4): 365-374.
@article{title="A high precision visual localization sensor and its working methodology for an indoor mobile robot",
author="Feng-yu Zhou, Xian-feng Yuan, Yang Yang, Zhi-fei Jiang, Chen-lei Zhou",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="17",
number="4",
pages="365-374",
year="2016",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1500272"
}
%0 Journal Article
%T A high precision visual localization sensor and its working methodology for an indoor mobile robot
%A Feng-yu Zhou
%A Xian-feng Yuan
%A Yang Yang
%A Zhi-fei Jiang
%A Chen-lei Zhou
%J Frontiers of Information Technology & Electronic Engineering
%V 17
%N 4
%P 365-374
%@ 2095-9184
%D 2016
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1500272
TY - JOUR
T1 - A high precision visual localization sensor and its working methodology for an indoor mobile robot
A1 - Feng-yu Zhou
A1 - Xian-feng Yuan
A1 - Yang Yang
A1 - Zhi-fei Jiang
A1 - Chen-lei Zhou
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 17
IS - 4
SP - 365
EP - 374
%@ 2095-9184
Y1 - 2016
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1500272
Abstract: To overcome the shortcomings of existing robot localization sensors, such as low accuracy and poor robustness, a high precision visual localization system based on infrared-reflective artificial markers is designed and illustrated in detail in this paper. First, the hardware system of the localization sensor is developed. Secondly, we design a novel kind of infrared-reflective artificial marker whose characteristics can be extracted by the acquisition and processing of the infrared image. In addition, a confidence calculation method for marker identification is proposed to obtain the probabilistic localization results. Finally, the autonomous localization of the robot is achieved by calculating the relative pose relation between the robot and the artificial marker based on the perspective-3-point (P3P) visual localization algorithm. Numerous experiments and practical applications show that the designed localization sensor system is immune to the interferences of the illumination and observation angle changes. The precision of the sensor is ±1.94 cm for position localization and ±1.64° for angle localization. Therefore, it satisfies perfectly the requirements of localization precision for an indoor mobile robot.
For autonomous indoor service robots, an embedded visual localization sensor is designed and a dot-matrix infrared-reflective artificial marker is given. Based on the statistical analysis of grey values of the marker dots, this paper provides a calculation method for marker identification. Experimental results show the effectiveness of the visual localization sensor system. The work sounds good.
[1]Aleksandrovich, Y.D., Gennadievich, P.G., Stepanovich, K.A., et al., 2013. Mobile robot navigation based on artificial landmarks with machine vision system. World Appl. Sci. J., 24(11):1467-1472.
[2]Her, K.W., Kim, D.H., Ha, J.E., 2012. Localization of mobile robot using laser range finder and IR landmark. Proc. 12th Int. Conf. on Control, Automation and Systems, p.459-461.
[3]Kim, Y., Yoon, W.C., 2014. Generating task-oriented interactions of service robots. IEEE Trans. Syst. Man Cybern., 44(8):981-994.
[4]Kroumov, V., Okuyama, K., 2012. Localisation and position correction for mobile robot using artificial visual landmarks. Int. J. Adv. Mech. Syst., 4(2):112-119.
[5]Lu, W., Xiang, Z., Liu, J., 2015. Design of an enhanced visual odometry by building and matching compressive panoramic landmarks online. Front. Inform. Technol. Electron. Eng., 16(2):152-165.
[6]Lu, Y., Song, D., 2015. Visual navigation using heterogeneous landmarks and unsupervised geometric constraints. IEEE Trans. Robot., 31(3):736-749.
[7]Luo, R.C., Chen, O., 2013. Wireless and pyroelectric sensory fusion system for indoor human/robot localization and monitoring. IEEE/ASME Trans. Mech., 18(3):845-853.
[8]Luo, R.C., Lai, C.C., 2014. Multisensor fusion-based concurrent environment mapping and moving object detection for intelligent service robotics. IEEE Trans. Ind. Electron., 61(8):4043-4051.
[9]Müller, J., Burgard, W., 2013. Efficient probabilistic localization for autonomous indoor airships using sonar, air flow, and IMU sensors. Adv. Robot., 27(9):711-724.
[10]Nakamura, T., Suzuki, S., 2014. Simplified EKF-SLAM by combining laser range sensor with retro reflective markers for use in kindergarten. Int. J. Robot. Mech., 1(1):1-7.
[11]Oh, J.H., Kim, D., Lee, B.H., 2014. An indoor localization system for mobile robots using an active infrared positioning sensor. J. Ind. Intell. Inform., 2(1):35-38. [doi:10.12720/jiii.2.1.35-38
[12]Persson, S.M., Sharf, I., 2014. Sampling-based A∗ algorithm for robot path-planning. Int. J. Robot. Res., 33(13):1683-1708.
[13]Reinstein, M., Hoffmann, M., 2013. Dead reckoning in a dynamic quadruped robot based on multimodal proprioceptive sensory information. IEEE Trans. Robot., 29(2):563-571.
[14]Ren, Y., Ye, A., Lu, T., et al., 2014. A method of self-localization of robot based on infrared landmark. Proc. 11th World Congress on Intelligent Control and Automation, p.5494-5499.
[15]Sultan, M.S., Chen, X., Qadeer, N., et al., 2013. Vision guided path planning system for vehicles using infrared landmark. Proc. IEEE Int. Conf. on Robotics and Biomimetics, p.179-184.
[16]Vynnycky, M., Kanev, K., 2015. Mathematical analysis of the multisolution phenomenon in the P3P problem. J. Math. Imag. Vis., 51(2):326-337.
[17]Wu, H., Tian, G., Duan, P., et al., 2013. The design of a novel artificial label for robot navigation. Proc. Chinese Intelligent Automation Conf., p.479-487.
[18]Xu, D., Tan, M., Li, Y., 2011. Visual Measurement and Control for Robots. National Defense Industry Press, Beijing, China, p.132-136 (in Chinese).
[19]Yu, H.H., Hsieh, H.W., Tasi, Y.K., et al., 2013. Visual localization for mobile robots based on composite map. J. Robot. Mech., 25(1):25-37.
[20]Yuan, X., Song, M., Zhou, F., et al., 2015. A novel Mittag-Leffler kernel based hybrid fault diagnosis method for wheeled robot driving system. Comput. Intell. Neurosci., 2015:606734.1-606734.11.
[21]Zhang, Z., 2000. A flexible new technique for camera calibration. IEEE Trans. Patt. Anal. Mach. Intell., 22(11):1330-1334.
Open peer comments: Debate/Discuss/Question/Opinion
<1>