Full Text:   <2942>

Summary:  <1735>

CLC number: TP391; TN2

On-line Access: 2016-08-05

Received: 2015-10-08

Revision Accepted: 2016-02-17

Crosschecked: 2016-07-14

Cited: 0

Clicked: 6191

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Gaetano C. La Delfa

http://orcid.org/0000-0002-1842-5467

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2016 Vol.17 No.8 P.730-740

http://doi.org/10.1631/FITEE.1500324


Performance analysis of visual markers for indoor navigation systems


Author(s):  Gaetano C. La Delfa, Salvatore Monteleone, Vincenzo Catania, Juan F. De Paz, Javier Bajo

Affiliation(s):  Department of Electrical, Electronics and Computer Engineering (DIEEI), University of Catania, Catania 95125, Italy; more

Corresponding email(s):   gaetano.ladelfa@dieei.unict.it

Key Words:  Indoor localization, Visual markers, Computer vision


Gaetano C. La Delfa, Salvatore Monteleone, Vincenzo Catania, Juan F. De Paz, Javier Bajo. Performance analysis of visual markers for indoor navigation systems[J]. Frontiers of Information Technology & Electronic Engineering, 2016, 17(8): 730-740.

@article{title="Performance analysis of visual markers for indoor navigation systems",
author="Gaetano C. La Delfa, Salvatore Monteleone, Vincenzo Catania, Juan F. De Paz, Javier Bajo",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="17",
number="8",
pages="730-740",
year="2016",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1500324"
}

%0 Journal Article
%T Performance analysis of visual markers for indoor navigation systems
%A Gaetano C. La Delfa
%A Salvatore Monteleone
%A Vincenzo Catania
%A Juan F. De Paz
%A Javier Bajo
%J Frontiers of Information Technology & Electronic Engineering
%V 17
%N 8
%P 730-740
%@ 2095-9184
%D 2016
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1500324

TY - JOUR
T1 - Performance analysis of visual markers for indoor navigation systems
A1 - Gaetano C. La Delfa
A1 - Salvatore Monteleone
A1 - Vincenzo Catania
A1 - Juan F. De Paz
A1 - Javier Bajo
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 17
IS - 8
SP - 730
EP - 740
%@ 2095-9184
Y1 - 2016
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1500324


Abstract: 
The massive diffusion of smartphones, the growing interest in wearable devices and the Internet of Things, and the exponential rise of location based services (LBSs) have made the problem of localization and navigation inside buildings one of the most important technological challenges of recent years. Indoor positioning systems have a huge market in the retail sector and contextual advertising; in addition, they can be fundamental to increasing the quality of life for citizens if deployed inside public buildings such as hospitals, airports, and museums. Sometimes, in emergency situations, they can make the difference between life and death. Various approaches have been proposed in the literature. Recently, thanks to the high performance of smartphones’ cameras, marker-less and marker-based computer vision approaches have been investigated. In a previous paper, we proposed a technique for indoor localization and navigation using both Bluetooth low energy (BLE) and a 2D visual marker system deployed into the floor. In this paper, we presented a qualitative performance evaluation of three 2D visual markers, Vuforia, ArUco marker, and AprilTag, which are suitable for real-time applications. Our analysis focused on specific case study of visual markers placed onto the tiles, to improve the efficiency of our indoor localization and navigation approach bychoosing the best visual marker system.

室内导航系统视觉标记性能分析

概要:智能手机大规模普及,人们对可穿戴设备和物联网兴趣倍增,以及定位服务指数级增长,使得室内定位导航成为近年来最重要的技术挑战之一。室内定位系统不仅在零售行业及定向推送广告行业有着巨大的市场,同时,它还可以部署在医院、机场、博物馆等公共建筑中,成为提升人们生活质量的基础性配置。甚至,在紧急情况下,是否部署室内定位系统,会造成生死之别。文献中已报道多种方法。近年来,得益于智能手机相机性能的大幅提升,无标记点和有标记点的计算机视觉方法得到开发。在之前的研究中,我们提出了一种利用低功耗蓝牙和嵌入地面的2D视觉标记系统进行室内定位导航的技术。在本文中,我们对3种可服务于实时应用的2D视觉标记(Vuforia,ArUco标记和AprilTag)进行了定性的性能评估。本文重点研究了附于地表瓷砖的3种视觉标记在特定情况下的表现,提出了最优视觉标记的甄选原则,为我们提出的室内定位导航技术提供技术支撑。
关键词:室内定位;视觉标记;计算机视觉

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Aider, O.A., Hoppenot, P., Colle, E., 2005. A model-based method for indoor mobile robot localization using monocular vision and straight-line correspondences. Robot. Auton. Syst., 52(2):229-246.

[2]Arias, S., April, S., 2011. Visual Tag Recognition for Indoor Positioning. MS Thesis, Universitat Politècnica de Catalunya, Catalonia, Spain.

[3]Bajo, J., de Paz, J.F., Villarrubia, G., et al., 2015. Self-organizing architecture for information fusion in distributed sensor networks. Int. J. Distrib. Sens. Netw., 11(3):1-13.

[4]Beauregard, S., Haas, H., 2006. Pedestrian dead reckoning: a basis for personal positioning. Proc. 3rd Workshop on Positioning, Navigation and Communication, p.27-35.

[5]Bitsch Link, J.A., Gerdsmeier, F., Smith, P., et al., 2012. Indoor navigation on wheels (and on foot) using smartphones. Proc. Int. Conf. on Indoor Positioning and Indoor Navigation, p.1-10.

[6]Buchman, A., Lung, C., 2013. Received signal strength based room level accuracy indoor localisation method. IEEE Int. Conf. on Cognitive Infocommunications, p.103-108.

[7]Chandgadkar, A., Knottenbelt, W., 2013 An Indoor Navigation System for Smartphones. MS Thesis, Imperial College London, London, UK.

[8]Constandache, I., Choudhury, R.R., Rhee, I., 2010. Towards mobile phone localization without war-driving. Proc. IEEE Int. Conf. on Computer Communications, p.1-9.

[9]Danakis, C., Afgani, M., Povey, G., et al., 2012. Using a CMOS camera sensor for visible light communication. IEEE Global Communications Conf., p.1244-1248.

[10]Denso, W., 2010. QR-Code Standard. Available from http://www.denso-wave.com/qrcode/qrstandard-e.html

[11]Ecklbauer, B.L., 2014. A Mobile Positioning System for Android Based on Visual Markers. PhD Thesis, University of North Texas, Hagenberg, Austria.

[12]Fuchs, C., Aschenbruck, N., Martini, P., et al., 2011. Indoor tracking for mission critical scenarios: a survey. Pervas. Mob. Comput., 7(1):1-15.

[13]Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F.J., et al., 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Patt. Recogn., 47(6):2280-2292.

[14]Han, D., Jung, S., Lee, M., et al., 2014. Building a practical Wi-Fi-based indoor navigation system. IEEE Pervas. Comput., 13(2):72-79.

[15]Haverinen, J., Kemppainen, A., 2009. A global self-localization technique utilizing local anomalies of the ambient magnetic field. Int. Conf. on Robotics and Automation, p.3142-3147.

[16]Jovicic, A., Li, J., Richardson, T., 2013. Visible light communication: opportunities, challenges and the path to market. IEEE Commun. Mag., 51(12):26-32.

[17]Kato, H., Billinghurst, M., 1999. Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proc. 2nd IEEE ACM Int. Workshop on Augmented Reality, p.85-94.

[18]La Delfa, G.C., Catania, V., 2014. Accurate indoor navigation using smartphone, bluetooth low energy and visual tags. Proc. 2nd Conf. on Mobile and Information Technologies in Medicine, p.1-4.

[19]La Delfa, G.C., Catania, V., Monteleone, S., et al., 2015. Computer vision based indoor navigation: a visual markers evaluation. 6th Int. Symp. on Ambient Intelligence-Software and Applications, p.165-173.

[20]Li, F., Zhao, C., Ding, G., et al., 2012. A reliable and accurate indoor localization method using phone inertial sensors. Proc. ACM Conf. on Ubiquitous Computing, p.421-430.

[21]Liu, Y., Wang, Q., Liu, J., et al., 2012. MCMC-based indoor localization with a smart phone and sparse WiFi access points. IEEE Int. Conf. on Pervasive Computing and Communications Workshops, p.247-252.

[22]Liu, Y., Dashti, M., Zhang, J., 2013. Indoor localization on mobile phone platforms using embedded inertial sensors. 10th Workshop on Positioning Navigation and Communication, p.1-5.

[23]Longacre, A., Hussey, R., 1997. Two Dimensional Data Encoding Structure and Symbology for Use with Optical Readers. US Patent 5 591 956.

[24]Mandal, A., Lopes, C.V., Givargis, T., et al., 2005. Beep: 3D indoor positioning using audible sound. IEEE 2nd Consumer Communications and Networking Conf., p.348-353.

[25]Martin, P., Ho, B.J., Grupen, N., et al., 2014. An iBeacon primer for indoor localization: demo abstract. Proc. 1st ACM Conf. on Embedded Systems for Energy-Efficient Buildings, p.190-191.

[26]Mautz, R., 2012. Indoor Positioning Technologies. Südwestdeutscher Verlag für Hochschulschriften.

[27]Meingast, M., Geyer, C., Sastry, S., 2005. Geometric models of rolling-shutter cameras. Computer Vision and Pattern Recognition, ePrint Archive, arXiv:cs/0503076. Available from http://arxiv.org/abs/cs/0503076

[28]Mohan, A., Woo, G., Hiura, S., et al., 2009. Bokode: imperceptible visual tags for camera based interaction from a distance. ACM Trans. Graph., 28(3):98.1-98.8.

[29]Mulloni, A., Wagner, D., Barakonyi, I., et al., 2009. Indoor positioning and navigation with camera phones. IEEE Pervas. Comput., 8(2):22-31.

[30]Naimark, L., Foxlin, E., 2002. Circular data matrix fiducial system and robust image processing for a wearable vision-inertial self-tracker. Proc. 1st Int. Symp. on Mixed and Augmented Reality, p.27-36.

[31]Olson, E., 2011. AprilTag: a robust and flexible visual fiducial system. Proc. IEEE Int. Conf. on Robotics and Automation, p.3400-3407.

[32]Qualcomm, 2014. Qualcomm Vuforia. Available from https://developer.vuforia.com/

[33]Richardson, A., Strom, J., Olson, E., 2013. AprilCal: assisted and repeatable camera calibration. Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, p.1814-1821.

[34]Saito, S., Hiyama, A., Tanikawa, T., et al., 2007. Indoor marker-based localization using coded seamless pattern for interior decoration. IEEE Virtual Reality Conf., p.67-74.

[35]Subbu, P., Sasidhar, K., 2011. Indoor Localization Using Magnetic Fields. PhD Thesis, University of North Texas, Texas, USA.

[36]Tarzia, S.P., Dinda, P.A., Dick, R.P., et al., 2011. Indoor localization without infrastructure using the acoustic background spectrum. Proc. 9th Int. Conf. on Mobile Systems, Applications, and Services, p.155-168.

[37]Torres-Solis, J., Falk, T.H., Chau, T., 2010. A Review of Indoor Localization Technologies: Towards Navigational Assistance for Topographical Disorientation. In: Molina, F.J.V. (Ed.), Ambient Intelligence. InTech Open Access Publisher, Rijeka, Croatia, p.51-83.

[38]Villarrubia, G., Bajo, J., de Paz, J.F., et al., 2014. Monitoring and detection platform to prevent anomalous situations in home care. Sensor, 14(6):9900-9921.

[39]Wang, H., Sen, S., Elgohary, A., et al., 2012. No need to war-drive: unsupervised indoor localization. Proc. 10th Int. Conf. on Mobile Systems, Applications, and Services, p.197-210.

[40]Wicker, S.B., Bhargava, V.K., 1994. Reed-Solomon Codes and Their Applications. IEEE Press, Piscataway, NJ, USA.

[41]Zachariah, D., Jansson, M., 2012. Fusing visual tags and inertial information for indoor navigation. IEEE/ION Position Location and Navigation Symp., p.535-540.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE