Full Text:   <1216>

Summary:  <1134>

CLC number: TP242.6; V279

On-line Access: 2020-12-10

Received: 2020-01-27

Revision Accepted: 2020-04-07

Crosschecked: 2020-04-22

Cited: 0

Clicked: 2230

Citations:  Bibtex RefMan EndNote GB/T7714


Yi-cheng Zhang


-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2020 Vol.21 No.12 P.1695-1712


Multi-UAV collaborative system with a feature fast matching algorithm

Author(s):  Tian-miao Wang, Yi-cheng Zhang, Jian-hong Liang, Yang Chen, Chao-lei Wang

Affiliation(s):  School of Mechanical Engineering & Automation, Beihang University, Beijing 100191, China; more

Corresponding email(s):   zycet@126.com

Key Words:  Multiple UAVs, Collaboration, Simultaneous localization and mapping (SLAM), Feature description and matching

Tian-miao Wang, Yi-cheng Zhang, Jian-hong Liang, Yang Chen, Chao-lei Wang. Multi-UAV collaborative system with a feature fast matching algorithm[J]. Frontiers of Information Technology & Electronic Engineering, 2020, 21(12): 1695-1712.

@article{title="Multi-UAV collaborative system with a feature fast matching algorithm",
author="Tian-miao Wang, Yi-cheng Zhang, Jian-hong Liang, Yang Chen, Chao-lei Wang",
journal="Frontiers of Information Technology & Electronic Engineering",
publisher="Zhejiang University Press & Springer",

%0 Journal Article
%T Multi-UAV collaborative system with a feature fast matching algorithm
%A Tian-miao Wang
%A Yi-cheng Zhang
%A Jian-hong Liang
%A Yang Chen
%A Chao-lei Wang
%J Frontiers of Information Technology & Electronic Engineering
%V 21
%N 12
%P 1695-1712
%@ 2095-9184
%D 2020
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2000047

T1 - Multi-UAV collaborative system with a feature fast matching algorithm
A1 - Tian-miao Wang
A1 - Yi-cheng Zhang
A1 - Jian-hong Liang
A1 - Yang Chen
A1 - Chao-lei Wang
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 21
IS - 12
SP - 1695
EP - 1712
%@ 2095-9184
Y1 - 2020
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2000047

We present a real-time monocular simultaneous localization and mapping (SLAM) system with a new distributed structure for multi-UAV collaboration tasks. The system is different from other general SLAM systems in two aspects: First, it does not aim to build a global map, but to estimate the latest relative position between nearby vehicles; Second, there is no centralized structure in the proposed system, and each vehicle owns an individual metric map and an ego-motion estimator to obtain the relative position between its own map and the neighboring vehicles’. To realize the above characteristics in real time, we demonstrate an innovative feature description and matching algorithm to avoid catastrophic expansion of feature point matching workload due to the increased number of UAVs. Based on the hash and principal component analysis, the matching time complexity of this algorithm can be reduced from O(log N) to O(1). To evaluate the performance, the algorithm is verified on the acknowledged multi-view stereo benchmark dataset, and excellent results are obtained. Finally, through the simulation and real flight experiments, this improved SLAM system with the proposed algorithm is validated.



摘要:针对多无人机协同任务,基于新的分布式结构建立一套实时的单目同步定位与地图创建(SLAM)框架。该SLAM框架与其他一般SLAM框架主要有两点不同:首先它不以建立全局地图为目标,而是着眼于估算无人机最新的相邻位置关系;其次系统中没有中央化结构,每个飞行器拥有独立的计算测量地图和自运动估计器,通过自身地图与相邻无人机地图间的关系计算相对位置。为实时实现以上性能,设计一套新的特征描述与匹配算法,以避免由于无人机数量变多导致的特征数据计算压力灾难性扩张。基于哈希与主成分分析,将匹配算法的时间复杂度从O(log N)优化至O(1)。为评估性能,将算法在多视角的立体数据集上进行验证,取得良好结果。最后,通过仿真与真实飞行试验,测试整体系统可行性。


Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article


[1]Amidi O, Kanade T, Fujita K, 1999. A visual odometer for autonomous helicopter flight. Robot Auton Syst, 28(2):185-193.

[2]Andersen ED, Taylor CN, 2007. Improving MAV pose estimation using visual information. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.3745-3750.

[3]Bay H, Tuytelaars T, van Gool L, 2006. SURF: speeded up robust features. Proc 9th European Conf on Computer Vision, p.404-417.

[4]Caballero F, Merino L, Ferruz J, et al., 2006. Improving vision-based planar motion estimation for unmanned aerial vehicles through online mosaicing. Proc IEEE Int Conf on Robotics and Automation, p.2860-2865.

[5]Caballero F, Merino L, Ferruz J, et al., 2007. Homography based Kalman filter for mosaic building. Applications to UAV position estimation. Proc IEEE Int Conf on Robotics and Automation, p.2004-2009.

[6]Caballero F, Merino L, Ferruz J, et al., 2009. Unmanned aerial vehicle localization based on monocular vision and online mosaicking. J Intell Robot Syst, 55(4-5):323-343.

[7]Campoy P, Correa JF, Mondragón I, et al., 2009. Computer vision onboard UAVs for civilian tasks. J Intell Robot Syst, 54(1-3):105-135.

[8]Cunningham A, Indelman V, Dellaert F, 2013. DDF-SAM 2.0: consistent distributed smoothing and mapping. Proc IEEE Int Conf on Robotics and Automation, p.5220-5227.

[9]Eberli D, Scaramuzza D, Weiss S, et al., 2011. Vision based position control for MAVs using one single circular landmark. J Intell Robot Syst, 61(1-4):495-512.

[10]Forster C, Lynen S, Kneip L, et al., 2013. Collaborative monocular SLAM with multiple micro aerial vehicles. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.3962-3970.

[11]Fox D, Burgard W, Kruppa H, et al., 2000. A probabilistic approach to collaborative multi-robot localization. Auton Robot, 8(3):325-344.

[12]Howard A, Sukhatme GS, Matarić MJ, 2006. Multirobot simultaneous localization and mapping using manifold representations. Proc IEEE, 94(7):1360-1369.

[13]Ke Y, Sukthankar R, 2004. PCA-SIFT: a more distinctive representation for local image descriptors. Proc IEEE Computer Society Conf on Computer Vision and Pattern Recognition, p.506-513.

[14]Li TC, Su JY, Liu W, et al., 2017. Approximate Gaussian conjugacy: parametric recursive filtering under nonlinearity, multimodality, uncertainty, and constraint, and beyond. Front Inform Technol Electron Eng, 18(12):1913-1939.

[15]Lowe DG, 2004. Distinctive image features from scale- invariant keypoints. Int J Comput Vis, 60(2):91-110.

[16]Masayoshi M, Chen A, Singh SPN, et al., 2017. Autonomous helicopter tracking and localization using a self-surveying camera array. In: Corke P, Sukkariah S (Eds.). Field and Service Robotics. Springer Tracts in Advanced Robotics. Springer, Berlin, Heidelberg, Germany.

[17]McDonald J, Kaess M, Cadena C, et al., 2011. 6-DOF multi- session visual SLAM using anchor nodes. European Conf on Mobile Robotics, p.69-76.

[18]Meingast M, Geyer C, Sastry S, 2004. Vision based terrain recovery for landing unmanned aerial vehicles. Proc IEEE Conf on Decision and Control, p.1670-1675.

[19]Moallem P, Razmjooy N, 2012. Optimal threshold computing in automatic image thresholding using adaptive particle swarm optimization. J Appl Res Technol, 10(5):703-712.

[20]Mondragón IF, Campoy P, Martinez C, et al., 2010a. Omnidirectional vision applied to unmanned aerial vehicles (UAVs) attitude and heading estimation. Robot Auton Syst, 58(6):809-819.

[21]Mondragón IF, Olivares-Méndez MA, Campoy P, et al., 2010b. Unmanned aerial vehicles UAVs attitude, height, motion estimation and control using visual systems. Auton Robot, 29(1):17-34.

[22]Montemerlo M, Thrun S, Koller D, et al., 2002. FastSLAM: a factored solution to the simultaneous localization and mapping problem. Proc 8th National Conf on Artificial Intelligence, p.593-598.

[23]Montemerlo M, Thrun S, Roller D, et al., 2003. FastSLAM 2.0: an improved particle filtering algorithm for simultaneous localization and mapping that provably converges. Proc 18th Int Joint Conf on Artificial Intelligence, p.1151-1156.

[24]Montiel JMM, Civera J, Davison AJ, 2006. Unified inverse depth parametrization for monocular SLAM. Robotics: Science and Systems II, p.81-88.

[25]Mourikis AI, Trawny N, Roumeliotis SI, et al., 2009. Vision- aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans Robot, 25(2):264-280.

[26]Mur-Artal R, Montiel JMM, Tardós JD, 2015. ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans Robot, 31(5):1147-1163.

[27]Ready BB, Taylor CN, 2007. Improving accuracy of MAV pose estimation using visual odometry. Proc American Control Conf, p.3721-3726.

[28]Ready BB, Taylor CN, 2009. Inertially aided visual odometry for miniature air vehicles in GPS-denied environments. J Intell Robot Syst, 55(2-3):203-221.

[29]Rocha R, Dias J, Carvalho A, 2005. Cooperative multi-robot systems: a study of vision-based 3-D mapping using information theory. Proc IEEE Int Conf on Robotics and Automation, p.384-389.

[30]Simo-Serra E, Trulls E, Ferraz L, et al., 2015. Discriminative learning of deep convolutional feature point descriptors. IEEE Int Conf on Computer Vision, p.118-126.

[31]Templeton T, Shim DH, Geyer C, et al., 2007. Autonomous vision-based landing and terrain mapping using an MPC-controlled unmanned rotorcraft. IEEE Int Conf on Robotics and Automation, p.1349-1356.

[32]Tsai R, Huang T, Zhu WL, 1982. Estimating three- dimensional motion parameters of a rigid planar patch, II: singular value decomposition. IEEE Trans Acoust Speech Signal Process, 30(4):525-534.

[33]Vidal-Calleja TA, Berger C, Solà J, et al., 2011. Large scale multiple robot visual mapping with heterogeneous landmarks in semi-structured terrain. Robot Auton Syst, 59(9):654-674.

[34]Yi KM, Trulls E, Lepetit V, et al., 2016. LIFT: learned invariant feature transform. Proc 14th European Conf on Computer Vision, p.467-483.

[35]Zhang ZY, 1997. Parameter estimation techniques: a tutorial with application to conic fitting. Image Vis Comput, 15(1): 59-76.

Open peer comments: Debate/Discuss/Question/Opinion


Please provide your name, email address and a comment

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2022 Journal of Zhejiang University-SCIENCE