|
Frontiers of Information Technology & Electronic Engineering
ISSN 2095-9184 (print), ISSN 2095-9230 (online)
2023 Vol.24 No.6 P.828-843
Underwater object detection by fusing features from different representations of sonar data
Abstract: Modern underwater object detection methods recognize objects from sonar data based on their geometric shapes. However, the distortion of objects during data acquisition and representation is seldom considered. In this paper, we present a detailed summary of representations for sonar data and a concrete analysis of the geometric characteristics of different data representations. Based on this, a feature fusion framework is proposed to fully use the intensity features extracted from the polar image representation and the geometric features learned from the point cloud representation of sonar data. Three feature fusion strategies are presented to investigate the impact of feature fusion on different components of the detection pipeline. In addition, the fusion strategies can be easily integrated into other detectors, such as the You Only Look Once (YOLO) series. The effectiveness of our proposed framework and feature fusion strategies is demonstrated on a public sonar dataset captured in real-world underwater environments. Experimental results show that our method benefits both the region proposal and the object classification modules in the detectors.
Key words: Underwater object detection; Sonar data representation; Feature fusion
1大连海事大学信息科学技术学院,中国大连市,116026
2大连海事大学交通运输工程学院,中国大连市,116026
摘要:现有水下目标检测方法多基于物体的几何形状从声呐数据中识别物体,这些方法几乎忽略数据采集和数据表征过程所产生的形状畸变问题。为此,本文对声呐数据的不同表示形式进行了对比分析,在此基础上,提出了一个特征融合框架,以充分利用从极坐标图像中提取的强度特征和从点云表示形式中学习的几何特征。该框架中设计了三种特征融合策略,以分析特征融合对检测器不同模块的影响。同时,这些融合策略可以直接集成到其他检测器中,如YOLO系列。通过公开水下实景声呐数据集上的一系列对比实验,验证了所提框架和特征融合策略的有效性。实验结果表明,所提特征融合方法对检测器中候选区域模块和分类模块的结果都有所增益。
关键词组:
References:
Open peer comments: Debate/Discuss/Question/Opinion
<1>
DOI:
10.1631/FITEE.2200429
CLC number:
TN911.73; TP391.41
Download Full Text:
Downloaded:
1630
Download summary:
<Click Here>Downloaded:
402Clicked:
2034
Cited:
0
On-line Access:
2024-08-27
Received:
2023-10-17
Revision Accepted:
2024-05-08
Crosschecked:
2023-07-03