|
Journal of Zhejiang University SCIENCE C
ISSN 1869-1951(Print), 1869-196x(Online), Monthly
2014 Vol.15 No.3 P.174-186
K-nearest neighborhood based integration of time-of-flight cameras and passive stereo for high-accuracy depth maps
Abstract: Both time-of-flight (ToF) cameras and passive stereo can provide the depth information for their corresponding captured real scenes, but they have innate limitations. ToF cameras and passive stereo are intrinsically complementary for certain tasks. It is desirable to appropriately leverage all the available information by ToF cameras and passive stereo. Although some fusion methods have been presented recently, they fail to consider ToF reliability detection and ToF based improvement of passive stereo. As a result, this study proposes an approach to integrating ToF cameras and passive stereo to obtain high-accuracy depth maps. The main contributions are: (1) An energy cost function is devised to use data from ToF cameras to boost the stereo matching of passive stereo; (2) A fusion method is used to combine the depth information from both ToF cameras and passive stereo to obtain high-accuracy depth maps. Experiments show that the proposed approach achieves improved results with high accuracy and robustness.
Key words: Depth map, Passive stereo, Time-of-flight camera, Fusion
创新要点:利用ToF深度摄像机的优势区域指导立体匹配的过程,优化了立体匹配的结果,同时提出了一种新的代价优化深度融合算法,将ToF深度摄像机的采集结果和立体匹配产生的深度融合成精度更高的深度图。
研究方法:主要包含两部分算法,流程可见图1。首先,利用ToF深度摄像机提供的深度测量图和对应的光强振幅图构建能量函数,利用该能量函数,结合K-最近邻域算法,指导原始立体匹配过程。然后,将优化后的立体匹配结果和TOF深度图结合,构建代价函数,选取最优深度解,作为最终融合深度。
重要结论:实验结果显示,本文采用的算法获取的深度图优于单一主动或被动方法获取的深度图,也优于另一类全局优化的深度融合算法。
关键词组:
Recommended Papers Related to this topic:
References:
Open peer comments: Debate/Discuss/Question/Opinion
<1>
DOI:
10.1631/jzus.C1300194
CLC number:
TP317.4
Download Full Text:
Downloaded:
3475
Download summary:
<Click Here>Downloaded:
2120Clicked:
8941
Cited:
0
On-line Access:
2024-08-27
Received:
2023-10-17
Revision Accepted:
2024-05-08
Crosschecked:
2014-02-19