Fast dense stereo matching using adaptive window in hierarchical framework

Sang Un Yoon, Dongbo Min, Kwanghoon Sohn

Research output: Contribution to journalConference articlepeer-review

6 Scopus citations


A new area-based stereo matching in hierarchical framework is proposed. Local methods generally measure the similarity between the image pixels using local support window. An appropriate support window, where the pixels have similar disparity, should be selected adaptively for each pixel. Our algorithm consists of the following two steps. In the first step, given an estimated initial disparity map, we obtain an object boundary map for distinction of homogeneous/object boundary region. It is based on the assumption that the depth boundary exists inside of intensity boundary. In the second step for improving accuracy, we choose the size and shape of window using boundary information to acquire the accurate disparity map. Generally, the boundary regions are determined by the disparity information, which should be estimated. Therefore, we propose a hierarchical structure for simultaneous boundary and disparity estimation. Finally, we propose post-processing scheme for removal of outliers. The algorithm does not use a complicate optimization. Instead, it concentrates on the estimation of a optimal window for each pixel in improved hierarchical framework, therefore, it is very efficient in computational complexity. The experimental results on the standard data set demonstrate that the proposed method achieves better performance than the conventional methods in homogeneous regions and object boundaries.

Original languageEnglish
Pages (from-to)316-325
Number of pages10
JournalLecture Notes in Computer Science
Volume4292 LNCS - II
StatePublished - 2006
Event2nd International Symposium on Visual Computing, ISVC 2006 - Lake Tahoe, NV, United States
Duration: 6 Nov 20068 Nov 2006


Dive into the research topics of 'Fast dense stereo matching using adaptive window in hierarchical framework'. Together they form a unique fingerprint.

Cite this