Occlusion handling based on support and decision

Dongbo Min, Sehoon Yea, Anthony Vetro

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

This paper proposes a novel method for handling occluded pixels in stereo images based on a probabilistic voting framework that utilizes a novel support-and-decision process. Occlusion handling aims to assign a reasonable disparity value to occluded pixels in the disparity maps. In an initial step, disparities and their corresponding supports at the occluded pixels are calculated using a probabilistic voting method using the disparities at visible pixels. In this way, the visible pixel information is propagated when the disparities and supports at the occluded pixels are computed. The final disparities for occluded pixels are then computed through an iterative support-and-decision process to propagate the information inside the occluded pixel region. An acceleration technique is also proposed to improve the performance of the iterative support-and-decision process. Experimental results show that the proposed occlusion handling method works well for several challenging stereo images.

Original languageEnglish
Title of host publication2010 IEEE International Conference on Image Processing, ICIP 2010 - Proceedings
Pages1777-1780
Number of pages4
DOIs
StatePublished - 2010
Event2010 17th IEEE International Conference on Image Processing, ICIP 2010 - Hong Kong, Hong Kong
Duration: 26 Sep 201029 Sep 2010

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference2010 17th IEEE International Conference on Image Processing, ICIP 2010
Country/TerritoryHong Kong
CityHong Kong
Period26/09/1029/09/10

Keywords

  • Occlusion handling
  • Probabilistic voting framework
  • Stereo matching
  • Supportand- decision process

Fingerprint

Dive into the research topics of 'Occlusion handling based on support and decision'. Together they form a unique fingerprint.

Cite this