Randomized texture flow estimation using visual similarity

Sunghwan Choi, Dongbo Min, Kwanghoon Sohn

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Exploring underlying texture flows defined with orientation and scale is of a great interest on a variety of vision-related tasks. However, existing methods often fail to capture accurate flows due to over-parameterization of texture deformation or employ a costly global optimization which makes the algorithm computationally demanding. In this paper, we address this inverse problem by casting it as a randomized correspondence search along with a locally-adaptive vector field smoothing. When a small example patch is given as a reference, a randomized deformable matching is performed on the very densely quantized label space, enabling an efficient estimation of texture deformation without quality degeneration, e.g., due to quantization artifacts which often appear in the optimization-driven discrete approaches. The visual similarity with respect to the deformation parameters is directly measured with an input texture image on an appearance space. The locally-adaptive smoothing is then applied to the intermediate flow field, resulting in a good continuation of the resultant texture flow. Experimental results on both synthetic and natural images show that the proposed method improves the performance in terms of both runtime efficiency and/or visual quality, compared to the existing methods.

Original languageEnglish
Title of host publication2014 IEEE International Conference on Image Processing, ICIP 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages5
ISBN (Electronic)9781479957514
StatePublished - 28 Jan 2014

Publication series

Name2014 IEEE International Conference on Image Processing, ICIP 2014

Bibliographical note

Publisher Copyright:
© 2014 IEEE.


  • Texture analysis
  • correspondence search
  • flow estimation
  • joint filtering


Dive into the research topics of 'Randomized texture flow estimation using visual similarity'. Together they form a unique fingerprint.

Cite this