An unsupervised approach for comparing styles of illustrations

Abstract

In creating web pages, books, or presentation slides, consistent use of tasteful visual style(s) is quite important. In this paper, we consider the problem of style-based comparison and retrieval of illustrations. In their pioneering work, Garces et al. [2] proposed an algorithm for comparing illustrative style. The algorithm uses supervised learning that relied on stylistic labels present in a training dataset. In reality, obtaining such labels is quite difficult. In this paper, we propose an unsupervised approach to achieve accurate and efficient stylistic comparison among illustrations. The proposed algorithm combines heterogeneous local visual features extracted densely. These features are aggregated into a feature vector per illustration prior to be treated with distance metric learning based on unsupervised dimension reduction for saliency and compactness. Experimental evaluation of the proposed method by using multiple benchmark datasets indicates that the proposed method outperforms existing approaches. KeywordsLocal image feature, illustration style tag, illustration style feature, unsupervised distance metric learning.

DOI: 10.1109/CBMI.2015.7153615

Extracted Key Phrases

8 Figures and Tables

Cite this paper

@inproceedings{Furuya2015AnUA, title={An unsupervised approach for comparing styles of illustrations}, author={Takahiko Furuya and Shigeru Kuriyama and Ryutarou Ohbuchi}, booktitle={CBMI}, year={2015} }