ABSTRACT
Images can convey rich emotions to viewers. Recent research on image emotion analysis mainly focused on affective image classification, trying to find features that can classify emotions better. We concentrate on affective image retrieval and investigate the performance of different features on different kinds of images in a multi-graph learning framework. Firstly, we extract commonly used features of different levels for each image. Generic features and features derived from elements-of-art are extracted as low-level features. Attributes and interpretable principles-of-art based features are viewed as mid-level features, while semantic concepts described by adjective noun pairs and facial expressions are extracted as high-level features. Secondly, we construct single graph for each kind of feature to test the retrieval performance. Finally, we combine the multiple graphs together in a regularization framework to learn the optimized weights of each graph to efficiently explore the complementation of different features. Extensive experiments are conducted on five datasets and the results demonstrate the effectiveness of the proposed method.
- D. Borth et al. Large-scale visual sentiment ontology and detectors using adjective noun pairs. In MM, 2013. Google ScholarDigital Library
- E. S. Dan-Glauser and K. R. Scherer. The geneva affective picture database (gaped): a new 730-picture database focusing on valence and normative significance. Behavior research methods, 43(2):468--477, 2011.Google Scholar
- Y. Gao et al. 3-d object retrieval and recognition with hypergraph analysis. IEEE TIP, 21(9):4290--4303, 2012.Google ScholarDigital Library
- A. Hanjalic. Extracting moods from pictures and sounds: Towards truly personalized tv. IEEE Signal Processing Magazine, 23(2):90--100, 2006.Google ScholarCross Ref
- K. Järvelin and J. Kekäläinen. Cumulated gain-based evaluation of ir techniques. ACM TOIS, 20(4):422--446, 2002. Google ScholarDigital Library
- J. Jia et al. Can we understand van gogh's mood? learning to infer affects from images in social networks. In MM, 2012. Google ScholarDigital Library
- P. Lang et al. International affective picture system (IAPS): Affective ratings of pictures and instruction manual. NIMH, Center for the Study of Emotion & Attention, 2005.Google Scholar
- X. Lu et al. On shape and the computability of emotions. In MM, 2012. Google ScholarDigital Library
- P. Lucey et al. The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In CVPR Workshops, 2010.Google ScholarCross Ref
- J. Machajdik and A. Hanbury. Affective image classification using features inspired by psychology and art theory. In MM, 2010. Google ScholarDigital Library
- J. Mikels et al. Emotional category data on images from the international affective picture system. Behavior research methods, 37(4):626--630, 2005.Google Scholar
- B. Pang and L. Lee. Opinion mining and sentiment analysis. Information Retrieval, 2(1--2):1--135, 2008. Google ScholarDigital Library
- G. Patterson and J. Hays. Sun attribute database: Discovering, annotating, and recognizing scene attributes. In CVPR, 2012. Google ScholarDigital Library
- P. Viola et al. Robust real-time face detection. IJCV, 57(2):137--154, 2004. Google ScholarDigital Library
- M. Wang et al. Unified video annotation via multigraph learning. IEEE TCSVT, 19(5):733--746, 2009. Google ScholarDigital Library
- W. Wang et al. A survey on emotional semantic image retrieval. In ICIP, 2008.Google ScholarCross Ref
- J. Xiao et al. Sun database: Large-scale scene recognition from abbey to zoo. In CVPR, 2010.Google ScholarCross Ref
- P. Yang et al. Exploring facial expressions with compositional features. In CVPR, 2010.Google ScholarCross Ref
- J. Yuan et al. Sentribute: image sentiment analysis from a mid-level perspective. In WISDOM, 2013. Google ScholarDigital Library
- S. Zhao et al. Video indexing and recommendation based on affective analysis of viewers. In ACM MM, 2011. Google ScholarDigital Library
- S. Zhao et al. Exploring principles-of-art features for image emotion recognition. In ACM MM, 2014. Google ScholarDigital Library
Index Terms
- Affective Image Retrieval via Multi-Graph Learning
Recommendations
Predicting Personalized Emotion Perceptions of Social Images
MM '16: Proceedings of the 24th ACM international conference on MultimediaImages can convey rich semantics and induce various emotions to viewers. Most existing works on affective image analysis focused on predicting the dominant emotions for the majority of viewers. However, such dominant emotion is often insufficient in ...
Exploring Principles-of-Art Features For Image Emotion Recognition
MM '14: Proceedings of the 22nd ACM international conference on MultimediaEmotions can be evoked in humans by images. Most previous works on image emotion analysis mainly used the elements-of-art-based low-level visual features. However, these features are vulnerable and not invariant to the different arrangements of ...
Learning Visual Emotion Distributions via Multi-Modal Features Fusion
MM '17: Proceedings of the 25th ACM international conference on MultimediaCurrent image emotion recognition works mainly classified the images into one dominant emotion category, or regressed the images with average dimension values by assuming that the emotions perceived among different viewers highly accord with each other. ...
Comments