skip to main content
10.1145/2647868.2655035acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
poster

Affective Image Retrieval via Multi-Graph Learning

Authors Info & Claims
Published:03 November 2014Publication History

ABSTRACT

Images can convey rich emotions to viewers. Recent research on image emotion analysis mainly focused on affective image classification, trying to find features that can classify emotions better. We concentrate on affective image retrieval and investigate the performance of different features on different kinds of images in a multi-graph learning framework. Firstly, we extract commonly used features of different levels for each image. Generic features and features derived from elements-of-art are extracted as low-level features. Attributes and interpretable principles-of-art based features are viewed as mid-level features, while semantic concepts described by adjective noun pairs and facial expressions are extracted as high-level features. Secondly, we construct single graph for each kind of feature to test the retrieval performance. Finally, we combine the multiple graphs together in a regularization framework to learn the optimized weights of each graph to efficiently explore the complementation of different features. Extensive experiments are conducted on five datasets and the results demonstrate the effectiveness of the proposed method.

References

  1. D. Borth et al. Large-scale visual sentiment ontology and detectors using adjective noun pairs. In MM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. E. S. Dan-Glauser and K. R. Scherer. The geneva affective picture database (gaped): a new 730-picture database focusing on valence and normative significance. Behavior research methods, 43(2):468--477, 2011.Google ScholarGoogle Scholar
  3. Y. Gao et al. 3-d object retrieval and recognition with hypergraph analysis. IEEE TIP, 21(9):4290--4303, 2012.Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. A. Hanjalic. Extracting moods from pictures and sounds: Towards truly personalized tv. IEEE Signal Processing Magazine, 23(2):90--100, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  5. K. Järvelin and J. Kekäläinen. Cumulated gain-based evaluation of ir techniques. ACM TOIS, 20(4):422--446, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J. Jia et al. Can we understand van gogh's mood? learning to infer affects from images in social networks. In MM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. P. Lang et al. International affective picture system (IAPS): Affective ratings of pictures and instruction manual. NIMH, Center for the Study of Emotion & Attention, 2005.Google ScholarGoogle Scholar
  8. X. Lu et al. On shape and the computability of emotions. In MM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. P. Lucey et al. The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. In CVPR Workshops, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  10. J. Machajdik and A. Hanbury. Affective image classification using features inspired by psychology and art theory. In MM, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. J. Mikels et al. Emotional category data on images from the international affective picture system. Behavior research methods, 37(4):626--630, 2005.Google ScholarGoogle Scholar
  12. B. Pang and L. Lee. Opinion mining and sentiment analysis. Information Retrieval, 2(1--2):1--135, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. G. Patterson and J. Hays. Sun attribute database: Discovering, annotating, and recognizing scene attributes. In CVPR, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. P. Viola et al. Robust real-time face detection. IJCV, 57(2):137--154, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. M. Wang et al. Unified video annotation via multigraph learning. IEEE TCSVT, 19(5):733--746, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. W. Wang et al. A survey on emotional semantic image retrieval. In ICIP, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  17. J. Xiao et al. Sun database: Large-scale scene recognition from abbey to zoo. In CVPR, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  18. P. Yang et al. Exploring facial expressions with compositional features. In CVPR, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  19. J. Yuan et al. Sentribute: image sentiment analysis from a mid-level perspective. In WISDOM, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. S. Zhao et al. Video indexing and recommendation based on affective analysis of viewers. In ACM MM, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. S. Zhao et al. Exploring principles-of-art features for image emotion recognition. In ACM MM, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Affective Image Retrieval via Multi-Graph Learning

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          MM '14: Proceedings of the 22nd ACM international conference on Multimedia
          November 2014
          1310 pages
          ISBN:9781450330633
          DOI:10.1145/2647868

          Copyright © 2014 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 3 November 2014

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • poster

          Acceptance Rates

          MM '14 Paper Acceptance Rate55of286submissions,19%Overall Acceptance Rate995of4,171submissions,24%

          Upcoming Conference

          MM '24
          MM '24: The 32nd ACM International Conference on Multimedia
          October 28 - November 1, 2024
          Melbourne , VIC , Australia

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader