Automatic spoken affect classification and analysis

  title={Automatic spoken affect classification and analysis},
  author={D. Roy and A. Pentland},
  journal={Proceedings of the Second International Conference on Automatic Face and Gesture Recognition},
  • D. Roy, A. Pentland
  • Published 1996
  • Computer Science
  • Proceedings of the Second International Conference on Automatic Face and Gesture Recognition
  • This paper reports results from preliminary experiments on automatic classification of spoken affect valence. The task was to classify short spoken sentences into one of two classes: approving or disapproving. Using an optimal combination of six acoustic measurements our classifier achieved an accuracy of 65% to 88% for speaker dependent, text-independent classification. The results suggest that pitch and energy measurements may be used to automatically classify spoken affect valence but more… CONTINUE READING
    52 Citations

    Figures, Tables, and Topics from this paper

    Explore Further: Topics Discussed in This Paper

    Recognition of negative emotions from the speech signal
    • 164
    • PDF
    Classifying emotions in human-machine spoken dialogs
    • 67
    • PDF
    Baby Ears: a recognition system for affective vocalizations
    • M. Slaney, G. McRoberts
    • Computer Science
    • Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181)
    • 1998
    • 98
    • PDF
    Automatic detection of stress in speech
    • 8
    • PDF
    Toward detecting emotions in spoken dialogs
    • 909
    • PDF
    Impact of Emotion on Prosody Analysis
    • PDF
    A Study on Prosody Analysis
    • 1
    • PDF
    BabyEars: A recognition system for affective vocalizations
    • 74
    • PDF
    BabyEars : A recognition system for affective vocalizations q


    Emotions and speech: some acoustical correlates.
    • 746
    • PDF
    Toward the simulation of emotion in synthetic speech: a review of the literature on human vocal emotion.
    • I. Murray, J. Arnott
    • Psychology, Medicine
    • The Journal of the Acoustical Society of America
    • 1993
    • 1,074
    Minimal cues in the vocal communication of affect: Judging emotions from content-masked speech
    • 94
    • PDF
    Analysis, synthesis, and perception of voice quality variations among female and male talkers.
    • 1,621
    • PDF
    Acoustic and perceptual indicators of emotional stress.
    • 96
    The long-term spectrum and perceived emotion
    • 46
    Generating expression in synthesized speech
    • 133
    • PDF
    Digital Processing of Speech Signals
    • 1,701