IEMOCAP: interactive emotional dyadic motion capture database

@article{Busso2008IEMOCAPIE,
  title={IEMOCAP: interactive emotional dyadic motion capture database},
  author={Carlos Busso and Murtaza Bulut and Chi-Chun Lee and Abe Kazemzadeh and Emily Mower Provost and Samuel Kim and Jeannette N. Chang and Sungbok Lee and Shrikanth Narayanan},
  journal={Language Resources and Evaluation},
  year={2008},
  volume={42},
  pages={335-359}
}
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communication. To facilitate such investigations, this paper describes a new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC). This database was recorded from ten actors in dyadic… CONTINUE READING
Highly Influential
This paper has highly influenced 80 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 602 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 420 extracted citations

CRoss-lingual and Multilingual Speech Emotion Recognition on English and French

2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2018
View 11 Excerpts
Highly Influenced

Human-Like Emotion Recognition: Multi-Label Learning from Noisy Labeled Audio-Visual Expressive Speech

2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) • 2018
View 7 Excerpts
Highly Influenced

Speech Emotion Recognition Using Semi-supervised Learning with Ladder Networks

2018 First Asian Conference on Affective Computing and Intelligent Interaction (ACII Asia) • 2018
View 5 Excerpts
Highly Influenced

603 Citations

050100150'10'13'16'19
Citations per Year
Semantic Scholar estimates that this publication has 603 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 53 references

Primitivesbased evaluation and estimation of emotions in speech

M. Grimm, K. Kroschel, E. Mower, S. Narayanan
Speech Communication • 2007
View 4 Excerpts
Highly Influenced

Facial actions as visual cues for personality

Journal of Visualization and Computer Animation • 2006
View 5 Excerpts
Highly Influenced

Least-Squares Fitting of Two 3-D Point Sets

IEEE Transactions on Pattern Analysis and Machine Intelligence • 1987
View 5 Excerpts
Highly Influenced

An Analysis of Multimodal Cues of Interruption in Dyadic Spoken Interactions

Lee, C.-C, S. Lee, S. Narayanan
Interspeech 2008 - Eurospeech • 2008
View 1 Excerpt

Similar Papers

Loading similar papers…