Multimodal Detection of Depression in Clinical Interviews

@article{Dibekliolu2015MultimodalDO,
  title={Multimodal Detection of Depression in Clinical Interviews},
  author={Hamdi Dibeklioğlu and Zakia Hammal and Ying Yang and Jeffrey F. Cohn},
  journal={Proceedings of the 2015 ACM on International Conference on Multimodal Interaction},
  year={2015},
  url={https://api.semanticscholar.org/CorpusID:2810891}
}
It is suggested that automatic detection of depression from behavioral indicators is feasible and that multimodal measures afford most powerful detection.

Tables from this paper

Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding

The findings suggest that automatic detection of depression severity from behavioral indicators in patients is feasible and that multimodal measures afford the most powerful detection.

Multimodal assessment of depression from behavioral signals

This chapter describes multimodal measures of behavior and physiology, how these measures can be processed to extract features sensitive to depression, and how classification or prediction may be used to provide automatic assessment of depression occurrence and severity.

Acoustic and Facial Features From Clinical Interviews for Machine Learning–Based Psychiatric Diagnosis: Algorithm Development

Machine learning models based on acoustic and facial movement features extracted from participant interviews are developed to predict diagnoses and detect clinician-coded neuropsychiatric symptoms and supports the development of a new generation of innovative clinical tools by employingoustic and facial data analysis.

Acoustic and Facial Features From Clinical Interviews for Machine Learning–Based Psychiatric Diagnosis: Algorithm Development (Preprint)

This study represents advancement in efforts to capitalize on digital data to improve diagnostic assessment and supports the development of a new generation of innovative clinical tools by employing acoustic and facial data analysis.

Psychomotor cues for depression screening

This paper discusses two frameworks for the task of automated depression screening, based on computation of facial movement features, followed by regression to yield a higher level feature which is correlated to depression severity of an individual.

Depression Severity Estimation from Multiple Modalities

It is demonstrated that among the considered modalities, behavioral characteristic features extracted from speech yield the lowest MAE, outperforming the best system at the Audio/Visual Emotion Challenge (AVEC) 2017 depression sub-challenge.

Identifying psychiatric manifestations in schizophrenia and depression from audio-visual behavioural indicators through a machine-learning approach

The results suggest that machine-learning models leveraging audio-visual characteristics can help diagnose, assess, and monitor patients with schizophrenia and depression.

Explainable Depression Detection using Multimodal Behavioural Cues

Findings indicate that: (a) head motion patterns are effective cues for depression assessment, and (b) explanatory kineme patterns can be observed for the two classes, consistent with prior research.
...

Detecting depression from facial actions and vocal prosody

The findings suggest the feasibility of automatic detection of depression, raise new issues in automated facial image analysis and machine learning, and have exciting implications for clinical theory and practice.

Detecting Depression Severity from Vocal Prosody

It is suggested that analysis of vocal prosody could be a powerful tool to assist in depression screening and monitoring over the course of depressive disorder and recovery.

Eye movement analysis for depression detection

This paper analyses the performance of eye movement features extracted from face videos using Active Appearance Models for a binary classification task (depressed vs. non-depressed) and finds that eye movement low-level features gave 70% accuracy using a hybrid classifier of Gaussian Mixture Models and Support Vector Machines and 75% accuracy when using statistical measures with SVM classifiers over the entire interview.

Can body expressions contribute to automatic depression analysis?

The results show the effectiveness of the proposed system to evaluate the contribution of various body parts in depression analysis, based on a framework based on space-time interest points and bag of words.

Towards an affective interface for assessment of psychological distress

The results of the evaluation suggest that clinicians' ratings of nonverbal affective markers are less predictive of psychological distress than automatically quantified Affective markers.

Head Pose and Movement Analysis as an Indicator of Depression

It is concluded that positive emotions are expressed less in depressed subjects at all times, and that negative emotions have less discriminatory power than positive emotions in detecting depression.

Automatic Nonverbal Behavior Indicators of Depression and PTSD: Exploring Gender Differences

It is shown that gender plays an important role in the automatic assessment of psychological conditions such as depression and post-traumatic stress disorder (PTSD), and a gender-dependent approach significantly improves the performance over a gender agnostic one.

A RATING SCALE FOR DEPRESSION

The present scale has been devised for use only on patients already diagnosed as suffering from affective disorder of depressive type, used for quantifying the results of an interview, and its value depends entirely on the skill of the interviewer in eliciting the necessary information.

Antidepressant drug effects and depression severity: a patient-level meta-analysis.

The magnitude of benefit of antidepressant medication compared with placebo increases with severity of depression symptoms and may be minimal or nonexistent, on average, in patients with mild or moderate symptoms.