Skip to main content
School of Electronic Engineering and Computer Science

Machine Learning for Analysis of Affect and Mental Health

The project is in the area of Computer Vision and Machine Learning for the analysis of actions, activity and behaviour with an application in the field of Affective Computing and Mental Health. More specifically, the focus on Machine learning methods for the analysis of facial expressions, body gestures, speech and audio for understanding the affective and mental-health state in context. The studentship is to build on existing works of Patras and Priebe on the analysis of the facial non-verbal behaviour of patients with schizophrenia (e.g., Schinet: Automatic estimation of symptoms of schizophrenia from facial behaviour analysis, Bishay et al. IEEE Transactions on Affective Computing, 2019) and on works of Purver on affect and mental health using audion and Natural Language Processing  (e.g., Multi-modal fusion with gating using audio, lexical and disfluency features for Alzheimer's dementia recognition from spontaneous speech, 2021). At a methodological level, the work will focus on the development of novel Machine Learning methods for fusion of information from vision and language.

For more information, please contact Prof. Patras, i.patras@qmul.ac.uk with an email with subject that includes the string: [PHD-Principal]

Team

The student will be based in the Multimedia and Vision group in the school of EECS. The school has one of the largest teams in Computer Vision in the UK and a very strong team in Computational Linguistics.  For more information please see

Ioannis Patras: Home page, Google Scholar

Multimedia and Vision Research group: Homepage

Cognitive Sciences Research group: CogSci homepage

Computing Infrastructure: The team of Prof. Patras, has a Deep learning infrastructure with over 256 CPU cores, 6 large GPU servers with 175,248 CUDA (GPU) cores and 36TB of storage.

References

[1] M Bishay, G Zoumpourlis, I Patras, “TARN: Temporal Attentive Relation Network for Few-Shotand Zero-Shot Action Recognition”, British Machine Vision Conference, Sept. 2019.

[2] M Bishay, P Palasek, S Priebe, I Patras, “Schinet: Automatic estimation of symptoms of schizophrenia from facial behaviour analysis” IEEE Transactions on Affective Computing, 2019

[3] Giorgos Kordopatis-Zilos, Christos Tzelepis, Symeon Papadopoulos, Ioannis Kompatsiaris, Ioannis Patras, “DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval”, ArXiv, 2021