Skip to main content
School of Electronic Engineering and Computer Science

Computational analysis of chick vocalisations: from categorisation to live feedback

The assessment of animals’ emotional state and welfare is a central issue for behavioural neuroscience and ethical farming. Animal vocalisations provide a rich set of information on the inner state of animals and can be used to influence animal behaviour in industrial settings, such as chicken farms. However, there is limited research on using vocalisations for monitoring animal welfare, and on the use of audio technologies for automatic welfare assessment of poultry in industrial settings.

In this project, we will develop computational methods to automatically categorise vocalisations of domestic chickens, infer their emotional state, provide live feedback and identify stimuli that can improve animal welfare. The PhD project will build upon pilot work led by the supervisors [i] on automatic recognition of chick calls using machine learning and signal processing methods. This project will lead to computationally efficient and robust machine learning methods and systems for automatically monitoring poultry welfare from audio, as well as will investigate research questions related to poultry development, behaviour, and well-being in industrial settings. Prospective candidates should be curious, self-motivated and have experience in one or more of the following: Bioacoustics, Cognitive Science, Artificial Intelligence/Machine Learning, Digital Signal Processing.

[i] C. Wang, E. Benetos, S. Wang, and E. Versace, “Joint Scattering for Automatic Chick Call Recognition”, 2022 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), submitted.

This topic is co-supervised by Dr. Emmanuil Benetos and Dr. Elisabetta Versace.