NEWS

AI model accurately detects animal emotions through vocalizations

Scientists at the University of Copenhagen have developed a machine-learning model capable of distinguishing between positive and negative emotions in animals with 89.49% accuracy by analyzing their vocalizations. This groundbreaking study, published in iScience, marks the first cross-species AI model for detecting emotional valence.

The research focused on seven ungulate species, including cows, pigs, and wild boars. By examining acoustic features such as duration, frequency, and amplitude modulation, the model identified consistent vocal patterns associated with emotions, suggesting an evolutionarily conserved system of expression.

Lead researcher Élodie F. Briefer emphasized the potential impact of this technology on animal welfare, livestock management, and conservation, enabling real-time emotion monitoring. The team has made their dataset publicly available to encourage further research.

Related Articles

Leave a Reply

Back to top button