Episode 19 ICML Expressive Vocalization Competition Panel | The Feelings Lab
Published on Apr 17, 2022
Join us for our podcast on expressive vocalizations and machine learning, where we discuss the powerful, contagious non-word utterances like yelps, laughs, and sighs that play a critical role in our social and emotional lives and provide new channels for human-computer interaction. Our guests include Hume AI CEO Dr. Alan Cowen, Creative Destruction Lab and DeepMind research scientist Dr. Kory Mathewson, Dr. Gauthier Gidel, professor at the Université de Montréal and Mila faculty member, and Hume AI Research Scientists Dr. Panagiotis Tzirakis and Alice Baird.
We begin with Dr. Alan Cowen explaining the need to study vocal bursts: the powerful, contagious non-word utterances like yelps, laughs, and sighs that play a critical role in our social and emotional lives.
Dr. Gauthier Gidel, professor at the Université de Montréal and Mila faculty member, shares the powerful story behind his involvement in the ICML Expressive Vocalization Workshop and Challenge.
DeepMind research scientist Dr. Kory Mathewson and Dr. Gauthier Gidel, professor at the Université de Montréal and Mila faculty member, discuss how new datasets like the ExVo challenge data are essential to progress in understanding vocal expression.
DeepMind research scientist Dr. Kory Mathewson and Dr. Gauthier Gidel, professor at the Université de Montréal and Mila faculty member, discuss their hopes for the future of auditory machine learning.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Are emotional expressions universal?
Do people around the world express themselves in the same way? Does a smile mean the same thing worldwide? And how about a chuckle, a sigh, or a grimace? These questions about the cross-cultural universality of expressions are among the more important and long-standing in behavioral sciences like psychology and anthropology—and central to the study of emotion.
How can artificial intelligence achieve the level of emotional intelligence required to understand what makes us happy? As AI becomes increasingly integrated into our daily lives, the need for AI to understand emotional behaviors and what they signal about our intentions and preferences has never been more critical.
For AI to enhance our emotional well-being and engage with us meaningfully, it needs to understand the way we express ourselves and respond appropriately. This capability lies at the heart of a field of AI research that focuses on machine learning models capable of identifying and categorizing emotion-related behaviors. However, this area of research is frequently misunderstood, often sensationalized under the umbrella term "emotion AI"--AI that can “detect” emotions, an impossible form of mind-reading.