Episode 22 Listener Questions + Emotion Science News | The Feelings Lab
Published on May 31, 2022
Join Hume AI CEO Dr. Alan Cowen and podcast host Matt Forte as they venture through the best pod-listener questions we've received so far this season: a veritable emotion science "mailbag." Can people who understand the emotions of others better interpret emotions conveyed through music? How should we responsibly address the ethics around emotion AI data collection and usage? Is there a healthy level of emotional expressivity conducive to emotional well-being? Are video calls bad for brainstorming? Do lobsters or hermit crabs have feelings? Tune in to hear the answer to these questions and more.
We begin with Dr. Alan Cowen, Hume AI CEO, and Matt Forte discussing recent scientific findings regarding video calls and how they change the way we think and communicate.
Hume AI CEO Dr. Alan Cowen and Matt Forte discuss how scientists grapple with emotional experience in animals, going back to Darwin's observations of "purposeless behaviors" among animals which he attributed to emotional expression.
Dr. Alan Cowen, CEO of Hume AI, describes The Hume Initiative, a not-for-profit developing concrete guidelines for the use of empathic AI.
Hume AI CEO Dr. Alan Cowen and Matt Forte discuss how expressing emotion may be critical to our mental health and well-being and to healthy relationships.
Hume AI CEO Dr. Alan Cowen and Matt Forte discuss how expressing emotion may be critical to our mental health and well-being and to healthy relationships.
Does the use of English-language terms to describe emotions contribute a Western bias in emotion science and empathic AI? Dr. Alan Cowen shares the importance of cross-cultural data.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Are emotional expressions universal?
Do people around the world express themselves in the same way? Does a smile mean the same thing worldwide? And how about a chuckle, a sigh, or a grimace? These questions about the cross-cultural universality of expressions are among the more important and long-standing in behavioral sciences like psychology and anthropology—and central to the study of emotion.
How can artificial intelligence achieve the level of emotional intelligence required to understand what makes us happy? As AI becomes increasingly integrated into our daily lives, the need for AI to understand emotional behaviors and what they signal about our intentions and preferences has never been more critical.
For AI to enhance our emotional well-being and engage with us meaningfully, it needs to understand the way we express ourselves and respond appropriately. This capability lies at the heart of a field of AI research that focuses on machine learning models capable of identifying and categorizing emotion-related behaviors. However, this area of research is frequently misunderstood, often sensationalized under the umbrella term "emotion AI"--AI that can “detect” emotions, an impossible form of mind-reading.