Episode 23 Well-being | The Feelings Lab
Published on Jun 7, 2022
Can AI teach itself to improve our well-being? Join Dr. Alan Cowen, CEO of Hume AI, Dr. Dacher Keltner, professor of psychology at the University of California, Berkeley and founding director of the Greater Good Science Center, and podcast host Matt Forte as they discuss how the future of technology hinges on the measurement of human well-being.
We begin with Dr. Dacher Keltner discussing the overall effects of new technologies on the well-being of Gen Z.
Dr. Alan Cowen, Hume AI CEO, discusses how technology companies are not simply seeking to maximize engagement at all costs, and points to developments on the horizon for considering human well-being.
Dr. Alan Cowen, Hume AI CEO, elaborates on how well-being is the ultimate key to the ethical deployment of empathic AI.
Dr. Alan Cowen, Hume AI CEO, describes how AI technologies can incorporate self-report and objective indicators of user well-being.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
Are emotional expressions universal?
Do people around the world express themselves in the same way? Does a smile mean the same thing worldwide? And how about a chuckle, a sigh, or a grimace? These questions about the cross-cultural universality of expressions are among the more important and long-standing in behavioral sciences like psychology and anthropology—and central to the study of emotion.
How can artificial intelligence achieve the level of emotional intelligence required to understand what makes us happy? As AI becomes increasingly integrated into our daily lives, the need for AI to understand emotional behaviors and what they signal about our intentions and preferences has never been more critical.
For AI to enhance our emotional well-being and engage with us meaningfully, it needs to understand the way we express ourselves and respond appropriately. This capability lies at the heart of a field of AI research that focuses on machine learning models capable of identifying and categorizing emotion-related behaviors. However, this area of research is frequently misunderstood, often sensationalized under the umbrella term "emotion AI"--AI that can “detect” emotions, an impossible form of mind-reading.