Episode 2 Embarrassment | The Feelings Lab
Published on Oct 4, 2021
In this week's episode of The Feelings Lab, we're discussing embarrassment! Returning hosts Dr. Alan Cowen, Dr. Dacher Keltner, and Matt Forte will be joined by guest host Dr. Jessica Tracy (Director of the Emotion and Self Lab at the University of British Columbia) and special guest, comedian Ali Kolbert (as seen on The Tonight Show).
Begin by hearing Dr. Dacher Keltner describe the concept of emotional contagion and how the feeling of embarrassment helps strengthen our collective identity.
Next, hear about this in practice as guest Ali Kolbert explains how emotions seem to spread across comedy club crowds—and how stepping over the line can cause backlash.
Later in the episode, Dr. Alan Cowen, Hume's Chief Scientist, comments on how the feeling of embarrassment has evolved across time and age. Specifically, social media may make the embarrassment we experience in our everyday lives feel worse than it used to.
And discover that while animals show recognizable displays of submission, psychologist Dr. Jessica Tracy explains — your dog’s expression of remorse may not be one of them.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
![Blog 2.14.24 (1)](https://hume-website.directus.app/assets/58908544-0b2f-4497-8930-7e0e7dadcde0/Blog - 2.14.24 (1).jpg?width=1000&height=1000&quality=75&format=webp&fit=inside)
Understanding how emotions are experienced and expressed across different cultures has long been a central focus of debate and study in psychology, cognitive science, and anthropology. What emotions do people in different cultures experience in response to the same evocative scenes and scenarios? What facial movements do they produce? How are feelings and expressions related?
![Chatter](https://hume-website.directus.app/assets/da0ab281-b013-420b-a308-1cd3c3f8f886/Chatter.png?width=435&height=435&quality=75&format=webp&fit=inside)
EVI Web Search Demo: The First Interactive Voice AI Podcast
Hume’s Empathic Voice Interface (EVI) is now the first voice API capable of native web search.
![Frame](https://hume-website.directus.app/assets/00a32aa2-dfb9-4aab-a467-2bf982f15cd5/Frame.png?width=1000&height=1000&quality=75&format=webp&fit=inside)
Introducing Hume’s Empathic Voice Interface (EVI) API
Last month, we released the demo of our Empathic Voice Interface (EVI). The first emotionally intelligent voice AI API is finally here! EVI does a lot more than stitch together transcription, LLMs, and text-to-speech. With a new empathic LLM (eLLM) that processes your tone of voice, EVI unlocks new capabilities like knowing when to speak, generating more empathic language, and intelligently modulating its own tune, rhythm, and timbre. EVI is the first voice AI that really sounds like it understands you.