Episode 13 Well-being in a Remote World | The Feelings Lab
Published on Feb 14, 2022
In a world of remote living, how can we ensure people are cared for? Join Hume CEO Dr. Alan Cowen, Harris Poll CEO John Gerzema, renowned psychologist Dr. Dacher Keltner, and host Matt Forte as they discuss how we can track well-being in a remote world. Defining "well-being" turns out to be more important than ever—not just for humans, but for the automated systems that increasingly orchestrate our digital lives.
First listen to Dr. Alan Cowen, John Gerzema, and Dr. Dacher Keltner discuss the diversity of experiences and the richness of emotions at play in defining one's sense of well-being.
Next, near The Harris Poll CEO John Gerzema discusses recent data on how the pandemic and remote work have affected people's emotions, including statistics suggesting that certain groups including BIPOC women feel more relaxed, happier, and more confident working remotely.
And hear Hume AI CEO Dr. Alan Cowen discusses how new technologies accelerated by AI have played an essential role in keeping our well-being intact throughout the pandemic, fostering social connectedness and enabling us to thrive in a remote workplace.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
![Blog 2.14.24 (1)](https://hume-website.directus.app/assets/58908544-0b2f-4497-8930-7e0e7dadcde0/Blog - 2.14.24 (1).jpg?width=1000&height=1000&quality=75&format=webp&fit=inside)
Understanding how emotions are experienced and expressed across different cultures has long been a central focus of debate and study in psychology, cognitive science, and anthropology. What emotions do people in different cultures experience in response to the same evocative scenes and scenarios? What facial movements do they produce? How are feelings and expressions related?
![Chatter](https://hume-website.directus.app/assets/da0ab281-b013-420b-a308-1cd3c3f8f886/Chatter.png?width=435&height=435&quality=75&format=webp&fit=inside)
EVI Web Search Demo: The First Interactive Voice AI Podcast
Hume’s Empathic Voice Interface (EVI) is now the first voice API capable of native web search.
![Frame](https://hume-website.directus.app/assets/00a32aa2-dfb9-4aab-a467-2bf982f15cd5/Frame.png?width=1000&height=1000&quality=75&format=webp&fit=inside)
Introducing Hume’s Empathic Voice Interface (EVI) API
Last month, we released the demo of our Empathic Voice Interface (EVI). The first emotionally intelligent voice AI API is finally here! EVI does a lot more than stitch together transcription, LLMs, and text-to-speech. With a new empathic LLM (eLLM) that processes your tone of voice, EVI unlocks new capabilities like knowing when to speak, generating more empathic language, and intelligently modulating its own tune, rhythm, and timbre. EVI is the first voice AI that really sounds like it understands you.