Episode 11 Compassion and Robots | The Feelings Lab
Published on Feb 1, 2022
In the Season 2 Premiere of The Feelings Lab, join Hume AI CEO, Dr. Alan Cowen and Embodied CEO, Dr. Paolo Pirjanian, with host Matt Forte as they discuss "Compassion and Robots."
What will it take to assuage some people's fear of robots? Can robots empathize? Can they deliver therapies, aid in child development, and give us deeper insight into ourselves? We discuss what it will take to make robots compassionate, and how the future of AI may hinge on this central challenge.
Dr. Paolo Pirjanian, CEO of Embodied, starts us off by noting how curious robots can help humans think through our own questions and even reflect on our feelings.
Next hear Dr. Alan Cowen, CEO of Hume AI, discuss how giving robots the empathic abilities needed to care for human well-being will help us avoid the outcomes that people are most afraid of.
From R2D2 to Her, Dr. Alan Cowen, CEO of Hume AI, and Dr. Paolo Pirjanian, CEO of Embodied, reflect on what sci-fi has gotten right and wrong about the future of robots.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
![Blog 2.14.24 (1)](https://hume-website.directus.app/assets/58908544-0b2f-4497-8930-7e0e7dadcde0/Blog - 2.14.24 (1).jpg?width=1000&height=1000&quality=75&format=webp&fit=inside)
Understanding how emotions are experienced and expressed across different cultures has long been a central focus of debate and study in psychology, cognitive science, and anthropology. What emotions do people in different cultures experience in response to the same evocative scenes and scenarios? What facial movements do they produce? How are feelings and expressions related?
![Chatter](https://hume-website.directus.app/assets/da0ab281-b013-420b-a308-1cd3c3f8f886/Chatter.png?width=435&height=435&quality=75&format=webp&fit=inside)
EVI Web Search Demo: The First Interactive Voice AI Podcast
Hume’s Empathic Voice Interface (EVI) is now the first voice API capable of native web search.
![Frame](https://hume-website.directus.app/assets/00a32aa2-dfb9-4aab-a467-2bf982f15cd5/Frame.png?width=1000&height=1000&quality=75&format=webp&fit=inside)
Introducing Hume’s Empathic Voice Interface (EVI) API
Last month, we released the demo of our Empathic Voice Interface (EVI). The first emotionally intelligent voice AI API is finally here! EVI does a lot more than stitch together transcription, LLMs, and text-to-speech. With a new empathic LLM (eLLM) that processes your tone of voice, EVI unlocks new capabilities like knowing when to speak, generating more empathic language, and intelligently modulating its own tune, rhythm, and timbre. EVI is the first voice AI that really sounds like it understands you.