Episode 21 Pain and Personalized Medicine | The Feelings Lab
Published on May 10, 2022
Join Hume AI CEO Dr. Alan Cowen, Dr. Daniel Barron, Harvard Medical School psychiatrist and Director of the Pain Intervention & Digital Research Program, and Matt Forte as they discuss pain and personalized medicine. Different people express and describe their pain differently, but how these signals are understood can have life-altering implications. We discuss the different kinds of pain: acute vs. chronic pain, central vs. peripheral, the enigma of phantom limb pain, and how physicians evaluate pain syndromes and their treatment. Can pain be measured objectively? Is there a role for quantitative tools in treating pain? Can AI help us reduce bias in how pain is diagnosed and treated?
We begin with psychiatrist Dr. Daniel Barron and Hume AI CEO Dr. Alan Cowen discussing how culture affects the way people think about, describe, and express their pain.
Psychiatrist Dr. Daniel Barron discusses the need for tools to help patients communicate their pain symptoms to doctors.
Psychiatrist Dr. Daniel Barron discusses how we can zero in on the data that will help physicians measure pain symptoms in a more unbiased, objective, and personalized fashion.
Hume AI CEO Dr. Alan Cowen and psychiatrist Dr. Daniel Barron explain how digital tools that surface quantitative information could help clinicians arrive at more reliable recommendations for patients.
Subscribe
Sign up now to get notified of any updates or new articles.
Share article
Recent articles
![Blog 2.14.24 (1)](https://hume-website.directus.app/assets/58908544-0b2f-4497-8930-7e0e7dadcde0/Blog - 2.14.24 (1).jpg?width=1000&height=1000&quality=75&format=webp&fit=inside)
Understanding how emotions are experienced and expressed across different cultures has long been a central focus of debate and study in psychology, cognitive science, and anthropology. What emotions do people in different cultures experience in response to the same evocative scenes and scenarios? What facial movements do they produce? How are feelings and expressions related?
![Chatter](https://hume-website.directus.app/assets/da0ab281-b013-420b-a308-1cd3c3f8f886/Chatter.png?width=435&height=435&quality=75&format=webp&fit=inside)
EVI Web Search Demo: The First Interactive Voice AI Podcast
Hume’s Empathic Voice Interface (EVI) is now the first voice API capable of native web search.
![Frame](https://hume-website.directus.app/assets/00a32aa2-dfb9-4aab-a467-2bf982f15cd5/Frame.png?width=1000&height=1000&quality=75&format=webp&fit=inside)
Introducing Hume’s Empathic Voice Interface (EVI) API
Last month, we released the demo of our Empathic Voice Interface (EVI). The first emotionally intelligent voice AI API is finally here! EVI does a lot more than stitch together transcription, LLMs, and text-to-speech. With a new empathic LLM (eLLM) that processes your tone of voice, EVI unlocks new capabilities like knowing when to speak, generating more empathic language, and intelligently modulating its own tune, rhythm, and timbre. EVI is the first voice AI that really sounds like it understands you.