Back in 2014, when I first got an Amazon Echo and had an 8-year-old who frequently got colds, I thought that a future where I could get my kid to cough into the Echo to help diagnose her illness would be super helpful. Is this cough croup? Pneumonia? Post-nasal drip? Any help in figuring out if I should keep them home or take them to the doctor would have been appreciated.
My wish may soon come true, thanks to a company called Sonde Health, although I’m not sure I want this functionality anymore given the state of privacy regulations today. Sonde Health, founded in 2015, a year after I started wishing for some form of machine learning to diagnose a cough, now provides insights about depression, anxiety, COPD, asthma, and cognitive decline based on people’s voice.
I am being a bit cautious here, because Sonde Health doesn’t diagnose these conditions and maybe never will. Instead its CEO David Liu told me that it analyzes a 30-second vocal sample for characteristics that indicate a person may have depression, anxiety, or cognitive decline. For asthma and COPD, patients provide a six-second vocal sample.
To formally diagnose someone, Sonde Health’s algorithms and app would need FDA approval, something Liu isn’t planning to seek. Instead the idea is to find clinical biomarkers for medical conditions in vocal samples, and build algorithms that look for those biomarkers. Then the patient gets feedback on individual samples and trends over time. If they don’t like what they see they can reach out to their doctor.
Anytime I hear an entrepreneur in the wellness sector discuss biomarkers and their decision not to get FDA approval, I get skeptical. I think there are plenty of disproved metrics promulgated by tech startups that are really just examples of digital snake oil. And yet, anyone who has spoken with a depressed or anxious person can certainly hear it in their voice.
When I’m depressed, my voice loses intonation and I tend to speak slowly and flatly. When anxiety hits, I speed up and can struggle to speak clearly because my mouth can’t keep up with my thoughts. I also tend to speak in a higher register. Sonde’s app tries to track these and other potential indications of mental stress before they might be obvious to a human’s ear.
The idea then becomes that, much like I use my FitBit to track my steps, sleep, and heart rate over time, companies deploying Sonde’s technology would use it to help patients track their emotional or cognitive states over time and get ahead of any issues. I imagine this sort of attention on a digital dashboard might increase some people’s anxiety, but when used appropriately it could help.
But (and this is a big but) I worry about who has access to this information and how it might be used — especially if it becomes accepted as part a popular knowledge as opposed to being diagnostically relevant. Think about employee wellness programs that hand out activity trackers and set goals of 10,000 steps a day, even though that number was chosen somewhat randomly.
I’m not sure I’d want people outside of my medical team monitoring trends for possible depression, COPD, asthma, or cognitive decline. It’s certainly not data I’d want sold to advertisers or accessed by my employer.
Sonde Health doesn’t currently share any data about its users with third parties. Rather, it provides an app for consumers and has deals with healthcare providers and remote monitoring companies to provide its algorithms and dashboards as a service. So far, the algorithms are designed to look for biomarkers taken through a near-field microphone such as the one on your phone. Sonde also tests the quality of its algorithms with a variety of medical providers developing new biomarkers with medical partners who formally diagnose a person if their voice is used to train the algorithm.
Liu stressed to me that the training data includes people of different ethnicities, socioeconomic statuses, and genders. The algorithms are tested and developed in conjunction with different hospitals and medical practices. For example, Sonde Health is working with Massachusetts General Hospital’s Frontotemporal Disorders Unit to remotely detect and monitor mild cognitive impairment in people living at home.
The results of this work could lead to new biomarkers that could act as early warning signals for diseases such as Alzheimer’s or other forms of dementia.
In addition to expanding the number of biomarkers, Liu said the company is trying to build algorithms that could work on far-field microphones, such as those found in smart speakers or smart TVs. This makes me nervous, because none of these companies are bound by any of the strictures of HIPPA. Liu maintained that people are OK with their data being tracked for their benefit, especially when it comes to health and wellness, but I think most people simply don’t understand the lack of protection they have with regards to keeping that data private.
I’m glad Liu isn’t selling the data and has no plans to, but as it expands its business relationships beyond remote healthcare and monitoring companies, the potential for someone to use this data against an individual grows. And the fact that this data isn’t proven to be diagnostically relevant means that people might experience painful repercussions thanks to tech that doesn’t even do what it purports to do.
JD Roberts says
As someone who has slurred speech due to a neuromuscular condition, I get tripped up on this regularly because of an automated wellness check that my healthcare provider uses. It’s not even testing a vocal sample, it just asks questions. And one of the questions is “is your speech slow enough that other people notice?“ Which it is, all the time. But the automated evaluator marks that as a potential sign of depression. Every time. I suppose it’s understandable, but it’s super annoying.
As the old medical saying goes “correlation is not causation.“ I know from experience that there are too many different things that can generate the “markers“ being discussed that don’t really have anything to do with the condition being evaluated for. Humans are just more complicated than that.