Devices Can Now Know Your Feelings with Empathetic Technology

Empathetic Technology

Many people think that the word “technology” might remind them of the cold images of steely robots and complex computer algorithms. But after a talk on “empathetic technology” at this year’s Wired Health conference did a lot to change this perception.

In the United States, currently, around 39 have a smart speaker; technology that caters to our needs is more and more everywhere, taking up ever more of our personal space.

But smart devices can do enough and these may do more than just playing our favorite song or searching the internet when we ask them to. Smart speakers can soon be able to diagnose us or tell how we are feeling.

At Wired Health — an annual conference which brings to the fore the latest developments in health tech — neuroscientist and technologist Poppy Crum, Ph.D., gave a talk aptly titled “Technology that knows what you’re feeling.”

Treading a fine line between warning and confident, the title made a powerful point: Additionally, consumer technology may know our mental and physical states before we do.

But how can you achieve this with the help of technology? How can we connect its potential to help us explain mental and physical conditions, and what role does empathy play in all of this?

What is empathetic technology?

Crum, the chief scientist at Dolby Laboratories in San Francisco, CA, and an adjunct professor at Stanford University in the Center for Computer Research in Music and Acoustics define empathetic technology as “technology that is using our internal state to decide how it will respond and make decisions.”

So how is technology able to identify and read our internal states? Crum’s talk at Wired Health featured some interesting examples of neurophysiological “giveaways” that the right type of technology can now pick up easily — a phenomenon the scientist referred to as “the end of the poker face.”

The pupillometry research from the last few decades has shown that we can track many cognitive processes, for example- memory, attention, or mental load, by examining the activities and measuring the diameter of our pupils. Actually, it is an experiment we can all “try at home.” In 1973, famous psychologist Daniel Kahneman wrote:

“Face a mirror, look at your eyes and invent a mathematical problem, such as 81 times 17. Try to solve the problem and watch your pupil at the same time, a rather difficult exercise in divided attention. After a few attempts, almost everyone is able to observe the pupillary dilation that accompanies mental effort.”

Other experiments have proved how skin conductance, also known as galvanic skin response, can be a tool which can foresee a person’s emotional response during watching a movie or a football match.

Additionally, it can predict the amount of sweat a person’s skin secrets, and the changes in the electrical resistance of the skin can guess “stress, excitement, engagement, frustration, and anger.” It can also predict humans exhale chemicals, like carbon dioxide and isoprene, while they feel lonely or scared.

Useful applications of empathetic tech

Crum in her Wired Health talk told that “Empathetic” hearing assistance could be personalized and adjusted to the amount of attempt that a person with hearing problems requires to apply so as to make out what someone is saying.

This would not only help destigmatize those living with specific disabilities but also providing them with the finest care.

Empathetic technology also has ample implications for our mental health. “With more capable cameras, microphones, thermal imaging, and exhalant measuring devices, we can capture prolific data,” writes Crum, data that can, in turn, function to alert carers.

Crum explained in her Wired Health talk that, “On the subject of mental health, it is not only the eyes that offer a window into someone’s soul, but also the voice.”

Researchers have applied artificial intelligence (AI) to their collected data on parameters like syntactic patterns, pitch-reflex, and use of pronouns to accurately detect the onset of depression, schizophrenia, or Alzheimer’s disease.

“The model sees sequences of words or speaking style, and determines that these patterns are more likely to be seen in people who are depressed or not depressed […] Then if it sees the same sequences in new subjects, it can predict if they’re depressed too.”-  Tuka Alhanai

“Every patient will talk differently,” he said, “and if the model sees changes, maybe it will be a flag to the doctors […] This is a step forward in seeing if we can do something assistive to help clinicians.” Said Study co-author James Glass, a senior research scientist in CSAIL, on the findings at the time.

Other researchers have applied computer algorithms to study half-a-million Facebook status updates to detect “depression-associated language markers,” such as sensory cues or greater use of first-person pronouns, like “I” or “me.”

Arthritis gloves and comprehensive design

Besides enhancing our understanding of psychological conditions, empathetic technology can improve physical problems.

Crum and her team conducted an experiment by using arthritis simulation gloves to create an empathetic feeling for a group of participants. The researchers then told these participants to prepare the menu of an app, thinking that its users would have arthritis.

The participants in the arthritis simulation group prepared a completely different user experience from those in the group who could not empathize with their users. People in the former group removed attributes like drop-down menus, for example, which are tough to connect with for those with digit mobility problems.

John Clarkson, a professor of engineering design at the University of Cambridge, U.K., and Roger Coleman, a professor emeritus of inclusive design at London’s Royal College of Art had prepared the gloves which were the result of 10 years of research into “inclusive design.”

Waller also uses a pair of glasses to imitate vision problems, and other researchers have used immersive technology, for example, virtual reality simulators, to recreate the experience of living with “age-related macular degeneration, glaucoma, protanopia, and diabetic retinopathy.”

Towards an ‘era of the empath’

We are going towards “the era of the empath”, as Poppy Crum has dubbed it — an era where “technology will know more about us than we do,” however an era where we will know more about each other than ever before.

“Consumer technology will know more about our mental and physical wellness than many clinical visits.” – Poppy Crum

Uniting machine learning and sensing technology and as a result, the vast amounts of data it can collect offers great opportunities for physicians, writes the scientist. “Here are just a few other examples of how this might play out,” she notes.

“By combining drug regimens with empathetic technology, doctors gain a closed feedback loop of data from the patient, changing drugs and therapies based on your signals.”

“Or, weeks before you go in for knee surgery, your orthopedic surgeon can gather much more data about your gait and how you use your knees in ways that may benefit from different considerations during your physical therapy rehabilitation post-surgery,” she continues.

At Wired Health, Crum appeared to have convinced her audience that artificial technology, joined with AI, can significantly improve our lives, rather than hold back them — a point the scientist drives home in many of her earlier articles.

“AI is often feared because people think it will replace who we are. With empathetic technology, AI can make us better, not replace us. It can also assure us and our doctors that the interventions they prescribe are actually solving the problems we have.” -Poppy Crum