Decoding Empathy: New Insights for Digital Health Design

Decoding Empathy: New Insights for Digital Health Design
Photo by TopSphere Media / Unsplash

You board a plane and take your seat. As you get situated, you see the flight attendants going through their pre-flight safety demonstration - buckling seat belts, pointing out exits and miming how to pull on an oxygen mask.

The facial expressions and gestures are deliberately exaggerated, almost comically so. Why is that? Because in that context, the crew needs to get critical information across to distracted passengers quickly and effectively. The nonverbal communication is as important as the verbal.

This principle applies in healthcare contexts too, as highlighted by fascinating new research from Audrey Marcoux and colleagues published in Computers in Human Behavior. In two clever experiments using simulated medical consultations with digital characters, they systematically varied facial expressions, gaze direction, and body posture to see how they affected perceptions of empathy. The results provide a window into how even subtle nonverbal cues shape the empathic connection.

Some key takeaways:

  • Facial expressions mattered more for judged empathy than gaze direction or posture. Sad and pained expressions were seen as more empathetic than neutral ones.
  • Higher-intensity sad expressions were judged as more empathetic than lower-intensity ones. Efforts to understand and share feelings seem to count more than perfectly mirroring the patient's state.
  • Combinations of cues mattered. A direct gaze plus a forward lean amplified the effect of sad and pained expressions. But for very low-intensity expressions, the lean alone was enough.
  • Women showed less variability in empathy judgments across behavior combinations compared to men. Gender-based stereotypes about empathy may fill in the gaps when expressions are ambiguous.

Implications

There are some implications for the burgeoning field of digital or virtual agents in healthcare. Designers should pay close attention to the facial expressions, gaze, and posture of their virtual characters or avatars. Combining a moderate-intensity sad expression with a direct gaze and a forward or upright posture seems to be the optimal formula for conveying empathy.

Interestingly, perfectly mirroring the user's emotional state may be less important than clearly demonstrating an effort to understand and connect. Designers should also be mindful of gender stereotypes that may color users' perceptions, especially when expressions are subtle. Careful testing of different nonverbal combinations with representative user samples can help optimize characters' empathic impact.

More broadly, this research underscores how nonverbal behaviors, even when mediated by technology, remain a potent channel for building understanding and connection. As healthcare interactions increasingly move into virtual spaces, getting the nonverbal piece right will be crucial, especially for applications where empathy and rapport are paramount, such as mental health. Like those flight attendants exaggerating for effect, digital agents may need to amp up the emotional cues to get the point across.

The science of empathic design is rapidly evolving, and this study offers valuable guideposts for the road ahead. As we work to humanize healthcare technology, cracking the "empathy code" will be an essential part of the journey. By using research insights and embracing an empirical mindset, designers can create virtual agents that inform, support, and connect with consumers.

Subscribe to Mediated

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe