Why Headspace’s CPO says AI won’t replace your therapist

 

By Jessica Bursztynsky

 

Dialogue about the rise of artificial intelligence appears to have fallen into two categories: Generative AI will save the world, or it will ruin it. But there’s a middle ground that’s been left out, says Leslie Witt, the chief product and design officer at Headspace Health.

 

“What we’re invested in is, how do we superpower the expert so that they can scale, so that they can be that much more confident in what it is that they’re doing and that much more efficient, versus any notion that this acts as a replacement,” Witt tells Fast Company in an interview.

The company’s namesake product, the meditation and mindfulness app Headspace, has about 2,500 paying corporate customers and has been downloaded 100 million times across 190 countries.

In 2021, Headspace merged with Ginger, a leader in on-demand mental health care, to form Headspace Health. The next year it bought AI-powered mental health and wellness startup Sayana to accelerate its AI efforts. Now, the company is working on integrating all of its mental health and wellness services under an eventual stand-alone Headspace app. It currently uses AI technology to match coaches with users, help coaches fill out note-taking summaries from sessions, and create so-called smart replies for coaches to use in responding to conversations.

 

“There’s a ton of excitement,” Witt says. “I think it is profoundly laced with a ‘not ready yet’ sentiment . . . particularly in areas that have deep sensitivity around them, like mental health care.” It’s “incredibly compelling” to think about how the tech can make access to therapy or other care more accessible, but it “also has the impact to be incredibly negatively impactful,” she adds. 

Witt’s comments come as AI-powered companies question how to move forward responsibly as an industry. Earlier in the week, a number of tech leaders including Steve Wozniak and Elon Musk penned an open letter calling for a moratorium on advancing AI, warning of “profound risks to society.”

For her part, Witt suggests that when it comes to healthcare and mental health care, it’s far too soon to replace traditional care with AI. “In the current state, particularly as it comes to provision and care, it’s too early [and] the safeguards don’t exist,” Witt says. “Both from a regulatory perspective and an evaluative perspective—like, how high-quality is this?”

 

Headspace has been running internal hackathons about potential AI applications while also running what it calls a “safety salon,” which gathers people to talk about how a responsible framework fits into this set of tools. Now, she argues, is the time for the industry to unite and decide how to move forward safely with the technology. 

“Tech is no longer video games and nice to have,” she says. “It’s just an essential part of the human condition, and in spaces like health tech, and mental health tech in particular, you can’t premium the tech without talking about the commitment to the person.” 

It’s still in the early days when it comes to seeing how AI fits into mental health care. Most likely, Witt says, it’s not an either/or situation between the two subjects. Consumers still have watches and laptops and televisions and tablets and cellphones, not just one overarching device. 

 

Before venturing into health tech at Headspace, Witt worked in fintech at Intuit, where people for decades forecast that such tech would eliminate accounting departments. “If you talk about something that is deeply rules-based, highly formulated, you can [machine learn] most of what it is, if not all, of what an accountant does,” Witt says. “And the accounting industry is as strong as ever because when it comes to the confidence that someone wants on the idiosyncrasies of a human being who can empathize and have meaningful conversation, they want an expert.”

 

Fast Company

(14)