Kai Koerber was a student at Marjory Stoneman Douglas High School in Florida when a gunman murdered 14 students and three school workers there in February 2018. Seeing other students and himself struggle to return to normal, Koerber wanted to do something to help people deal with their emotions.
Some of his classmates from the school have advocated for gun control policies or entered politics. Some took time to heal and work on their studies. Koerber’s background in technology led him in a different direction: to build a smartphone app.
The result was Joy: AI Wellness Platform. It uses artificial intelligence to suggest small mindfulness activities for people based on how they are feeling. The algorithm is designed to recognize how a person feels from the sound of their voice. It does not matter the words or language they speak.
Like many of his fellow students from Marjory Stoneman Douglas, Koerber said he suffered from post-traumatic stress disorder for a "very long time.” Only recently has it eased a little.
Koerber started a research team at the University of California at Berkeley to build an AI tool to see if his idea was possible.
The idea was a platform that provides those struggling with a “wellness practice on the go that meets our emotional needs on the go.”
He said it was important to offer quick activities. Sometimes the activities last just a few seconds and can be done anywhere the user might be.
Mohammed Zareef-Mustafa is a former classmate of Koerber's. He has been using the app for a few months.
“I use the app about three times a week, because the practices are short and easy to get into,” Zareef-Mustafa said. “It really helps me quickly de-stress before I have to do things like job interviews.”
To use Joy, the user simply speaks into the app over a phone or computer. The AI is supposed to recognize how you are feeling from your voice, then suggest short activities.
If the user is feeling “neutral,” the app suggests several activities, like a 15-second exercise called “mindful consumption.” It gets you to “think about all the lives and beings involved in producing what you eat or use that day.”
Another activity calls for practicing how to make an effective apology. Another asks users write a letter to their future self. One suggestion asks sad users to track how many times they have laughed over a seven-day period. The user is supposed to count the laughs up at the end of the week to see what moments made them happy.
The iPhone app costs $8 a month. It is a work in progress. And like other AI tools, the more people use it, the better it becomes.
Many wellness apps on the market claim to help people with mental health issues, but it is not always clear whether they work, said Colin Walsh. He is a professor of biomedical informatics at Vanderbilt University and has studied the use of AI in suicide prevention.
The stakes also matter. Facebook, for instance, has faced some criticism in the past for its suicide prevention tool. It used AI as well as humans to identify users who may be considering suicide. If the technology is simply directing someone to spend some time outside, and the stakes are lower, it is unlikely to cause harm, Walsh said.
Koerber said people often forget, after mass shootings, that survivors do not “bounce back right away” from the experience. It takes years to recover, he said.
His work has also been slower and more thoughtful than tech business leaders of the past, he added.
“I guess young Mark Zuckerberg was very ‘move fast and break things,’” Koerber said. “And for me, I’m all about building quality products that ... serve social good in the end.”
I’m Dan Novak.