Kai Koerber was a junior at Marjory Stoneman Douglas High School when a gunman murdered 14 students and three staff members there on Valentine鈥檚 Day in 2018. Seeing his peers 鈥 and himself 鈥 struggle with returning to normal, he wanted to do something to help people manage their emotions on their own terms.
While some of his classmates at the Parkland, Florida, school have worked on , entered politics or simply took a step back to heal and focus on their studies, Koerber鈥檚 background in technology 鈥 he鈥檇 originally wanted to be a rocket scientist 鈥 led him in a different direction: to build a smartphone app.
The result was , which uses artificial intelligence to suggest bite-sized mindfulness activities for people based on how they are feeling. The algorithm Koerber鈥檚 team built is designed to recognize how a person feels from the sound of their voice 鈥 regardless of the words or language they speak.
鈥淚n the immediate aftermath of the tragedy, the first thing that came to mind after we鈥檝e experienced this horrible, traumatic event 鈥 how are we going to personally recover?鈥 he said. 鈥淚t鈥檚 great to say OK, we鈥檙e going to build a better legal infrastructure to prevent gun sales, increased background checks, all the legislative things. But people really weren鈥檛 thinking about ... the mental health side of things.鈥
Like many of his peers, Koerber said he suffered from post-traumatic stress disorder for a "very long time" and only recently has it gotten a little better.
鈥淪o when I came to Cal, I was like, let me just start a research team that builds some groundbreaking AI and see if that鈥檚 possible,鈥 said the 23-year-old, who graduated from the University of California at Berkeley earlier this year. 鈥淭he idea was to provide a platform to people who were struggling with, let鈥檚 say sadness, grief, anger ... to be able to get a mindfulness practice or wellness practice on the go that meets our emotional needs on the go.鈥
He said it was important to offer activities that can be done quickly, sometimes lasting just a few seconds, wherever the user might be. It wasn鈥檛 going to be your parents鈥 mindfulness practice.
鈥淭he notion of mindfulness being a solo activity or something that鈥檚 confined to sitting in your room breathing is something that we鈥檙e very much trying to dispel,鈥 Koerber said.
Mohammed Zareef-Mustafa, a former classmate of Koerber's who's been using the app for a few months, said the voice-emotion recognition part is 鈥渄ifferent than anything I've ever seen before.鈥
鈥淚 use the app about three times a week, because the practices are short and easy to get into. It really helps me quickly de-stress before I have to do things like job interviews,鈥 he said.
To use Joy, you simply speak into the app. The AI is supposed to recognize how you are feeling from your voice, then suggest short activities.
It doesn鈥檛 always get your mood right, so it's possible to manually pick your disposition. Let鈥檚 say you are feeling 鈥渘eutral鈥 at the moment. The app suggests several activities, such as 15-second exercise called 鈥渕indful consumption鈥 that encourages you to 鈥渢hink about all the lives and beings involved in producing what you eat or use that day.鈥
Yet another activity helps you practice making an effective apology. Another has you write a letter to your future self, with a pen and a paper 鈥 remember those? Feeling sad? A suggestion pops up asking you to track how many times you've laughed over a seven-day period and tally it up at the end of the week to see what moments gave you a sense of joy, purpose or satisfaction.
The iPhone app is available for a $8 monthly subscription, with a discount if you subscribe for a whole year. It鈥檚 a work in progress, and as it goes with AI, the more people use it, the more accurate it becomes.
鈥淜ai is a leader of this next generation who are thinking intentionally and with focus about how to use technology to meet the mental, physical, and climate crises of our times,鈥 said Dacher Keltner, a professor at UC Berkeley and Koerber鈥檚 faculty advisor on the project. 鈥淚t comes out of his life experience, and, unlike past technologists, he seems to feel this has to be what technology does, make the world healthier.鈥
A plethora of wellness apps on the market claim to help people with mental health issues, but it鈥檚 not always clear whether they work, said Colin Walsh, a professor of biomedical informatics at Vanderbilt University who has studied the use of AI in suicide prevention. According to Walsh, it is feasible to take someone鈥檚 voice and glean some aspects of their emotional state.
鈥淭he challenge is if you as a user feel like it鈥檚 not really representing what you think your current state is like, that鈥檚 an issue," he said. "There should be some mechanism by which that feedback can go back.鈥
The stakes also matter. Facebook, for instance, has faced some criticism in the past for its suicide prevention tool, which used AI (as well as humans) to flag users who may be contemplating suicide, and 鈥 in some serious cases 鈥 contact law enforcement to check on the person. But if the stakes are lower, Walsh said, if the technology is simply directing someone to spend some time outside, it's unlikely to cause harm.
鈥淭he driver is there鈥檚 a huge demand there, or at least the perception of a huge demand there鈥 Walsh said of the explosion of wellness and mental health apps in the past few years. 鈥淒espite the best of intentions with our current system 鈥 and it does a lot of good work 鈥 obviously, there鈥檚 still gaps. So I think people see technology as a tool to try to bridge that.鈥
Koerber said people tend to forget, after mass shootings, that survivors don鈥檛 just 鈥渂ounce back right away鈥 from the trauma they experienced. It takes years to recover.
鈥淭his is something that people carry with them, in some way, shape or form, for the rest of their lives,鈥 he said.
His work has also been slower and deliberate than tech entrepreneurs of the past.
鈥淚 guess young Mark Zuckerberg was very 鈥榤ove fast and break things,鈥欌 he said. 鈥淎nd for me, I鈥檓 all about building quality products that, you know, serve social good in the end.鈥