#StartupsEverywhere: Bellevue, Wash.

#StartupsEverywhere profile: Dr. Grin Lord, Co-Founder and CEO, Empathy Rocks

This profile is part of #StartupsEverywhere, an ongoing series highlighting startup leaders in ecosystems across the country. This interview has been edited for length, content, and clarity.

grinlordpsydheadshot.png

Using Ethical AI to Promote Empathy

Empathy Rocks is a startup that’s using technology to teach people to be empathetic. We recently spoke with Dr. Grin Lord, the Co-Founder and CEO of Empathy Rocks, to learn more about her startup’s work, how she’s working to combat AI bias, and her experiences as a woman founder.

What in your background led you to launch Empathy Rocks?

I’m a licensed psychologist who has worked to coach people in empathic listening skills for most of my career. When I first started out in 2006, I worked for the Level I Trauma Center in Seattle. Our hospital conducted a study on providing brief therapy to people who came into the emergency room after a traumatic injury or accident involving alcohol. A psychologist would listen to these patients with empathy for a short period of time, even 15 minutes, using a form of therapy called motivational interviewing. We found that folks who received the empathic listening showed increased abstinence from drinking after leaving the hospital and that effect lasted for three years. That one conversation, in fact, led to a drop in readmission to the hospital by 48 percent. So the hospital scaled the idea and decided to have psychologists come in and talk to every single patient using this empathic intervention, which saved the hospital millions in readmission costs annually. To me, that was an inspiration and concrete example of how empathic listening helps people feel understood so they can change their behavior.

Over the past year, the world has really become attuned to and demanding of more empathy, and I wanted to scale those skills and concepts to make them accessible to anyone. 

Can you tell us about Empathy Rocks and how you’re working to improve human empathy through the use of artificial intelligence?

We founded Empathy Rocks to provide free gamified skills training in empathy. We train people to be empathic through our games and with our empathy engine that makes text-based corrections, called mpathic. Our empathy engine is like Grammarly for empathy or a coach built into your keyboard to help you have better communication skills. We use artificial intelligence—instead of a human coach—in our products because AI is consistent and can make expert-level corrections to conversations. 

My co-founder Nic Bertagnolli and I built our AI model by sourcing data from therapists who are, ideally, some of the most empathic people on the planet. When they play our games, they’re helping us build our empathy model and we’re helping them improve their skills. And we’re also combing through empathic responses in therapy transcripts to further build out our AI models. 

Even though our global mission is to bring empathy to everyone and have therapists guide our efforts, we also want to improve the quality of therapy. There are a lot of therapists that go through training programs and don’t get immediate feedback on their skills. We have a partnership now with the Crisis Hotline in Idaho, and I’d like to see other crisis call centers or training centers go through training programs like Empathy Rocks to provide them with real-time empathy feedback.

So our mission is improving empathy for everyone and our games are especially targeted to help improve the quality of evidence-based therapists. We want the basic skills that we know lead to positive behavior changes to become second nature to people who work in crisis settings. 

I know that part of Empathy Rocks’ mission is to source and collect data in a way that includes underrepresented voices. Can you tell us more about how you do this, and why this type of intentional data collection is so important?

As people play our games, they’re helping us improve our empathy model. The reason we chose to do it in this way, rather than other ways of training AI models, is that a lot of other approaches just scrape tons of Internet data and then drop it into a model. We want to be sure that we’re specifically sourcing and curating data from really empathic people.

We wanted to be mindful of AI bias, where machines are basically trained to reproduce the status quo from historical data. If your data reflects systemic racism, sexism, and institutionalized oppression, then the algorithm will reproduce those same patterns. We don’t want our model to find a particular group to be non-empathic because we excluded them from our training data. So we’re trying to collect data from as many different genders, races, and backgrounds as possible to build a robust model. And as people play our game, the models evolve over time. People might change how they perceive empathy—even in nuanced ways—and we want our models to adapt to that.

What are some of the challenges you’ve had to overcome as a woman founder, and what can the startup community and policymakers do to better address some of the barriers to entry for underrepresented founders?

As a woman founder, I’m pitching a SaaS AI product to a sector that’s seemingly dominated by men who are often independently wealthy. I didn’t come into this experience with that background or cushion. So more access to early-stage risk tolerance capital would be great to see at the local, state, and federal levels. 

When entering the startup world, I was most affected by access to child care. It’s cost prohibitive, and remote schooling plus lack of consistent child care almost stopped my startup journey. Since becoming a parent, I’ve taken jobs that barely made a profit when accounting for the cost of childcare for two kids under 5 years old. That is scary to me given that I am a doctoral-level psychologist. I have an amazing amount of privilege and job opportunities, and yet I still struggle. I’ve basically run my startup in the evenings after the kids are asleep, which I think many parents do. As an employee of my startup, I can claim the annual child care tax credit. But where I live, that credit pays for around three months of full-time child care for my 3-year-old. My partner is a freelance artist and does the majority of caregiving at this point. I think countries that have federally supported child care must have more room for supporting innovation. I wouldn’t be doing this if I didn’t have a supportive partner and didn’t believe in myself and this company so strongly.

Another thing that helped me was joining PIE, a Portland-based accelerator that doesn’t charge founders to participate. They support many women like me to access information, mentorship and connections that might otherwise be un-accessible or costly. My Co-Founder and I are in the process of exploring grants and am about to start pitching to angels, but it’s still been hard to access the capital needed to start out in this space. 

What are some of the startup-related policy issues and concerns that you believe should receive more attention from policymakers?

The government can do a better job of reducing barriers to entry. I spoke with Josh Carter who is the director of the Maritime Blue Innovation accelerator recently, and we discussed some of these potential solutions. For example, the Air Force launched an innovation group that develops patents and innovation labs, and then connects with startups to help commercialize them. Larger companies also have consultants that go out and create startups, and then the company acquires them once they reach a product market fit. These are models that are already working, and the government can use them as a template. Policymakers don’t need to reinvent the wheel. Instead, they can look at these existing models and determine how to best apply them to underrepresented founders to help reduce barriers and provide entrepreneurs with the support and funding they need. 

In terms of reducing AI bias, the government can consider a program that offers grants to bring in experts in AI bias and in the social consequences of AI bias to address these issues. If the government is worried about tech making problems like sexism and racism worse, then it should create incentives for the many people who have studied the consequences of AI bias, including psychologists and experts in institutionalized racism and sexism to come to tech spaces and help mitigate the biases. A lot of programs focus on testing the algorithms for biases after they are built, and we could be doing work to help developers become more aware of their personal biases so they are thinking of these issues when selecting training data. 

I’m doing a little of this work in a grassroots style through a group I founded called Therapists in Tech, which helps support therapists as they enter tech spaces to understand the language of tech and bring their important perspectives to this work. We have over 300 members who are thought-leaders in digital health supporting each other on our forum. We’re still figuring out our path but my long-term vision would be for every tech startup to have access to mental health experts to help think through the consequences of their products on mental health, things like AI bias and other problems, at early stages. It would be great to see government dollars focused specifically on supporting these kinds of mental health initiatives in accelerators and startup spaces. 

What is your goal for Empathy Rocks moving forward?

I want to keep on focused on scaling empathic communication training. I hope we can continue to gain traction so that, in five years, every counselor can pass our empathy training before they go to a crisis clinic or call center. I’d also like to see us continue the development of our ‘mpathic’ engine so it can become integrated with larger software services. I’d love to see people trained in empathy when typing a text or email, learning and practicing effective communication in their daily lives. We could all use some more understanding and compassion right now! 


All of the information in this profile was accurate at the date and time of publication.

Engine works to ensure that policymakers look for insight from the startup ecosystem when they are considering programs and legislation that affect entrepreneurs. Together, our voice is louder and more effective. Many of our lawmakers do not have first-hand experience with the country's thriving startup ecosystem, so it’s our job to amplify that perspective. To nominate a person, company, or organization to be featured in our #StartupsEverywhere series, email edward@engine.is.