Podcast Summary
We Are For Good Podcast - Episode 649
Retention Is a Love Language: Using AI to Communicate With More Care (Not Just More Often)
Guests: Crystal Clark (Stop Soldier Suicide), Amina Mohamed (Cameras for Girls), Michael Mitchell (Feed the Children)
Date: September 29, 2025
Episode Overview
This episode centers on the idea that donor retention should be viewed as a “love language” for nonprofits—requiring care, intentional connection, and ethical communication, especially in the age of AI. Host Jon McCoy and his guests (Crystal Clark, Amina Mohamed, and Michael Mitchell) discuss how technology can support meaningful relationships, the risks of automating away authenticity, and how to balance efficiency with true care for donors. They emphasize that while AI can be a powerful tool, the heart of donor retention is still profoundly human.
Key Discussion Points & Insights
Framing Retention as a Love Language (04:57)
- Crystal Clark: Applies the concept of “love languages” to donor relations. At Stop Soldier Suicide, this means timely words of affirmation (personalized recognition) and quality time (deepening relationships):
“We are sending those personalized messages, making sure our donors feel loved, and we're doing that out of the gate… and then my love language is quality time, and that is really building the relationship.” (05:10)
- Michael Mitchell: Focuses on making donors feel seen, heard, and central to the story—moving from transactional to relational communications:
“It's about remembering that the donor is the hero of their story and putting them right there in the center…” (06:14)
- Amina Mohamed: Emphasizes bringing donors into the journey and showing them the real impact they enable, beyond automated thank-yous:
“It's less about thank you for giving and more about look what you made possible. And I think that shift… changes everything.” (07:13)
Barriers to Authentic Relationships: Noise & Trust (09:19)
- Michael: Identifies overwhelming “noise” and lack of trust as the biggest obstacles. AI has increased the volume of content, risking generic and impersonal messaging (“beige goo”).
“There's so much noise, so little signal… and AI has made that problem worse because now anyone can just generate a bunch of stuff to throw out there.” (09:19)
Care vs. Noise: How AI Fits In (10:40)
- Crystal: Shares a story about intentionally tailoring communications, picking up the phone instead of mass emailing sensitive information:
“We can't send an email to our lost survivors sharing the data findings that we have. So we picked up the phone, we called those donors, we let them know we have this white paper. Do you actually want to read it?” (10:40)
- Amina: Stresses authentic, donor-centric communication—'you' language must be genuine:
“Just because you wrote you instead of we does not make it donor centric. It's just noise.” (12:04)
- Michael: Encourages organic communication, driven by real excitement, rather than forcing rigid schedules:
“I'd rather 45 days go between a contact and you send something that you are really excited about, than... force it at, okay, it's been 30 days.” (12:47)
Building Trust with AI: Tools & Limits (14:49)
- Michael: Uses AI to help remember details and commitments, which helps build trust. He records and transcribes notes from donor conversations for accuracy.
“I'm using AI just to help me remember things. And when I remember things, that generates trust.” (14:49)
- Amina: Relies on handwritten, personalized notes with student photos, building trust through effort and authenticity. AI is used only to polish language, not replace personal touch.
“Every single note that I wrote was not a copy of the other. It was something personal to the person…” (17:30)
- Crystal: Uses AI to adjust her communication style for certain donors (e.g., more direct for military donors), but always aims to retain her authentic voice.
“I may write it in crystal language and ask… AI to help me to be a little bit more formatted. That's going to fit the need of my donor…” (19:48)
Ethics in Storytelling: The Human Backstop (22:04)
- Amina: Advocates for participatory, ethical storytelling—asking program participants to review or provide their own stories, rejecting exploitative or colonial narratives.
“You got to stop the colonial narratives… They don't want to see it anymore, they don't care for it. It's not impactful, it's not real storytelling.” (22:04)
- Crystal: Highlights the importance of consent and avoiding retraumatization. Even after original consent, she rechecks before sharing stories, acknowledging family or emotional changes.
“We're still going back and making sure that they're okay because we don't know where they are on their journey of grief…” (24:43)
- Michael: Stresses that AI can fabricate (“hallucinate”) and the responsibility lies with humans to verify and uphold truth:
“The red line is AI hallucinates and so we have to be the backstop… We cannot tell stories that are not true.” (26:53)
Impact Reporting and Avoiding “Shiny, Shallow” Updates (29:15)
- Crystal: Warns against sending mass, decontextualized reports. Advocates for intentional, audience-specific segmentation to convey the larger story behind data.
“We're not just dispersing everything to the masses. We're actually doing things with intentionality, with the right audience…” (29:15)
- Amina: Refuses to use manipulative or “poverty porn” tactics to drive donations, drawing a hard ethical line:
“We will never manipulate donor emotions for conversion or for donations… it's always about respecting the dignity and the autonomy of the person…” (30:54)
Practical Applications: Experiments, Tools & Flops (33:07)
- Crystal: Testing AI-assisted donor “journeys” in Facebook Messenger for small givers, focusing first on gathering their stories and reasons for engagement before making further asks.
“We are right now in the midst of doing what we're… doing journeys with them in Facebook messenger. And before we even start talking to them about do you want to become a monthly donor... we're actually collecting really good stories.” (33:21)
- Michael: Talked about integrating scattered program data into an AI-driven agent (Copilot) that enables fast, accurate access to stats for one-to-one conversations.
“We have this large monitoring and evaluation program… we have been feeding it into an agent in Copilot that is now something that we can go and we can chat with…” (35:28)
- Amina: Uses Raisly for donations and automation, but always supplements automated workflows with personal, human updates and timely apologies (“oops” emails), which donors appreciate.
“You get the same thank you from me every month… But it's always about an update on top of our newsletter…” (37:20)
Notable Quotes & Memorable Moments
-
Crystal Clark:
“I call it the peanut butter smear… But it's really thinking about, okay, what is it that we have in our lap to share with our donor? What is the appropriate way to share it?” (10:40)
-
Michael Mitchell:
“Things that scale are actually the things that don't scale. Just continue to do the things that don't scale that you can't ask an AI to do and just be absurdly human…” (39:56)
-
Amina Mohamed:
“AI for us, I call it my best friend because it's helped me be much more productive but not a substitute for that humanity that we also need to keep that top of mind, especially in the world we're living in today.” (41:01)
-
On Ethics:
“We need a Hippocratic oath for philanthropy. Do no harm.” – Audience comment lifted by host (28:34)
Timestamps for Key Segments
- 01:12 – Session intro and panelist introductions
- 04:57 – What does “retention as a love language” look like?
- 09:19 – Barriers to relationship building: Noise & Trust
- 10:40 – When does communication feel like care vs. noise?
- 14:49 – How can AI be used to build trust (and not just automate)?
- 22:04 – Storytelling ethics in the AI era
- 29:15 – Impact reporting and avoiding shallow or manipulative updates
- 33:21 – Experiments & practical tools for retention
- 39:56 – One Good Thing (closing reflections)
Closing Reflections – “One Good Thing” (39:56)
- Michael:
“Do the things that don't scale… be absurdly human in that way. I think that is a great, great backstop against a lot of the ethics that we're talking about.”
- Amina:
“Donor retention… is love in action. Looks like slowing down to listen… Only using AI as a super tool, not a substitute for human connection.”
- Crystal:
“Be authentic. People feel that. And when you are using those AI tools… I want to sound like me. That donor is going to know that doesn't sound like her. So keep on tapping into you.”
Summary Takeaways
- AI can amplify connection but should never replace authenticity, care, and ethical boundaries in donor communication.
- Deep retention requires personalization, intentional listening, and centering the donor—not just in your messaging, but in your mission’s storytelling.
- Ethical lines must never be crossed, especially with AI—ensure dignity, seek consent, and “do no harm.”
- Keep experimenting—successful retention often arises from doing the unscalable, deeply human things.
For nonprofit professionals, this episode provides both inspiration and grounded tactics for leveraging technology with care. The recurrent advice: lead with humanity—let AI enhance, not replace, your love language.
