Podcast Summary
Artificial Intelligence Podcast: ChatGPT, Claude, Midjourney, and All Other AI Tools
Episode: Is AI Changing the World of Consulting with Richard Hawkes
Host: Jonathan Green
Guest: Richard Hawkes, Change Consultant and Author
Date: June 16, 2025
Episode Overview
This episode explores the profound impact AI is having on the world of business consulting. Jonathan Green sits down with veteran change consultant and author Richard Hawkes to unpack how AI is reshaping consulting, team dynamics, business processes, and organizational change. Together, they scrutinize where AI adds value, where it falls short, and how consultants and executives can thoughtfully integrate AI into their work without losing sight of the essential human factors.
Key Discussion Points & Insights
1. How AI Is Redefining Consulting and Internal Advisory Roles
-
Democratizing Consulting Tasks:
Richard notes that AI has automated much of the traditional content-creation aspect of consulting (business plans, reports, analysis), enabling more people to ‘do it themselves’ (01:39–02:28). -
New Focus on Human Connection:
With AI handling repetitive, technical work, consulting—and business in general—is shifting heavier emphasis to the human side: facilitating hard conversations, driving closure, and building alignment.“What’s left is being able to drive the human being to human being conversations to actual closure.”
—Richard Hawkes (01:39) -
Emergence of Management and Relationship Building:
AI doesn’t just free up time; it pushes everyone up the value chain, emphasizing leadership, conversation, and collaboration.“It almost is pushing everyone into management. ... It gives you this ability to do less repetitive tasks and work more in relationship building.”
—Jonathan Green (02:28)
2. AI, Organizational Tension, and Communication
-
Inherent Tensions Are Unavoidable:
Every business, worldwide, contains built-in tensions between departments—e.g., sales, marketing, product, operations.“They naturally exist in the business as a system.”
—Richard Hawkes (03:56) -
AI as a Communication Accelerator:
AI helps articulate perspectives quickly and creates written narratives that bridge gaps between departments, supporting alignment and understanding (05:00–06:54). -
Narrative-Driven Conversations Inspired by Amazon:
Richard references Jeff Bezos’ narrative meetings at Amazon; similarly, AI makes narrative-building easier for everyone, potentially fostering a cultural shift toward deeper, more thoughtful business conversations.
3. The Nuances of Trusting AI
-
Spectrum of Trust:
People range from complete trust (blindly sending AI-generated reports) to total skepticism (fear from high-profile AI blunders). The right approach? Cautiously iterative, with validation and alignment at each step (07:39–09:58). -
Trust Is Built Through Shared Experience:
“Trust is the residue of promises kept.”
—Richard Hawkes (07:41) -
Danger of Skipping Steps:
AI tends to skip to conclusions, assuming you won’t ask follow-up questions—a mismatch with how most humans process complex topics.“It always jumps to giving you a conclusion because it always assumes you’re not going to ask another question.”
—Jonathan Green (09:58) -
Breaking Down Complexity:
The challenge (and skill) is structuring change as a series of aligned, shared conversations rather than racing to a quick answer.
4. Choosing AI Tools: Problem First, Tool Second
-
The Tool-First Trap:
Many businesses buy a shiny AI tool before defining the problem to solve—a reversal of effective problem solving (12:10–12:22). -
Richard’s “Big Change Canvas”:
His company developed an AI-assisted framework that guides organizations through the necessary stakeholder conversations, ensuring both satisfaction and trust along the way. -
AI Interfaces Are Solution-Centric—Human Change Is Process-Centric:
Current AI chat interfaces are designed for speed and quick answers, often skipping the essential journey of stakeholder alignment (13:55–15:03).
5. The Design and Limitations of AI Models
-
AI Is Programmed for Affirmation and Speed:
AI models aim to maximize positive feedback and minimize resource use, sometimes at the cost of accuracy or alignment with user needs (15:03–17:01). -
“Hallucinations” Are Just Wrong Answers:
Jonathan challenges the term “hallucinate,” highlighting that users experience this as deception or error, not something benign. -
AI Embeds Its Creators’ Biases:
Even image-generating AIs reflect their programmers’ worldview and biases, underlining the non-neutrality of these tools.
6. Data Overload and Digital Hoarding
-
From Too Little to Too Much Data:
The pendulum has swung from not recording enough to compulsively transcribing and storing everything—data by the foot, rather than for use (17:01–18:47). -
The Real Value Is in Shared Agreements:
The information that matters most is what is agreed upon and acted upon together—not just what’s recorded (18:49–22:35).“The knowledge that impacts the world is the knowledge that results in shared agreements between us.”
—Richard Hawkes (19:17)
7. The Importance of Shared Language and Social Contracts
-
Clarifying Definitions Is Foundational:
Key business terms and roles can mean radically different things in different contexts. Establishing shared definitions and agreements is crucial—and AI can only guess contextual meaning from surface-level dictionaries (22:35–25:50).“As soon as you go from one company to another … the CTO’s job is completely different, or the CEO sees their job as differently … the shared language or the social contract is different in each environment.”
—Jonathan Green (23:24) -
AI and Memory: Not Always an Advantage:
Perfect recall can stifle relationships; forgetting minor missteps enables smoother collaboration.
8. Diagnosing the Need for Change Consulting
-
Complex Change Goes Beyond Individuals:
Change is often misdiagnosed as an “individual skills” problem. In reality, the hardest and riskiest changes are at the organizational systems and role level (26:45–30:35). -
Warning Signs:
- Change requires altering leadership roles, capabilities, or culture (not just policy or strategy tweaks)
- There are persistent tensions or deadlocks between departments or leaders (“who breaks the tie?”)
- Top leaders are unwilling or unable to adapt to the new organization’s needs
-
Case Study:
Richard shares the story of a failed acquisition due to a CEO’s inability to shift from top-down control to true enterprise leadership, highlighting the pitfalls of mismanaging change at the systems level.“An organization can never perform … at a higher level than the most level of leadership.”
—Richard Hawkes (30:37)
Notable Quotes & Moments
-
“What’s left is being able to drive the human being to human being conversations to actual closure.”
—Richard Hawkes (01:39) -
“It gives you this ability to do less repetitive tasks and work more in relationship building. And that’s my favorite part of it.”
—Jonathan Green (02:28) -
“Trust is the residue of promises kept.”
—Richard Hawkes (07:41) -
“It always jumps to giving you a conclusion because it always assumes you’re not going to ask another question.”
—Jonathan Green (09:58) -
“Whatever is easy or hard, I’ll deal with that. … There’s almost a perfect inverse correlation. The easier they think it is, the harder it actually is.”
—Jonathan Green (25:50) -
“The knowledge that impacts the world is the knowledge that results in shared agreements between us. … It’s creating shared language.”
—Richard Hawkes (19:17) -
“Consciousness is a controlled hallucination.”
—Richard Hawkes (26:45) -
“An organization can never perform … at a higher level than the most level of leadership … the top leader was unwilling to learn a new role, couldn’t get it, inflexible … so they lost all the value they thought they were acquiring.”
—Richard Hawkes (30:37)
Timestamps for Key Segments
- What AI Has Changed in Consulting: 01:07–02:28
- Departmental Tensions & AI’s Role: 03:56–06:54
- Trust and AI: Risks and Best Practices: 06:54–09:58
- The Problem with Solution-First AI Tools: 12:10–15:03
- Why AI “Hallucinates”: 17:01–18:47
- Data Hoarding & Shared Language: 18:47–25:50
- Diagnosing When Change Is Needed: 26:45–30:35
- Leadership Limits and Organizational Change: 30:35–31:39
- Closing & Where to Find Richard: 31:56–33:22
Further Resources
- Richard’s Website: growthriver.com (US)
- German Site: unternhungsbarate.de
- Book: Navigate the Swirl (Published by Wiley)
- Big Change Canvas project: (Interested consulting firms can reach out via Growth River)
This episode offers a nuanced perspective on AI’s real impact: not about tools replacing people, but shifting the focus towards high-value human conversations, deeper alignment, and change at the systems level. It’s essential listening for leaders, consultants, and anyone navigating digital transformation.
