Podcast Summary: Human Leadership for Humane Technology
New Books Network – Liminal Library
Host: Nicholas McKay
Guest: Cornelia C. Walther
Date: September 9, 2025
Episode Overview
In this episode of Liminal Library on the New Books Network, host Nicholas McKay interviews author and humanitarian Cornelia C. Walther. The conversation explores Walther's journey from humanitarian work with UNICEF to academia and authorship. They delve into her recent books on technology, social change, human leadership, and the concept of "pro social AI." The episode provides insights into how technology both connects and divides, how values influence design and use of technology, and the critical role of human leadership in shaping humane technology.
Key Discussion Points & Insights
Cornelia C. Walther's Journey: From Humanitarian Work to Academia
- Background (02:06): Walther describes her 20 years with UNICEF working globally in humanitarian operations, later moving into academia and writing to explore how individual transformation fuels social change.
- On the Shift:
- “I wouldn't really consider it as a switch... It seemed like an organic continuum where one piece is just building onto the others.” (06:05)
- Motivation for Writing:
- “I don't like writing... every single time when I'm once again in the middle of a book, I'm thinking, and why did I take this on?... It helps me to discipline my own thinking...” (07:58)
Reflections on Human Interconnectedness and Technology
COVID-19 and Exposing Digital Divides
- Main Idea:
- Technology made new forms of connection possible during the COVID pandemic, but also revealed and deepened inequalities.
- “Everything that can become a door can also be a wall in the moment where you don't have the necessary tool to access it.” (11:08–13:43)
- On Digital Exclusion:
- “The Internet can be an amazing tool… but… in the moment where the person either does not have a computer or… doesn't have Internet… that door might just as well be a glass wall.” (11:08–13:43)
The Posy Paradigm: Framework for Understanding Human Experience
- Explanation: Posy stands for peace in Haitian Creole, with Walther conceptualizing it around four human dimensions—soul (aspirations), heart (emotions), mind (thoughts), and body (sensations/behavior).
- “The quest for meaning is not something you can delegate to a ChatGPT, but… only you yourself can give.” (16:00)
- Systems View: Individual and collective transformation requires attention to these interwoven dimensions.
- “We need to not only look at each of these dimensions, but start to recognize and systematically influence their interplays.” (09:57)
Technology: Not Neutral – Values and Intent Matter
- Reflects Designer and User Mindsets:
- “We can't expect the technology of tomorrow to be better than the humans of today… garbage in, garbage out versus values in, values out.” (23:13)
- On Aspirational Algorithms and Prosocial AI:
- “If we want good outcomes for humanity… we need to build that intent into the algorithm and into the design.” (23:13)
- Call for Broader Stakeholder Involvement:
- “We cannot leave it in the hands of the private sector alone.” (23:13)
Amplified (Hybrid) Humanity: Human and AI Collaboration
- Beyond Cyborgs:
- “To make the best out of our artificial tools, we need to have a holistic understanding of our natural tools.” (25:25)
- Proposes "amplified humanity" via hybrid intelligence—complementing artificial and natural intelligence.
Technology’s Diverse Global Paths
- Anecdotes of Local Innovation:
- Story of Turkish market vendor using phone translation:
- “For him that was like, duh, obviously.” (28:32)
- Story of Turkish market vendor using phone translation:
- Emergence of Localized AI:
- Discusses Malaysia’s approach with the ILMU large language model as an alternative to Western tech dominance:
- “...there is an alternative to mainstream and that can potentially also be a way to go beyond efficiency and effectiveness and lead with values, as the Prime Minister here has said very clearly.” (28:43–30:44)
- Discusses Malaysia’s approach with the ILMU large language model as an alternative to Western tech dominance:
Values in Technology: Micro, Meso, Macro, and Meta Levels
- Multi-level Values Model:
- Micro (individual), Meso (community), Macro (country), Meta (planet).
- Proposes the “four As”: Awareness, Appreciation, Acceptance, Accountability (37:44)
- Universal Value Starting Point:
- The Golden Rule and Platinum Rule are proposed as universal ethical foundations.
- “Do to others what you want to have done to yourself... consider that what the other person needs is not really what you need...” (37:44–40:10)
- The Golden Rule and Platinum Rule are proposed as universal ethical foundations.
- Foundational Values:
- “Curiosity, compassion, creativity and courage… are very helpful in our interplay with technology.” (41:12)
Critique of Economic-First AI and the Role of Moral Courage
- On the Dangers of Agency Decay:
- “We’re currently navigating a very dangerous transition... to full blown addiction... That’s what I call the scale of agency decay.” (42:17)
- Need for Moral Courage:
- “Moral courage sounds so grandiose, but it’s also moral courage to look ourselves into the face and to pull out the white elephants that we have shoed underneath the table because they are uncomfortable to look at.” (43:30)
Looking Forward: Hopeful Alternative Futures
- Two Scenarios:
- Emergence of countries pioneering pro social AI for the common good.
- Awakening to the magic and sufficiency of human life, perhaps even stepping back from technology.
- “If we inject enough intent, [these] might come to fruition... Garbage in, garbage out. Values in, values out.” (44:23–46:22)
Notable Quotes & Memorable Moments
- On Luck and Human Frailty:
- “Being at the right time at the right place is sometimes the only thing that matters... five minutes that you might spend... waiting for a bus... makes such a big difference.” (03:38)
- On Writing:
- “I hate it. And every single time... in the middle of a book, I'm thinking, and why did I take this on?” (07:58)
- On Tech as Both Door and Wall:
- "Everything that can become a door can also be a wall in the moment where you don't have the necessary tool to access it." (11:08)
- On Intent and AI:
- “We can’t expect the technology of tomorrow to be better than the humans of today… We have a choice and we need to make it.” (23:13)
- On Universal Ethical Principles:
- "The golden rule... and maybe the platinum rule... consider that what the other person needs is not really what you need..." (37:44–40:10)
- On Agency Decay:
- “...a scale of agency decay... We might not realize what stage we're in until it is too late.” (42:17)
- On Alternative AI Futures:
- “I hope... there will be one or maybe more countries that will emerge as pioneers that actually embrace pro social AI and show that an alternative path... is possible...” (44:23)
Timestamps for Major Segments
- [02:06] – Cornelia C. Walther’s background and shift to academia
- [03:38] – Life and death in the humanitarian field: “five minutes” story
- [06:05] – Writing motivations and process
- [09:57] – The Posy paradigm explained (four human dimensions)
- [11:08–13:43] – COVID-19, technology as connector and divider
- [23:13] – Aspirational algorithms, pro social AI, values in/values out
- [25:25] – Amplified humanity and hybrid intelligence
- [28:32] – Turkish market anecdote: local adaptation of technology
- [30:44] – Discussion on Malaysia’s alternative AI development model
- [37:44] – Four As for individual and collective values
- [42:17] – Moral courage, agency decay, individual responsibility
- [44:23–46:22] – Two positive scenarios for the future with technology
Tone and Language
The episode is insightful and reflective, matching the tone of thoughtful academic and humanitarian inquiry. Walther blends practical humanitarian anecdotes with deep, often philosophical critiques concerning technology and society, consistently emphasizing humility, individual responsibility, and collective well-being.
Concluding Thoughts
Cornelia C. Walther’s perspective invites listeners to deeply consider the interplay between humanity and technology, emphasizing that human leadership and values must guide the development and deployment of technology. Her optimism for pro social AI and value-driven innovation is anchored in a belief that change begins from within and radiates outward through intentional action and collective agency.
For more on Cornelia C. Walther’s work or to connect on pro social AI, she encourages listeners to reach out.
