Podcast Summary: The Ethics of the Cognitive Sciences – Privacy and Respect for Persons
LSE: Public lectures and events | LSE Film and Audio Team | November 13, 2013
Episode Overview
This public panel, organized by the Forum for European Philosophy at LSE, brings together leading philosophers and ethicists to explore the ethical challenges posed by cognitive neuroscience and neurotechnology, focusing on issues of privacy and respect for persons. Speakers Sarah Edwards, Sarah Richmond, and Roger Brownsworth dissect the limits of current neuroimaging technologies, the conceptual importance of privacy and respect, and the implications for law, regulation, and society, including the dangers of overhyping technological risks, the challenges of consent, and surveillance.
Key Discussion Points and Insights
1. Technological Capabilities and Privacy Concerns
- Sarah Edwards introduces how diverse methods in cognitive science, including high-resolution brain imaging, are designed to uncover neural correlates of thought and intention (01:32).
- Brain as the "inner sanctum" – culturally and philosophically treated as the seat of individuality and creativity.
- Technological limitations: PET, EEG, and fMRI are currently invasive, costly, or limited in resolution (04:38).
- Expanding availability: Brain scanners are shifting from medical to consumer devices (e.g., EEG-based apps).
- Case study: U.S. legal case (Kyllo) shows how emerging technology can challenge and redefine legal standards for privacy (06:20).
- Challenges with Informed Consent:
- Consent does not always equal meaningful protection—subjects may not understand or foresee the implications and non-therapeutic uses of data (10:00).
- Incidental findings in imaging raise issues of unwanted, un-erasable knowledge and third-party access (10:45).
2. Respect for Persons and the Nature of Privacy
- Sarah Richmond emphasizes the concept of respect separate from the consequence-based reasoning around privacy breaches (12:00).
- Respect is about entitlement to non-intrusive attitudes: "It's not, I don't want you to find this out, but I don't want you to have that attitude towards me." (15:36)
- Illustrations:
- Eliza Doolittle (Pygmalion): Fundamental discomfort even when one has "done nothing wrong" signals unmet respect, not just fear of harm.
- Ornithologist analogy: Animals don’t care about being observed, but humans do, reinforcing the unique value of respect for personal mental space.
- Thought experiment:
- Transparency of minds would radically change social life, in ways hard to imagine (17:45).
- Reference to Judith Jarvis Thomson's "people seeds"—demonstrates that radical social changes subvert easy analogies or predictions.
- Privacy and shame:
- Loss of privacy might have unexpected upsides: reducing unnecessary shame and fostering empathy (20:51).
3. Law, Regulation, Human Dignity, and Social Control
- Roger Brownsworth frames cognitive neuroscience within broader debates on regulating emerging technologies (21:50).
- Emergent technology discourses are driven by tensions between technophiles (who fear overregulation stifling progress) and those cautious of risks.
- Three key social values/concerns:
- Risk – Danger to public health, safety, or environment.
- Human Rights – Compatibility (including privacy as a derivative of rights language).
- Human Dignity – Especially prominent in European legal frameworks after the Treaty of Lisbon.
- Privacy Benchmarks (41:23):
- Privacy as property: Is access inherently wrong, beyond consequences?
- Reasonable expectations: What’s considered private changes with social and technological norms (52:00).
- Consent: Is consent too casual? Should it require explicit, opt-in signaling?
- Ranking of privacy: How does it stand against property or freedom of expression?
- Regulatory dangers:
- Surveillance and technological social control risk undermining moral agency: “You can only do the right thing for the right reason if you have some freedom to do the thing you’re doing and you have options available to you.” (39:32)
- Example: Supermarket trolleys with GPS; conduct managed without genuine moral engagement.
- The worry is a society where rule-following is engineered, not chosen: “No virtue in doing the right thing if you can’t do the wrong thing.” (41:04)
4. Audience Questions and Panel Discussion
A. Can We Learn from Other Regulatory Successes? (45:34)
- Example: IVF regulation in the UK credited as a success—broad debate, expert panels, and careful, transparent licensing (45:34).
- Challenges:
- Global variation undermines strict national controls.
- Privacy threats are more diffuse and far harder to regulate than technologies like assisted conception.
B. Proportionality and Hype: Is Neurotechnology Special?
- Should brain scans be regulated differently than other methods of learning about people (coffee preferences, subtle cues)? Are we overestimating cognitive science’s power? (49:33, 52:26)
C. The Collingridge Dilemma
- Early on, it's unclear what technologies’ risks or benefits will be. By the time risks are understood, it's often too late to regulate effectively (54:42).
D. Nature and Limits of Privacy—and Analogies to Property
- Is privacy truly analogous to property, allowing one to say "this is mine, back off" regardless of the consequences? (55:21)
E. Psychiatric Cases and Technological Change
- Example: Schizophrenia patients experiencing “idea broadcasting”—if technology advances, such beliefs might not seem delusional (57:31).
- Raises the prospect that new technologies fundamentally shift the boundaries of normality and pathology (60:09).
F. Profiling, Determinism, and Ethics (62:10)
- Concerns about linking brain/genetic data to deterministic predictions (e.g., insurance premiums, criminal risk), and the philosophical ambiguity of such predictions.
- Panel: Most actual science deals with probabilities, not certainties; risk assessments are not strictly deterministic (65:32).
G. Social Networks and Voluntary Surrender of Privacy (68:02)
- Why do people so willingly relinquish information? Partly due to lack of foresight, but also for inherent psychological rewards (social recognition, validation) (72:06).
- Quote: “Expressing information about yourself in a sort of publishing way…is inherently rewarding and people get satisfaction out of knowing that everybody knows something about them.” – Host (72:43)
H. Biometrics, Biocryptography, and Evolving Privacy Threats (75:09)
- New biometric methods (e.g., brain patterns as passwords) provoke questions: Should we treat "brain signatures" as fundamentally different from fingerprints?
- Integrated systems (mobile + biometrics + predictive analytics) could enable fine-grained surveillance (“digital dog tag”) (82:38).
- Right to erasure and new data protection frameworks will be tested by such advances.
I. The Erosion of Privacy and Mental Health (85:01)
- Does persistent exposure and loss of privacy damage well-being? Panel agrees privacy is vital for experimentation, non-conformity, and psychological comfort.
- “In a free society you need some privacy. Otherwise, you’re constantly…just worried about having to stay on the norm.” – Brownsworth (85:19)
J. Public Engagement and Democratic Oversight (90:01)
- Principle: Deliberative democracy should drive tech regulation.
- Practice: True engagement is difficult. Public education and authentic participation lag behind technological change, and sometimes “public engagement” is performative (90:01–92:19).
K. Children and Privacy (92:46)
- Research on children is vexing for consent; reliance on "best interests" is problematic as it’s vague and susceptible to abuse (92:55).
Notable Quotes & Memorable Moments
"We often think of the brain as being the seat of our souls, the source of all value, of our creativity and of our mental freedom. Is it our inner sanctum that is out of bounds for science?"
– Sarah Edwards (02:12)
"It's not, I don't want you to find this out, but I don't want you to have that attitude towards me. That's not an attitude you're entitled to have, at least not without my consent."
– Sarah Richmond (15:36)
"No virtue in doing the right thing if you can’t do the wrong thing."
– Roger Brownsworth (41:04)
“Expressing information about yourself…is inherently rewarding and people get satisfaction out of knowing that everybody knows something about them.”
– Host (72:43)
Timestamps for Key Segments
- Introduction and Format: 00:00–01:32
- Sarah Edwards: Technology and Privacy: 01:32–11:11
- Sarah Richmond: Respect and Privacy Beyond Consequences: 11:17–21:16
- Roger Brownsworth: Law, Regulation, and the Risks of Social Control: 21:50–43:27
- Panel Q&A: Regulatory Models, Overhyped Technology, Reasonable Expectations: 43:27–55:08
- Audience Q&A: Determinism, Voluntary Exposure, Biocryptography, Social Media, Children's Rights: 61:27–93:53
Conclusion and Takeaways
The panel underscores that cognitive neuroscience—like other emergent technologies—raises especially acute privacy questions because the brain is seen as the seat of personhood and agency. While neurotechnologies are not yet capable of true “mind-reading”, hype, expanding access, and social/cultural attitudes all shape policy needs. Law, ethics, and regulation must grapple with the uniqueness (or lack thereof) of brain data, the problems of meaningful consent, risks of over-surveillance, and the deep psychological/social needs for privacy and respect.
Ongoing public engagement, realistic appraisal of both risks and likely benefits, and a focus on respect for persons—not just instrumental outcomes—are vital for the future.
