Scrolling 2 Death – [BONUS EPISODE] The Heat is On...Big Tech on Trial: Meta Whistleblower Brian Boland
Overview
This bonus episode of Scrolling 2 Death features an in-depth interview with Brian Boland, a former Meta (Facebook) executive-turned-whistleblower. Boland, who testified in the ongoing "social media addiction" trial just after Mark Zuckerberg, reveals the inner workings of Meta, its prioritization of profit over user safety, and its lack of meaningful accountability. Hosts Nikki Petrossi and Sarah Gardner probe Boland's experiences inside Meta and discuss the broader implications for parents and young people navigating social media.
Key Discussion Points & Insights
1. Testifying Against Meta and Emotional Context
- Boland’s Testimony: Brian describes the immense pressure and emotional stress of testifying against his former employer, especially knowing bereaved parents were in the courtroom. He emphasizes the importance of helping the jury understand complex issues and signals from his time working on advocacy and loss in the community.
"I think part of what you're trying to do or what I was trying to do is just be helpful to the jury...it felt important, felt stressful." (01:28) - The Presence of Bereaved Parents: The hosts underline the emotional impact of seeing parents of harmed or deceased children in court, facing company leaders whose decisions they blame for their losses.
2. Profit Over People: Systemic Incentives at Meta
- Market Forces and Accountability: Boland illustrates how shareholder and market pressures direct Meta’s behavior, not user welfare.
"These companies are chasing what our global economic system and what our markets tell us to prioritize, which is profits and growth and shareholder return." (03:53) - Congressional & Legal Accountability: He supports legislative and court action as meaningful avenues for forcing tech companies to responsibly address harms, noting the historical effectiveness of litigation in the U.S.
"If you look historically, we've got a great history in the US of litigation...That kind of opportunity feels like a real opportunity to create that accountability that changes how a company would operate." (06:20)
3. Decision-Making & “Open” Culture at Meta
- Contradiction Between Openness & Control: Boland disputes Zuckerberg’s testimony about openness and diversity of thought, explaining that ultimate power always rested with Zuckerberg. "It has always been a one person show. Everything about the company is oriented around Mark's power and decision making." (08:44)
- Supervoting Stock Structure: Boland discloses how Zuckerberg’s control is structurally unassailable, making him unaccountable to both the board and shareholders. "Mark has the majority of the voting shares. And so there's no accountability there." (08:44)
4. Harm, Research, and Responsibility
- Denial and Failure to Act: Boland outlines how Meta failed to fund or act on meaningful research into user harms, despite being fully able to.
"If he wanted to get real answers around these questions, he could get them. He could have absolutely avoided testifying in court...by just meaningfully investing and understanding if these products are harmful." (08:44) - Law of Large Numbers and Safety: Drawing from other expert testimonies, the discussion turns to statistical responsibility – even small harm percentages mean thousands are affected given Meta’s scale.
"At some point, you really should be responsible for the people on the margins, the people who have the highest chance of getting hurt." (14:18)
5. Advertising, Revenue, and Minors
- Ad Revenue from Minors: Boland concurs that Meta’s stated numbers on ad revenue from minors are misleadingly low, given basic math and outside research.
"If you know the total revenue...and how many youth are on a platform...You come out to a guesstimate number. That would be pretty solid. It's definitely not in the millions, right?" (17:05) - Age Estimation Technology: Meta’s age estimation tools are highly effective, but not fully enforced to keep underage users off the platforms. "Your prediction is probably pretty spot on...I saw one leaked document...not wanting to be public about how accurate these prediction algorithms were because then the company might be forced to actually enforce them." (19:02)
6. Product Feature Manipulation & Targeting Teens
- Targeting Vulnerable Moments: Boland acknowledges the plausibility of allegations that Instagram might serve beauty ads to teen girls after they delete a selfie, seeing it as technically possible and revealing a deeper moral issue.
"Completely doable on the...platform. But surprising, like morally." (20:37) - Vast Data Collection: Boland describes the depth of user tracking, extending beyond app activity to data from other apps and potentially devices, highlighting privacy concerns. "You can look at every click, every engagement...endless, endless ways that data can be gathered." (22:43)
7. The Power and Impact of Algorithms
- Unintended Outcomes: Boland recounts Meta’s emotional contagion study, which showed that the newsfeed could alter users’ emotions at scale—provoking a pullback in research rather than reforms.
"This study looked at if we showed people more happy content...could we change how they feel? It showed, like, yeah, like, we can..." (29:25) - Algorithmic Goals: User engagement, specifically time spent and habit formation, is at the center of product strategy—even when company leaders claim otherwise.
"Their goal is probably around number of users and engagement and revenue...increasing the time spent is going to get you there." (34:42) - Addiction & Compulsiveness: Boland contests Meta’s comparisons between social media and binge-watching TV, articulating how platform use is much more compulsive and pervasive.
"I do think these things are addictive. I think they're compulsive. And I think there are ways you could make them so they're not." (46:55)
8. Whistleblowing and Culture of Denial
- Consequences for Whistleblowers: Boland shares the personal and professional costs, including ostracization, loss of future job prospects in tech, and being dismissed internally for speaking out.
"You'd be surprised at the amount of back channel that I get from people who are former or still there who are like, 'you're saying what we all believe.'...but...you're putting things on the line...you're risking career paths." (38:46) - Internal Gaslighting & Culture: Meta’s internal rhetoric rationalizes or minimizes harm and vilifies critics, including other whistleblowers like Frances Haugen and Sarah Wynn Williams.
"There's so much gaslighting that goes on...the mental gymnastics that people will do is pretty strong..." (41:59)
9. The Responsibility of Parents vs. Platforms
- Limits of Parental Control: Boland pushes back on the idea that harm is simply a parental responsibility, emphasizing the overwhelming power and reach of these platforms.
"That's why the argument that parents should be responsible for all these things is just false to me." (47:42) - Advice for Parents: Citing Amy Neville, Boland starkly describes the risks:
"When should your kids get a phone?...when you're ready for your kid to meet a stranger, to see nudity and to watch someone kill themselves is like a good time to let them on these platforms." (47:42)
Notable Quotes & Memorable Moments
- Boland on Mark Zuckerberg's power:
"Everything about the company is oriented around Mark's power and decision making." (08:44) - On research and responsibility:
"The scale of money that Meta generates in profits and that Mark has personally...he could have absolutely avoided testifying in court...by just meaningfully investing and understanding if these products are harmful." (08:44) - On time spent and business incentives:
"If you really didn't want people spending a lot of time there, you would just tell people, here's the max time we want people to spend..." (34:03) - On addiction and comparisons to TV:
"I've never been, like, standing in line, like, 'I need to catch a minute of this Netflix show...'" (46:54) - On whistleblowing courage:
"You're saying what we all believe...but you're putting things on the line and you're risking career paths." (38:46) - On internal denial:
"There's so much gaslighting that goes on...these are rare occurrences, these are people who are looking to get paid." (41:59) - Message to Kaylee, the young plaintiff:
"She's brave...her bravery has mattered a ton...I do know that people are listening. I do know that it's made an impact." (43:40) - On parent advice:
"When should your kids get a phone?...when you're ready for your kid to meet a stranger, to see nudity and to watch someone kill themselves..." (47:42)
Important Segments & Timestamps
- Testifying and parent presence in court (00:52–03:27)
- Profit vs. people; market incentives (03:27–05:34)
- Litigation as accountability (06:20–07:28)
- Meta’s decision-making structure (08:06–11:30)
- Business incentives and engagement metrics (11:52–14:51)
- Challenges of age estimation and ad revenue from minors (16:55–19:39)
- Advertising manipulation and data collection (20:10–24:12)
- Loss of control & the power of algorithms (24:41–28:59)
- Emotional contagion study & algorithmic change (29:24–31:07)
- Addiction vs. habit and platform incentives (32:43–35:15)
- Attempts to sound the alarm internally, Zuckerberg's response (35:23–36:55)
- Effects on whistleblowers and internal denial (38:46–42:57)
- Parental responsibility and reality (47:42–48:37)
- Final advice to parents & society (48:37–49:18)
Tone & Language
This episode is earnest and unvarnished, with Boland speaking candidly about both his personal convictions and Meta’s corporate evasions. The conversation is empathetic toward parents and young users but unsparing in its criticism of corporate priorities and the structural lack of accountability inside Meta.
For Listeners Who Haven't Heard the Episode
This episode provides a rare, firsthand look at how Meta prioritizes profit and user engagement over user well-being, the structural mechanisms ensuring Zuckerberg’s unaccountable control, and the real emotional toll on both affected families and employees struggling with their roles in a system that incentivizes harm. Boland not only breaks down the technical and business details but also issues a call for legislative, legal, and cultural action while recognizing the complexity—and limits—of parental responsibility.
The episode will leave parents more informed—and more wary—about the platforms their children use, and offers validation for whistleblowers and advocates seeking change.
![[BONUS EPISODE] The Heat is On...Big Tech on Trial: Meta Whistleblower Brian Boland - Scrolling 2 Death cover](/_next/image?url=https%3A%2F%2Fd3t3ozftmdmh3i.cloudfront.net%2Fstaging%2Fpodcast_uploaded_episode%2F39685433%2F39685433-1772679763575-017505df202a6.jpg&w=1200&q=75)