Loading summary
A
This is an episode about Facebook. I know that might seem like an odd choice for a show about young people because most people my age avoid Facebook entirely. But if you want to understand why social media has had such a profound effect on my generation, you need to understand the story of Facebook.
B
I'm here in Palo Alto, California, chilling with Mark Zuckerberg of thefacebook.com It's 2005,
A
only a few years after I was born, and a gangly 21 year old mark Zuckerberg is sitting down for an interview.
B
No, no, actually, I'm gonna mention the beer.
A
Hard at work, Zuckerberg's on a couch holding a red solo cup, wearing basketball shorts and an old T shirt. At one point in the interview, they cut to someone doing a keg stand
B
for the uninitiated, or those without a computer tell us, you know, simply what Facebook is and what it does.
C
So I think Facebook is an online directory for colleges. So if I want to look you up or get information about you, I just go to the Facebook and type in your name and it brings me up, like, hopefully all the information I'd care to know about you.
A
Zuckerberg had launched the Facebook a year earlier while he was a student at Harvard.
C
And within a couple of weeks, two thirds of the school had signed up. So at that point, my roommates were kind of like, you know, this is pretty cool. Like, I bet this would work at other schools.
A
It did. Within a few years, Facebook was on every college campus in America.
C
So, like, we'll go to parties on campus, like, we end up in someone's room. Like, there's a Facebook window open, and that's pretty cool, you know what I mean?
A
Back then, Zuckerberg was adamant Facebook was a platform exclusively for college students.
C
A lot of people are focused on taking over the world or doing the biggest thing, getting the most users. And there's a level of service that we provide when we're a college network that we wouldn't be able to provide if we went to other types of things. I mean, there doesn't necessarily have to be more.
A
Whether or not he meant it at the time, that's not the way things turned out. A year after this interview, Zuckerberg introduced Facebook to high schools and then made it available to everyone. And despite what a 21 year old Mark Zuckerberg might have said, Facebook did, in a way end up taking over the world.
D
How many people use it regularly?
C
It's 3.2 billion people use one of our services every day.
B
That's.
C
Yeah, no, it's wild.
B
More than a third of the planet,
A
Facebook would become one of the most profitable companies of all time. And in doing so, they wrote the blueprint for an entire industry to follow. So when Instagram and Snapchat and TikTok burst onto the scene, they adopted Facebook's business model. A model that relied on bringing young users to the platform and trying to capture their attention for as long as possible. I'm Ava Smithing from Paradigms and the Toronto Star. This is left to their own devices. Episode 3 Origin Story.
D
I got a phone call from somebody at Facebook who said his boss was facing an existential crisis.
A
In 2006, a year after that interview in Facebook's Fratti headquarters, the company was at a crossroads. Bigger players like Yahoo and Microsoft were talking about making an offer to buy Facebook, and the future of the company was uncertain.
D
He needed to talk to somebody to get some help figuring out how to solve the problem. Would I be willing to meet Mark Zuckerberg?
A
The person Mark Zuckerberg wanted to talk to was Roger McNamee. McNamee had a reputation in Silicon Valley. Early in his career, he'd worked at a venture capital firm called Kleiner Perkins, where he'd been a part of some of the most pivotal events in tech history.
D
When I was at Kleiner Perkins, that's when Mark Andreessen brought in the idea for Netscape, and I was in the first meeting he had with them. I was at the first meeting that Jeff Bezos had with Kleiner Perkins.
A
McNamee is almost 30 years older than Zuckerberg, which meant he came up at a time when technologists were seen as free spirits, not oligarchs.
D
I was a true believer. Remember, for 50 years, tech was a positive force in society.
A
So with that in mind, McNamee agreed to meet with Zuckerberg.
D
Mark comes into my office. He sits down on a couch opposite me. We're probably three feet apart. And I said, mark, here's the thing. I don't know why you're here, but I'm going to tell you why I took this meeting. And I took it because I'm afraid that if it hasn't already happened, either Microsoft or Yahoo is going to offer a billion dollars for Facebook and your board, your management team, your parents, everybody's going to tell you to take the money.
A
McNamee was right. Yahoo had already offered to buy Facebook, and everyone was telling Zuckerberg to take the money.
D
And I'm here to tell you that is all BS you have created something that is really cool. You know, you've given privacy control to people. You've given authenticated identity. You've solved the core problems of social media. And I believe you're going to have a very successful company. And these guys do not have a right to sell this company out for money. It took me precisely that long to say that. What followed was astonishing. He looked at me like the thinker. He's thinking about what I said. And this goes on first for a minute, then two minutes, three minutes, four minutes. I mean, it was the most extraordinary thing I've ever seen. It was so uncomfortable for me. I was literally carving holes in the upholstery of the chair I'm sitting in. I've never seen anybody do anything remotely like that before. He's being really weird. I mean, when you. Somebody lays something profound on you, you don't make him wait four or five minutes before you say anything.
A
Finally, Zuckerberg does say something. He tells McNamee about the offer from Yahoo and says he's not interested in selling his company.
D
And that became the beginning of a very constructive relationship. We got along well. The product went from 0 to 9 million people in the first two years. That's a really good growth.
A
Getting to 9 million users in two years is profound growth, the kind of growth that can make you a lot of money. But how?
D
How do you sustain a business model in which users don't pay for your service?
A
Mark Zuckerberg tried to explain this to the U.S. senate in 2018.
C
Senator, we run ads.
A
Social media's business model has always been predicated on advertising. The longer people stay on the site, the more ads they can show them, the more data they can collect from them, and the more money they can make off of them.
D
And then they start to look at how many minutes a day people are using it. And once they get into the 10, 20, 30, 40, 50 minute level, they are a really serious force in the media business.
A
But McNamee says that Facebook was hungry for more. So Zuckerberg started to experiment with different ways of keeping people on the platform.
D
The first thing they did was News Feed. And News Feed initially was a bunch of protests from the 9 million people who were there.
A
It might be hard to imagine, but there was a time when Facebook didn't have a centralized feedback where you could scroll through everyone's posts. Instead, the site opened up to your
D
profile page once News Feed was there. That allowed for wildly higher engagement because it was much easier to see what your friends were doing. And engagement goes up and up and up. And then the like button allows you to react to what you see, and that's a huge deal. And then Friend Finder allows you to increase your network. And then when they start to give you notifications for photos and photo tagging and all that kind of stuff, all of those things increased engagement a lot. And that's all happening between 2006 and 2009, basically.
A
But as the company continued to grow, McNamee started to have concerns.
D
The thing with Facebook was that they weren't comfortable stopping with getting your attention. Their goal was to manipulate your attention. Their goal is to manipulate your behavior to drive you towards things that were economically valuable to them. You know, the thing that they figured out, the algorithms were all sitting there, optimized for attention. Well, the simplest way to grab most people's attention is to trigger flight or fight. You want to scare them or outrage them.
A
Facebook decided to optimize for fear and outrage. And that decision has had profound consequences. Research indicates that it's led to political polarization and radicalization. And those changes to the algorithm could also be why Instagram, which was purchased by Facebook in 2012, started showing me eating disorder content. It doesn't matter whether or not that content made me feel good. What mattered is that it kept me coming back.
D
The moral failing of Facebook is that they don't understand that manipulating people is a violation of those people's civil rights. That is, I would say, morally unconscionable.
A
Eventually, McNamee realized there was nothing he could say to Zuckerberg to get him to change course.
D
It became very clear to me that Mark's goal was to get the whole world onto Facebook. He said, look, when I get to a billion, then we're going for two and then three, until finally everybody in the world is on Facebook and I'm going, dude, that is not a good strategy.
A
Roger McNamee left Facebook in 2009 and later wrote a book called Waking up to the Facebook Catastrophe. But by then, it was too late. Facebook, which would go on to acquire WhatsApp and Instagram and rebrand under the name Meta, had gone all in on a business model that required them to hold their users attention for as long as possible. And the ramifications of that decision would be felt by generations to come. We gave Meta the chance to respond to all of the claims made in this episode. They refused to comment on the record.
B
So at first, I had to convince people that these companies were using psychological tactics to get people to engage with their products and services. That at first, people Thought, oh, you know, Zuckerberg, this kid is just getting lucky, right?
A
As Facebook was exploding in popularity, a Stanford grad named Nir Eyal was taking notice.
B
So I was working at the intersection of gaming and advertising starting at around 2007. And so I was kind of at the right place at the right time to see the rise of these habit forming technologies like Facebook and Instagram and Snapchat. And so I saw this happening with my clients, my colleagues, that these companies, some of them were doing very, very well because they were incredibly good at getting customers hooked. And some of them kind of disappeared because they couldn't bring customers back.
A
Unlike Roger McNamee and many of the other experts we interviewed for this podcast, Eyal doesn't think getting people hooked on technology is necessarily a bad thing.
B
Are we gonna say stop making your devices so user friendly? No, we want them to be easy to use. That's not a problem, that's progress. I mean, let's be honest with ourselves. What are we doing here right now? Do you guys think this isn't psychological manipulation? Of course it is. Your program uses the exact same psychological hooks that the tech companies do. You're using variable rewards. You're preying on emotion. All media uses our psychological weaknesses to get us to pay attention. Now, is that a bad thing? No, but you're still using psychological manipulation.
A
To be fair, there may be some truth to that. But unlike podcasts and newspapers, social media companies have collected unprecedented amounts of your personal data and they figured out how to use that to turn psychological manipulation into an exact science. Eyal laid out those techniques in a book called how to Build Habit Forming Products.
B
So habit forming products all have this basic feature of a hook. A hook is a design experience to connect the user's problem with the maker's product with enough frequency to form a habit. And so we see these four basic parts. To every hook, you have a trigger, an action, a reward. And then we have what's called an investment, and it's through success cycles. This is exactly how these products become so habit forming.
A
Trigger, action, reward, investment. Here's how that actually works.
B
What's an app that you find habit forming? Let's take Instagram. It all starts with what's called an external trigger. An external trigger is something in our outside environment that tells us what to do. So it's some kind of ping, ding, ring. Then you've got the action phase. So if the external trigger is a notification, the action is to open the app.
A
Once you've opened an app, there needs to be something that keeps you coming back. Usually this is what's called a variable reward, meaning that sometimes when you open the app there isn't much to see, and other times there's something really juicy like a picture of your ex boyfriend and his new girlfriend.
B
There's some kind of uncertainty, some kind of mystery, some kind of intermittent reinforcement that gets you scrolling. Then the investment is every time you use the app, you're giving the company something that makes the product better with use, whether that's data, content, followers, reputation. You're putting something into the product so that over time it gets better and better the more you use it. So that eventually, over time, the company doesn't even need to send you these external triggers at all. They don't need to send you these pings and dings. You're prompting yourself with what's called an internal trigger. People don't realize that only 10% of the time that you check your phone is it because of an external trigger that ping, ding or ring. That's only 10% of the time you check your phone. The other 90% of the time that you check your phone is because of what we call an internal trigger. What are internal triggers? Internal triggers are uncomfortable emotional states that you seek to escape. Boredom, loneliness, fatigue, uncertainty, anxiety. That is 90% of the time that you get distracted. 90% time you check your phone, it's not because of a ping, ding or ring and it's because of a feeling. And that's when the habit really takes hold.
A
For Eyal, this is what hooks people and makes some apps habit forming. For others, it's what makes them addictive.
B
You're not addicted per se, you're just distracted. But of course, we don't like that terminology. We don't like to think that, oh, we're just distracted, that there's something we can do about the problem. Can't I just blame Mark Zuckerberg for this? Well, no, there's actually quite a bit you can do yourself.
A
Five years after he published Hooked a blueprint for how tech companies could capture their users attention, Eyal wrote Indistractible A guide to resisting those very same technologies.
B
You think distraction is new? Plato was talking about this problem 2,500 years ago, before the Internet. People have always and will always get distracted unless you understand the root cause of the problem. The root cause of the problem is that you want to escape a feeling you don't know how to deal with in a healthy way. There is a bifurcation right now between people who allow their time and attention to be manipulated and controlled by others and people who stand up and say, no, I'm indistractable. I control my attention. I choose my life.
A
But for all his pro tech libertarian rhetoric, even Eyal acknowledges that there should be some guardrails.
B
Just to be clear for your episode, what I would very much hope you don't do is to say that I'm advocating using this on children. Okay. I do believe we need special regulation for children as a protected class, and I think we should even go beyond that and protect people who are pathologically addicted. I do believe that if you know someone is overusing your product to the extent that it might be harmful to them and there's a way to access that person, I do think you have an ethical imperative to try and help that person. If you don't, that's exploitation.
A
The problem is this kind of psychological manipulation is being used on kids. And that isn't just some oversight. It's fundamental to the business of social media. In 2019, a decade after Roger McNamee and Mark Zuckerberg parted ways, a product manager named Francis Haugen arrived at Facebook. What Haugen would see there would confirm McNamee's worst fears. Facebook's business model, which relied on keeping users glued to their screens, was having serious side effects, and kids were being hit the hardest.
E
So imagine you drive up upon this building, and it's a quarter of a mile long. It's largely one open floor plan.
A
But when she first joined the company, Haugen was cautiously optimistic.
E
This building was the largest open floor plan office in the world, right? 5,000 people sitting on one floor. And it's because Facebook views itself as inherently democratic, that everyone has an equal voice, that no one sits above anyone else. It's completely level, even if that is grossly impractical. So, for example, I used to walk for 15 minutes to go to half an hour meetings.
A
If the architecture was egalitarian, Haugen soon realized the company was not. And it quickly became clear who was calling the shots.
E
People in Silicon Valley are spoiled. They're used to being on teams where their opinion matters, where things are done collaboratively, that you're not just working for Team Mark, you know, Team Zuck.
A
Haugen had joined Facebook to work on something called civic misinformation. At the time, Facebook was being accused of influencing the 2016 US election and abetting genocide in Myanmar.
E
So there's not like, a lot of question out in the world on, like, is Facebook doing some bad things?
A
She was hired to try and fix things.
E
My team got formed to try to make it so it was less dangerous overall because there's lots of individual features in the product that make it riskier for misinformation.
A
But when Haugen came up with ideas to fix these problems, she was brushed off.
E
It wasn't that he tried to problem solve with me and say, how could we make these projects better? You know, these are serious issues. Like, Facebook has a responsibility to keep these people safe. He said, you shouldn't work on these projects. They're not measurable. And that was the moment where I was like, there's no way that Facebook can heal itself on its own.
A
Haugen decided the public needed to know what was happening inside Facebook, but she knew doing that would be hard and dangerous.
E
One of the things that I had to consider was I knew I was being watched. And so when I went to ask myself the question, how do you get information out? How do you make sure history has a record of what happened? I decided that the best way to do that was to take a burner phone. And I used that to go and take pictures of my computer screen.
A
Over the course of several weeks, Haugen took thousands of photos.
E
The thing I was most scared of was not being able to finish. I understood there was one chance to do this, that they were going to slam every door shut after I was done. And so there was this question of what were the questions that history deserved answers to.
A
The crazy thing was Facebook had these answers. They had asked users about the negative consequences of their products. Compulsive use, body image issues, trouble sleeping. The problem was they were keeping the answer to those questions secret from the rest of us.
E
Facebook had asked hundreds of thousands of kids whether they felt they had a problem with compulsive use. So compulsive use is where you use your phone so much it harms your sleep, it harms your health, or your schoolwork or your relationships with others. And what they found was that rates of self reported problematic use peaked at 14, which really probably means the younger you are, the more problematic use problems you have.
A
The fact that 14 year olds were using Facebook and Instagram compulsively was troubling. But the real story might be even more concerning.
E
So I really want to put 14 in like air quotes, because you have to remember, as far back as 2022, one in three seven to nine year olds was on social media in the United States.
A
I got Instagram when I was 11. I think technically you need to be 13 to sign up. But neither Facebook or Instagram make you verify your age.
E
When Facebook looked at that data, they said, oh, this isn't so bad. You know, I think at that point it was something like 1 in 8. 1 in 7 expressed problematic use. So we're talking about 14% or something at the same time. Is that really the definition you would use? Right. Like if you say to someone, do you feel like you can put your phone down whenever you want to? That number would probably be much, much higher.
A
As Haugen dug through Facebook's internal documents, evidence that they may have been mistreating their young users started to pile up.
E
One of the most commonly requested interventions for changes to Instagram has been removing like counts from posts. These are sometimes called quantitative measures of popularity. There was a sizable population of kids who said, hey, I like having the like counts removed. It makes it less stressful to use this product.
A
But even if kids didn't want them, those like counts were valuable to Facebook for other reasons.
E
Advertisers really wanted the like counts. Influencers really wanted the like counts. And so it's one of these interesting questions around whose interests matter on these platforms.
A
There was research on notifications, too. The pings, dings and rings that nearal calls external triggers.
E
They had gone out and talked to kids and said, hey, how do you feel about notifications? We have notifications in the app because they make you come back. You know, if we want you to spend more minutes every month on the app, best way to do that is find lots of compelling reasons to ping you and say, oh, what about this? What about this? What about this? Oh, did you see what your friend did? And it's to the point where they'll optimize it such that if they see that you are drifting away, they might hold back a particularly juicy notification just so you can see that your ex boyfriend posted a new photo. They spent a huge amount of effort making these incredibly compelling. So they went and asked kids, when you receive these late at night, what happens to you? And the kids said, obviously, I can't sleep. They asked them what happens when you send them to you during the day. Kids said, obviously, it's distracting. It's hard for me to focus in school. It makes me feel stressed.
A
Facebook ran experiments that showed when they turned off notifications, kids slept better and they had an easier time paying attention in school. But again, what was good for kids wasn't necessarily good for Facebook.
E
You know, you see quotes from senior executives say, we know that these experiments are positive, but the Metric that Mark cares about this month is total minutes spent. And we can't push out something that's going to hit total minutes spent by 1 or 2%.
A
This is the crux of the problem. Facebook made its money by keeping people on the platform for as long as possible, and they designed products in service of that goal, regardless of the consequences.
E
The problem is if you don't have numbers that represent the downsides, the consequences, you will just keep cutting corners to get your metrics to go up. In a world where we don't poll people on, did this app impair your ability to sleep well at night? And in a world where you don't have to report that number, you will continue to see apps like Instagram just try to keep you up all night because there is no downside and their overall success metrics go up.
A
Before I let Frances go, there was one last thing I wanted to ask her, a question that was personal to me. What did these documents say about the impact Instagram was having on teenage girls?
E
One in eight teen girls said that when they felt bad about their bodies, Instagram made it worse.
A
This was something I'd always suspected when I was battling my eating disorder. But Haugen confirmed it. And the evidence she uncovered at Facebook helped me understand that the harmful things I was seeing on social media weren't always by choice. The algorithm was choosing them for me.
E
Anytime you trust a computer, you trust an AI to direct your attention. You end up getting what are something called rabbit holes. Now, the AI says, what's the thing that you're most likely to engage with further? And it pulls you towards that concentration of content so your subconscious can be drawn to something. It can be an eating disorder, it can be self harm, it can be a negative thought about yourself. And the AI will pick up on those vulnerabilities and show you more of it, because it's the content that you can't look away from. And when you ask Instagram about these things and you say, why do you show depressed kids more self harm content? Why do you show insecure girls eating disorder content? They say, we're not intentionally doing this. Our algorithms are agnostic. And what's so frustrating about having the word agnostic get applied to a system like this is once you are told your system has a problem, that system isn't agnostic anymore. Your lack of action is a Choice.
A
In late 2024, Facebook and Instagram rolled out a feature that allowed you to reset your algorithms. That's a step in the right direction. But is it enough, a business model that relies on keeping users scrolling for as long as possible will inevitably push you back towards shocking, outrageous, or even dangerous content. And that's unlikely to change anytime soon, because that business model, the one that Facebook pioneered, has made social media companies a lot of money, like trillions of dollars. We've been living with the consequences of that business model for two decades now. Now that's two decades of doom. Scrolling clickbait and arguing in the comments and I can't help but wonder what is all this screen time doing to our brains? The red squiggly line is the increase and decrease in blood flow to a channel in our prefrontal cortex.
B
Kids had less well developed white matter all over their brain, but particularly in areas that support language literature.
E
See An Executive Function and I was actually really surprised by the results because we found no significant findings. In other words, the time spent on social media was not related to either depression or anxiety.
A
That's next time on Left to Their Own Devices. Left to Their Own Devices is hosted and produced by me, Ava Smithing. The show is written and produced by Mitchell Stewart. This episode is mixed and sound designed by Reza Daya. Our story editor is Kathleen Goldhar. The executive producers for Paradigms are James Millward, Helen Hayes, Taylor Owen, and Mitchell Stewart. The Executive producer for the Toronto Star is JP Fozo. If you want early access to upcoming episodes of Left to Their Own Devices, subscribe to the Toronto Star. Add the Star.com.
Episode 3: Big Tech's Origin Story
Host: Ava Smithing (Toronto Star)
Date: October 3, 2025
This episode dives deep into the origins of Facebook and the broader business model that would go on to define social media and shape the experiences—and vulnerabilities—of an entire generation. Host Ava Smithing, a survivor of social media’s darker impacts, retraces Facebook’s path from dorm room project to global behemoth. She underscores how the attention-driven business models pioneered by Facebook (now Meta) not only rewrote adolescence for millions but created consequences we’re still grappling with today. Through interviews with insiders, critics, and whistleblowers—including Roger McNamee, Nir Eyal, and Frances Haugen—the episode unpacks how these platforms were deliberately engineered to keep users, including children and teens, coming back, often at significant personal cost.
“A lot of people are focused on taking over the world or doing the biggest thing, getting the most users. … There doesn’t necessarily have to be more.”
– Mark Zuckerberg (01:55)
"It's 3.2 billion people use one of our services every day." – Mark Zuckerberg (02:31)
“Senator, we run ads.”
– Mark Zuckerberg, U.S. Senate hearing (07:02)
“You have created something that is really cool. … You’ve solved the core problems of social media. And I believe you’re going to have a very successful company.”
– Roger McNamee (05:19)
“Their goal was to manipulate your attention. Their goal is to manipulate your behavior to drive you towards things that were economically valuable to them. … The simplest way to grab most people’s attention is to trigger fight or flight. You want to scare them or outrage them.”
– Roger McNamee (08:33)
“A hook is a design experience to connect the user's problem with the maker's product with enough frequency to form a habit.”
– Nir Eyal (12:57)
“Are we gonna say stop making your devices so user friendly? No, we want them to be easy to use. … What are we doing here right now? Do you guys think this isn’t psychological manipulation? Of course it is.”
– Nir Eyal (12:03) “Just to be clear for your episode, what I would very much hope you don’t do is to say that I’m advocating using this on children.”
– Nir Eyal (16:29)
Facebook’s Internal Knowledge (17:42–27:11):
“One of the most commonly requested interventions… was removing like counts… It makes it less stressful to use this product.”
– Frances Haugen (22:25)
“You see quotes from senior executives: ‘We know that these experiments are positive, but the metric that Mark cares about this month is total minutes spent. And we can’t push out something that’s going to hit total minutes spent by 1 or 2%.’”
– Frances Haugen (24:23)
Algorithmic Rabbit Holes (26:04–26:54):
“Anytime you trust a computer, you trust an AI to direct your attention. … Your lack of action is a choice.”
– Frances Haugen (26:04)
“One in eight teen girls said that when they felt bad about their bodies, Instagram made it worse.”
– Frances Haugen (25:43–25:49)
The episode is analytical, personal, and sometimes urgent, blending researched reporting with first-person stakes. Ava Smithing’s narrative anchors the episode emotionally while expert voices provide technical and strategic context. The tone is investigative but humane, refusing to let the listener forget who is most deeply affected: the young people raised by—and often victimized by—the machinery of Big Tech.
This summary captures the central themes, guest contributions, and pivotal revelations of the episode, providing essential orientation and detail for those who have not listened—or who want to recall its main arguments and evidence.