NO SUCH THING Podcast
Episode: "TINY CAMERAS!?! We found out what those customer satisfaction buttons actually do"
Hosts: Manny, Noah, Devin
Guest: Scott Erickson (VP, US Sales & Global Channels, Happy or Not)
Release Date: April 1, 2026
Episode Overview
In this episode, Manny, Noah, and Devin dig deep into the world of those ubiquitous customer satisfaction buttons you see at airports, stadiums, gas stations, and bathrooms. Inspired by a listener question, the hosts embark on a journey to discover the truth behind these devices: Do they actually matter? Who collects the data? And, most shockingly, was a camera watching you push a button?
Key Segments, Insights & Discussion Points
1. Listener Question & Initial Theories
[02:58-04:38]
- Listener Emma asks: "What is the purpose of those immediate reaction in-person reviews? Like when they ask you to hit a smiley or frowny face after you go to the bathroom?"
- The hosts wonder if anyone actually reads the button data. Manny jokes it feels like “pressing the button at a crosswalk” where it’s just psychological theater.
- Noah: “It’s like the button thing at a crosswalk...so many of them are not even connected.”
[04:45] - Devin: “Maybe I’m mad because I had to wait in line and now I get to do this. It makes me feel better, and it might stop me from going off online or something.”
[04:25]
2. What Are These Devices Really?
[06:18-09:13]
- Happy or Not is revealed as the main company behind these feedback buttons (Finnish origin, a country voted happiest for 8 years running).
- The buttons are intentionally binary, with no neutral option.
Scott Erickson: “We take the neutral out of it. You’re either really happy, you’re happy, you’re unhappy, or you’re really unhappy. We truly do force that person to tell us which side of the fence they’re on.” [08:55] - Manny: “There’s no meh.” [08:54]
- Devices are most familiar as a stand with four emoji-style buttons, ranging from very happy to very unhappy – no screens, just simple physical buttons.
3. Is Anyone Actually Watching the Data?
[09:26-13:03]
- The hosts are surprised to learn the data is tracked in real time via a SIM card in every device.
Scott Erickson: “It is indeed real time. All our devices have a SIM card...so they’re transmitting via cellular network.” [09:54] - Data is shown live on an analytics dashboard for clients (like airports) and can trigger alerts if negative feedback spikes.
- Noah: “So the people who work at that airport could get an alert on their phone saying, oh, go check out the bathroom near Terminal A, you got a code red.” [12:43]
- Anti-tampering: The system is designed to ignore button mashing and prevent manual ‘gaming’ of results by staff.
4. The Shocking Reveal: Tiny Cameras Inside
[14:39-18:27, 17:41-18:39]
- The biggest shocker: The devices now include cameras—raising immediate privacy concerns.
- Devin: (on hearing this) “Camera in a bathroom immediately raising some red flags.” [17:50]
- Scott Erickson: “There is a camera in the device...we are assessing via some AI technology to allow us to predict that person’s age and gender.” [17:54]
- The camera’s primary purpose: filter out feedback from non-target demographics, e.g. ignore votes from kids mashing buttons or to analyze group trends (such as how older passengers react to tech changes in airports).
- Since 2023, most devices include demographics-capturing cameras, but buyers (like airports or stores) must opt in and comply with local disclosure laws.
- Devin: “To be fair, as far as the bathroom concern is, they’re always positioned on the entryway...but you can imagine…” [21:53]
5. Privacy, Vectorization & Bias
[22:08-27:42]
- The hosts dive into privacy and data protection:
- Noah: “How do you protect customer data? Most people don’t realize this is happening...” [22:15]
- Scott Erickson: “We’re not storing anything on the actual device nor in the cloud. We simply take what we call a vector analysis of that person’s face...no record of any kind is maintained.” [22:37]
- In simple terms, a mathematical “vector” of your face is produced in real time, never stored, and immediately discarded—no photos are saved or uploaded.
- Bias: Tech experts (Jason Keibler, Matty Belichick) explain AI’s reliability for age/gender detection is “notoriously...not great”—often less accurate for women of color, and “up to 95% accuracy” can mean a lot of incorrect readings, especially in large crowds. [24:47-26:26]
- Disclosure is up to the clients’ local laws. Airports/stores “should” post a sign but may just rely on blanket security-camera notices.
6. Data Participation & Human Behavior
[29:24-31:27]
- Only about 10% of people (sometimes up to 30%) actually press the buttons.
- Scott Erickson: “We’re in the kind of the 10% neighborhood...in certain environments, we see 20-30% engagement.” [29:30]
- Engagement is actually much higher than email surveys (which are <1% response rate).
- Noah: “I would assume...people are going to be more likely to be upset...but my boy Scott said that’s not the case. People...like to share positive feedback.” [31:09]
- Scott Erickson: “You’d be surprised how heavily weighted the positive feedback actually is.” [31:27]
- Devin: “I could see at a store...cashier is just friendly, nice, whatever, easy. I could see smashing a positive for sure.” [31:50]
Notable Quotes & Moments
- “This is kind of game changing...that this is happening in real time.” – Manny [12:54]
- “A camera in a bathroom? Immediately raising some red flags.” – Devin [17:50]
- “We don’t store anything...simply a vector analysis...no photos stored or uploaded.” – Scott Erickson, Happy or Not [22:37]
- “These models are notoriously not great at detecting age and gender—even in perfect lab settings.” – Paraphrasing Matty Belichick [24:47]
- “If I’m paying for this movie ticket and I have to see ads...that’s disgusting. You should be paying me to show that.” – Manny, Dev’s Hot Takes segment [39:51]
Memorable Moments
- Hosts' disbelief: The surprise and mounting unease as the camera news drops. [14:39, 17:47]
- Discussion of hacking, disclosure, legalities: The trio jokes about trying to subpoena Happy or Not’s vectors after a “fake murder,” riffing on privacy and surveillance. [27:59-29:19]
- ‘Dev’s Hot Takes’: A comedic postscript where Devin rails against excessive movie trailers and in-theater commercials. [35:06-40:32]
Segment Timestamps
- Listener Question & Button Theories: [02:58–06:09]
- What the Devices Are: [06:18–09:13]
- Who Looks At the Data & Real-Time Monitoring: [09:26–13:12]
- Tampering & Button Fiddling: [14:05–14:39]
- Cameras in the Devices & Privacy: [14:45–18:39; repeated at post-break 17:41]
- AI Bias & Security: [23:36-27:34]
- How Many People Actually Press? [29:24–31:27]
- Positive vs. Negative Feedback: [31:09–31:50]
- Dev’s Hot Takes Segment: [35:06–40:32]
Final Takeaways
- The “happy or not” buttons are real, wired, and tracked in real time by organizations, not just psychological theater.
- Since 2023, many devices come with built-in cameras—ostensibly not to record you, but to feed AI models used for demographic filtering and analytics.
- No images or personally identifiable data are stored, according to the company, but experts have concerns about reliability and bias in the tech.
- Only a small percentage of people use these buttons, but the positive-to-negative feedback ratio surprises even the hosts.
- Privacy signage/disclosure is inconsistent and relies on local law compliance.
For more, including links to the cited New Yorker piece and to Manny’s book, check the show notes at www.nosuchthing.show.
This summary captures the episode's major revelations, technical explanations, host banter, and key moments, providing essential context for everyone curious or concerned about those little feedback boxes.
