Transcript
Liberty Vitter (0:00)
Hello and welcome to the Harvard Data Science Review podcast. I'm Liberty Vitter, feature editor of the Harvard Data Science Review, joined by my co host and editor in chief, Shalime. Once thought of as a gadget of some futuristic fiction on the Jetsons, WHOOP is leading the charge in personal health from real time data, which revolutionizes how we eat, sleep, train and recover. This month we're diving into that process with whoop's founder and CEO Will Ahmed, who will guide us through the entire data journey from the moment it's captured on your wrist to the insightful metrics you see on your phone. Whether you're a health enthusiast, an aspiring entrepreneur, or just curious about how technology is reshaping our understanding of the human body, join us for an in depth call conversation on turning data into lasting positive change, potentially allowing us to live happier and much healthier lives.
Shalime (1:04)
Well, we all thank you so much for joining us. You have built WHOOP into a multi billion dollar business by turning invisible data into actionable insights. Can you walk us through the journey of data collected by WHOOP Sensor and from collections to the metrics we see on the app?
Will Ahmed (1:22)
Yeah, absolutely. Well, thanks for having me. I founded Whoop 12 years ago now and our mission is to unlock human performance and health span. Ultimately, we want to help people live healthier and longer lives. And we've developed the WHOOP sensor, which we're now on our fourth version of. And it collects the most data of any wearable on the human body. And if you think about the overall chain of collecting this data or the overall path, you know, first we have raw data that comes from our hardware sensor. So, you know, let's take photoplesmography for example, which is the technique for capturing heart rate. You know, we essentially have these various light forms that shine light underneath your skin, and the frequency with which those light waves return to a photodiode is the first input of data, so to speak. And that information then gets processed and analyzed. Now, if you're completely still, the frequency with which it is returning to a photodiode, well, that can be very directly attributed to what your heart rate is. Challenge is, if you're moving a lot, the sensor may be moving and your blood underneath your skin might be moving. So all of a sudden the, the idea that there's a perfect relationship there changes. And so we couple that data with raw accelerometry data, which is also measuring motion. And over the last decade, we've built all these algorithms around understanding the relationship of motion and photoplesmography to ultimately develop a very accurate heart rate sensor. And that heart rate data, depending on how hard it is to process, may live on the strap and be processed on the wearable itself. Or it might be processed on the phone, which is where we transfer the data to and then we take the raw heart rate data and then we start interpreting that into other scores. So I'm really just focused on one metric right now, but we do this for about 10 different metrics and one example of a score might be our strain metric. So we look at the amount of time that you would spend in an elevated heart rate zone versus a lower heart rate zone relative to your baseline. And the more time you spend an elevated heart rate zone, that would then contribute to a strain score, which shows how much exertion you've put on your body. Something like that strain score would feed into another set of algorithms and logic and coaching, which our large language model that sits inside the app would use to coach you. So, you know, the Punchline for the ND user, or as we call member, is Whoop recommending you should do a 12.5 strain today. And it should be a 30 minute run at moderate intensity. But the actual method of capturing what happened during that run or how we're talking to you obviously starts all the way at the raw sensor data level.
