Loading summary
Sponsor/Ad Reader
Get in the game with the College branded Venmo debit card. Wreck your team with every tap and earn up to 5% cash back with Venmo Stash, a new rewards program from Venmo. No monthly fee, no minimum balance, just school pride and spending power. Get in the game and sign up for the Venmo debit card@venmo.com collegecard the Venmo MasterCard is issued by the Bancorp Bank NA Select Schools available Venmo Stash terms and exclusions apply at Venmo me stash terms max $100 cash back per month.
Host 1
It's not that just your phone listens to you anymore. That part is old news. You talk about shoes and ads for shoes appear. Everyone already accepts that. But this is different. This is quieter and darker. People are saying their phones respond to thoughts. Not words, but thoughts. You think about something random, an object you've never searched, never typed, never said out loud. Then hours later, there it is. An ad, a video, a post. Like your mind left fingerprints somewhere it shouldn't have someone walk.
Host 2
So there's this whole kind of social media trend on beyond that's moving on beyond the idea, as that little video on TikTok and Facebook went on to explain. We've moved on beyond the fact that our phones are listening to us, right? That the AI technology finds words that are said in the environment and kind of hooks it up through the algorithm for marketing. Guys, I don't know if it's happened to you, but it's happened to me on various occasions. I remember once we were talking about like a love seat or something. I was talking to my wife about that and lo and behold, later on that evening I started getting all these ads on my social media feed on sofas and love seats. I'm like, what is going on? It's very bizarre. So it's getting a little creepy. Even what is projected to be an innovation and a progress in science and technology starting to get a little creepy. Like a brand new publication that just came out in the Green Journal ahead of print. This is not yet officially out yet. This was accepted ahead of print on March 5, 2026. And in all disclosures, this has ties to our sister organization. You know, we're at Texas A and M. Our competitors slash sister organization through the state is the UT Health System. So through UT Austin and at Dell Medical School, there's physicians and scientists who are. They're physicians, but they're also leading a charge in innovation, which UT is known for. And of course through the namesake Dell who. Which of course changes the world of computers as well. To take the iPhone program machine learning through an AI algorithm. And here it is. Guys, this is going to be. This is quickie episode number three. We're just going to tell this very fast. I'll get into a lot of the science and the technology specifics, but let you know what is like, wow, both as a little creepy and what is fascinating at the same time. These authors, slash researchers, slash innovators are working on an algorithm. And they're not the first one. Others have done this as well, where you can put the iPhone. Follow me here. On a patient's gravid abdomen. And the iPhone will quote, unquote, listen. Okay, not. Not necessarily for audio sounds also, but for vibrations. Listen to fetal movement. What? So, so let me. I gotta explain a lot and we're gonna do that in a minute as well after the intro. But yeah, this is a thing. Let me read you. Let me read you the title and I'm gonna go into this after the intro and I'm gonna explain this because, wow. Not only is this potentially going to be a thing because these authors, some of these have submitted patents for this technology and others, by the way, they're not the only ones. Others are doing similar kind of studies for this. To use machine learning through a smartphone, in this case specifically the iPhone, to try to detect fetal movement when a patient complains of decreased fetal movement. Now, remember that we've covered decreased fetal movement on this show in the past. Decreased fetal movement is not an indication for induction in another itself. Let me say that again, because we've covered that in the past episode. Decreased fetal movement is not an indication for induction of labor in and of itself. It is, however, an absolute indication to evaluate whether that's an NST alone, which is not as good as an NST with fluid. Because if there's decreased fetal movement, you got to make sure that there's ample room in the uterine environment for the child to move. So doing a modified biophysical profile seems to be better, but you got to do some kind of assessment. And if the NST and the maximum vertical pocket as part of the modified biophysical profile are both good, then you can buy more time. All right. Because decreased fetal movement alone is not an indication for induction. How many patients do we put on the monitor or go into L and D for GQ's freedom movement? And that is the thing that's a good thing. They should concentrate on the baby movement. And we've learned a lot about that, like from the Affirm trial, if you remember, the Affirm trial was one that kind of grouped patients into this. A standardized kick count assessment versus just global assessment of movement. And neither group was any better at prevention of adverse outcomes or stillbirth. The only one that guaranteed more intervention was the strict kick counts, which is why we don't really do that anymore. It's just kind of awareness of fetal movement. But when there is a perceived decrease in fetal movement, wouldn't it be nice if we could, say, download this XYZ app, put the phone on your abdomen and let the phone tell you if the baby is moving? What? Sci fi crazy. The future is here. Now. I don't want to give any false perceptions here. Number one, I have no financial ties to UT Dell except for the fact that I went to UT Austin and my son graduated as a Longhorn as well. So I have no financial ties to this. I'm not trying to promote this app, which is still experimental, but I thought this was fascinating. And again, as a quickie episode, just want to let you know what's up in print because this is brand new smartphone detection of fetal movements using artificial intelligence. I'm going to get into that in a minute. So that's number one. Number two, right now, this is not considered anywhere in any ACOG or SMFM guidance where it says, use a wearable device to check fetal movement. Right now, we don't. We don't do that. Now it's both of those guidances. Professional societies say there's technology for this, but we're way, way not yet into clinical integration as a population because they need good clinical validation studies. However, right now, the best that we have for detection of fetal movement is maternal perception. All right? So we've known that since the affirmed trial. And so that is a thing. Okay? However, if this thing actually gets validated and if this thing actually does work, kind of game changer. Because think about how reassur wondering that would be if a patient can just put her smartphone on the abdomen and go, oh, baby's moving just fine. What? So we're gonna explain after the intro what this is actually doing and what is quote unquote listening for is actually is doing, because it's not really listening to the sound of baby movement. It's kind of looking for a combination of vibrations and tones that were mapped using real time ultrasound and using an iPhone place on the abdomen to kind of record. I know this sounds freaky. It sounds kind of weird and kind of gimmicky and I'm not sure if I'm digging it 100%, just to be honest, but I think the technology and what they're trying to do is absolutely fascinating. Okay, so this is not really. It's not like the baby's moving and the phone is picking up, like, oh, I heard that movement. Even though our bodies do make sound, hence stomach growling. That's Borborgmi. I mean, our bodies do make sounds, but you're not going to be sitting next to your spouse and going, ooh, I really heard that kick. I mean, that's not what we're talking about here. So right now, the fetal movement is not being heard by the iPhone. It's a combination of acoustics and vibration that the iPhone is picking up. Now, you can definitely hear movement on a Doppler. Right? How many times have you put a handheld Doppler on the patient's gravid abdomen and you hear boom, boom, right? The movement as it kicks off the Doppler beam. But that's using Doppler ultrasound. The iPhone is not using Doppler. It literally is being placed in the abdomen with the little sensor at the bottom and trying to hear the waves, the vibration as the baby moves, which was matched by an abdominal ultrasound in real time. I'm not going to get into all of it. Remember, this is just a quickie episode. I just want to give you the highlights of what they found and to let you know that right now there is zero electronic tech device that is for fetal detection that is validated in clinical trials. So this is still considered experimental. This still says, even in this publication from the Green Journal. Guys, this is. In. This actually got into the Green Journal. They say we definitely need further clinical trials here before this thing is even FDA cleared and. Or put into clinical practice. Okay, but did it have some kind of reassuring findings? Yeah, it did. It actually did what it's supposed to do. And I'm going to tell you that right after our intro. Again, we're going to talk about the creepiness and the both fascinating and intriguing aspect of this, what our iPhones are capable of doing. Not only can they potentially read our minds according to social media, but they are definitely listening to us. That's for darn sure. That. That's. That's. That's a known thing. They're. They're picking up words to go into AI, algorithms and marketing. That's not a myth or, you know, some kind of urban legend. It does do that to a degree. And so there's real privacy concerns there. But now we're saying maybe you can even detect, maybe your iPhone can maybe detect fetal movement and even breathing movement and hiccups. Guys, I don't know if this is one year, you know, down the road, two years, or five years, but the fact that this is being looked at is indeed pretty darn fascinating. Now, to be clear. Now, I don't want to get too much into this, but we're going to take a look at this after the intro. This wasn't all phones. This was specifically an iPhone and a specific model of iPhone that I'm going to tell you about here in a minute. So I think I've set it up enough now that our iPhones are all listening to us because we've said iPhone multiple times. Not a sponsor. Let's get out of the intro and I'll be right back to the both creepy and the fascinating. We'll be right back. Podcast Family welcome to another quickie episode of the Clinical Pearls no Spin podcast.
Sponsor/Ad Reader
Introducing Shark Facial Proglow. A first of its kind at home, Facial Pro level extraction removes impurities and dual temperature technology boosts radiance and deep puffs with Both InstaHeat and InstaChill technology. Treat yourself to a hydro fueled glow in minutes. And because Shark Facial Pro Glow fits in the palm of your hand, you can get that luminous post spa glow anywhere. Visit sharkbeauty.com to learn more. This episode is brought to you by Indeed. Stop waiting around for the perfect candidate. Instead, use Indeed sponsored Jobs to find the right people with the right skills fast. It's a simple way to make sure your listing is the first candidate. C. According to Indeed data, sponsored Jobs have four times more applicants than non sponsored jobs. So go build your dream team today with Indeed. Get a $75 sponsored job credit at Indeed.com podcast terms and conditions apply.
Host 2
All right, podcast family, so this is a new publication that's not officially out yet. Again, it is ahead of print in the Green Journal. The title is Smartphone Detection of Fetal Movements using Artificial Intelligence. And they do have their financial disclosure here that this is out of UT Dell and some of the authors do have, they do have a patent pending for some of this technology. Okay, now this isn't an rct. This is a prospective study of a small number of patients. And they used ultrasound on the abdomen, regular ultrasound to look for fetal movement. At the same time, they placed a smartphone on the abdomen to try to capture it to see if there was any kind of sound, and then using an algorithm to filter out noise and borborgmi and other issues that the phone picked up. They tried to see if there was anything that correlated between the baby actually moving with the iPhone. Picked up. Crazy. I know. Again, it sounds kind of wacky. It seemed to work. Now, this wasn't every smartphone. This was specifically iPhone X. So does this work with 11? Does it work with 9? Does it work with a 16? Whatever. This was iPhone X. And they place that iPhone vertically on the patient's abdomen, and they have a picture on it. Okay. In a paramedian location at the mid uterus held in position with a custom apparatus. Okay, so you got to figure all that out. So it's not like just, you know, you have to. There's a standardized way to do this. Now, quote, a simultaneous continuous real time ultrasound was performed with movement documented by the sonographer, as well as asking the patient, do you feel the baby move? And then correlating all of those things together to see if that iPhone somehow, with its machine learning and specific algorithm, and could detect the movement, y'. All. Again, it's fascinating and creepy and kind of goofy at the same time. Like, you're telling me the phone is gonna be able to, quote, unquote, listen to baby movement? Yes, but it's not listening like you think. It's not audio. It's looking. It's sensing for some kind of vibration. All right, now, there's a lot of science and engineering mumbo jumbo and language in here. I'm not going to get into that, because as clinicians, that's not our mojo. But the short of it is, I got to tell you the word, because I got to be true to the publication. They were looking for acoustic features known as MFCCs. I know you think about that all the time. MFCCs, I.e. mel frequency, septal coefficients, y'. All, Whatever. It's some kind of AI thing using both a combination of audio and vibration and some kind of algorithm to detect movement. All right, so MFCCs. Just go with it. Just go with it. And they look for detection of breathing, gross movements, and even baby hiccups. Yeah. Try to figure that out. The short of it, members, it's just a quickie episode. Yeah, this actually correlated pretty darn well and actually picked up more movement than just maternal perception at the same time. So there seems to be something here. Although we're definitely not ready for prime trime. Also, this wasn't like a vast number of patients. This was still a relatively small number. But because fetal death is linked to decreased fetal movement. Guys, when maternal perception isn't enough. Maybe something down the road. This will be something that we can consider. But again, very, very small number of patients. Let me give you the numbers. A total of 136. That's it. 1, 3, 6. 30 patients were followed longitudinally and 106 only had one visit. So it was kind of both a longitudinal and kind of a cross section view at the same time. Then they compared these results under area, under the curve. And I'm just going to give you the results very quickly. But yeah, I mean, it seemed to be okay. Quote, Gross fetal movements was detected at an accuracy of 64%, whereas maternal perception of fetal movements yielded an accuracy of 18%. Now, remember, we're talking about using sonography as a gold standard. So you're looking at the baby with an ultrasound. You see the baby move and you go, mom, did you feel the baby move? They're like, no. And that's part of the human body's adaption to the environment, right? So if we felt everything, we couldn't function. That's why we can wear clothes, because the body goes through accommodation and acclimation. That's why your ring doesn't bother you all the time, because your body adjusts to it. Same thing with due to movement, right? So the ultrasound can see movement. And mom's like, no, I didn't feel that. So let me just tell you again, this detected fetal movement at 64% compared to mom's perception of movement of 18%. Using ultrasound as gold standard. Okay, that's pretty good for fetal breathing. How about that? For fetal breathing? Yeah. The ultrasound detected fetal breathing, of course, like 93%, 3% for maternal perception because I'm not sure how you would be able to tell if the baby's moving or if that's a fetal breathing episode. So that's kind of weird, but patients can definitely feel hiccups when the baby. Right. That rhythmic kind of contraction. Those are baby hiccups. And the accuracy for that using this kind of audio recording iPhone technology, was 73% compared to 32% for maternal perception. The numbers don't really matter. The point is it's, it's impressive. It did something using this iPhone, AI machine learning, comparing it to maternal perception as the clinical gold standard and the ultrasound as the visual gold standard. This phone thing kind of did something. Their conclusion, quote audio based assessment of fetal movement using a smartphone can reliably detect gross fetal movements as well as fetal breathing and hiccups observed on ultrasound. And it proved superior to maternal perception of movement end Quote, so this Green Journal article, guys, this is in The Green Journal, ACOG's premier journal, with a prospective study of 138 out of UT Dell. Welcome. Horns. Wow. How creepy is that? So good water cooler discussion. Next time you're getting a little water at your water cooler in the office, go. Did you hear that? The iPhone can possibly not only spy on us and listen to what we're saying. One day may be able to hear, quote, unquote, hear baby movement by putting the phone on your abdomen and using machine learning. Fascinating and creepy at the same time. Podcast family, this is quickie episode number three. Using the iPhone to detect fetal movement. Wow. We're not ready for primetime. Don't go try to download this on the Apple or your Samsung or Android store. It's not there yet and it may be or it may not be coming soon. Podcast family, as always, we're thankful for you. We're glad you're part of our podcast community. And now that we've done all that, go grab your iPhone, go grab your Samsung, go grab your Android, and let's take it home. Podcast family, welcome to another quickie episode of the Clinical Pearls no Spin podcast.
Episode: Quickie #3: The iPhone AI Fetal Movement Detector?
Date: March 19, 2026
Host: Dr. Chapa
Podcast Audience: Medical students, residents, and healthcare providers interested in women’s health
This "Quickie" episode dives into a brand new, yet-to-be-published study proposing the use of iPhones, equipped with artificial intelligence algorithms, to detect fetal movement by being placed on a pregnant patient's abdomen. Dr. Chapa explains the research coming out of UT Dell Medical School, unpacks its clinical context, discusses the science and limitations behind the technology, and shares both the fascinating and slightly “creepy” aspects of our devices’ ever-growing capabilities in healthcare.
[00:38] - [01:06]
Dr. Chapa opens with the notion that while we're all used to our phones "listening" and marketing to us based on spoken words, social media rumors now allege that devices are even responding to thoughts, not just speech — setting the stage for even more advanced sensing, like fetal movement detection.
"It's not that just your phone listens to you anymore. That part is old news... This is quieter and darker. People are saying their phones respond to thoughts. Not words, but thoughts."
— Host 1, [00:38]
[01:06] - [05:00]
Dr. Chapa describes a newly-accepted (ahead of print) publication in the Green Journal from UT Dell, where researchers programmed machine learning algorithms on iPhones to “listen” for both audio and vibration data from a gravid abdomen to detect fetal movement. Patents for the technology are pending.
He emphasizes:
"Wouldn't it be nice if we could, say, download this XYZ app, put the phone on your abdomen and let the phone tell you if the baby is moving? What? Sci fi crazy. The future is here."
— Dr. Chapa, [04:44]
[12:27] - [16:15]
"It's some kind of AI thing using both a combination of audio and vibration... They look for detection of breathing, gross movements, and even baby hiccups. Yeah. Try to figure that out."
— Dr. Chapa, [13:36]
[15:00] - [16:49]
"Gross fetal movements was detected at an accuracy of 64%, whereas maternal perception... yielded an accuracy of 18%... The numbers don't really matter. The point is... this phone thing kind of did something."
— Dr. Chapa, [15:50]
"Audio based assessment of fetal movement using a smartphone can reliably detect gross fetal movements as well as fetal breathing and hiccups observed on ultrasound, and it proved superior to maternal perception of movement."
— Dr. Chapa, quoting the article, [16:30]
[16:49] - [17:38]
Dr. Chapa notes:
"We're not ready for primetime. Don't go try to download this on the Apple or your Samsung or Android store. It's not there yet and it may be or it may not be coming soon."
— Dr. Chapa, [17:26]
"Wouldn't it be nice if we could, say, download this XYZ app, put the phone on your abdomen and let the phone tell you if the baby is moving? What? Sci fi crazy. The future is here."
— Dr. Chapa, [04:44]
"It's some kind of AI thing using both a combination of audio and vibration... They look for detection of breathing, gross movements, and even baby hiccups. Yeah. Try to figure that out."
— Dr. Chapa, [13:36]
"Audio based assessment of fetal movement using a smartphone can reliably detect gross fetal movements as well as fetal breathing and hiccups observed on ultrasound, and it proved superior to maternal perception of movement."
— Dr. Chapa, quoting the study, [16:30]
"Fascinating and creepy at the same time."
— Dr. Chapa, [16:54]
If you haven’t listened, this episode delivers a concise, spirited rundown of a genuinely novel clinical tech idea, what it could mean, and why OB care is still much more than an app away from incorporating such tools.