A (57:39)
Club back from the break and I just wanted to mention there's some stuff going on in the self driving car world. NITSA is continuing to monitor the situation, as it were. The push toward hands free driving is frankly running up against some turbulence. A series of reports from the Wall Street Journal, written by Ryan Felton, Ben Glickman and Don Nico Forbes, paints a troubling picture of where automated driving technology stands right now. Internal documents from Ford reveal the company knew drivers were confused by its Blue Cruise system well before launching it. And Tesla's full self driving system is under an escalating federal investigation after crashes in poor visibility conditions. And just today NHTSA expanded that Tesla probe to cover 3.2 million vehicles, upgrading it to an engineering analysis, a step that could lead to a recall. These stories together raise a fundamental question. Are automakers moving faster than the technology and the drivers using it can safely handle? Let's start with a Blue Cruise problem, because people, I think when they hear about full self driving or self driving at all, they immediately think of Tesla. Right? Right. Well, other cars are doing this too. And the sort of US darling that is Ford is also part of this. Ryan Felton's deep investigation which was published earlier this month, or excuse me, earlier this year in February, reveals that Ford actually had extensive internal evidence that drivers misunderstood how hands free driving systems work even years before its Blue Cruise launched in June of 2018. Ford conducted a benchmarking study and used GM's Super Cruise system for this. More than 40 people drove a Cadillac CT6 on a Michigan highway. And the results confirmed the company's own concerns that drivers worry they will be too complacent behind the wheel with the system active. The numbers of this 40 people. So what can you really say about a study of 40 were stark. One in four drivers incorrectly believed that Super Cruise would reposition the car if it veered out of a lane. So again, they incorrectly believed that using Super Cruise would mean that the car would be repositioned in the lane if it veered out of the lane. 80% of drivers failed to notice the initial flashing green light warning them to resume control. And then a follow up study of Ford's own developing system in 2019 found that more than half 60% of drivers incorrectly believe the technology would steer the vehicle back into its lane if it drifted. So. So people don't know what self driving can and can't do. It's still not clear to the driver they are responsible for steering the vehicle back into the lane. That's what the presentation said. Then there were some fatal crashes leading to a federal inquiry. Ford launched Blue cruise in 2021 despite knowing that this was an issue and the consequences have been severe. In September of that year. In 2021, Barry Wooten was killed in Foresight in Geor after losing control of his F150. His family said that he was in self driving mode. His daughter Wendy Wells said her father was a lifelong car enthusiast and so he of course wanted the latest and the greatest in 2024. Two more deadly incidents then drew federal attention. In February, a Ford Mustang Mach E using Blue Cruise struck a stopped Honda CRV on a San Antonio highway. It did kill the Honda driver. Ford told NHTSA that warnings to take control of the wheel were ignored for 30 seconds before the collision. And then weeks later, another Mach E that was also using Blue Cruise in a Pennsylvania construction zone slammed into a stopped Hyundai Elantra, pushing it into a Toyota Prius. Both drivers of those vehicles died at the scene. Arguably, this becomes a conversation not of just people understanding self driving vehicles, but also the deadliness of these gigantic trucks that we have on our highways. But I don't think people want to hear about that, so I'm not going to get into it. Ford maintains that neither accident was caused by its system. The company says these incidents unfortunately illustrate that driving, whether by humans alone or with technology, require adequate time Adequate. No adequate time to perceive, classify, confirm and react to events. So let's talk about what Ford told nhtsa. Ford says, hey, NHTSA Blue Cruise's adaptive cruise control was designed to stop decelerating in response to stationary objects when traveling at or above 62 miles per hour. So this is a feature that's supposed to avoid what's called phantom braking, where the system incorrectly reacts to bridges, overhead signs, or other objects aren't actually in the roadway. So other safety features like automatic emergency braking are functional at higher speeds. But the design choice means the system may not slow down for a stopped vehicle in a travel lane at highway speeds. So to be clear, if the speed of the vehicle is above 62 miles per hour, Blue Cruise's adaptive cruise control does not decelerate upon discovering a stationary object in front of of it. So when you're driving even faster and there's a stationary object, it won't slow down. And the idea is that too many times it was detecting a bridge or some other item in front of it that then caused it to slow down. NITS's administrator Jonathan Morrison said in a January conference, we want to be crystal clear that the systems available in today's vehicles are driver assistance systems, systems you're driving. And I think that that's important like that. Yeah, that needs to continue to be the conversation now. Ford, of course, is dealing with the Blue Cruise scrutiny. But Tesla faces its own escalating federal investigation. In October of 2024, Ben Glickman and once again Don Nico Forbes reported that NHTSA opened a preliminary evaluation into Tesla's full self driving system system after identifying four crashes involving the technology. One in which a Tesla fatally struck a pedestrian. The crashes occurred in these reduced visibility conditions due to sun glare, fog, airborne dust, and the agency started looking to see if the system could even, you know, respond whenever it's in those scenarios, because those scenarios are going to come up a lot. That investigation covers an estimated 2.4 million vehicles, all Tesla models that can be equipped with the optional driver assistance software, including certain versions of Model 3, Y, X and S, as well as the cybertruck. Now, that is one part of it. The probe came just a week after Tesla's Robo taxi event, where the company outlined its plans to become a robotics and AI focused business. Elon Musk said Tesla would offer its driverless taxi service only where permitted, but gave no details on navigating regulatory hurdles. Now, as of today, March 19, 2026, that probe is expanding due to the escalation of the investigation to an engineering analysis that means that recalls could result the scope again to now 3.2 million vehicles. This has to do with several crashes, including a fatal one where self driving, full self driving failed to alert drivers appropriately about reduced visibility conditions. So it needed to set I can't tell what's going on due to these changes. That almost started to sound like Christopher Walken, but anyway I can't tell what's going on with the stuff that's happening outside and so I need you to take control of of the wheel. I think if Christopher Walken was telling me to do that, perhaps I'd pay more attention. Tesla, you should be listening to this. Anyway, if this is seen to be enough of a problem, then that does mean that we could be looking at a huge recall. And this has to do in part with Tesla's choice to go with these vision based setups. No radar, no lidar. We still don't really understand Tesla's true thinking here on switching away from radar and lidar and only going with vision based systems. But yeah, not a good idea. Automakers are marketing increasingly sophisticated driver assistance systems despite the fact that internal research and let's be real, real world crashes show that drivers fundamentally misunderstand what these systems can and can't do. So federal regulators are having to respond, ramping up scrutiny. Unfortunately, automakers continue to push this technology technology forward. Ford is working toward eyes off highway driving by 2028 and of course Tesla has claimed full self driving for a while, but is also working towards fully driverless robo taxis that don't even have steering wheels or brake pedals. The gap between what's being promised and what's been proven and is the actual situation continues to work widen. So to those of you out there with vehicles like this, please be mindful, be aware and don't think of these systems as something that can completely take over driving for you because they just simply are not that. Thank you all for tuning in to this week's episode of Tech News Weekly. The show publishes every Thursday Twitter TV tnw. That is where you can go to subscribe if you're not to the audio and video versions of the show. We'd love to love to have you join us tell other people about the show as well. If you would like to get all of our shows ad free plus so much more. I mentioned Club Twit, Twit tv Club Twit. If you'd like to follow me online, I'm ichasargent on many a social media network or you can head to the now back in back and functioning Chihuahua Coffee. That's C H I H u a H u a Coffee where I've got links to the places I'm most active online. Be sure to check out my other shows that'll publish later today. IOS today and hands on Apple as well as as hands on tech which publishes every Sunday. Thanks so very much for being here. We'll catch you again soon on a future episode of Tech News Weekly. Bye Bye.