Hands free? Not safe for thee.

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

VO: You are listening to There Auto Be A Law, the Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Cimino. For over 50 years, the Center for Auto Safety has worked to make cars safer.

Anthony: Hey everybody.

Weather Updates and Premium Subscription

Anthony: It is Wednesday, November 12th, 2025. It’s snowing where Fred is. It’s dark and gray where I am, and Michael has the sun blinding his face. It’s amazing. So much so that the camera’s now out of fo Oh, it’s coming to focus, but only the people at the premium subscription can see that.

Yeah,

we don’t offer video to anybody.

That’s not what we’re here about.

Discussion on Hands-Free Driving

Anthony: Let’s talk about hands-free driving. everyone’s done it. Come on. Even if you don’t even have the feature, I know every single person listening is taking your hands off the wheel. You’re on the highway, you’ve [00:01:00] closed your eyes for a few minutes and you’re like, oh God.

Fred: It’s nice to relax. And then you

realize, I just wanna say, oh, sorry. I just wanted to say we’re in the second day of the first snowstorm of the year, and that’s keeping all the Waymo’s away. They’re apparently not used to this kind of weather in this kind of part of the world.

Anthony: they are moving to Detroit, Fred.

So that’s coming up.

Fred: I think they’re just doing that for the crappy roads. Most of the manufacturers think of Detroit as an extended test range for their suspensions.

Anthony: Okay, so back to hands-free driving. I don’t know what just happened there.

Studies on Driver Assistance Systems

Anthony: So there’s there’s been some interesting studies coming out saying, Hey, do hands-free driving, self-driving features.

Now I’m not talking Tesla. Talking, your lane keeping assist, your lane departure warning, your blind spot monitoring. Does [00:02:00] this make you safer as a driver? Does this make cars on the road safer? And you’re thinking is, yeah, all these safety, we love all these safety features, right?

Fred: Yes, we do.

most of ’em I turn off the lane keeping in my car.

Anthony: Okay.

Michael: Yeah. it’s hard to, it’s hard to just give a blanket yes or no. We love all these systems ’cause all of ’em are very different, right? we’ve seen proven benefits from automatic emergency break-in, we’ve seen proven benefits from blind spot warning.

I, I think that ultimately we will see benefits from lane keeping technology, in, in terms of preventing head-on collisions and which are incredibly devastating. But when it comes to some of these other driver assistance features, partic, particularly the ones that kind of trend towards convenience, features versus safety, things that are intended to give the driver the opportunity to [00:03:00] disengage from the driving task.

that’s where the problems start.

Anthony: the associate professor of risk management, Ashish Agarwal at Texas McCombs University, I don’t know. he that’s the

McCombs School of Business at

UT

Austin.

Michael: Oh, there you go. The McCombs School of Business. they they did some research into this and it’s interesting.

Anthony: quick summary. Blind spot detection reduced the daily number of hard braking events, 6.76% and speeding events 9.34% compared to cars without a DAS. By contrast, lane departure forward collision warnings led to 5.65% more hard braking and 5.34% more speeding. So it seems like the the forward collision warning almost the a EB is making people a little complacent.

Michael: They’re like, the car will just take care of it for me, when I get that be beep, I’ll just slam on the brakes. whereas the blind spot protection is, Seems to be a good feature. [00:04:00] Michael, what did I get wrong?

I, I’ve been looking at this study and I’m trying to act, actually figure it out because, they I don’t understand some of the correlations here.

why would blind spot detection, I could see how it would reduce the daily number of hard braking events, right? it’s, you get a blind spot warning, you break and, making, make a move to avoid whatever might be in the other lane you’re going into.

Or not go into the other lane.

Yeah I’m still, wondering how that plays out in terms of speeding. and maybe they’re using speeding in a different way here than we’re traditionally used to. But obviously you can see why a forward collision warning is going to lead to a knee-jerk reaction by drivers that break immediately to avoid the problem.

I I still, again, I don’t see how that contributes to [00:05:00] speeding.

I know when I’m driving and I see a car and the lane next to me and I see they have blind spot detection. ’cause you can see the little icon in their side view mirror, it makes me feel a little more comfortable. I go, oh, okay I’ll sneak right into their blind spots and see if it’s working.

No I feel a little safer. I don’t, they didn’t do any study that related to this is you have, Tesla, obviously Ford and gm, they. The ones that really promote the hands-free stuff, the Tesla nonsense stuff. the Ford Blue Cruise, the GM Super Cruise, and of course Tesla comes out and says, Hey, we recorded one crash for every 6.36 million miles driven in which drivers were using autopilot.

For drivers not using autopilot. We recorded one crash every 993,000 miles driven. I’m just gonna go right [00:06:00] into, this is my gaslight because Oh, wow. Yeah, I know right, right into it. This is. Because it’s data from Tesla, you can’t be trusted.

No, they’re the David Copperfield of crash data, right?

they, yeah. Everything they report is, one of the number one problems with their reporting is they are conflating miles driven by humans in every environment, every driving environ in America with miles driven by Tesla, primarily on, divided highways and autopilot. and then use and going back and comparing their autopilot vehicles on autopilot and their vehicles without autopilot to the general crashes per, mile for humans, which is, it’s.

It’s useless from a safety perspective. it, it is used primarily by Tesla for marketing to say, Hey, we’re safer than, we’re safer than humans and we’re better than humans and you should buy our product. But I don’t think if you broke that [00:07:00] data down, which no one can because Tesla doesn’t share it with anyone that you would be able to conclude the same things that Tesla’s coming out with.

Anthony: And we’ve learned from lawsuits and whatnot that Tesla’s autopilot magically disengages seconds before a crash. okay, Autopilot wasn’t involved in the crash then ’cause it was not there at the time of impact. So again, Tesla, this is just nonsense with their data and mixing things up. yeah.

Michael: And takeaway, sorry, real quick is

Yeah,

Anthony: go for it.

The Importance of Driver Responsibility

Anthony: If you have any of these things, you are the driver. You are in control. Stop trying to read a book. Go to sleep late, Tetris.

Michael: we could scream that from the rooftops, but the companies that are selling these products are screaming just as loudly that they’re building a car that makes it okay for you to disengage from the driving task [00:08:00] and not pay attention.

and they’re going so far as to say, that you’re gonna, in, in a couple of years, you’re going to be able to read a book, answer emails do all these other things that, there’s, I don’t think there’s any, large agreement among re amongst researchers that, those that we have the technology to do that, that we have effective driver monitor to make it safe.

and once again, they’re taking the path that Tesla’s chosen to take, which is to make all of us on the roads, Guinea pigs, this new technology when we don’t fully understand the safely safety implications of it.

And it’s clear from the information that Tesla puts out, I don’t think they understand.

Anthony: So that is MyLight.

Fred: I’m sorry. There’s actually a tell in the date of that they report that it’s nonsense. when they say it is I can’t [00:09:00] remember the exact number, 9.63. so they’ve got three figures in it, something like that.

6.6, 3 million,

six, excuse me. 6.63. The more figures you’re have in it, the more confidence you have to have in the data.

And I think that by using three significant figures in there, they’re basically just making things up because there’s no way you can have high confidence in the data that you’re analyzing with that many figures in it. It would be if they said something like it’s, around six to 7%. Okay, I do that intentionally.

That’s a thing these days, but anyway, six to 7%. Then you’d say, all right, it’s in a ballpark, and I can believe that with all the fuzziness in the data, eh, maybe it’s right. When they say it’s, 6.63, that’s just a made up number. There’s no way the data with high confidence can [00:10:00] support that much image information.

So that’s a little bit subtle, but whenever you see statistics that have that many figures in it, implying that kind of precision in the analysis, it’s nonsense.

Michael: Oh, great.

and just to back all that up, IHS has tested all of these driver assist vehicles, including Tesla, autopilot, blue Cruise, super Cruise, Hyundai Genesis stuff, Mercedes stuff, Nissan stuff.

And none of them got a good overall rating. I think that alone suggests that, they’re not ready yet. Which, leads me to ask why is it allowed on our roads?

I don’t know, scream it from the rooftop, but not from a driving vehicle. Related to this is an article on MSN. you were asking Michael I don’t understand this report.

And they had a little paragraph here. I don’t know if it really relates, but it said basically asking why the difference between [00:11:00] these blind spot detectors and imminent collision warnings and whatnot. And so quoting from this says, urgent alarms trigger action that make drivers feel safe without understanding.

They create a sense of surplus safety that drivers quote unquote, spend as riskier behavior later, reflective cues by contrast, provide a micro lessons pause, scan, decide over time these micro lessons accumulate into judgment. So that’s I think what they’re saying with blind spot detection, where it’s a little thing here, whereas like imminent forward collision warning is more of beep be beep.

And it just, it’s a little more shocking, But I like that it makes drivers feel safe without understanding. Yeah,

I think, I don’t know what that summary means really. if your forward collision warning is giving you an urgent warning, then either there is a car or some other objects.

Usually a car, since that’s what the a [00:12:00] b and forward collision warning are built to detect. It’s right there in front of you and you’re close to it. And I’ve got Ford Collision warning on my car. anytime it’s gone off, it’s typically been, I usually when I’m making a lane change and there’s still a vehicle close and in front of me, and it’ll start to go off, or, I’ve never been in a crash imminent scenario where it’s gone off, but it seems like it’s not.

Anthony: you should understand why it’s going off. You’re too close to the vehicle in front of you. so I don’t know. I don’t know where they’re going with that. Perhaps I should go back and take another look at the study to understand the, how I don’t really understand how the warnings from a Ford collision warning system are going to give you this, give you this safety wallet [00:13:00] that you can then turn around and use to speed later, riskier behavior later.

Fred: That doesn’t, I don’t know. That doesn’t compute with the way I drive. I don’t know if it’s, I, I don’t know. I’m completely clueless on this.

Michael: Could I put it out that perhaps you’re an atypical driver.

Anthony: I would accept that I’m an at typical human in lots of ways.

Fred: Okay. I wasn’t going that far.

That’s a different show we do called What’s Wrong with Michael?

Yes. That’s, that one’s in its 12th year.

It’s just an hour of crying. It’s it’s good. It’s it’s like green noise.

Anthony: It’s not a great lesson.

but if you don’t wanna hear Michael Cry, go to auto safety.org and click on Donate. That’s right.

Michael: Every time you donate an angel gets its wings. I don’t know. Okay. So that was my gaslight. Fred, do you have a [00:14:00] gaslight?

Fred: I do.

Automated Transportation Symposium Insights

Fred: I had the pleasure of attending the Automated Transportation Symposium in Arizona last week, and we were, it was pretty heavily attended. I was pleased to find a lot of people there who confessed to listening to our podcast.

So thank you for listening all those folks and many of the self-driving vehicle companies. One of the presentations I went to was by the Texas, I can’t remember the exact name of the organization, but Texas Public Safety Organization, where they were talking about developing inspection procedures for automated freight trucks.

And it was interesting. apparently the state of the art of what they’re doing is trying to find a way to inspect trucks to the same standard that is being used for human driven trucks. So basically, this is just all of the features like brakes, lights horn the standard features you’d use for human driven [00:15:00] truck to essentially do the safety check.

of course these are obsolete now because they were all. Developed with the intention of a human being, being responsible for actuating and take advantage of these safety features. I asked them if they were concerned about, or thinking about evaluating the safety and life critical logic that’s built into the computers.

It’s pointing out that there are hundreds of functions, logical functions in the vehicle that also impact safety to the same extent as the physical features they were trying to inspect. they said, no, they’re not doing that. They have no intention of doing that. manager of a trucking company, automated trucking company, happened to be standing nearby, was listening in as I went through this and I said one of the examples is a solid state memory that’s in the truck.

The solid state memory has limited life. So how [00:16:00] do you know that a truck that’s on the highway, how does a safety officer who stops the truck and wants to inspect it, how do they possibly know what the status is of these logical functions and life limited functions, all of which impact the safety of the vehicle as it’s driving down the road?

The company representative, the president, jumped in and said we change out the solid state memory before every trip on the truck.

No, they don’t.

And which is a good rhetorical device to, jump down my throat that way. But it doesn’t really address the logical functions, the adherence to safety margins that are built in and all those kind of things that are buried deep inside the computer and not visible to a human being.

My conclusion is that. The Texas Safety Officers are not really concerned about the overall safety of the [00:17:00] trucks that are now inhabiting their highways without human drivers. don’t know if this is intentional or just people stubbornly adhering to practices that they’ve grown accustomed to, but they get my Gaslight award this week because what they’re doing is, if not counterproductive to autonomous trucking safety, it’s certainly insufficient to monitor the safety of highway trucks going down Texas highways.

Anthony: Wow. I, my takeaway from that is a CEO of a, we’ll call it mystery company claims that after each trip of their self-driving trucks, they replace the flash memory and replace it with what brand new flash memory. ’cause that would be the implication he’s making. But flash memory will go through 10,000 read write cycles before there’s a problem, if not more.

what they were replacing it with wasn’t clear.

and I have zero confidence that any of that is true. ’cause that is the dumbest, [00:18:00] just that that’s the dumbest statement I’ve ever heard in my life. that is that is batshit ccra crazy saying for each trip we’re replacing this with a brand new module.

there’s no, that is dumb. that is, I work with some people who work on security issues a lot, and they travel across borders and they wipe their phones regularly. Every time they enter a new country and they erase things and do things, they don’t go in there and replace the memory in their phone.

this is, oh, this is poor shit.

Am I winning this week, Anthony? Or am I at least in the lead so far?

You’re in the lead so far. I want these companies to explain this happen that, and the, okay, let’s pretend this is actually happening. That means they’re sitting on piles and piles of flash memory.

Fred: If I’m an investor in [00:19:00] one of these companies, I’m gonna go like, why are you just burning a hundred dollars bills at the end of each trip? this is so dumb. This is beyond stupid. Yeah, but the thing is, this was not your gaslight ’cause we’re saying this is mystery company X.

Yeah. So I’m restricting my gaslight to the Texas Public Safety Officers Officer because the statement is certainly possibly misinterpreted and there may be additional details that would affect, exactly how the hell they’re trying to do this.

But the idea that. The Texas safety officials generally don’t care or appreciate that software is as important for safety critical operation of the vehicles, as is the physical components they’re inspecting is the heart of it.

I’m gonna, I’m gonna be a little more optimistic than you and suggest that it’s not that they don’t care, just they are don’t know yet, [00:20:00] they haven’t been enlightened by someone as astute as you.

It could be, it’s fair to say that the physical appearance of these gentlemen is not typical of software developers. just put it that way. Oh, and if you had a cliche of good old boys in Texas your physical appearance was closer to that.

Probably cowboy boots, cowboy hats. What do we got here, Fred?

we a lot of fairly bulky individuals. A lot of experience ordering people around. It seems,

Anthony: you did violate the cardinal rule, which is don’t mess with Texas, but that in my book gets you an extra point. before we jump to Michael’s gaslight, can you you told us about this interaction you had with Waymo, and I would love for you to share this with the listeners because you waymo’s often the focus of your gaslighting.

I thought this would’ve been an amazing [00:21:00] gaslight as well, but please.

Fred: Sure. and this is not a slam on any individuals there. People at Waymo seemed to be very nice, you must say that, but there was a safety, there was a panel where they were reviewing risk and the risk profiles of different vehicles. so I observed that the entire industry uses the definition of safety boating.

ISO 2 6 2 6 2, that safety is the absence of unreasonable risk. And so I observed through the panel that if that is true, then there is a residual risk that includes things that you need to accept and things that you try to mitigate, right? So every risk falls into one of those three categories, and to remind the listeners risk is the combination of the consequence of something happening and the probability that it will happen.

So if it’s really bad and it’s likely [00:22:00] to happen a lot, it’s very high risk. So we discussed that and I said, isn’t it important for the, and I wanna back up a little bit more and say, this is not just Waymo. Waymo happened to be responding to this, but there was a panel of different industry people.

to whom I addressed the question. So anyway, the question is, if these risks are out there, if you know what they are, if you’ve categorized them into those that you accept and those that you have tried to mitigate somehow, and you have a list of the ones that you think are reasonable, isn’t it important to share that with the regulators in a community where you wanna operate these vehicles?

Shouldn’t they know and shouldn’t the public know what these risks are and what’s, what they actually need to be concerned about for these vehicles operating [00:23:00] on their roads? And the response I got, which happened to be from a way more representative, was that airplanes don’t have to report the risks to the public, so why should we?

And my response was that. It’s really a different situation because aircraft are required to go through certification. Those certification standards are publicly available. They’re actually written in law, I think far apart. 25. Isn’t that right, Michael? Yeah, they’re,

they have, they have a completely different certification system than vehicles to

Right.

And the residual risk is one in a billion for every flight hour. So that’s, they have to establish that is the standard, the renewable standard for how risky airplane travel is. So I observed that it’s quite different with the automotive industry where there are no such standards. There is no certification requirement and there is no numerical value of residual risk.

Which [00:24:00] means when all is said and done, what are the real hazards that you got out there? or the automotive industry. There was no response to that. Except that everybody said that’s kinda what we do. So that was, that’s the hard of the exchange.

Michael: I find that entertaining. I have to jump back. It popped in my head while you were talking about the guy, the CEO saying, we replaced the memory after each trip.

That doesn’t solve the problem because you can buy brand new shrink wrap memory modules that are faulty. It’s like that doesn’t mitigate testing to see if this stuff is right. And the Texas public safety people, they need an ability to test. Is the system not faulty?

And is the system is the software safe?

The software is driving the vehicle, not the memory. So is the entire software suite safe? Are the [00:25:00] assumptions made in the software? correct. And is the software conforming to the assumptions made by the programmers with respect to safety margins? Things like for example. The commitment of this microprocessor to the driving task.

And normally you would want to have the microprocessor capacity in the range of 60 to 70% for an automatic operation. You want to have that margin left over of 20 to 30% in case something’s happening. That’s unanticipated. And it always does, by the way. you need to have some safety margin built into your software as well as into the physical components that make up the data processing system, as well as the sensors, as well as everything else that’s attached to it.

And what the Texas Department of Public Safety is saying is that we don’t care. ’cause we don’t have to. It’s like the old at and t, Skit that [00:26:00] they did on in a hundred years ago. Oh, you weren’t born then. I’m sorry. But it was a pretty good skit. One ringy dingy, two ringy ding. We don’t care. We don’t have to.

Okay. Okay. Michael, what’s your gaslight? I.

Anthony: All right. Mine is coming from, it’s a recidivist. We’ve got a a repeat violator here, and I don’t know if this is gonna be his last time because he is,

oh, no, it’s not

Michael: creepy. Creepy cars. Car salesman turned, Senator Bernie Moreno set out to prove that Democrats who backed provisions in the 2021 transportation reauthorization that mandated, or, took steps to mandate safety systems like alcohol detection technology, which I believe my last gaslight on poor Bernie was about.

he took it upon himself or his staff to essentially stalk these other senators vehicles to try to prove that they hadn’t taken advantage of these [00:27:00] new safety systems by purchasing them. he. Again, has no idea what he’s talking about. I think I talked about that the last gaslight on him.

There are no alcohol detection sensors in cars now, so you’re not gonna find anybody that has those in cars. he tracked down the VIN numbers of the cars that these centers were riding in. I don’t know if he ran them against some type of database that could tell him whether or not they had purchased the additional options of safety equipment, but.

He, he, I think we need to turn this and look back at what’s really going on here. This is a former car salesman who supposedly divested himself of all the dealerships he owned when he became a senator. However, his son is still a car dealer and he has supposedly invested in his son’s dealerships and now he is [00:28:00] stalking other senators to try to show that Americans should be.

Purchasing these safety systems as options, which of course, inflates dealer profits significantly, rather than having the government mandate safety into all vehicles so that we start to see things like drunk driving distraction and all these other good provisions that were included in the 2021 reauthorization to come to fruition, which none of them have yet.

automatic emergency braking is still somewhat in dispute. a, a nitsa got the rule out, but Nitsa is still way behind on getting out research and mandates on drunk driving prevention technology and, A few other safety system features that they were required to, driver monitoring and other things they were required to look into in the Reauth.

So Bernie despite your creepy efforts and your clear you’re clearly got a, your hand in the cookie jar here as a former car [00:29:00] dealer and continually supporting your son and his car dealerships and they’re making a lot of money selling this kind of new fancy tech as options versus making the safe stuff mandatory.

you’re gonna get my gas light of the week and stop being so creepy.

Anthony: Yeah. How accurate are VIN number lookups? Because I kinda remember, looking up the VIN number in my car and it telling me that it’s not exactly the car I have it had some features listed. I’m like, I don’t have that.

Michael: And they called it a specific model that is not mine.

Yeah, I’m not sure how exactly accurate they are. In fact, I, my, I bought a 2019 Volkswagen Jetta. That’s the car I still own and drive. It’s been a great car about two months ago. I’m driving. and there is a vehicle probably, I don’t know, 50 yards ahead of [00:30:00] me.

Not, probably not that far, maybe 30 yards ahead of me. Takes a really quick breaks quickly to take a right into a gas station. and. I only thought I had far collision warning based on what I had looked up about my vehicle, my VIN number and everything. And e even in the Volkswagen literature, it looked like I only had far collision warning.

Anthony: my brakes engaged, so I do have auto American emergency braking. I didn’t know I have it. So yeah, I would say, there is, there when these features aren’t mandatory And they’re not coming out on every car, it’s sometimes hard to know whether or not you actually have them.

Michael: Yeah. I had a weird incident like that too. ’cause I’ve had this car for four, five years at this point. And the other day, first time ever, it gave me a blind spot [00:31:00] monitoring, warning. Warning.

Anthony: Whoa.

Michael: and I have in the side view mirrors, there’s that little icon, but it’s never turned on. ’cause I assume we just didn’t pay for that feature.

Anthony: And I was leaving a lane, but somebody’s racing towards me and stuff on the dashboard and in that side view mirror lit up. Wow. And whoa. And I’m like. Is my car getting smarter?

Michael: No, I think it’s just because you’re getting worse as a driver. you’ve never had the problem before, so you’ve never noticed it.

That be right and you screwed up. Why don’t you take the blame instead of putting it on your car?

Anthony: I’m just saying the car is getting cool, man. That’s all I’m trying to say. but it leads me into our next story.

GM’s Infotainment System Issues

Michael: Is my car getting smarter? Because GMs are not getting smarter, they’re getting dumber.

Anthony: this is an article [00:32:00] from gm authority.com. Ooh. titled, these GM vehicles can no longer download apps through their infotainment system. Quoting GM is pulling the plug on app functionality for a significant number of vehicles equipped with its previous generation. Infotainment technology owners of certain 2017 through 2020 models now find their in-vehicle app store.

Inaccessible direct result of the company ending support for its legacy NGI systems. I guess NGI was the software or something like that. So you bought a car five years ago and all of a sudden part of it doesn’t work anymore.

Michael: Yeah, that’s essentially what’s happening here. GM announced, and I think we talked about this maybe a year ago, that they were.

Going to move away from the Apple and the Android car interfaces and do their own, which we thought, is a terrible idea. why does GM think they can do that? and now, I [00:33:00] guess in pursuit of that ultimate goal, they are essentially shutting down their little app system that they have in these vehicles.

the oldest of these vehicles is what, eight years old? Some of them are I think the 2021 Buick Encore is the most recent one. you could have bought your 2021 Encore in, in four years ago, and now you are losing the functionality of your app store and, The apps that you might use in there, which could be, navigation, your Apple card play, and a few other things that you, a lot of people probably use, I don’t know how.

I don’t use my incar stuff that much. I don’t use a phone or anything like that much while I drive as much as possible. So that’s something that, is scary in a way like that, that you can buy a vehicle. and it shows you [00:34:00] now that, we’ve talked about subscriptions and those things a lot, how you don’t really own your vehicle.

You’re not really able to determine the future of what your vehicle is. Because here you are five years into ownership and the average age vehicles on the road is 13 years. You get to go through the next eight years of vehicle ownership without a part of the vehicle that might have been important to you at the time of purchase.

Anthony: this type of ob it’s hard to say this is obsolescence ’cause it’s not as though, I don’t know when they designed these vehicles, if they intended to get rid of the functionality of these things, but that. Essentially what it is you’re losing part of the vehicle that you purchased. and so the, it’s car is a very expensive piece of equipment, right?

Michael: when you buy a phone or [00:35:00] a computer and then five years later you’re not able to do certain things on it. That’s normal within the product lifecycle. With a car, not so much, especially due to the high price of cars. You wanna maintain your functionality as much as possible. I think this is disappointing for GM owners and I hope this isn’t something that we continue to deal with, but I think we will across the industry for a lot of reasons.

One is that, in-car electronics, I think are going to be becoming obsolete faster than the mechanical parts of the vehicle. and. I think that consumers, manufacturers aren’t gonna let that just go. They’re not just gonna give you a new system to replace what you’ve got. You’re gonna be outside of your warranty and they’re gonna want to charge you for it.

Anthony: They’re trying to monetize every part of the owner [00:36:00] experience now. It’s not just buy a car, leave the dealership and your good it is, buy a car, leave the dealership, and then continue to subscribe or buy everything that we want you to until you get rid of the car. So it’s a new ownership model, and I think it’s a terrible ownership model for consumers.

Michael: But it follows, the general mantra in society today, which is that anything that can make money is good.

Fred: There’s another issue here, which is that, excuse me, a brilliant marketing idea and a part of GM to negate the value of the phone that every single customer has in their pocket. And to make sure that if your child wants to call you, for example, when you’re in the car.

Instead of just having it come through the infotainment system, you’ve now gotta pull your phone out of your pocket or [00:37:00] out of your purse or wherever you keep it, and you gotta manipulate it, push a couple of buttons before you be able to talk to your child. This seems a really let me say stupid marketing idea as well as an unsafe thing to do to add a lot of manipulation to the operation of talking to somebody who calls you when you’re in the car.

Anthony: Hey, but since I’ve got your attention, Anthony, I wanted to give you one more anecdote from the a TS conference.

Fred: Go for it.

This is my first contact with actual Waymo operation and Tempe, Arizona, where the conference is held. And Phoenix allow operation of robo taxis by Waymo.

Waymo Privacy Concerns and Experiences

Fred: So as I was sitting in the hotel these devices kept zooming up and down the long twisty access road to the hotel, which was pretty scary.

And somebody asked me if I’d [00:38:00] gotten a ride in the Waymo, and my response is that I thought about it one time and then I actually downloaded the Waymo app and made the mistake of reading the user agreement which stopped me from ever wanting to give them a nickel because it’s, you can give up an awful lot in order to use the Waymo in terms of your own personal privacy.

And I encourage people to read that agreement. The only interaction I had with him af other than that, is when I was taking my taxi back to the airport to leave. we were riding along in the travel lane and there was a Waymo by the side of the road that apparently decided that was a good time to leave the breakdown lane and enter the travel lane.

as we were passing it, it honked at us. Even though we were stationary, [00:39:00] apparently we weren’t driving fast enough or slow enough or far enough away to satisfy the Waymo’s computers that we were accurately driving or appropriately driving. So I was really puzzle by that. So maybe it had been programmed to give me a kind salute on my way out of town because they didn’t have to answer any more questions than I was posing.

But I just thought that was interesting that the Waymo was urging us along by giving us a little toot.

Michael: And I would suggest that, Fred is concerned about privacy, that there are probably some even more nefarious things in the user agreement regarding liability if you’re in a crash.

Oh

Anthony: yeah, that was all there.

Yeah.

You’ve taught me Michael,

and if you feel you’ve been taught, go to auto safety.org and click on donate and let us know that you’ve been taught. before we [00:40:00] move on, I wanna jump back to this GM infotainment story. So I just wanna get this right.

So let’s say five years ago, I shell out 30, $35,000, get my Chevy Equinox. Now it’s five years later. I want to get a new car. I want to go sell that car. And essentially I’m selling a car with a known part of it that’s broken. Yeah. And there’s no dealer that can fix it. There’s no repair shop that can fix it.

Michael: There’s no one, it just literally, GM says, oh, you wanna sell your car? We’ve, you can sell your car, but we’re gonna take reach into your pocket and pull out an extra five grand. ’cause you’re selling a broken car. is that about right?

Anthony: it’s gone. That functionality’s lost forever.

And I paid for it.

Michael: Yeah. And you paid for it at the time. You paid for it.

Anthony: But it’s got, that’s like equivalent of after five years. Oh, the backseat. No, [00:41:00] that’s not a thing anymore. You can’t use the backseat like. Just we’re not

Subscription Models in the Auto Industry

Michael: letting I think we’re gonna see more and more of this. We already are, we’re seeing, seat heaters and remote starting sold as subscriptions now when they used to be features that were purchased with your vehicle.

If you look around anything that can possibly be monetized in society as being monetized I just don’t see where this is going to end. Hopefully we’re not paying for seats. Hopefully we’re not, hopefully, and this is something we’ve maintained since the beginning, that safety systems are never included in subscription models.

When you buy a car with certain safety systems, that those should remain functional with the vehicle for the entirety of, at the time it’s on the road. But I have concerns in that area, particularly with safety systems, that, that aren’t mandatory right now. So if they’re not required to be installed in vehicles under the Federal Motor Vehicle Safety Standards, but they have [00:42:00] a big safety upside.

And I’m just concerned that sub subscription models are an they’re an immoral way to go. Frankly.

Fred: I wonder if they’ll sell automobile capability cards at the Piggly Wiggly, like they’ve now got the, gift cards you can buy. Yeah. Here’s,

Michael: You get a an airbag for a week if you buy this card.

Fred: Yeah. Or, you’ve got a gift card that sells somebody that don’t seem warmer for one winter season, or, I think there’s a lot there that could benefit everybody.

Michael: I don’t. I hope that even, the, our corporate overlords aren’t that evil. When it comes down to it on safety systems, and hopefully this all stays in the range of heat cedars and massaging chairs and all the other whizzbang features that they’re throwing into cars.

But I, I certainly think there’s a problem in another safety issue that’s hiding there that no one wants to talk about, which is that you’re [00:43:00] building vehicles that have the equipment available to provide all those features. That means you’re adding weight to the vehicles and we know what that does in crashes.

Anthony: Listeners, before we move on, if you’ve been astute, you’ve noticed I have not scored this week’s gaslight and I, it’s time to do that. So Fred, that was great. I think what you put out there, it’s gonna make progress in the world. Your other gaslight could have been this mystery, CEO, but Okay.

Michael, a good one. You got the creepy senator in there. Mine was low hanging fruit. Wasn’t that great. But, so this week’s winner is Phil Copeman. That’s right. Phil Copeman for last week. For last week’s Ultimate Gaslight of all time. Sudden emergency, a sudden unintended acceleration. That’s right. Bill Copeman, congrats.

Anthony: Okay, before we jump into the Tao, ’cause this will be somewhat related Waymo’s are coming to Boston. Oh yeah. And quoting from an article from Bloomberg, Boston City counselors are proposing an ordinance that would require [00:44:00] alphabets of Waymo and other autonomous ride share services to have a human operator present in the vehicle.

Hey, that sounds good. You read that thinking, all right, they want safety. This is a good way they’re doing it. That’s not why they’re doing it. Continuing the legislation would mandate a study on the technologies effects on the employment of current rideshare drivers and prohibit self-driving cars from operating in Boston until that’s been completed and other permitting requirements are established, even then the cars wouldn’t be allowed to operate unless a human safety operator is inside the vehicle and able to intervene according to the proposed ordinance.

I like this. They’re making these cars somewhat safer by requiring a safety driver, but they’re not doing it under the language necessarily of safety. They’re doing it under protecting jobs.

Michael: Yeah,

clever those.

Yeah, that’s and I don’t think that’s uncommon. I think you, you will see that the labor unions are pretty involved in the autonomous vehicle space, particularly for that purpose.

And I think we’ve, I think we’ve seen Governor Newsom out in [00:45:00] California veto a bill requiring safety drivers, that, that was supported by labor unions. So the labor component of all this is important, right? There, there humans are being replaced by computers in a lot of other areas right now.

And, autonomous vehicles are one of many areas where the human labor force is threatened by autonomy or AI or other factors. And. I think when you have any questions about the safety of autonomy and autonomous operations we know that the humans aren’t perfectly safe either, right?

A vehicle could still have problems. An autonomous vehicle with a safety driver, killed a human in Arizona not that long ago. And so we know that it’s not a perfect system. But, I think at this point and we’re still in the very early adoption of autonomous vehicles, that, the keeping a safety driver in the car, both from a labor perspective as well as a safety perspective, is warranted.

So we would [00:46:00] support Boston’s activity in that area.

Fred: Yeah. Lemme also point out that if this autonomous or self-driving software. Can actually be proven to be much safer than a human being, or even incrementally safer than a human driver. It should be reconfigured so that it acts as a supervisory controller on conventional vehicles.

’cause it would then add to the safety capabilities of a human driver rather than somehow being an alternative to that. I think that if you can make a solid argument that humans are what they are, and by adding this capability that we’ve now perfected for self-driving vehicles to the human driving experience, we’ll get a we’ll get a superior safety profile

VO: there.

Fred: I also think that it would allow then to alert other people outside of the vehicle when the vehicle is being operated dangerously. For example, if it crosses a double [00:47:00] line. The lights should start flashing so that oncoming traffic knows that there’s something screwy going on ahead. This would be a very simple software adaptation of what the self-driving vehicle technology is already able to do and understand.

So I, I think there’s another dimension to this whole argument about autonomy versus human drivers that could perhaps expand the safety range of both.

Michael: Yeah and I think really importantly, we’re looking at, the advent of a safe, autonomous vehicle does not change the fact that humans are permitted to operate vehicles in the United States, right?

There are going to be decades and decades before humans are and may be longer. Who knows, before humans are taken out of the driver’s seat and deploying technologies to keep the human driven vehicles safe is going to be of utmost importance during those decades. You [00:48:00] it’s a little bit, and it’s something we talk about a lot.

It’s, there’s a lot of focus at the federal and state level, even on these autonomous vehicles that are coming. There’s. Scant focus and not nearly enough focus on how we can use those technologies to make the human drivers on our roads safer. Now, for instance, an autonomous vehicle won’t speed that’s written into their code.

They cannot speed. And yet we’re not willing to put those same restrictions on the humans driving on our roads. Even though the technology is available and cheap and could save 10,000 lives next year. You have the choice to ride at the speed limit in autonomous vehicle, or you have the choice to speed operating a vehicle as a human.

I think we need to deploy those technologies in ways that effectively force the population to operate safely.

Anthony: Michael, I like it. You’re forcing people to do things they don’t want to do. Tell ’em to eat healthy, drive safely. [00:49:00]

Michael: No, look, authoritarianism is in its heyday in America right now.

So we might as well force people to be safe as well.

Anthony: There you go. Alright, with that limber up listeners, it’s time with a towel, Fred.

Regulatory Checklist for Autonomous Vehicles

Fred: We have been talking about checklist for regulators for the last couple of weeks, and what we’ve done to refresh people’s memory is to put together a checklist that makes the safe driving technology that could be embodied in self-driving vehicles accessible to regulators.

We cannot expect regulators to be experts on engineering as well. They’re busy doing what they’re doing and getting pulled in a hundred different directions every day. So we’ve put this together I’ve actually shopped it at the a TS last week. Hopefully we’ll be getting back some comments from people about what they think and the [00:50:00] has been a positive reaction so far.

For the most part. We talked about a few of these last week or two weeks ago, so I’ll add another few today. So number four that we’ve got is, as the applicants, presumably the AV developer adequately documented the computer driven vehicles ODD or operational design domain. In other words, every self-driving vehicle has got restrictions on when they can operate, where they can operate, how they can operate.

These are all embodied in what’s called the ODD or Operational Design Domain. So the question that should be posed to the applicant is have you actually documented this adequately so that you know where the vehicle’s going to be allowed to operate and when? It’s a pretty basic thing, it’s asking the question, what’s the burden of answering that back onto the developer, not on the community to tell [00:51:00] the developer what they need to do.

Has the applicant shown that all computer driven vehicle operations within the operational design domain are legal? That’s important because companies like Tesla are now developing self-driving technology, which they call level two to evade regulation, but they’re developing self-driving technology, which they intend to use to violate motor vehicle laws zooming in and out.

I think they call it mad max mode, something like that. But zooming in and out of lanes, traveling at excessive speeds somebody has to ask the question affirmatively, have you made sure that your operations within designed command are going to be legal? Next one is have all applicable federal, state, regional, and local authorities approved the operational design domain.

One thing to have it, it’s another thing to have the regulators accept it and approve [00:52:00] it. Has the applicant received regulatory approval for computer-driven vehicle operation only within the ODD boundaries? In other words, does the is the vehicle enabled through engineering means to operate outside of the approved operational design domain?

And the answer is either yes or no, but it’s important to ask the questions because there are situations where it would be expected that something weird’s gonna happen. For example, when you unload the vehicle from the truck and put it in a showroom. Is that transit part of the operational design domain or is there some other kind of control logic that governs that?

Another example would be if people have an operational design domain, but they store the vehicle somewhere else, how do you get from the storage area to the approved ODD that’s operating outside of the [00:53:00] operational design domain? So there need to be answers to that. And of course, you’re only gonna get the answer if you ask the question.

And the last one I can see on my screen here is number eight. Has the applicant shown that the computer driven vehicle will op, will respond safely if for any reason it’s operated outside of its operational design domain? I gave a couple examples previously, but another way that can happen is if we’ll say that vehicle is being operated.

In normal weather conditions, but all of a sudden a snow storm pops up or a, hailstorm or a dust storm, or a lot of things can happen where the vehicle is operating as within its intended domain, but suddenly the domain changes. Another example would be if it requires cell phone service in order to adequately respond to emergency [00:54:00] notifications.

If there’s a fire somewhere and fire trucks are set up around it, you want to have the vehicle stay away from that. So the operations design domain suddenly changes because the police say, keep the hell out of this area. So what does, what does the vehicle do in that case? Is it safe? Does it just go about its business and hope for the best?

You need to ask the questions in order to get the answers. So we’ll do more of these next week. We’d love your response to this. Whole idea of putting the checklist together for the regulators and in particular, if the regulators are out there we’d love to know from you what you think. I spoke with one woman who is a regulator in Chicago.

Her attitude seemed to be that the AVS are intrinsically good and all sorts of wonderful things are going to happen with the avs. So you shouldn’t challenge that, you shouldn’t challenge the licensing process. Now, I may have [00:55:00] misinterpreted her attitude, but I hope that’s not a generally accepted approach to regulation of avs.

Anthony: I hope not either.

Michael: Yeah, they’re gonna save us. Let them do whatever they want.

Anthony: Yeah, so we have a link to the checklist inside the show notes, but you can also find it on auto safety.org and all at the bottom, you can click on autonomous vehicle safety. And we’ve got a links to a bunch of cool things and I’ll find a better, more prominent place to put this as well.

Vehicle Recalls: Tesla, Chrysler, Ford, Honda, and Toyota

Anthony: And with that it is time for recall. Let’s start off with Tesla. That’s right. Tesla’s recalling the cyber truck. This is the 700th 94th recall of the cyber truck since it’s been released. Or maybe it’s the 10th. It the 10th.

Michael: That’s the 10th.

Anthony: Alright. And this is fascinating too, ’cause cars recalled and also the head of the cyber truck division’s.

I quit. I’m out. I don’t know why I don’t if they’re related, but anyway.

Michael: Yeah. I wonder if that’s more related to the fact that Elon just got that massive payday and there’s a lot of stuff going on with Tesla stock and it’s a good time [00:56:00] to exit. I think they’ve lost almost all of their vehicle, their original vehicle team over the past few weeks or months.

Yeah. But that’s, this recall is 6,197 vehicles, the 2024 cyber truck. And the partially delaminated service installed optional off-road light bar accessory may create a noise detectable from inside the cabin. Separately, the customer may observe a gap between the light bar and the windshield, and the light bar might, may feel a loose one touched.

Anthony: Oh. So this is not gonna be an over the air fix, is my guess.

No, it’s another adhesive problem. There seemed to have been a number of adhesive or other type similar issues with this. I know we had an unattended acceleration issue with the pedal cover not being stuck on properly.

Michael: I think there’ve been few other elimination

between the A and B pillar. Some,

yeah. Stain steel. Yeah. So Tesla needs to work on their adhesive adhesives engineering. But at any rate it looks like this is gonna be [00:57:00] ready the day after Christmas. For owners. You can expect to make your appointment then, or just.

Anthony: Throw your cyber truck into a river somewhere and forget about it.

Michael: Hey, if you own a cyber truck, you probably don’t have friends and family to celebrate the holidays with anyway, so just park outside the Tellis dealership. Next up Chrysler 320,065 vehicles. The 2020 to 2025 Jeep Wrangler 2022. 2026 Grand Cherokee.

Anthony: And a vehicle fire can result in an increased risk of occupant injury in injured persons outside the vehicle because of battery charge level. When the battery charge level, wait sorry. In rare circumstances, a battery pack may contain cells with separator damage. Separator damage combined with other complex interactions within cells may lead to a vehicle fire.

Oh. And these are made by Samsung. These is part of their hybrid. These are hybrid vehicles, plugin hybrids.

Michael: Yeah. And this one is probably gonna be a [00:58:00] wait for, this is a park outside warning. I don’t know if you mentioned that. I wasn’t paying attention. Anthony.

Park outside.

Yeah. P park outside. Do not park in your garage.

Park away from other things. This is a battery problem. Park

next to a cyber truck.

Yeah. It looks like they don’t have an idea of what the remedy’s going to be yet, so there might be a pretty significant weight here for owners. They’re gonna send you an interim notification in early December that says, Hey, we don’t have a fix yet.

Anthony: You need to wait. So keep checking back on this recall with Nitsa on their website and sooner or later they’ll post an update and you could be the first to know.

Continuing with our showdown is Ford. Holy cow. 163,000 2,056 vehicles. 2021 to 1 23 Ford Bronco. One of the seat frame height adjustment pivot bolts may become loose and eventually dislodge on one or both front seats.

Michael: [00:59:00] That’s the, that doesn’t sound great.

Yeah. Always bad news when a seat is loose. I can’t tell from this if the seat is a, I don’t think the seats are able to fully detach, but if the seat isn’t fixed, then you’re going to have some issues and crashes. I think the most interesting thing about this is that Ford says in October 25, Ford reviewed the results of the testing, which showed that a seat with a missing pivot bolt will meet F-M-V-S-S requirements, but may not meet the Ford acceptance criteria in certain crash scenarios.

Struck me because invariably when you see an automaker a journalist goes to them for a quote about a pending that’s investigation. The automaker’s first defense is, this vehicle is safe because it meets all federal motor vehicle safety standards. Here you’ve got Forge saying that it’s not, say it meets F-M-V-S-S, but it doesn’t meet our own internal acceptance criteria, which is essentially saying [01:00:00] that meeting the F-M-V-S-S is not a guarantee for safety.

The F-M-V-S-S sets a minimum standard, which is oftentimes a fairly low bar. And so it’s interesting to see a manufacturer explicitly come out and undermine decades of what the industry’s been claiming. In interviews with journalists

Oh F-M-V-S-S, our new gaslight continuing forward, 34,481 vehicles, this is the.

Anthony: Ford. What the, huh?

Michael: These are a lot of different vehicles. Yeah, they, these are remanufactured transmissions.

Oh.

And basically they were using these as service repair parts, missing a bearing.

And this goes to 2020.

Yeah. And that could anyone who’s had a repair on any Ford or Lincoln vehicle, I’m not sure if they even list the actual vehicles here or

not.

They do not. They do not. They just list the transmission number. So this is a harder one for. For consumers.

Yeah. So if you’ve had a, if you’ve had a [01:01:00] 22, what is it, 2020 to 2025 Ford. Yeah. That’s had a transmission replaced with a remanufactured transmission in the last few years. Be sure to pay attention to this because this can cause a condition where you think your vehicle’s in park and it’s not.

Anthony: And we see a lot of problems in rollaway incidents when that kind of circumstance exists.

And that concludes our Ford section of the recalls. Jim Farley, anytime you want, we’re here for you. Next up Honda 406,290 vehicles. The. This is an Alli alloy wheel, 18 inch alloy wheels.

Michael: This is another equipment problem.

Oh

boy. Yeah.

Anthony: So this is a optional accessory steering wheel for 2016 to 21 civics. And it looks like this steering wheel has a problem a wheel nut that loosens, and that can mean you lose your steering wheel, which is funny in cartoons, but not funny in real life. It looks like what they’re going to do [01:02:00] is give you a new steering wheel, but it may not be the accessory type of steering wheel that you originally purchased.

Michael: I didn’t realize this was the steering wheel. I figured this was like the wheel wheels. But anyway next up Toyota. 1,024,470 million.

Anthony: You know what, I’m gonna go back. It isn’t a steering wheel. It’s, it is not a

Michael: steering wheel. I,

it’s not a steering wheel, it’s a wheel, which is probably, it’s a wheel. Neither one is great.

Anthony: This is an 18 inch alloy wheel. There I go. Not reading through all the recalls quickly, closely enough.

Okay. Our next recall is Michael Brooks. Basic reading comprehension. Okay. No, Toyota,

Michael: They’re still trying to determine the the root cause for me.

Toyota, 1,024,407 vehicles. Oh my God. The 2024 to 2026 Lexus RX TX plugin.

Anthony: Hybrids. Don’t even start

that. There’s 20 different [01:03:00] vehicles. Basically it’s every Toyota from the last four years

Michael: or so. Yeah, from 2023 to 2024. And I’m scrolling. And this is the Signia, the Lexuses. Oh my God, it keeps going. Keep pages. Yeah. This is a long

one, isn’t it?

Anthony: And pages

27 pages long.

Michael: Holy cow. I don’t even know if I can get they’re equipped with parking assistance. Parking assist, ECU, which is a component of the panoramic view monitor, is this a goddamn rear view image? Goddammit. The software in this E electronic control unit may cause the rear view image to freeze briefly during the backing event.

If reverse is selected, then a specific time after ignition is turned on the rear view image not to display on the next ignition on if the ignition is turned on and off within a specific time, as a result, the subject vehicles may not meet rear visitor requirements specified. AN M-F-M-V-S-S number one 11.

Anthony: Yeah. This one. Yeah. Another, sorry about that’s okay. I know how [01:04:00] much you hate those, especially 27 pages in,

Michael: Hey, I know. I was like what’s what’s the big, what’s the big takeaway? What’s the reward?

Yeah. It’s a lot of rear view camera problems for Toyota from virtually every vehicle they’ve made for the past few years and beyond that owners are only gonna get an interim notification late next month, and there’s no, no plan for when the actual fix is going to be available for owners.

Anthony: So that is a lot of vehicles out there with glitchy rear view cameras for the next few months.

Yeah, and it’s weird ’cause it’s a very, it’s a unique recall in that the rear view image will freeze if you put it in reverse within a specified timeframe from Ignition. It doesn’t say these what the, that timeframe is it?

Michael: Wouldn’t that

Anthony: be nice to know?

Michael: Yeah.

Anthony: A week and a half.

Michael: Yeah.

Anthony: What is this? Very strange. If you have these cars what you can do is turn your neck and look over your shoulder. Pretty [01:05:00] effective. We used to do it back in the day with that. Ladies and gentlemen, I hope you’ve clicked like subscribe, click, donate, clicked, all the things.

Legal Implications of Vehicle Fires

Fred: Hey, can I ask a quick legal question, Michael?

Anthony: Oh, I thought it was, you don’t want my legal advice.

Fred: Michael. Assuming you live in the American Southwest, a lot of those places have red flag warnings when things get dry. If you have to park your vehicle outside and it starts a forest fire, who’s responsible?

Michael: I don’t know that determination would be made instantly, presuming. The manufacturer would probably have to bear some sort of responsibility there. I don’t know. I can’t imagine they would pin it on an owner who simply parked their vehicle outside. That’s a pretty tough, that would be a pretty tough assignment of liability.

Fred: Oh, thank you.

Michael: I don’t have an answer, so

that’s my favorite kind of question.

Fred: There’s

Michael: no answer

Fred: [01:06:00] if I recall correctly, there have, I think there have been forest fires started by, by vehicles that, might have been involved in a crash. And in that circumstance you might, trace it back to who was at fault in the crash, but simply parking.

Michael: There were fires. There were fires started by catalytic converters. I remember reading about those because they get pretty hot

parking a vehicle out in your driveway.

Fred: And then you’ve got a spontaneous battery fire.

Michael: Yeah, that’s, I, that’s hard to call that negligent behavior, particularly when your manufacturer has issued a notice saying you should not park this indoors to avoid the known risk there.

So that’s, that would be a difficult situation.

Conclusion and Farewell

Anthony: And with that, listeners, take that one. Think about it, and click subscribe. Tell all your friends. Bye-bye.

Fred: Thank you for listening, folks. Bye

Michael: bye.

VO: For more information, visit www.auto safety.org.