Waymo’s Woes, Tesla’s Troubles and more auto safety news.

A power outage leaves Waymo’s stranded physically and metaphorically as they, in our opinion, try to figure out who to blame. Tesla is finally told they cannot use the term “autopilot” to sell a feature that is not “autopilot”. Uber doesn’t seem to care if they hire criminals, New Hampshire thinks safety inspections are no longer needed and some Senators try to make autonomous vehicles stick to an operating design domain. Plus recalls.

Support the only show that pulls the curtain back on tech-bro nonsense.

 

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Anthony: You are listening to There Auto Be A Law, the Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Seminar. For over 50 years, the Center for Auto Safety has worked to make cars safer.

Hey

listeners, welcome to the Trump Branded Center for Auto Safety Podcast. No, I’m kidding. We’re still the, just the Independent Center for Auto Safety. And but if you don’t donate, we might just have to change our name and make everything golden.

Michael: Oh, come on. Is that your Kennedy Center reference this week?

Anthony: Hi, it’s everything. Battleships Kennedy Center. Yeah. Yeah. Anything stupid is what it is. But today is Tuesday, December 23rd.

Waymo’s Traffic Light Dilemma

Anthony: And let’s start off with a little company that Fred is a big supporter of called Waymo. Those of you who may not be aware Waymo operates in San Francisco, and San Francisco had [00:01:00] a power outage, and this caused all of the traffic lights to go out.

And so what does a Waymo do better than a human? It just stops. They just stopped at intersections and did nothing. Now Waymo says, Hey, when the power goes out, we just treat traffic lights when they’re out as a four-way stop, which could be my gaslight of the week because that sounds vaguely reasonable.

Like humans are like, oh, four-way stop. I’ve been there. I know what that means.

Michael: That’s what humans are supposed to do. What I mean, if traffic lights are out, they’re supposed to treat it like a four-way stop. Did Waymo do that?

Anthony: No, but the thing is, it’s more than that is yes, humans do that.

And so we come to an intersection, it’s a four-way stop or the traffic lights out. What is the number one thing all of us do? Come on, Fred. What’s the, you come to a four-way stop. What do you do? Stop. Yeah. What’s the [00:02:00] next thing you do after you stop?

Michael: You look to see if anybody’s coming.

Anthony: You make eye contact with the other drivers and you have this non-verbal conversation. Hey, which asshole’s gonna go next? Every single person does that. That’s the benefit of being human. Look. Oh, are you gonna go? Are you gonna go?

Michael: Yeah. It’s a wait and see game.

Fred: It doesn’t matter. The law is, but to be clear, I’m an engineer.

So you know the story about the engineers. How can you tell if an engineer likes you, looks at your shoes instead of his own?

Anthony: But still, I would notice that you were looking at my tires.

Fred: Yeah, that’s possible.

Anthony: And, but I love that Waymo, this, as they say we just treat it like a four-way stop, except in this case where we just gave up, we just.

Cars. Cars stopped entirely, and they said, we’re working actively with the San Francisco city officials to make sure that blah, blah, blah, blah, blah, blah, blah. More just bullshit and bullshit.

Fred: Yeah.

Waymo’s Operational Challenges

Fred: I don’t think they figured out who to blame yet that they’re but there’s [00:03:00] a lot of controversy about where the blame should lie.

Is it communications? Did they lose their center of the backup center, remote supervision, whatever they call it. But the issue has arisen. What if there’s a really major event that happens? What if there’s a, configuration? What if there’s an earthquake? Earthquake, right?

What if building’s crashing down? What do you do then? And the apologists for Waymo say it’s designed to do certain things. Designing is not the same thing as validating, right? So the design might be there, but. The test that is required to make sure that it will perform as designed is not part of the plans.

They don’t do that. Apparently Missy Cummings pointed out that the two su audit that was recently proclaimed by Waymo did not include events like this where you’ve got a massive power output or [00:04:00] a massive blockage of communications. And apparently she was denied use of the LinkedIn utility because of that.

So she’s back on now, but there’s a lot of, and a lot of things going on. And this whole situation pointed out not only defects in the control circuits for Waymo, but also defects in the public information is flowing from whatever their sources are for why it happened and how it happened, and which people will be allowed to report.

On the underlying problems. A lot is coming out. I think there’s more to come.

Michael: One thing that was interesting to me here was, rather than suspending service and, taking action on their own, it required the mayor’s office, they didn’t spin services until the mayor’s office contacted Waymo and said, Hey you’re causing all this gridlock.

You gotta, you’ve gotta get rid of this, get your cars off the road until we figure this [00:05:00] out. Why is Waymo not voluntarily doing that immediately when they see that, dozens of their vehicles are stopped in the middle of intersections, impeding traffic, and it’s doesn’t speak too well to their operations that, that they can’t figure that kind of thing out.

Fred: And they’ll say over and over again, that safety is their first priority. But clearly it’s not. Clearly, operations is the first priority and safety were the first priority. They would’ve pulled the Waymo’s as soon as it became clear that they were blocking intersections all over the city. Just like their refusal to stand down their Waymo operations in Austin, when it became clear that the Waymo’s were violating the requirement to stop for school buses with flashing red lights.

Yeah,

Michael: Just to be clear, this isn’t like an edge case situation similar to school buses. Traffic lights going out. It’s a, it’s not super common, but it’s a relatively common experience that every driver’s encountered at some time or [00:06:00] other. And, after big storms, it’s, relatively common.

So it’s something that, that, that could be expected in the wake of, a large power outage and should be something that’s been considered and worked out in the design process, not something that comes as a surprise.

Fred: Hey, Michael, how many automobile accidents are there every year? Oh, there are millions.

Maybe more why million? Yeah. But it’s interesting you brought up the concept of edge cases because people say normally this doesn’t happen. Because it’s an edge case. It’s really rare. Every collision is an edge case, right? Every collision basically is something that was unanticipated by the driver or the automobile manufacturer.

So there are millions of edge cases that occur every single year. And to dismiss a collision or a crash or a critical traffic incident by saying it’s an edge case, is really doing a disservice to public safety. [00:07:00]

Michael: Yeah. You could really go a lot of different ways with the Edge case issue. I think the way I see it is a very narrow set of circumstances that, wasn’t contemplated or, wasn’t, a reasonable or common occurrence that’s, relatively foreseeable.

It’s, sorry, I lost something there.

Anthony: Oh Michael’s adjusting his headphones. This, I don’t see this as an edge case though at all. ’cause it says, okay, this problem happens, treat it as a four-way stop. And as, as I first started discussing the problem with four-way stops is you really need humans there to make eye contact with other humans to decide what do we do next?

Who’s doing next?

Michael: Or you need a vehicle that has been designed and can handle those situations by observing those same humans.

Anthony: But even that I think is hard because if you’re, you can be observing. Yeah. Like it’s that is probably the most impossible

Michael: engineer problem. It’s hard, but it’s necessary.

If you want to be able to operate safely.

Anthony: I don’t know if it’s possible though, but [00:08:00] I’m gonna quote from this article in the San Francisco Chronicle. Waymo said, its vehicles are designed to treat non-functioning traffic signals as four-way stops. But the company acknowledged that the scale of the outage created unusual conditions.

That’s bullshit. I’m who? I’m in Waymo X. Okay. I’ve come to a traffic light and the traffic lights are out. I have no idea that another Waymo, six miles away is experiencing the same thing. Why does my Waymo care? Why would the operation of some other vehicle at a different location prevent mine from doing what Waymo said they’ve designed their cars to do, which is.

Non-functioning traffic lights treated as a four-way stop. Who care? There could be no traffic lights operating. Why is your system so fragile that if one vehicle runs into this edge case that all of them fail?

Michael: You’ve gotta wonder here too, what impact the power outage had on the remote operations.

Anthony: They’ve claimed it’s not remote operations at all. They’ve been very clear on that saying it had nothing to do with remote [00:09:00] operations.

Fred: Yeah, I think that’s bullshit too. You’re still looking for the right person to blame. I wanna go one level deeper on this though please. The same article points out that many people in San Francisco are using Waymo, violating the terms of service to send the children to school and pick up the children from school.

So let’s put those two together, right? You’ve got Waymo’s full of kids violating the terms of service who are not qualified to drive scattered throughout the city, and then you’ve got a power outage and all of a sudden all of those Waymo’s stop. Where they are and your kids who don’t really have the capacity to navigate the city safely are now stuck in vehicles that are immobilizing traffic around.

So there’s no public service available to rescue them, because all the intersections are blocked. How can this be a good idea

Michael: that’s not Yeah. You’ve got a traffic jam at Child [00:10:00] Protective Services then, because there’re gonna be a lot of parents in trouble.

Anthony: No, this is an amazing idea, Fred.

If anyone wants to fund Fred’s screenplay, the Horrors of Teen Waymo, go to auto safety.org and click on Donate. And I think that will encourage Fred to write this screenplay. ’cause this is a horror film waiting to start.

Michael: I was thinking more of it as like the Hunger Games type thing where the kids have to fight their way home.

Anthony: But the thing is, the kids are unfortunately just are punching their phone over and over again. Why won’t it respond?

Fred: They all get, yeah. And I’ll put ’em in Teslas where they can’t get out of the cars because the electricity is filled.

Gaslight of the Week: Tesla’s Claims

Anthony: So that brings up my Gaslight of the Week further down in this article from San Francisco Chronicle quoting Tesla used this as an opportunity to take a jab at its competition, claiming its full service driving feature is trained on billions of real world miles, including power outages.

And this is my Gaslight of the week because Tesla does not offer an autonomous vehicle service anywhere in the country. Nope. But, hey, [00:11:00] nice try.

Michael: Yeah. Elon has made a habit recently of poking at Waymo. Every time there’s an issue with Waymo’s, despite the giant festering soar on his forehead involving autonomy.

Anthony: Ah hey, I jump right into Gaslights. Gentlemen. The floor is open. Good luck. Mine was super succinct, right to the point,

Michael: Michael. I didn’t prepare one because I thought we were doing our yearly roundup today, right?

Anthony: You can jump in for your yearly roundup. An astute listener refer to this as the Skid Mark award.

Michael: Yeah, I, that’s one way to refer to me in Gaseous Lanterns was another thing.

Uber’s Background Check Controversy

Michael: Oh, corporate PSYOPs, there’s all sorts of things I think we could call it, but, if I did have a gaslight this week, it was gonna come from the Uber article on their terrible background checking to prevent sexual assault and worse, that’s horrendous.

There was a very good article from the New York Times and very well researched and long one of the Uber folks in that article was making some [00:12:00] pretty. Kind of sketchy claims around their background checking and even going so far as saying that, fingerprinting was not accurate, not a good thing to do.

I think that’s almost necessary in some respects because you can have, false applicants who are, who do have records, who need to prove their identity before they drive an Uber. You also have people who are driving Ubers who aren’t the applicant, right? They pass their phone off to someone else who could be anyone.

And that’s clearly a problem. And Uber clearly needs to be vetting those people. They were also doing some other things like not searching the national database, just searching in states where that person had resided. Just all sorts of problems and even, not spending a little extra money to do that type of search.

Solely on cost, so that was my Gaslight of the week. I forgot the person’s name. I think it was Mrs. Ms. Neils Les, I’m not sure. But [00:13:00] it was a pretty clear gaslighting of us pretending that Uber’s focus is on safety when pretty clearly their focus was on profit.

Anthony: Yeah. I’m going to do a quote from this the New York Times article on Uber that you referenced there.

The articles titled Uber’s sexual assault problem, that’s always great. Quoting Uber rejects prospective drivers who have been convicted of murder, sexual assault, kidnapping, and terrorism. Good for you Uber. But in 22 states, the times found the company approves of people convicted of most other crimes, including child abuse, assault, and stalking.

So long as the convictions are at least seven years old.

Michael: Wow. Yeah. And it was even like, they only use the more expensive background checking option if the state has passed a law requiring it to do it. Otherwise, they use the cheap one that doesn’t find them, find problems as well, which is just,

Fred: I wanna reinforce this discussion a little bit, but I think considering our national president, Mr.

Trump, oh boy. Now [00:14:00] according to the article, Donald Trump would not be able to drive for Lyft because of his convictions of felonies, felony convictions. And he would be allowed, he would be allowed to drive or Uber because of the date of his convictions.

Anthony: Wait. So Lyft, actually, Lyft has stronger protections there.

Fred: Yeah. But now follow up with the fact that. Neither one considers civil penalties. So Trump’s conviction for sexual assault, which was a civil offense, would not be considered by either Uber or Lyft in their consideration of his competency for being a driver, a contractor driver. Hopefully he’ll retire soon.

He can always look for a future as the driver for Lyft, or excuse me, for Uber, but you won’t see him in a Lyft, I don’t think. But, again, neither one [00:15:00] considers the civil history, and I think that’s a defect in both of their betting processes because, a wrongful death suit, for example, that they lose, oJ Simpson would’ve been allowed to drive for either one.

Michael: That, that leads us with a question, would you rather ride in a Waymo, a robo Taxii Tesla, or a Uber driven by Donald Trump? I’m choosing a Waymo.

Fred: I’m gonna walk. It’s a hell of a choice. But presenting that choice is my Gaslight of the week.

By the way, just my head exploded after that.

Anthony: Huh? That’s pretty good. All right. Clearly I won this week. Thanks for participating.

Michael: Yeah. The Gaslight has just become a nod to Anthony. We’re gonna have strength. No, I, and in 2026, we’re gonna have to strengthen criteria. Anthony’s judging has gotten a little outta control.

Anthony: I don’t think so. The last few weeks I have barely won. Barely. Come on. No,

Fred: But to give Anthony his due, he’s up to the standards of the New York Times, which consistently reports that there is a peer reviewed article that [00:16:00] validates William o Safety without. Going into the detail that those peers are all either employees or clients of Waymo.

So Anthony you’re following up that tradition you’re in, it’s really good precedent for you. Excellent. So go ahead. Yeah. Declare yourself to be the winner every time. Perfect. Why not?

Anthony: I’m gonna send my resume to Bari Weiss and work at CBS by the end of this episode. Yeah.

Tesla’s Deceptive Marketing Ruling

Anthony: Let’s move on to a little company called Tesla.

That’s right. Hey, not listeners. If you’ve not even paid attention, you just have this running in the background. While you’re doing dishes, you’ve heard us talk about how the Federal Trade Commission should go after Tesla for calling things full self-driving when they’re not autopilot, when it’s not. And now a California judge ruled late Tuesday afternoon that Tesla engaged in deceptive marketing in reference to its full self-driving system.

And that Tesla’s license to sell and produce cars in the state should be revoked for 30 days. However, the California DMV said, we’ll give you 60 days to comply and fix its marketing. This is a good thing. [00:17:00] It’s a, almost a decade late, but, progress,

Michael: right? Yeah. We wrote California DMVA letter in 2018 telling them, exactly what they did here.

Obviously they could have done lot quicker and if they had, they might have saved some lives. But they have finally done it and, they’ve declined to, suspend Tesla’s manufacturing license. I’m sure there’s some political considerations since Tesla still operates, industry in the state of California.

But they did. Tentatively suspend Tesla’s dealer license in the state for 30 days. And in order for Tesla to avoid that suspension, they actually have to change the name of autopilot. They’ve got 60 days to cease using the autopilot name to describe their A DAS features. Or they have to implement changes to Tesla that would bring them up to sa a level 3, 4, 5 which I would say is [00:18:00] impossible for them to do.

So ultimately, if Tesla’s going to satisfy these conditions, there’s a chance they’re gonna try to bullshit their way into saying they’re level three. There’s always that chance with Tesla, but I think they’re, ultimately they would have to cease using the autopilot name, which, you know, frankly, they could probably.

Do without significant penalties themselves that they didn’t require them in this order to stop calling them full self-driving, which is, I think there’s a problem. I think they should have addressed that as well. Yeah, there’s some question, there’s still remaining questions here.

I think it’s this overall, this is a good thing. This has happened, but, ultimately whether it stops some of the nonsense that Tesla’s been doing is a greater question because I don’t see that happening. I think they could change the name of autopilot, continue some of the crap they’re doing around robotaxis, continue taking advantage of the level two loophole and continue to insist that their vision [00:19:00] only vehicles are going to ultimately be safe.

So this is gonna be one of those situations. Plus they have an opportunity to appeal this. They have an opportunity to ask for reconsideration of the decision, which is probably not that great of an option for them. So it, this is gonna be another situation where we’re waiting to see what happens in the court.

Anthony: Elon and I were doing ketamine this weekend and I asked him about this. I was like you’re not, you can’t call it autopilot anymore. What are you gonna call it? And so he’s deep in a K hole and he is danger boobs 69.

Michael: That sounds like, yeah. Yeah. It does sound like him.

Anthony: How, but so they, and they have to full self-driving supervised.

They have to change that as well.

Michael: I don’t see that in the order. Al autopilot was, it’s autopilot is, was not found to be unambiguously false by the court. But it’s. It, what it does is what we’ve always said, it’s done. It’s very an, a [00:20:00] very ambiguous term that gives some people the idea that the vehicles are self-driving, which is all that should really matter, right?

As long as you’re lying to part of the population that believes you, even if the other part of the population knows you’re full of shit, you still risk killing the part of the population that believed you. And that kind of marketing just doesn’t work in the auto industry. It’s something that we’ve been worried and see, still see signs that other auto manufacturers are hedging towards, which is certainly makes us fearful.

When we start seeing people use phrases like hands off, eyes off, things like that, that, that really take, in a vehicle that requires your attention, taking any of those steps could be a recipe for disaster. So that’s one thing. But I’m, I’m. Tesla continues to change what autopilot is, what full self-driving capability is.

Functionally there the terms have become somewhat meaningless because Tesla continues to modify what systems and what [00:21:00] those names apply to. So it’s it’s one reason why it’s sometimes hard to distinguish what’s autopilot, what’s full self-driving. You’d have to go back in time to look at what it meant at each stage.

So it’s, I’m a little surprised that the court didn’t issue a direct order on the full self-driving part of this.

Anthony: So I guess we’ll have an update within the next 60 days, I imagine.

Michael: Possibly if they go the route of appealing this, it could be much longer than that. It’s really gonna depend on which option Tesla takes, but I believe they have a certain period of time.

Because I’m assuming that if they appeal this, that the DMV’s punishment will also be stayed until that appeal is resolved. But that’s yet to be determined.

Anthony: Fun times.

Fred: Yeah. I’m going to reclaim my option for Gaslight of the week. Oh, reclaim, because you just refresh my memory, Michael, that the order, I think, distinguished between full self-driving and [00:22:00] driver assistance.

And, the myth is that all these electronic devices make driving easier. But if you think about it for a moment, all of them require that the person behind the wheel have full attention to the driving task. ’cause they need to be ready to take over at a, at any time, right? And in any situation.

So you’ve got full attention being paid to the driving task, even if your hands are not on the wheel. But at the same time, you’ve got to monitor all of the responses of the automatic driving system or the driving assistance system if you know if you’re going to be using it. So you’re actually adding task to the driving task because you’ve gotta do everything to drive, except to actually steer the wheel.

And then you gotta monitor the performance of the automatic systems. And you’ve gotta monitor the interaction between those automatic systems [00:23:00] and the human driver, in the intellectual space. So I don’t see how it’s possible that any of these can reduce the driving work if you’re going to use them safely, because using them safely requires you to do more.

Than if you were just merely driving the car. So I’m gonna use that moment to just say, this whole idea of using driver assistance systems to reduce the workload of the drivers is just wrong. It doesn’t do that. If you are using it safely, and if you are reducing the workload of the driving, then you’re not using them

Michael: safely.

But you’re saying we’re not gonna be able to get that time back that’s so precious to us. Like the rivian, CEO

Fred: No, those 32 seconds. Yeah. Yeah. That you could save.

Michael: Yeah. I agree. I think I would be a basket case driving a Tesla around using full self drive and autopilot because, the control has been removed from me, and yet I’m still responsible for watching [00:24:00] out, babysitting a robot, which doesn’t sound very, like a good, fun drive through the country.

I’m sorry.

Fred: No and the many videos that are out there of the Teslas failing to perform safely, all show the drivers going, oh no. Oh no, God, look at this. And as they grab the wheel and try to steer back into a straight line. There’s a lot of work, a lot of intellectual work associated with using these things if you’re going to use them safely.

So I, I just don’t think it’s possible to use them and also reduce the workload of the drivers.

Michael: That’s my, I’m also interested, really interested actually, when the new t Texas law comes into play in January that requires manufacturers of autonomous vehicles to tighten up their ships, essentially how this ruling plays into that.

Does this Texas kind of battle California and try to loosen the reins for Tesla a little bit or are they gonna, there’s a lot of BS surrounding Tesla’s deployment of [00:25:00] Robotaxis in Austin and, some of the new law is gonna require submissions from Tesla that I don’t know, that they’d be willing to make or could make.

And I, it’s gonna be really interesting to see how that plays out in Texas given, the state’s kind of bias towards corporations that reside in the state, a la Tesla, but also their kind of, a Texas versus California battle is always fun to watch.

Fred: Yeah, I’m gonna, I’m not a betting man, but I’m gonna put a nickel on the idea that Elon Musk is gonna go visit his buddy in the White House and try to get this smooth over with a presidential decree.

Anthony: While that happens, I wanna, I’m more interested in seeing Michael as a basket case beyond the wheel of a Tesla. Listeners that you find that entertaining, go to auto safe.org. Click on donate. If we raise $10,000 in the next hour, we’ll put Michael behind the wheel of a Tesla and watch him lose his mind.

Thanks.

Michael: Someone will [00:26:00] do that.

Anthony: They have to do it the next hour though. Can

Michael: you bump it up? Can you bump the mon, can you bump it up a little? Oh,

Anthony: yeah. I want 50,000.

Michael: I need to check with my life insurance policy first.

Anthony: Okay. Okay. Let’s move on to something positive and interesting.

New Safety Legislation for Self-Driving Cars

Anthony: That’s right. There are some senators whom my guest has listened to this show because Senators Markey and Blumenthal introduced a new safety legislation for self-driving cars.

Quoting from the press release, senators Markey and a member of the Commerce Service and Science and Transportation Committee and Richard Blumenthal today introduced the Stay in Your Lane Act legislation that would prohibit self-driving technology in cars from operating outside the road conditions and environment, which they were designed.

Also, Fred, what else? What’s another term we use for that bullshit? No operating design domain. What happened?

Fred: Fred’s clearly not listening to you. Yeah, he’s not listening. Sorry. I have blue birds coming to my bird feeder side at distraction. Wow. Blame it. Blame it on

Anthony: the bluebird. Blame it on the bluebird.

Sounds a damn [00:27:00] hippie. Yeah, exactly. Yes. This is great. They’re, they wanna make what we’ve talked about, which number of our guests have talked about of saying, Hey, operating design domain. These vehicles should only operate within there. And what’s the big company that doesn’t operate? Yeah. Within an operating design domain.

Fred Perkins for the wind

Fred: starts with T I’m gonna go with T-Mobile.

Anthony: Yep. T-Mobile. It is, oh God. He’s still staring at the birds Tesla.

Michael: Yeah. And to be clear, if you read the press release they say, this applies to self-driving vehicles. It applies to all vehicles. It’s not just restricted as to vehicles that are actually self-driving.

’cause then it wouldn’t apply to Tesla. But it basically means that anytime you have a partially automated, which is what Teslas are, or fully automated vehicle, you’ve gotta be clear about what conditions and areas that vehicle can operate in. And if it’s not, if those conditions don’t match up to where that vehicle is, [00:28:00] it cannot.

Function as an automated vehicle fully or partially. It cannot do it. And if it does, it’s in violation of the, the declaration that the company has made to NHTSA of what the design parameters are for that vehicle. This is something that would if had it been in law, a decade ago, could and should have prevented, a lot of the crashes we’ve seen in Teslas where they’re being operated in areas, not on divided highways, on country roads, and in places where autopilot or full self-driving should never have been allowed to be turned on.

Anthony: And to remind listeners, partial automation includes things like adaptive cruise control, lane keeping assists, correct.

Michael: It could be, it’s not just adaptive cruise control linking it’s gonna require and if you go to the definition of it in here, it is a driving automation system.

The hardware and software collectively capable of [00:29:00] simultaneously performing all of the lateral and longitudinal vehicle motor control subtasks of the dynamic driving task of a motor vehicle on a sustained basis. So I think what that means in common terms is, sustained steering activity.

Anthony: So wouldn’t my lane keeping, lane centering combined with adaptive cruise control, where essentially I’m just keeping the pressure. Yeah, could be. Yeah. I think

Michael: that could be

Anthony: all right. All right. We’ll we’ll keep on top of this one and let you know how this develops, but this is a good one.

And hey, we’d like to thank those senators for being fans of the show.

Michael: Yeah. And we’d also, like virtually every other piece of legislation that comes out that tries to address something in auto safety, individual pieces of legislation simply aren’t making it through the House or Senate anymore.

We’re, it’s it was bad before, but now it’s become even worse where most of these things have to be put into a larger funding bill. And this is one that we really hope makes it into the surface [00:30:00] transportation reauthorization. That’s gonna be coming up next year.

Anthony: Okay. Fred, another question for you.

What’s the big threat to America when it comes to our innovation and our progress towards anything? What is it

Fred: the biggest threat?

Anthony: Yes.

Fred: I’m not sure what’s doing. I’m gonna have to go with NutraSweet. Oh,

Anthony: aspartame nice. It’s just not good for you.

Michael: You’re reaching

Fred: back

Anthony: in the nineties. I, that’s the eighties.

I, I, Bravo. Bravo. Thank you.

China’s Robo-Taxi Accident

Anthony: But the answer we’re looking for is China. Ah, yeah. So from car new china.com. It’s an article this is horrible. It’s titled bad News. Roboto Taxii Causes Accident in a Central City of China, leaves Two in Intensive Care. Quoting from the article, according to eyewitness reports and video shared on social media, the self-driving vehicle struck two pedestrians at approximately 9:00 AM on Yin Jang Road.

One victim was trapped beneath the vehicle while another was injured.

Self-Driving Car Incidents

Anthony: Nearby video footage shows the victim [00:31:00] trap under the vehicle, wearing a helmet with visible face bleeding as bystanders attempted to lift the vehicle to rescue them. The only one, the only self-driving car I think in the US that’s pinned somebody underneath was the GM cruise.

China and this one, they got two people at once. So I, they are winning was, I’m so sorry,

Michael: STO. Yeah, that was terrible. I, yeah. The only credit I would give, here is that it, it looks like the vehicle did not move any further after the victim was trapped under the vehicle, which Cruz failed at.

But it’s, this is a Baidu vehicle. Baidu, but it’s operated by hello, which is a company that’s using Baidu vehicles in its autonomy. Baidu has a separate autonomous vehicle program, which I believe may be one of China’s bigger ones. So it, it’s hard to say I, I can’t, you can’t exactly tell from the article, exactly. Which vehicle or what is [00:32:00] going on here, but it’s, certainly a, a lesson in being cautious. And I think that China only suspended the company’s operational permit in the city in which it took place. So yeah,

Anthony: it’s just another example of, hey, let’s as Mr. Perkins will say, let’s not beta test, dangerous munitions on the road, right?

Fred: Yeah. I think it’s a bad idea to put lethal industrial machinery on the highway with no protection for, the other highway users. This it’s no less important in China than it is at airports in the United States. So I think that, the way this is progressing so far in the United States is a recipe for disaster.

Anthony: We’ll see.

Debate on Car Inspections

Anthony: Alright, moving on to something that it’s also a recipe for disaster. Let’s talk New Hampshire from an article in Jalopnik, New Hampshire wants to get rid of car inspections that [00:33:00] weren’t for you meddling environmentalists. Now this blows me away. The state of New Hampshire, we don’t wanna do environmental car inspections to see if, you’re rolling coal or you’ve got a diesel gate vehicle, or you’re just car’s just missing a catalytic converter, something like that.

What’s surprising to me from this article though, is that a lot of states don’t have safety inspections at all. So I’m gonna quote from the article, however, nearby Rhode Island only requires inspections every two years and Connecticut no longer does safety inspections only emissions. How is this?

Am I just, I’m living in my woke mind virus of New York State where I have to do every year I have to get a safety and emissions inspection. It’s only 25 bucks or something like that. But how do all of these states not even just do safety inspections? Because as we’ve talked about, the safety inspections are fairly cursory.

Like they’re not intensive, they’re not calibrating cameras on modern [00:34:00] vehicles. They’re not.

Michael: They should be doing that. They

Anthony: should be doing a bunch of stuff. But the fact that so many states don’t even do a basic one be like, eh, your tires are leaking.

Michael: And then there’s not, there’s a lot of not, even op, I think misses the mark on it’s, saying, they say that 27 states currently have no safety inspections.

I, that, that’s not even accurate. My count is that there are only for now 14 states and the District of Columbia that require regular safety inspection. So this is something that’s been going on for years. States have been taking their safety inspections out of play because consumers complain about them.

But ultimately it’s a very minimal cost. We need better safety inspections to make sure that the technology that’s being loaded into our vehicles is working. Without frequent and or regular calibration of cameras and other, sensors on the vehicle, the automatic emergency braking, your lane [00:35:00] keeping assist a lot of the systems that are required to operate safely.

And a lot of the crash prevention systems can’t operate effectively if they are not calibrated and if they are not accurate. So if anything, right now, I think states should be increasing the, their requirements for safety inspections, not eliminating them. But this is New Hampshire where you’re perfectly free to live inspection free and die.

Fred: I completely agree with you, but I also point out that. As cars are evolving with more and more electronic functionality driver assistance self-driving what have you we as in society, are moving more and more safety critical functions over into the realm of logic rather than physical devices.

Originally, when the inspections came out, all of the safety critical devices on the car that were visible were all that you need to inspect. ’cause there were no [00:36:00] logical components, right? But now if you’ve got a logical component, like an algorithm that’s trying to keep you safe, as you’re approaching the intersection with other vehicles, if it’s faulty, you’re gonna die.

There is no state that I’m aware of, no government entity that is right now currently developing a procedure or inspecting either at delivery or at any time during the operation of the vehicle. Equality and consistency of the logical functions that are simply not visible to a human inspector.

There’s, this has gotta happen because you’ve got hundreds or maybe thousands of critical functions that are now being handed over to computers. There’s no way for a human being to inspect those except to wait for them to fail and have a crash. Yeah, I completely agree. We, this is not the time to relinquish inspections.

It’s the [00:37:00] time that we have to really expand the scope of the inspections into all of the safety critical functionality that’s being included in the vehicles.

Anthony: Michael has this really been rejected because consumers fought back? Because for me it’s just once, every October I go to some Sal’s auto body and I’m like, Hey, I need an inspection.

Great, have a seat. I’m out like 20 minutes. It’s,

Michael: yeah, I, it’s not that, unless your car fails right? Or unless you go to, and this does happen. There are folks who operate in the inspection area who every time you go to them, they will find a problem because they want to replace that part on your car and charge you for it.

So some of that, if it’s not cracked down on, can result in consumer dissatisfaction with it. But I think mostly this is just, a political nod to the, some of the more libertarian voters out there who don’t want to have to interface with the government in any face or form, unless they want roads to drive on or blah, [00:38:00] blah, all the things that government actually provides that they like to ignore in their arguments.

But it’s ultimately. A bad we’re just ultimately headed down a bad road here, because as cars become more and more complicated from a safety perspective, and individual owners and consumers are less able to, Eva, like anyone can evaluate whether there’s a crack in your windshield.

You can evaluate whether or not your brake lights or your horn are working or some of the little simple safety inspection things they do. No one can evaluate by looking at the vehicle, whether your your cameras are calibrated. That’s going to require being hooked up to a machine somewhere.

And so it’s, this is just a very bad direction.

Anthony: So I think our number one threat to this country then is the state of New Hampshire. That’s right. New Hampshire. All right. This is getting ridiculous.

AV Regulators Checklist

Anthony: Fred, are you ready for for continuing with the AV regulators checklist?

Fred: Sure. Put up a few more. There we go. And a note to Junco Yida, I [00:39:00] really want you to call me so we can dedicate one of the, one of the episodes of your podcast to this whole idea of setting up a set of informational questions for the regulators to use. We’re all ears. Come on, Jono. Give us a call.

Anthony: Who the hell is Junko? Nexto? Was he a pitcher for the Dodgers? I’m sorry? Who’s Jono Yto.

Fred: Oh, shame on you. I’ll educate you later. But she’s she’s a podcast who’s got a lot of insights into the AV industry. Interviews people frequently Phil Cooperman’s on there a lot. Ah, Mike Nelson’s on there a lot.

Why aren’t you two on there? A lot. She’s actually a very good journalist, but so far I haven’t been famous enough to attract her attention. But come on, Jono, give me a call.

Anthony: Maybe you gotta call her.

Fred: Don’t be shy. I’ve hard to tried that. That hasn’t worked. Next up on our checklist, remember this is a checklist for use by non-technical people to start to [00:40:00] force companies to divulge how much safety they’re building into their self-driving vehicles.

Next up is number 24. As the applicant has shown that safe ingress, egress and expedited egress on demand is available regardless of users’ race, sex, physical or mental incapacities. So if you’re using one of these vehicles a robot taxi for hire, are you sure you can get the hell out of it if you have to?

We’ve seen failures of electricity where the doors don’t work in self-driving vehicles. We’ve seen Teslas wedged up against. Embankments and the electricity fails and people die from, I guess what they call thermal injuries. That’s not good folks. So you need to make sure that if you’re threatened you can get the hell out of the vehicle and get yourself to safety.

I think that’s fundamental and any safety analysis. Next one is [00:41:00] number 25. Has the applicant shown that unaccompanied underage users are prohibited from use? And this goes back to the New York Times article we talked about earlier where, you know, a lot of the parents in San Francisco are using the Ubers, not the Ubers, Waymo’s, the Waymo’s to send their kids to and from school as a violation of the terms of service.

Also a really bad idea, particularly because they can sometimes fail spectacularly and then your kids adrift. So next stop after that. Number 26. As the applicant designed for and shown that the intended computer driven vehicle user’s destination may be safely and expeditiously modified by its user, if in their sole determination the original destination is undesirable or unsafe,

Anthony: what do you mean?

Fred: So well say that you ha have programmed in that you’re going to 47th [00:42:00] third Street and the vehicle takes you to 47th third Street and in front of 47th, third Street let’s just say some militia members carrying automatic weapons. Do you really want to get out there? You should be able to redirect the vehicle, or, maybe something’s on fire or there’s some kind of catastrophe or your plans just changed,

Michael: right?

Fred: Your plans just changed, right? It doesn’t have to be

Michael: Something insane like that. Or the Waymo driving into a shooting gallery,

Anthony: right? Or you drain the wrong outfit for your militia

Fred: for whatever reason, right? You should, you the user should be able to say, nah, I don’t wanna go there anymore.

I want to go somewhere else. I don’t know that’s the capability. I don’t believe it is in any of the robotaxis right now, but that should be yours again, fundamental aspect of your control over your own health and safety.

Anthony: I like that one a lot because a lot of times mapping software is not up to date.

Like you’re going to some restaurant and you go there, it just doesn’t exist anymore, or it’s closed today [00:43:00] and you know that that’s a very simple one. But also when you go to your militia meeting, make sure you’re wearing the correct fatigues for your particular hate group.

Fred: Because it’s embarrassing.

I may worn that one. I’m not sure. Okay, next one up. 27. Has the applicant proven that computer driven vehicle commands and redirections are secure, authenticated and legal? So let’s say that you’re a bad person and you wanna deliver a bad package to a location, right? So you get in your, you get in the Waymo and you put your bad package in the Waymo and you send it on a ER way.

How is that controlled, right? How do you make sure that’s not gonna happen?

Anthony: That’s an interesting idea. Can I actually order like a Waymo and just not get in it, but just put a box in it and just send it off a box in my phone? ’cause it doesn’t know there’s a person in there

Fred: that’s Niro’s business, right?

Neuro isn’t in the business of sending packages [00:44:00] unaccompanied by any human being to a indeterminate location.

Michael: Look, that same thing goes for the post office, right? As long as you click the box, this is no liquid, hazardous, perishable or whatever,

Fred: they’re gonna send it, right? Yeah. But they’re, and Uber Eats is advertising and publicly declaring that they’re going to use these self-charging vehicles to deliver Uber Eats.

Boy, there’s a lot of room in there for mischief, isn’t there?

Anthony: But all, getting back to the post office, they actually have massive systems in place to check for safety of packages. And do they, how often does, do you discover a mail truck exploding?

Michael: I mean there’s actually a lot of mail trucks catching fire in the last few years.

That had nothing to do with the package that the design defect. That’s

Anthony: a very different thing. Okay.

Michael: Often, but how often are mail, is the mail used to ship illegal items all the time?

Anthony: That’s, again, the Mitch Hedberg joke, my post, my postal worker doesn’t know he is also a drug dealer.

Michael: Yeah.

Anthony: Yeah, but, oh, but I wonder, can I do, can I get a just [00:45:00] a burner phone, put that in my Uber and our Waymo and just send off packages. Oh, look at, this is great. I think Waymo would

Michael: catch you, right? Because we know they’re monitoring to make sure there’s an actual person in the car all the time.

And if that person’s having a baby, there are even other steps they could

Anthony: take. Oh, what if I put a baby in the box? Alright, Fred, go on. I’m there’s

Fred: other dimensions to this, right? You have to, you would want to know the instructions you’re giving into the vehicle are secure. And they’re authenticated, right?

Because you are giving the commands. You don’t want Anthony to give the commands. You want yourself to give the commands, right? So they need to be authenticated that the person who is supposed to be controlling it is in fact controlling it and needs to be secure because you don’t want somebody who is not intended to control the vehicle jumping into controlling the vehicle.

Michael: Yeah. That’s how I saw this is more of a cyber security thing. You have to show that the commands that the computer driven vehicle is receiving are actually coming [00:46:00] from, either the passenger or the person in charge of their ride through the app or, the remote operations center or whoever is at the Waymo headquarters who has some type of command authority and not from an outside party.

Fred: And if it, if the vehicle is configured so that the remote operation center can override the commands. Given by the occupant of the vehicle. This is a real problem because somebody else can jump into the communication link and override the commands of the person who’s in the vehicle. So I think this is a sleeper issue, but it’s very important.

Anthony: This is a good point because say I get in my Waymo and I want to go to the Piggly Wiggly, but somebody takes over the center and next thing I’m at Safeway.

Fred: That’s right. You don’t want that to happen. No. Piggly wiggly is gotta be secure. All right, and this last one for today, number 28, has the applicant shown that the computer driven vehicle occupants are adequately protected from harassment, [00:47:00] entrapment, and threatening actions by unauthorized, unwelcome, or threatening third parties.

So how secure are you in the vehicle? We’ve seen reports of people being harassed by other people who are outside of the vehicle, putting a cone in front of a vehicle to disable it while there are people inside who don’t care to be trapped that way. We just talked about how inter communications can be interrupted by remote users to control the progress of the vehicle.

So we think it’s a minimum that the person who is asking for permission to use this and public should have to show that the occupants are etiquette and adequately protected against plausible threats. We don’t know of that happening, so I’m gonna leave it there for this.

Michael: And adequately is probably the big word there because, there’s obviously some threats that, we can’t protect ourselves from even in cars.

Sure. But there are [00:48:00] situations we’ve seen, guy in the trunk of the Waymo we’ve seen a number of them where people were standing in front of the cars to arrest female passengers. Who knows what else. But that’s certainly a problem. It’s where the vehicle’s safety considerations not to run over a person, put the passenger in a bad situation because the person that the vehicle isn’t running over is the one doing the harassment or the entrapment.

So it’s a. Very sticky situation. And in that case, adequately protecting the passenger ultimately, I think means preventing unauthorized access by that threatening party. And contacting the police. I think that the vehicle should, have a hotline for that type of thing so they can immediately contact law enforcement when those types of situations occur.

Fred: That’s another good idea. Jungle, give me a call to to have this in your in your podcast. And Anthony, how can people find this list?

Anthony: They can go to auto safety.org [00:49:00] and click on, is it AV checklist, which is what I, and I gotta look it up. I gotta remember. I think it’s av Yeah, it’s AV Checklist.

Checklist. Checklist. Yeah. Right next to or just

Michael: autonomous Dash vehicle. Dash checklist. Yeah. Or

Anthony: like URL or use our search function. It’s wonderful. You can get it printed on a t-shirt. We don’t sell that, I’m sure you can just copy it and paste it. Where’s our

Michael: merch?

Anthony: We don’t have merch we should have merch.

Hey, listeners, if you wanna get auto safety merch, send us a message. I’ve got some great ideas for them. I saw years ago, Axl Rose wearing a t-shirt with unsafe at any speed, unsafe at any speed book cover on it. And I was like, damn. Yeah, I know. I reached out to the designer and I was like, Hey, I know a guy who wants to say thank you.

Never got back to me. Hey listeners, before we jump into recall, it’s that time of year. It’s that fundraising push time of year. So if you’d like to go to auto safety.org, click on donate, that would be great. And if you don’t wanna do that, no matter what listeners, if you’re on our email list, you’re gonna get bombarded with emails asking for donations.

And look, if you can [00:50:00] give, great. If you can’t just click delete. No need to unsubscribe, because starting January, we’re not gonna hit you with these day after day after day. We’re required by law to bombard your email box with these starting from a couple days from now.

Michael: The law of nature. What laws

Anthony: law. The law of we have to pay bills. Yeah. Okay.

Recall Announcements

Anthony: And with that listeners, let’s jump into recalls. My, the first one is almost gaslight adjacent. This is great. This one hasn’t even made, and it’s his website yet. It’s zoo’s. That’s right. Zoo’s recalled certain a DS equipped with software versions released before December 19th.

Adding that or at near intersections, zoo’s, vehicles may cross the yellow center line and drive into or stop in front of oncoming traffic, increasing the risk of crash. The my favorite part though, is the next line. This is from a Reuters article. The company has updated the a DS software free of charge, and it’s added.

That is the dumbest thing I’ve [00:51:00] ever seen, first of all. Yeah. It’s a recall. It sounds like they,

Michael: they took a talking point from your average recall, which just means the owners get a recall free of charge and stuck it in here where it makes no sense because there is no charge to zoo’s customers for repairs or recalls.

Yeah, exactly.

Fred: Also there are no customers, right? Yeah. All these zoos are owned by the company, so it’s, how much is it gonna charge itself?

Anthony: It’s a good point. With Jeff Bezos behind the wheel, the financial chicanery could be impressive. Here’s a point. Yeah.

Michael: Yeah. So they’re turning, basically you’re taking a right turn instead of going into your right lane, the vehicle’s going, or crossing the yellow line or whatever lines you have, where you are and into the opposite lane.

And then stop in front of oncoming traffic. So really a bad, really bad thing. And something that has to be fixed quickly, but. We’re still concerned with Zoox because they’re getting exemptions and they’re, they say, I think it was about a week ago, they said they’re gonna start charging customers in [00:52:00] January, which means they are anticipating that the Department of Transportation is going to give them their Part five five exemption.

Which means they’ve bought into the AR argument that these vehicles are as safe or safer than vehicles that can be evaluated under federal motor vehicle safety standards. I don’t know where they would’ve gotten that information so early in the process before that exemption has been decided on, but they seem pretty confident.

So we’ll see what happens in January.

Anthony: Self-certification and an amazing thing. Next up. No, this is

Michael: exemption, right? Oh, they can’t even, they can’t self certify. They can’t meet the f and DSS. They tried to do that. They that was a big gaslight by them. They basically said, oh, we’re just as safe as the f and DSS, so we’re cool.

And just pretended that they had self certified their cars. DOT said, no open an investigation. And now, in the new administration, they’ve just decided to just hand them an exemption without any sufficient proof of safety. So we’ll see how that goes.

Anthony: I know how [00:53:00] they did it. They promised that all future zoox is, will be painted gold,

Michael: right?

Or they helped pay for the New East Wing ballroom.

Anthony: Oh, there you go. Next up, BMW 36,922 vehicles, the 2025 to 2026 BMWX three. This safety recalls involves the steering system software, which may not be sufficiently robust. Ooh. If one of the two channels within the steering torque sensor malfunctions, while the vehicle is in a standstill condition, start up or in drive mode, but stationary software, diagnosed diagnostics may not appropriately detect this condition.

In these rare cases, unintended steering wheel movement may occur. So wait, so I’m not moving the car. I’m stationary, and the wheel’s gonna start turning itself.

Michael: Yeah. I’m thinking of, this is your, and I’ve never heard of unintended steering before. This is a new thing, and it’s brought to bear by, the fact that vehicles now have the ability to steer themselves in many cases, this one [00:54:00] looks like, I’m thinking of it as you’re sitting at a stoplight, not moving, and for whatever reason for this, due to this defect, the vehicle turns your tires.

And so then when you proceed forward, you’re gonna be turning in whichever direction the vehicle has, spontaneously decided to move your tires. So that’s clearly a problem.

Anthony: Oh boy. When will when will owners get an update?

Michael: It sounds like they’re gonna get it in early February.

It’s gonna be an over the air update. It sounds since they’re just gonna be upright updating the steering software so they won’t have, will be for free. Yes. It’ll be free. And owners who don’t wanna do the over the air update have the option to go into their dealership, excellent.

That’s gonna happen on Groundhogs Day.

Anthony: Wonderful. Moving on Honda 70,658 vehicles. The 2016 to 2020 Acura, ILX. Residual plasticizer in the brake reserve hose can contaminate the brake fluid causing the brake [00:55:00] master cylinder’s secondary cup seal to swell into form. Radiant engine heat further expands the seal reducing its ability to seal and allowing brake fluid to bypass the seal during slow brake pedal applications.

So just slam on the brakes. Never do slow bra, right?

Michael: No. Oh, this, obviously, your brake master cylinder fails, you’re gonna lose braking. Big problem. It, it’s it’s hard to interpret this. 5 7 3. They’re gonna replace the master cylinder with an improved part, but they’ve only scheduled an interim owner notification date again for Groundhog Day.

But that’s not the final owner notification date. It looks like they’re going to be designing a, an improved master cylinder. So this one may take a while. Owners should be on the lookout in February for their interim notification. That’s basically gonna just tell you, you’re gonna wait some more.

We don’t really have a final date for a fix here. So I think it pays these folks to be very aware that they could lose braking during [00:56:00] that period. That’s pretty scary.

Fred: Yeah. Why is it not someone in Marysville is not gonna have a Christmas vacation this year?

Anthony: W why is this not a stop driving this car notice?

Michael: I’m assuming because it’s only happened in such rare occurrences that they don’t believe it is a ongoing threat. However, they res, they’ve only received. A few complaints. It doesn’t look like they’ve seen any accidents or injuries related to this. Perhaps that’s the issue. Perhaps, you retain some of your braking and this really only affects the stopping distance.

And they note stopping distances increased. So that could be the issue. You still have brakes, but your stopping distance isn’t going to be what you expect.

Anthony: Next up. Ford. Wow Ford odd 272,645 vehicles. The 20 25, 20 26 Ford Maverick, 2022 to 2026 [00:57:00] F-150 Lightning 2024 to 2026 Mustang Mock E they may have a transmission that may not lock into park in some instances.

Oh. One tested in accordance with. E with some sort of standard, each vehicle must not move more than 150 millimeters on a 10% grade when the gear selection is locked in park finding of the vehicle’s IPM hall against the IPM slider component may impede the slider from returning to its fully engaged park position.

I love that. The fact they use millimeters in this is great. Take that. It’s

Michael: a federal motor vehicle safety standard, so they do use the metric system communists, but that’s what scientists use around the world. And it just makes more sense to do it that way. And I believe most of the, most of in all of the engineering standards are metric as well.

So makes a lot of sense to be consistent across the board there. I think you could run into some real problems. What was that we talked about? Was it the Saturn rocket, Fred, where the, they used the wrong [00:58:00] conversion or something. Oh yeah. Metric to our system

Fred: that was a, mission to Mars where they neglected to convert from metric to English.

And oops, it missed Mars by a few million miles.

Michael: So looks like this is just gonna be a software update and could be deployed over the air. These are, I think the vehicles involved here are all the relatively new hybrid and electric Fords. So that makes sense, but it doesn’t look like they really know when this one is going to occur.

It looks like they’re gonna start mailing owner notifications February 19th, but they’re not gonna finish till May 12th. So they appear to be struggling with an over-the-air update that takes three months. Sounds a little odd. This is Ford’s 140th vehicle recall of the year.

Anthony: So if I was to, or any consumers to go into a Ford dealership and buy a new Ford, Michael, would your [00:59:00] recommendation be to just buy it and leave it there?

Michael: That’s funny, my recommendation would probably be not to buy it in the first place. Although, who knows? There, there is a world in which all of this safety recall and focus means that Ford moving forward quality may be job one you never really know in the auto industry.

Anthony: Last recall Chrysler 52,565 vehicles, the 2025 Ram 2,500, the 2025 Ram 3,500. Some of these vehicles may have been built with an ORC module, which may go into offline state while driving. What is an ORC module?

Michael: That is the occupant restraint controller, and that controls at least two. And in this case, it looks like three things that are very important.

It controls airbag deployments. It controls your pretensioners working to make your seatbelt function properly and safely. And in this case, I think it controls also, or has something to do with the [01:00:00] elect electronic stability control. So there’s a couple of problems here. They’ve got non-compliances with F-M-V-S-S 1 26, which is electronic stability control FM VSS 2 0 8, which is occupant crash protection.

And I don’t think any of us want to imagine the situation in which we’re involved in a potential crash in our electronic stability control. Airbags and pretensioners all fail in that moment. That is very bad thing. This one looks like they’re gonna remedy it pretty quick, though. It should be, you should be able to get a repair by mid-January.

Listener Engagement and Wrap-Up

Anthony: And with that listeners, if you enjoy this show or if you don’t even enjoy this show, if you just enjoy safety, if you like, Hey, every time I sit down on my car and I go, ah, I have an airbag and a seatbelt and electronic stability control, and my Ford’s not gonna roll over on me, and my Goodyear tires won’t explode randomly then go to auto safety.org and click donate.[01:01:00]

Fred: Hey, Andy, final question before we go. Yeah. How would somebody give us a message about, plots or rye observations about our parentage, or anything else they’d care to discuss with us?

Anthony: I don’t want to talk to you about your parents. What you go to a con, just you can send an email to contact@autosafety.org.

Yes. Our spam filters are aggressive. Michael’s generous.

Michael: Yeah, and I think we also have a feedback page for the podcast. There is a

Anthony: feedback page as well. I didn’t know there’d be a quiz guys,

Michael: I think that goes directly to Anthony

Anthony: oh, it might actually, yes, if you go to autos.org, just

Michael: want to, if you just want get an Anthony’s here.

There you go.

Anthony: There you go. I’ve signed myself with it. You go to auto safety.org, go to the podcast page. The bottom of that page is a handy dandy little form and you can go ahead and send us a message and be like, let’s talk about your parentage.

Michael: Alright folks.

Anthony: Alright, thank you. And we’ll be back next year before, and we’ll next week will be the end of the [01:02:00] year and we’ll do an end of the year wrap up maybe.

I don’t know.

Michael: Yeah, we’ll see. We might actually get to our guest light of illumination roundup

Anthony: next week. Ooh. Alright. Take care everybody. Bye. Alright,

Michael: bye-bye. For more information, visit www.auto safety.org.