What is a safety? Waymo don’t know.

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

[00:00:00] Introduction and Welcome

[00:00:00] Anthony: You’re listening to There Auto Be A Law the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino for over 50 years, the center for auto safety has worked to make cars safer.

Welcome back. It’s that time again to pull over on the side of the road. Eat your sandwich, drink your coffee, because don’t do that while driving. Come on, I, I, I’ve done it, but, you know, I’m not that smart as you. Welcome to another episode of There Ought to Be a Law. Good morning, folks.

[00:00:43] Congestion Pricing in New York City

[00:00:43] Anthony: Today, we’re going to start off with a, an issue near and dear to my heart, congestion pricing, because New York city decided to finally implement congestion pricing.

That is if you go below 60th street in Manhattan, the city will ding you nine bucks or something like that, [00:01:00] which is great because I don’t live anywhere near there. And it’s the, the idea is that this will pay for mass transit funding and clean up. pollution and make the streets safer for pedestrians and drivers and move traffic along.

And so far I think it’s going great. Apparently traffic is dropped traffic isn’t, is moving at a better rate. The Wall Street Journal has an article and my favorite quote in the article is from Angel Gilliland from West New York, New Jersey, who works in hospitality in Midtown. She’s taking the bus now instead of driving.

Now, Okay, so West New York from where I am, I can literally point to it. I can see it. There’s this giant bridge right next to it. You literally just get on a bus. I don’t know why people would drive that. This is just, this is my favorite part of congestion pricing, and people realizing, Oh, I have to sit on a bus, and that’s actually quicker and cheaper than me driving.

Sorry, that’s just my end of rant.

[00:01:57] Michael: So, how does it work? Is there [00:02:00] basically a mapped out zone and any vehicle entering that zone has to have an E ZPass or something that’s charged automatically?

[00:02:08] Anthony: This is an excellent question, Mr. Brooks. So, they have these cameras and stuff all over the city and they if you have E ZPass, they hit that.

If not, they have license plate readers, they hit that. But, as we know, a lot of people Remove their license plates, which is illegal. Or they obscure them, or cover them up, or leave them quote unquote dirty. Also illegal. So that’s How it’s going. Hopefully it’ll break, stop the scofflaws who are

[00:02:35] Fred: doing a question for you.

I don’t know if that’s a once per day charge or is that every time you cross the boundary?

[00:02:42] Anthony: You know, I don’t know. I think it’s a once per day charge because I, but I’m not, I’m not positive. I don’t know. I won’t drive down there. I mean, I, I’d never have any need to, but I’m not sure. Cause like, if you live in that zone, like when you turn your car on and you drive around, do you get hit?

I don’t, Yeah, I’ve got to think

[00:02:59] Michael: [00:03:00] it’s once per day. I mean, imagine being a ride share driver who is continually having to cross that line every day. That could get, you know, significantly more expensive than even what you’re collecting from fares.

[00:03:14] Anthony: Right. So apparently, so ride shares and taxis, they pay a reduced rate.

And I think for them, it’s definitely when they enter and exit the zone. Everybody else, I must be a once a day thing. I don’t, it’s an excellent question. I don’t know. Listeners. Do you know somebody out there does. I don’t know. They already have, you

[00:03:32] Michael: know, they’ve got data already showing that there’s, you know, less traffic.

You know, I think some streets have dropped a third, some are, you know, 20%, 15 percent drops in travel times. So it looks like it’s working.

[00:03:45] Anthony: Yeah, it’s great. It’s just the state of New Jersey who’s very against this because they want some of the revenue. That’s, that’s really their complaint. They’re like, yeah, you’re hurting New Jerseyans from driving into New York city.

There is a ton of mass transit options. I know if you live somewhere else in the world, mass [00:04:00] transit is not really a thing, it’s a myth. From New Jersey into New York City and vice versa, there are trains, there are ferries, there are buses. There is a ton of options and they’re all actually run really, really well.

And they’re inexpensive. So the state of New Jersey is just like, Hey, give us some of that sweet money.

[00:04:18] Fred: When I was watching Sein, when I was watching Seinfeld, they also had a rickshaw option. Is that, is that a thing or is that just for Seinfeld? I

[00:04:27] Anthony: think it was a thing briefly, but then it’s just.

cruel. So I don’t, I don’t think they do it anymore because they’re strapping into the homeless, right? Yeah. Yeah.

[00:04:37] Michael: Well, it’s also, you know, from a safety perspective, it would be interesting to see what happens here because we’ve seen a lot of studies that suggest that congestion actually safer in some ways.

So it makes you wonder if, you know, less traffic means, you know, more space for higher speeds and maybe negative safety outcomes. So I [00:05:00] guess we’ll see how this, how this proceeds.

[00:05:02] Anthony: Yeah. Cause prior to this, New York city had the worst traffic in the entire country, I believe they would take it like how long it would take you to go across town and basically be a, you start a November 1st and you finish on October 20th.

Like it’s just. It’s insane how bad it was. So this has apparently improved that greatly. Hopefully it reduces pollution, makes the streets safer for most New Yorkers who are walking. It makes the bus lanes clear. So I’m all for it. Fred, we’re going to do it to your city next. I know you live in the country.

[00:05:33] Fred: I’m gonna drive down there sometime just to see how I, I get pinged .

[00:05:38] Anthony: Okay. I, I think that’s an excellent idea.

[00:05:42] Automated Crash Systems and Pedestrian Safety

[00:05:42] Anthony: Hey, let’s talk let, let’s, let’s go right into the world of a DAS and, and robocar and all the cool stuff that, that’s coming in the future. The Insurance Institute for Highway Safety came out with a report saying high visibility, high visibility clothing makes automated [00:06:00] crash systems worse.

So we’ve talked about this where automatic emergency braking and, and these camera systems that are forward looking, they don’t see people very well, especially we’ve seen they don’t see people who are black or wearing dark clothing very well. And it turns out the same is true that that wearing bright clothing that makes it stand out to humans makes them almost invisible to automated crash systems.

So, maybe let’s take this back and let’s not put people anywhere on the road. Is that the solution now?

[00:06:32] Michael: No. I mean, the solution is figuring out automatic emergency braking. Systems that pedestrian automatic emergency braking systems that can not be confused by reflective clothing. You know, this is an issue we’ve seen.

I’m assuming this is an issue that’s related to the cameras and the sensors that are used to detect pedestrians and that they’re, they’re being confused by. The reflected lights from these, you know, you see them [00:07:00] on your jackets or your, your dog leash or your shoes, you’ll have certain reflective parts.

And we’ve seen, you know, for instance, we’ve seen Tesla’s become very confused by flashing lights. That’s why they’ve continued to run into emergency responders and, and police cars and that sort of thing, because they have trouble detecting certain. Certain types of reflected light that’s coming back to the cameras.

So this isn’t totally unexpected, but it is kind of scary if you’re a pedestrian, you know, if you’re wearing high visibility, reflective clothing to. to allow humans to see you better while you’re running down the road, then you’re also exposing yourself to potentially not being seen by the computer vision that’s going on in these cars and possibly subjecting yourself to, to being hit by a vehicle that can’t detect you.

So there’s a, this is something that has to be figured out. You know, we’ve got to, we’ve got to figure it out.

[00:07:59] Fred: The [00:08:00] article, if I remember correctly, said that the one the one car they tested that had an active LIDAR passed the test and was not confused by the lights. So I think what we’re looking at basically is that the processor is looking for an intermediate range of illumination, which is representative of ambient light being reflected off of an object, and they reject anything that’s out of range, either too bright or too dark.

As being irrelevant, but the LIDAR has its own light source, so that becomes a different calculation if you’ve got the LIDAR. Right? Do you remember that right? I think they looked at a Honda, a Toyota, and a Subaru, and only the Subaru had the LIDAR. And so the subaru passed. Am I, am I mixing things in my head, or is that?

[00:08:50] Michael: No I think you got it right. It was a Honda, a Mazda, and a Subaru that they looked at. And the Subaru came to complete stop without hitting the dummy. In every trial except for [00:09:00] one and that was when the dummy was, Wearing reflective strips and the roadway was illuminated, but even then it slowed by about 80%, so it did at least detect something.

[00:09:12] Anthony: Yeah, I’m not sure it actually had LIDAR. I think I think it was just a better system on that car. Maybe it has LIDAR. But so a lot of these cars, Fred, they have radar and ultrasonic sensors. Does the clothing, does your clothing choice, does that affect radar? I mean, short of wearing like some sort of iron oxide clothing.

[00:09:30] Fred: It can, but I don’t think it’s a big determinant. You know, unless you were enclosing yourself in metal, but you know, but that’s not going to happen.

[00:09:41] Anthony: So this is a safety feature, like these emergency automatic emergency braking, for example, like that’s been on cars for a number of years now, and they put this out there and like, Hey, we have cameras and we’ve got radar and we’ve got ultrasonic and stuff.

And this is clearly something they’ve said, Hey, let’s put out out testing it. Like [00:10:00] because how else would you put this out in the world and it can’t see a person how does that happen? Well, I Think you

[00:10:12] Fred: pointed out some of the problems of letting Developers self certify and determine their own acceptable criteria rather than having government standards, but no

[00:10:26] Anthony: criteria.

[00:10:28] Michael: Like, there are no criteria, I mean, there is a voluntary agreement around automatic emergency breaking that didn’t even include pedestrians that the industry, you know, I think 99 percent of their vehicles to me, but it’s only up to 25 miles per hour. Trying to avoid striking the back of a vehicle. So pedestrians weren’t even included in that.

It’s not going to be until 2029 when the new automatic emergency braking rules hopefully take effect that we’re going to see actual standards come into play that dictate how [00:11:00] pedestrians are going to be treated by automatic emergency braking. So, right now, there’s nothing there. They can, they can put almost anything on a vehicle and call it pedestrian automatic emergency braking until 2029.

[00:11:12] Fred: Well, I guess I wasn’t speaking clearly. The companies have their own criteria, so there’s no, there’s no requirements imposed on them, but they do have their own internal criteria, which can be whatever they want.

[00:11:25] Anthony: And the criteria just seems to be marketing checkbox.

[00:11:29] Fred: Yes.

[00:11:31] Anthony: Feature. Yeah. Okay. I’m going to jump down a little bit of a rabbit hole.

Okay. Cause I just completed some sailing courses and on boats at sea, they all have. transponders. Well, you don’t all have to, but most of them have transponders saying, Hey, this is my boat. This is where I’m at. And this is kind of my course and my speed and my direction. So other boats can say, Hey, and you can set alarms saying, Oh, this boat’s within a mile range of me.

I should keep eyes on it. And they also have things like if you’re a [00:12:00] smaller vessel, you can put up a thing called a radar reflector, which makes your radar signature bigger. And do all these kinds of things. Why don’t we do this with cars?

[00:12:08] Michael: Well, I mean, that’s what vehicle to vehicle technology would be.

To, so that cars can speak to each other and warn each other presence, even without, you know, I think in boats, if you’re, if you’re on the autopilot system, but you still need a captain paying attention, right, to slow or to change the boat’s course when it detects something on radar that might be on a collision course.

Is that right?

Yeah.

[00:12:32] Anthony: I’m not sure what the US laws are because the US laws are pretty lame and weak, but internationally, yes, you need to have somebody who has eyes on all the time. Yeah. I mean,

[00:12:41] Fred: I just, I’ve seen. Unless you’re running a Russian freighter, in which case you just turn it off.

[00:12:46] Anthony: Well, yeah, of course

[00:12:47] Fred: you turn

[00:12:49] Anthony: it off

[00:12:49] Michael: and then drag and anchor.

It just seems like they’re I forgot what I was going to say there.

[00:12:55] Anthony: Yeah, well anyway, okay, because it seems relatively simple technology. Hey, I’m here. Hey, I’m [00:13:00] here. Hey, I’m here. But what do I know? And we can, you can put that in your pocket so you don’t get hit by a car.

[00:13:04] Fred: Well, it was exactly what DSRC was supposed to do.

But you recall the FCC took away the bulk of the bandwidth that was dedicated to DSRC which was, I can’t remember exactly what it means, but it was a government mandated system for anonymous attribution of location and you know, and the velocity. Of your car, so other people could process that information.

So our, our friends at the agit pi at the FCC got rid of that during the previous Trump administration. And nobody’s really stepped up to put a substitute system in place. So another opportunity lost.

[00:13:45] Anthony: So listeners, you hear it. There is a very easy solution to this to make everybody’s life safer on the road.

Low cost. Welcome to the next four years.

[00:13:56] Waymo and the Future of Robo-Taxis

[00:13:56] Anthony: Hey, let’s go to the Waymo section of this podcast. Does that sound good? [00:14:00] Waymo. Waymo, for those of you who just woke up, is a boondoggle happening inside Google. I think maybe the Google executives might start catching on to it, but whoever’s running Waymo, that’s, that’s that’s just, they’re setting cash on fire.

And unlike General Motors, who set a bunch of cash on fire. Waymo, our alphabet, Google, has a lot more money to burn. Bloomberg has a great article we’re linking to about how San Francisco and the robo taxis have taken over. And they have David Zipper interviews, oh, who is the guy he interviews? The director of San Francisco Municipal Transportation Agency since 2019, Jeffrey Tumlin.

And so, there’s a great question in there. Overall, are robo taxis a positive or negative for San Francisco? And the answer? So far, there is no net positive for the transportation system that we’ve been able to identify. The Robotaxis create greater convenience for the privileged, but they create problems for the efficiency of the transportation system as a whole.[00:15:00]

So Robotaxis just for the 01%. Everyone else can suck it. That was my takeaway.

[00:15:08] Fred: Well, I was just going to say it’s long been identified as a solution in search of a problem. And you know, this is, this is really being driven completely by the need for return on investment of capital that’s been put into these things and not to solve any particular transportation problem.

[00:15:33] Michael: Yeah, and you know, we’ve talked, we’ve talked plenty about the, where this industry is going and, and whether it’s needed or necessary, and we’ve struggled to come up with really any compelling argument to have these things around right now, particularly with all the problems we see. seen, you know, there’s this continual drumbeat of, you know, Oh my God, we’ve got to get these things on the road or China’s going to catch us.

And Oh my God, we need these to, to [00:16:00] save lives because humans aren’t safe, but we just haven’t seen, you know, the vehicles that can a drive. Fast enough, drive at freeways, freeway speeds or be reliable enough to be deployed across the country to address whatever this giant problem is that they’re supposed to address.

So it’s, it’s really interesting. And you see in the article, just kind of the attitude of the industry as expressed by The former crew CEO, Kyle, which Anthony, I might let you take over and play Kyle for, for, for his interaction with the SFMTA.

[00:16:40] Anthony: Alright, let me see if I can do this. So my motivation is I’m Kyle, and Kyle was the CEO of GM Cruise he sucks.

He doesn’t work there anymore. Well, GM Cruise doesn’t exist anymore, but they got rid of him before they even shut it down. And he he was at a meeting, he gets fists on the table and said, You’re the most dangerous person for automated vehicle [00:17:00] industries. Like, that’s an insanity. Who bends their fist to the table?

[00:17:08] Michael: Yeah. I mean, it just, the guy comes off as a bully who walked into the office of the director of the San Francisco transport authority and was just, you know, wanted to bully, bully him into letting crews do whatever it wants on the streets of America, which you know, thankfully we have, we have reasonable people in charge of agencies like this that see through some of the venture capital that’s Under, under, underscoring these businesses.

[00:17:36] Fred: Well, when logic fails, there’s always ad hominem attacks available.

[00:17:40] Anthony: I think the next four years is just gonna be a bunch of guys banging their fists on a table saying, Your regulators are bad! You’re threatening the world! That’s going to be, oh, that sound, we’re all going to get too used to it. I don’t

[00:17:52] Fred: think you have to wait any longer.

[00:17:54] Anthony: Well, we give it a, Hey, look, we can only make these jokes and insults for the next week and then it’s illegal. [00:18:00]

[00:18:01] Michael: Yeah. I mean, and look, he points in the article they point to what is going to be a major problem in, in the next few years. You know, if we see federal regulation come around on autonomous vehicles, not, you know, in California.

The local government in San Francisco was prohibited from taking direct action against the autonomous vehicle companies, even though they were screwing up traffic and blocking firetrucks and ambulances and the laundry list of problems that we saw there. Their hands were tied because the regulatory authority rested with the state and, you know, state lawmakers are highly subject to the whims of lobbyists for large corporations.

And they, they, they, who they pass laws that vests the authority in the state rather than the locality to regulate this kind of thing. Well, when you get a federal autonomous vehicle bill, you’re going to see the same thing happen, but the states [00:19:00] aren’t even going to be able to control their own roads anymore because they’re going.

The federal laws are going to be used to preempt state laws that allow for regulation of autonomous vehicles. So it is kind of a a premonition of things to come.

[00:19:16] Anthony: Just to remind people in California was the California public utilities commission, which is a board appointed by the governor. And one member of the California public utilities commission used to be the general counsel for GM crews.

So they’re like, yeah, whatever GM Cruise wants, we’re gonna pass, bro. Anyway next Waymo article, this is one from the Washington Post, and I’m gonna quote from the article It’s so much safer, especially for a woman said one of these Waymo customers. You’re not getting into the car with some strange man.

Instead, you’re getting into the car with software written by a bunch of strange men. The argument is that these driverless vehicles are safer for, for young women or women, Because there’s no [00:20:00] creepy guy behind the steering wheel. I don’t know how much of a. problem that is in this country of assaults.

I mean, it’s certainly,

[00:20:07] Michael: it’s certainly an issue. You know, and we saw a lot of Waymo sexual harassment victims who at one point Waymo was trying to prevent them from filing suit in, in a court of law and trying to make them go to Arbit, you know, the, the, the Cruz Arbitration Court. to, to to or the Uber, sorry, Uber arbitration court to hear their claims of sexual harassment.

Ultimately Uber reversed its position on that, but they’re certainly not reversing their position on that in regards to other problems that might happen when, when you’re in their vehicles. If, you know, if Uber is partnering with Waymo to put autonomous vehicles on the road and allow people to use the Uber app to get into them.

There’s going to, you’re going to be subject to a number of, of end user agreements with those companies, Uber and Waymo, probably that will require you [00:21:00] to give up your right to, to, to bring a case in an actual court in America, if something bad happens to you while you’re in those cars.

[00:21:08] Fred: Right. Just a reminder to our, to our listeners that Uber.

Successfully defended a liability claim in court for people who are injured in an Uber because they had actually somebody in their family, not even them, but somebody in their family had used their account once to order on Uber Eats. And the identification clause that was in the Uber Eats agreement, the court said extended to the actual injury that these people suffered when they were using an Uber.

So, yeah they will use every device possible to avoid their own responsibility.

[00:21:51] Anthony: Yeah, I think it was a teenage member of the family too, so somebody who was under the age of 18 said, yes, I want Uber Eats. And [00:22:00] by doing that, you agree to Give up all of your rights with anything related to Uber writing.

[00:22:05] Michael: And then back to that, you know, the Waymo article, the Waymo article that we’re looking at really talks about women who are in a Waymo and there have been a number of situations where, you know, a bad actor, whatever their intentions might be, it’s, it’s, it’s hard to know, but, People know that if they walk out in front of a Waymo, then the vehicle is going to stop and it’s not going to move.

So essentially anyone walking down the road can walk out into the road and trap you where you are. I’m, you know, I, I believe that the, the Waymos would be locked, you know, so there’s a question as to, you know, how much danger you’re in immediately, but There’s certainly some problems here that need to be worked out, you know, if there’s criminal activity going on, you know, criminal, I mean, it’s essentially when you’re in your car versus when you’re in a way, Mo, you have the ability [00:23:00] to make the decision whether to evade or avoid or to leave the area at all costs.

If you see criminal. Behavior going on. If you’re in a way mo you don’t have that option. The car is in control of where you’re going. You can get out and run or stay in the car and hope for the best, but there just aren’t a lot of options. And that’s a very scary, scary situation for people to be in.

[00:23:22] Fred: And it has been documented that people have men in a car.

I’ve been following women in a way Mo and essentially threatening them, or at least the women felt threatened because the women had, as Michael said, had no control over what the car was going to do. So it’s a great way for stalker to follow somebody home. And you know, in that case, they might have really wanted to have a man or at least another person in the car to help avoid the, you know, the criminals who are chasing them.

[00:23:59] Anthony: Well, I’m not [00:24:00] getting in a way mode this week. Our next Waymo article, also from the Washington Post covers Waymo’s not stopping at a crosswalks. And this is something we talked about last week with Missy Cummings, where people just, they’re at an intersection, at a crosswalk, and the law, I think, in most states, is that, You have to yield to pedestrians.

And so pedestrians try to cross. As a pedestrian you always make a little eye contact with the driver and you do a little kind of, Hey, yeah, you see me, I see you. The driver doesn’t run you over. But when there’s no driver in the car, the Waymo is just like, I’m gonna keep going. I’m gonna keep going. So maybe you should stop wearing reflective clothing at these intersections.

Is that the takeaway?

[00:24:40] Michael: No, I mean, the takeaway here, you know, this is Jeff Fowler, who’s a columnist to the post. He, he basically he’s in San Francisco and he’s got way most driving down his road, driving down his street every day. And at the crosswalk, he is, has noticed that way most are not stopping. They’re not yielding to him.

And he’s [00:25:00] tested this out and videoed it and, you know, dozens of the times he’s walked out and he’s tried to figure out, you know, what What, what actually makes the vehicle stop? And he says, you know, the only way he was consistently able to get the way mo to stop at the crosswalk was to dart, dart out into the street, which is obviously not a good idea.

And, but underlying all this is the fact that, you know, Waymo should be stopped, probably should be stopping at crosswalks every time anyway, if there’s a human anywhere near the crosswalk. I mean, that’s, that’s actually the law in a lot of locations, but, you know, the Waymo interpreting the situation, you know, that gets into sketchy ground and it’s something, you know, we know that, that, that.

autonomous vehicles struggle to identify and predict human behavior. The absolute safest course here is to stop at a crosswalk when there’s a human anywhere, anywhere near it. The [00:26:00] fact that way moves are not doing that, you know, the, you know, the anecdotal evidence from Mr. Fowler’s experience that the way mo would yield for him about three out of every 10 times it’s, it’s.

It’s a little scary if, you know, fortunately, I don’t believe that any, anyone has been hit by a Waymo on a crosswalk. I think there may have been one bicyclist who was, but they should be stopping. You know, the Waymo shouldn’t be trying to accommodate, you know, speed and efficiency or getting to their top speed.

to their destination over the possibility that someone might walk out of the road on a crosswalk. They need to be stopping. I mean, that’s the rule of the road in a lot of places. And I don’t think we want to have Waymo’s being trained to act like human drivers. That’s kind of the whole point, right? So this is a problem with, with Waymo’s and the rule rules of the road that needs to be resolved.

[00:26:53] Anthony: Fred, would you say a human in a crosswalk, is that considered an edge case scenario? [00:27:00]

[00:27:00] Fred: The edge of the crosswalk, yeah. Listen, I’m gonna, I’m gonna pump up the Tower of Fred here, because I want to dive into this a little bit. There’s a concept that the A. V. companies like a lot, which is called absence of unreasonable risk.

And so what the A. V. companies are all saying, and it’s been institutionalized now, is that safety means the absence of unreasonable risk. And Well, this gives them a lot of wiggle room to do pretty much whatever they want and so I wanted a little bit more detail on that. I looked at the paper that was published by Waymo called Building a Credible Case for Safety, Waymo’s Approach for the Determination of Absence of Unreasonable Risk.

It’s very interesting. We’ll send, we’ll post a link to this. And it says risk is a combination of probability of occurrence of harm and is estimated severity for unwanted outcomes of interest. Well, that’s true. That’s good. That sounds pretty good, right? If you’re designed to that, then you’re in good shape.[00:28:00]

They go on to say, and I’m quoting here, Acceptability of risk can exist and be assessed at both the ADS functionality status dimension and the level of aggregation dimension levels. OAMO’s approach calls for a balance of occurrence and support event level risk assessment and aggregate level acceptance criteria for Which work as overarching indicators of performance and are not necessarily traceable back to individual events.

So that says basically that they’re all pragmatic decisions. Waymo is not going to trace them back to individual events. And so the question arises, who decides what’s reasonable? Waymo’s coming up with their own decisions. And my position, I guess, is that shouldn’t it at least be based on forensic analysis or crashes that have happened so that, you know, you understand the reasons for the crash and you make sure it’s not going to happen again.[00:29:00]

The question is, is that too big a burden for developers to bear? Industry really prefers the ambiguity. So, NTSB investigated the Mountain View accident in California, which occurred on March 23, 2018. And They released their analysis on February 25, 2020, which is 23 months after the incident. Okay, so forensic analysis is not easy.

You need to do a lot of work. It’s better to wave your hands and say, well, the driver was X, and that’s why it happened. No, that’s not why it happened. The driver was X, yes, but that was only a factor. You’ve also got to consider the software that’s in the car, the hardware, the tires, the environmental conditions.

A lot of things have happened. In fact, in the case of the Uber crash that killed a woman in Arizona, the NTSB came to the conclusion [00:30:00] that it was the corporate culture at Uber that was a major factor in that woman’s death. So, diving into this Highway Accident Report a little bit, remember this is now, seven years ago this happened, seven years ago, Anthony, your, your son in college was in middle school, okay, so this is a long time.

And still, you know, the recommendations that were published in 2020, this is now five years ago include for NHTSA that they expand the new car assessment program testing of forward collision avoidance system performance. Well, they’re finally getting around to that, and that’s going to happen in another, what, Michael, five years, four years, something like that.

[00:30:42] Michael: They, you know, the NCAP is the recent update that they announced should, should be going into effect over the next couple of years. However, we’re just not sure how that what the trajectory of that’s going to be under a new administration,

[00:30:57] Fred: right? So anyway, so maybe [00:31:00] 10 to 15 years after this incident, that recommendation.

Which really defines what’s reasonable, in my opinion, has yet to be adopted by the industry or NHTSA. The next step recommendation is to evaluate the Tesla autopilot equipped vehicles to determine if the system’s operating limitations, foreseeability of misuse, and the ability to operate the vehicles outside the intended operational design domain pose an unreasonable risk to safety.

No regulations yet coming on that.

[00:31:32] Anthony: they’ll be coming shortly. Don’t worry.

[00:31:34] Michael: Well, I mean, keep in mind here that, that, that, that is essentially the standard for when a safety recall has to be performed and for decades now, you know, we’ve seen this standard come up in court or elsewhere, and it’s never been defined in a way that lends itself to being interpreted by the data, right?

There’s no number or threshold. There’s no real. Definition of what that means, and that’s [00:32:00] problematic,

[00:32:00] Fred: right? And remember, this is now seven years after that death that occurred. And five years after the recommendations came out, still nothing. Next recommendation is for vehicles equipped with level two automation, work with SAE international to develop performance standards for driver monitoring systems that will minimize driver disengagement, prevent automation, complacency, and account for foreseeable misuse of the automation.

Zero progress so far. Zero progress. Missy talked about this last week. Next recommendation. Go ahead. I’m

[00:32:35] Anthony: still stuck on this, this phrase though, that safety is the absence of unreasonable risk. Right. Like, I, who, who came up with that? I actually, I, I did a quick search for, for what does that phrase mean?

Because I, I don’t understand it. And it comes up from some door company, it looks like. That, that make like physical doors.

[00:32:55] Fred: My understanding is that this is a legal concept that’s applied. In [00:33:00] liability cases,

[00:33:02] Michael: it’s in the motor vehicle safety act, you know, unreasonable risk to motor vehicle safety is the, is literally the gatekeeping issue that you have to solve when you’re looking at whether a vehicle needs to be recalled or not.

That’s, that’s the standard.

[00:33:18] Anthony: I still don’t understand it. I mean, this article that I, I, I just came across these, They said a better way to phrase it might be, and I don’t understand this either, is safety is an outcome of the absence on reasonable risk. Now that doesn’t make sense to me either. This is some legalese bullshit, isn’t it?

[00:33:34] Fred: Yes, it is, sir. Exactly that. And it’s not a design standard. It’s not a requirement. It doesn’t say that, you know, for example, Your vehicle has to have safety outcome no worse than the aggregate human driver safety outcome. Okay. That’s a design standard. You could apply. Right there’s a number associated with that, [00:34:00] but that’s not what they do.

Now, there’s another recommendation they’ve got for net, so, which is after developing performance standards for driver monitoring systems recommended in safety recommendation H20 3. Which was a previous recommendation. Require that all new passenger vehicles with level 2 automation be equipped with a driver monitoring system that meets these standards.

That has not happened. There’s no it’s not happening at all. Companies are putting their own monitoring systems in places in certain cases. GM’s got 1 for their blue cruise or no, that’s for excuse me.

[00:34:38] Ford’s Super Cruise and Safety Standards

[00:34:38] Fred: The super cruise. I’m not sure what Ford is doing. But again, 8 years after that, that gentleman died and so many people were injured.

Nothing. But they have

[00:34:50] Anthony: standards, but they’re just internal, and they haven’t shared them with anybody. Right.

[00:34:55] Waymo’s Pragmatic Approach to Safety

[00:34:55] Fred: Well, you know, going back to what Waymo was saying, [00:35:00] you have to balance the, you know, the likelihood of something occurring to the overarching indicators of performance. And not necessarily traceable back to individual events.

So basically what they’re saying is screw it. We’re going to balance things the way we want to balance them. And then what that results in is a Waymo approaching a crosswalk and making its own determination about whether or not it needs to stop because of the trajectory of objects it determines to be pedestrians in that crosswalk.

And so they’re making pragmatic decisions that, yeah, this person’s going to exit the crosswalk. Yay or nay, and we can get by okay, and that’s violating the law, which says you need to stop for a person in the crosswalk. The law doesn’t say you need to stop if you think the person is going to, you know, exit the crosswalk before you get to it.

So, you know, that’s the pragmatism that’s built into the Waymo approach for [00:36:00] building a credible case for safety. Interesting reading. It’s very complex. That complexity, I think, allows Waymo to hide behind an awful lot of fog and basically gives them, in their own mind, a free reign to do pretty much whatever they want to do.

[00:36:17] Debating the Definition of Safety

[00:36:17] Fred: So, Anyway, absence of unreasonable risk, I don’t think it’s a good idea to use that as safety. I get a lot of pushback when I bring that up in industry meetings. They seem to like that very much, and they seem to be very disinterested in going back to the heart of it, which is the International Standards Organization report number 26262, which defines safety in this way.

Everybody leans on it and says, that’s a standard. We’re going to go with it. And my opinion is the standard that does not even a little bit confer safety on the cars being [00:37:00] built. In response to that, End of rant.

[00:37:05] Anthony: I agree. I’m just still stuck from a grammatical point of view or, or a definition point of view of understanding safety is the absence upon reasonable risk.

It’s really it, it, for those of you not paying for the visual experience here, it really makes my brow furrow. This is literally like they’re trying to trip up first year law students. That’s what this is. Yeah, I mean, that’s

[00:37:28] Michael: the problem. It is, there’s a broad way in which you can interpret that statement because of the, contains the word reasonable, which everyone can disagree on.

And if you allow, you know, manufacturers to make their own interpretation, well, they’re going to interpret it one way and a way that we might not interpret it. Think is, is all that great when the vehicle runs over us.

[00:37:51] Anthony: I mean, I see the problems being the word absence on reasonable risk. I mean, I can just do whatever I

[00:37:57] Fred: want with that.

Well, my position is [00:38:00] that there should be an affirmative definition of safety. This is what safety is and you have to design to this. And we’ve suggested some definitions for that. Absence of unreasonable risk is a property that that affirmative statement of safety should have. And you can test it, right?

If so, if you have an affirmative statement of safety, this is this is what the vehicle has to do. Then you’re going to ask the question. Is that definition going to allow the vehicle to safety operate and include absence of unreasonable risk? But it should also include, it’s going to stop within certain parameters.

It’s going to, it’s not going to harm the individuals inside of the vehicle due to its control system strategy. It’s not going to harm people outside of the vehicle due to its control system strategy. You can go, you can go on with that. But in fact, our bill of rights. Our consumer bill of rights [00:39:00] is basically a statement of what those parameters should be for a vehicle and all the parameters that we’ve got in there are.

Supporting of the absence of unreasonable risk, but we try to quantify what that would mean in terms of engineering requirements. So there’s a long way to go in these regulations. And again. Until this is resolved, I think you need to continue to think of AVs as a solution in search of a problem, because the solution is really, you know, the problem they’re trying to solve has got a lot of ambiguity.

So, if there’s something besides capital preservation that’s underlying these, it would be very nice if someone would let us know what that is.

[00:39:44] Anthony: And that’s from Fred Perkins, the second most dangerous man in the world of A. V. regulations.

[00:39:50] Fred: Oh, you flatter me. Thank you.

[00:39:52] Anthony: I know. Yeah, you can go to autosafety.

org and you can read our Consumer A. V. Bill of Rights. The link’s on the top of every page. And [00:40:00] after you do that, you can click on the donate button. Oh, boy. I know. We’ll like it too.

[00:40:04] Uber and Lyft’s Self-Driving Car Dilemma

[00:40:04] Anthony: Let’s continue down the rabbit hole of self driving nonsense. So, Uber and Lyft, they I think Uber tried their own self driving car thing, and they’re like, Ha ha, somebody there did basic math, and we’re like, that’s not gonna happen.

Or basic physics, and said, yeah, let’s not do that. But now, Uber and Lyft are like, hey, Waymo, we’re, let’s use your cars. We’re gonna use your cars and get rid of Human drivers because someone at Uber and Lyft haven’t done basic math yet. And we’re linking to an article from the Wall Street Journal, and part of it, I’m quoting, is Uber and Lyft have agreed to maintain these driverless fleets.

They are finding locations to store the cars, equipping them with chargers and high speed internet, and training workers to maintain the cameras, lidar, and other gadgets that driverless vehicles depend on. My take on this is dum da da dum dum dum! As we’ve pointed out numerous times, these vehicles are MASSIVELY expensive to repair.

There is [00:41:00] ZERO regulations or specifications on how to maintain lidar radar. And camera vision systems. There’s a, unless Waymo all of a sudden is like, here’s our technical specs manual, which as best as we can tell, they have not handed this out to anybody to say, this is how we maintain these vehicles. I can’t imagine they’re going to say to Uber and Lyft being like, Hey, here’s the secret sauce, guys.

Try not to slip it. Let it slip into our competitors hands. I, I think this is Uber and Lyft quickly going, Oh, shit! Why didn’t we sign up? They’re gonna be like Hertz saying, we’re gonna rent a bunch of Teslas. Sounds good. Didn’t work out. That’s what

[00:41:42] Michael: it makes. It makes sense from Uber lift perspective, if they are just arranging for the trips, right?

You open the Uber app. And in addition to your Uber X, your Uber black, whatever, all their options are, you’ve got your Uber AV, which could be Waymo or, you know, another [00:42:00] manufacturer of, of autonomous vehicles that’s deployed in your area. You know, for, for Uber and Lyft, simply to facilitate that transaction, to take a cut, to be the middleman makes sense.

But like Anthony says, when you’ve got Uber or Lyft in charge of maintenance or any type of, any of that stuff that, that, that could have, could impact the vehicle’s operation. That’s when you really have to start questioning whether, whether, you know, Waymo is willing to take that risk. You know, the it was an Uber vehicle that has.

You know, killed the only, it’s the only AV that has actually killed a human so far that we, that we know of. And so it, it really raises a question as to, you know, whether companies that are essentially just a middleman from a driver to a passenger should be in charge of the actual technology behind autonomous vehicles.

We’d say no.

[00:42:53] Anthony: So one of the pushes for Uber and Lyft is, Hey, we don’t have employees. We just have these contract gig workers driving the [00:43:00] car. So we don’t have to pay them that much. But if now, if they’re taking over the fleet and maintaining it, then you need to pay very expensive technicians to do this kind of thing.

And I imagine once, like you said, so the technician there calibrate something wrong, that Uber now goes out and crashes and. Waymo and Uber are gonna be like, You did it! No, you did it! And it just And then lawyers will be like, Safety is the absence of unreasonable risk. I get 30 percent

[00:43:25] Fred: are we going to get to a gaslight illumination this week or

[00:43:28] Anthony: I don’t

[00:43:29] Fred: think so.

I think I’ve got a great segue that goes along with this discussion.

[00:43:33] Anthony: Hey, look, then just that you ruined your segue,

[00:43:35] Fred: but go for it. All right.

[00:43:37] AV Industry’s Best Practices and Inspections

[00:43:37] Fred: So we’re talking about safety, right? So the automated vehicle industry association has put out a best practice for automated driving system, dedicated vehicles.

Okay. All right. Safety Inspection Framework. Now, the AVFC is an industry group that allows no outside input to what their best practices say. And [00:44:00] so Anthony, let me ask you, what defines a software driven vehicle or an AV? What defines that fundamentally?

[00:44:07] Anthony: That the computer is driving the car.

[00:44:10] Fred: Right. That’s software driven.

Yeah.

[00:44:12] Fred: So that, that shouldn’t surprise anybody. So you’d think the safety inspection of software driven vehicles, a framework for that would include software, particularly safety critical aspects of the software. So this best practice that they published quoting from the, it says inspections of ADS software in this document, focus on version verification, Correct configuration and accurate calibration, all other aspects, assessment of capability or performance, verification and validation, et cetera.

Or outside of the scope. So basically what the industry is saying through its industry organization is that the only thing that you [00:45:00] need to inspect in a software driven vehicle is that part of the vehicle that doesn’t include what’s driving the vehicle. So this, you know, this is very telling. I think that they just, they’re just not there.

They don’t know how to do it. They don’t want to do it. They’re publishing a lot of nonsense. That’s that sure sounds like they’re doing a good job. But then within the fine print, they just give you clear notice that, in fact, they’re just punting on any type of inspection that would actually reflect upon the poor performance of the vehicle or lack of capabilities for safety that the vehicle has so end of rant there too.

But that’s my dash light of the week.

[00:45:46] Anthony: All right, Michael, do you have a gas light?

[00:45:49] Gaslight of the Week: AV Safety Data

[00:45:49] Michael: I do. It’s going to be the same folks the autonomous vehicle industry association, because last week they put out what is essentially their 2025 policy push. They know they’ve got, [00:46:00] you know, a Trump administration and the Congress going.

Probably in the favor of industry. And so as part of that document, there’s some other things in there which we could dive into at another time. But they are, you know, they’ve taken a very, a, a, a view of crash data for avs that I, I have to say, as gaslighting a lot of people, not just states, but also, you know, independent researchers and others.

So what they’re trying to do, and I, and I’ll quote. A couple of spots here, you know, they are trying to make a national AV safety data repository. This is what they’re pushing for now. They’re only want to make that available to relevant state transportation regulatory agencies. And only to regulators, they’re only going to give regulators visibility into incidents that are in their state.

So if you’re in, you know, Wyoming, and you don’t have any AVs in [00:47:00] your in your state, you don’t get to even look into what’s happening in other states. Also, you know, obviously, they’re, they’re very high, you know, On confidential business information protections for that. We’ve seen how those are abused in the, in the NHTSA standing general order data reporting, you know, Tesla routinely redacts all relevant information on crashes, whether or not, you know, a lot of that information is plainly obvious.

To bystanders or the public who could have been around that vehicle when it crashed on a public road there are a lot of problems with with that, you know It prevents descriptions of those accidents from reaching the public domain And other regulators in other states under this plan. Also they are trying to propose that A meaningful minimum damage threshold for reportable crashes.

So what they want is they want to see an actual crash and a threshold for how much damage was caused by that crash before it can be included [00:48:00] in this safety data repository. Well, we know obviously another step. California is already collecting incidents like this where the autonomous vehicle screws up badly, but no one is killed.

You know, we’ve seen this on video. We’ve seen way most driving the wrong lane. We’ve seen dozens, if not hundreds of incidents where there’s seriously questionable and unsafe behavior that’s going on. And yet. None of those events would make it into this data repository to inform, you know, anyone in, whether it’s a regulator or anyone else about the potential dangers of abs.

And then finally, they want to enhance. They’re saying this is enhancing state, regulatory and public understanding of incident reporting by removing the current one day reporting deadline, and instead require reporting with 120. Hours of an event. Now, I just want to rewind everyone back to October, November, 2023, [00:49:00] when Cruz had its incident in San Francisco and dragged the woman on the ground and then proceeded to.

To lie about that to California authorities and federal authorities. What this is proposing is giving a bad actor like that another four days before it has to report anything to regulators so it can get its PR bullshit together. That is not an acceptable. Proposition. You know, these these crashes need to be reported as soon as possible.

Not just for transparency sake, but so that government investigators can get to the crash scene and evaluate what actually took place. So I’m giving my gas light of the week to the A. V. I. A. S. Proposal on crash data. It’s It’s it’s makes crash data far worse than what’s currently available, and, you know, essentially, they’re trying to remove transparency [00:50:00] around AV safety with this proposal.

[00:50:03] Fred: Oh, thank you. I want to expand that just a little bit by talking about the standard general order, which requires the reporting the reporting that it requires is not sufficient to allow understanding of what caused the collision. Okay. It’s just a statement that something happened. And perhaps you want to look into this, but in terms of forensic analysis, in terms of understanding what actually happened having the data required to even understand what kind of software is being used, that’s not included in a standing general order.

It’s just a very cursory look. So it’s not in any sense a burden. On the participants to produce this minimal information.

[00:50:47] Anthony: You gotta like fill out like paperwork, man. Like what if my pen’s out of ink and like, well, you know, I don’t have spell check on paper form sometimes. Or hand

[00:50:56] Fred: cramps too, if you’re going to be typing.

[00:50:58] Anthony: Yeah. Hand cramps. [00:51:00] Hey, who told you about my hand cramps?

[00:51:01] Michael: None of this is done by paper, so, you know, nothing to worry about, right? Well,

[00:51:05] Anthony: you know, I got carpal tunnel on the typing and the, and the pressing of buttons and, oh. Okay, that’s pretty good. You both did the same gaslight. I’m, my gaslight is the last time we’re allowed to gaslight this person before he’s inaugurated next week.

[00:51:19] Tesla’s Full Self-Driving Claims

[00:51:19] Anthony: Elon Musk. Elon has been saying that Tesla full self driving their each software update is five to ten times better a massive improvement that, that there’s less human intervention and these cars are great and great. And wow, man, that sounds, oh, it’s five to ten times better. I mean, that’s, you know, basically, man, that’s a, that’s a huge fucking gap, okay?

Like it’s, it’s a lot. But also he’s like, oh yeah, the data, no, you can’t see that. Yeah. We’re not going to share that data at all. Yeah. Instead, people have gone out there and said, Hey, let’s, let’s crowdsource this data and let’s figure out what we can do. And that’s not what the crowdsource data is showing.

The [00:52:00] crowdsource data is showing basically no, no, their, their stuff is not improving that much at all. People still have to intervene with this software regularly. And unfortunately the data only refers to highway miles where, yeah, you know, everyone knows, Hey, you want to buy a car. You want a bunch of highway miles on it cause it’s just a consistent speed.

Traffic seems to be flowing versus that stop and break of cities. And they’re, they’re apparently their software for highway and city is a different software stack. I think we’ve seen it cause they’ve actually manually drawn in on roads where Elon’s like, Hey, it keeps crossing this line here. So their quote unquote artificial intelligence is just somebody manually updating a look up sheet saying don’t do this.

But basically my gaslight is that Elon Musk everything ever out of his face. He’s just So gaslighting. He’s I mean, sure he’s adorable. He’s the most attractive man on, on [00:53:00] planet Earth. He should be reproducing as much as possible to repopulate the planet. He’s a genius engineer who has no background in engineering.

He he’s an anti immigrant who isn’t immigrant. He’s, you know, he’s a jack of all trades. He’s a jack of all asses. So that’s my, my gas light of the week.

[00:53:22] Fred: Without this gas light being produced, we should start a utility.

[00:53:28] Anthony: Oh yeah. Ah, okay. Hey, let’s let’s continue with my attack on Tesla. This is another one from Electrek titled Tesla issues recall in over 200, 000 vehicles for the self driving computer failure.

Sure we’ll get to this in Recall Roundup but apparently they’re they’re having a short circus, short circuit, a short circuit. A short circus is what I do. It’s

[00:53:55] Michael: been a long circus with Tesla.

[00:53:57] Anthony: Well, this is a short one. And just, you know, it’s it’s [00:54:00] a. Pygmy cats and whatnots and mice. I’ll stop.

[00:54:06] Michael: That was an interesting one. I mean, it looked like you know, from electric’s reporting, it looks like there was a problem with the self driving computer in a lot of their vehicles and they’d been hearing about it from owners. And reports online for quite some time. And then all of a sudden, I believe last week, Tesla came out with a recall that specifically was to the rear you know, the, the recall we keep seeing popping up every week, which is the rear view camera issue.

And, and electric is claiming, no, this is not just. It’s a short on the car’s computer that affects a lot more than rear view camera functionality. And the recall is only applying to that. So it was some good reporting by Electric. But it’s also, you know, it raises the question of just how quickly and efficiently.[00:55:00]

is getting to recalls like this.

[00:55:03] Fred: Well, it also raises the question of how they’re validating their software changes, because clearly if they had gone through an entire validation process, this would have been uncovered before they fielded it. You know, there were certain electronic devices that start to resist the input of current.

Only after they’ve been activated electronic electric motor is a good example of that. You have to start it in stages so that you don’t overload the wiring and burn it up. This clearly is a validation failure due to their internal processes. So, I don’t think that, you know, the only way you can solve it with software is if there was a software validation process problem in the first place.

That you neglected to understand. So I think this is representative of how the industry basically does a poor job of validation of the software before they put it out in traffic and before they’ve [00:56:00] considered all of the safety critical aspects of it.

[00:56:03] Anthony: Well, for those of you at home, you don’t have to worry about this anymore because starting next week, regulations will no longer exist.

That’s right. No need to test your software, your hardware, your cars, your brakes, anything at all, because everything goes in Musk’s new America. If you’d like to continue supporting the show, go to autosafety. org, click on donate. Thank you. You know what, let’s I’m going to start backdooring into recall roundup because another clean technica about a secret.

A Cybertruck recall of their batteries. And CleanTechnica’s been getting reports in of people saying things like, quoting, I dropped off the truck at the service center a few days ago and they informed me that there’s technically nothing wrong with my batteries. Just my decision to buy this dumb car.

However, Tesla Engineering has requested that the battery pack be returned for a teardown and inspection. Apparently units produced around the same time as mine have been experiencing issues. As a result, they’re taking my battery pack and replacing it with a new [00:57:00] one. Yes, there’s nothing wrong with your battery pack.

Like imagine going to your, you know, Fred, imagine going to the Subaru dealer and be like, Hey, I’m just coming in for an inspection. They’re like, Hey, we’re going to take your engine out. There’s nothing wrong with your engine, but we’re just going to give you a new one. You’d be like, what the, huh? Yo, there’s nothing wrong with my, I don’t, this is insane.

There’s, there’s nothing. We just want to take your engine apart and just see. Like, cause you know, I don’t know. I lost my wedding band in there and maybe it’s in your engine. That happens all the time. Yeah. Yeah. So is this, has this actually, this recall, this is just one of their secret service bulletins.

[00:57:38] Michael: Yeah, essentially, that’s what this is looking like. It’s hard to tell since we don’t really have any information on a safety outcome here. I mean, for all we know, this is, there’s clearly a battery problem here that they’re swapping out for. And then what they’re describing it as is a cell side dent induced core collapse.

That sounds safety related. I’m assuming that a [00:58:00] problem like that could, could, you know, result in battery overheating and possible fire or, you know, possibly failure. A propulsion failure that leaves you stranded in the middle of a road. So there’s probably certainly a safety outcome here. And that means that this should be conducted as a safety recall.

So I would lean towards this being, you know, a, a, a, a, a, something that should be covered by a recall. And it’s something that we’ve seen in the past. I mean, the Cybertruck’s already at, I believe seven recalls. This would be number eight. So, you know, just in the few months it’s been out on the road, it is racking up recalls as fast as almost any other vehicle in history.

Maybe not. It just may not be on the top of the pile, but it’s close. But yeah, we do need some more information here. Ah, there’s been a number of vehicles. Oh, there’ve been a number of vehicles. I believe there’s been vehicles that reach, you know, 20 recalls. I think the couple of years back. No, but over time, I think we’ve seen, we’ve seen close to [00:59:00] this and probably a few more because of the cybercrackers

[00:59:02] Anthony: talking one year, it’s been out for one year and I mean, I’m going to challenge you, you come back next episode and you tell me what car has gotten more recalls in one year.

We’ll save it to next week because now we’re going to go into recalls. Ready?

[00:59:16] Recall Roundup: Ford and Tesla

[00:59:16] Anthony: First one, Ford Motor Company. Shock of all shocks, 295, 449 vehicles. The 2020 to 2022 super Ford F super duty. F 250, 350, but if you it’s a diesel vehicle. That is the big,

[00:59:31] Michael: the big. The big trucks, the big heavy trucks, these are not, yeah,

[00:59:36] Anthony: this is not that you’re taking your kids to soccer practice is you’re, you’re probably out there working and this is a failure of the high pressure fuel pump may result in loss of motive power.

Get this. So this is a

[00:59:47] Michael: propulsion failure that, that a lot of owners have been affected by complaining about. And it’s essentially, they’re deposits from biodiesel that collect inside [01:00:00] of the fuel pump and cause wear and premature failure of the pump. So this is something that, it’s interesting, you know, you’re taking your, your, You’re taking the vehicle into the dealer, but instead of, it looks like the fix for this is not a new fuel pump, but instead it’s some software, a software power, excuse me, powertrain control module software is going to be reflashed.

And apparently that’s going to. Allow for more pump cooling and prevent the formation of those deposits that cause the wear. So hopefully this isn’t a free call, which by which I mean, software instead of replacing a part is a lot cheaper for the company to do. And hopefully it works. I guess we’ll see in the, in the coming months, owners should be getting notified about this.

Right now. So you should be getting a letter in the mail immediately. And also noting, you know, that this [01:01:00] was Ford’s, I believe this was Ford’s 62nd recall of 2024. They kind of came back right at the end. Stellantis was ahead in the vehicle recall there at the end. And it looked like Stellantis was going to have the most recalls last year.

But as of today and all of the recalls look like they’ve been submitted for 2024. Ford is up 62 to 60 in the 2024 recall sweepstakes. If you count equipment recalls in that, then Stellantis is the winner, 72 to 67. So Ford and Stellantis are battling out at the, at, if you want to call it the top or the bottom they’re battling it out for the most recalls.

[01:01:40] Anthony: All right, everybody check your bingo cards at home and see what you got. Next up Ford motor company. Oh my God. This is 20, 000 plus vehicles. The 2020 to 2024 Ford escape. Oh high voltage batteries. This is the issue here. The high voltage cell separator lever between it’s, you’re going to have some [01:02:00] thermal we’re having thermal runaway on this thermal vent.

[01:02:04] Michael: Thermal venting, I think is a. Is a way to prevent thermal runaway by venting the, the heat away from the battery. I don’t think

[01:02:16] Fred: so. I think thermal venting is industry speak for

[01:02:18] Michael: fire. You know, it sounds like not fire, but maybe it is. But this is, this, this is a high, these are hybrid vehicles and it’s the high voltage battery cell that’s used to provide the electric power for the hybrid drive system.

And owners, you We’ll be hearing about this next week. And it looks like, you know, the fix here, it looks like they’re just going to have software installed, but that software is dedicated to detecting anomalies in the battery pack that could be indicative of damage. And in that case, Ford will replace the [01:03:00] battery pack for owners.

[01:03:01] Fred: If our listeners have one of these Borderscapes or Lincoln Corsairs. Do not park it in your garage until this is resolved. Just as a, as a safety hint.

[01:03:12] Anthony: Park it in your neighbor’s garage. You know, the guy hasn’t returned your lawnmower? Park it in his garage.

[01:03:17] Fred: There’s probably a political sign in your neighborhood that you might like to remove as well.

[01:03:21] Anthony: Right. Park it next to that person.

[01:03:24] Michael: And make sure you get your lawnmower back first so you don’t lose it too.

[01:03:27] Anthony: Okay. Last recall, Tesla. What? Clutches, pearls. 239, 382 vehicles, the 2024 2025 Tesla Model 3, 2024 2025 Tesla Model S, 2023 2025 Tesla Model X, 2023 2025 Tesla Model Y. On a small this is the rear view camera.

[01:03:51] Michael: Yes.

[01:03:51] Anthony: Yeah. This is

[01:03:52] Michael: the one we just talked about. And you know, luckily for most of the owners of this, the fixes software update, it’s been put out over the air [01:04:00] right around December 18th. So you’ve had this fixed for almost a month. And so if you’re noticing any. Rear view camera problems or other electric problems because there’s a lot of things that are controlled by the computer that shorting out here beyond rear view camera functionality.

If you’re still experiencing problems like that, that suggests this recall is not working. So make sure to file a complaint when that’s or let us know.

[01:04:24] Conclusion and Upcoming Episodes

[01:04:24] Anthony: Alright with that, I think that’s the end of our show. Thank you, everybody. We’ve got a ton of news. We’re still catching up on from the end of the year.

So expect more and more of that in the coming episodes with special guests coming up soon too.

Oh

[01:04:36] Anthony: my till next time, everybody. Thanks. Bye. Bye bye.

[01:04:40] Michael: Thanks everybody. For more information, visit www.

[01:04:44] Fred: autosafety. org.