Autonomous Ambitions or Delusions?

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Anthony: You are listening to There Auto Be A Law, the Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Cimino. For over 50 years, the Center for Auto Safety has worked to make cars safer.

Hey everybody. It’s Wednesday, November 19th, the first day I’ve heard of cow cuddling. Yeah, I don’t know. There’s some article in your the Washington Post about is a way to help your stress is you cuddle with cows. That has nothing to do with this show. Sorry. Just the last thing I read, it stuck in my head.

AI in Road Safety

Anthony: Let’s get onto the world of autos and safeties and the here’s actually a good use of ai. I think. Yeah, I think this is pretty good. Oh, this is an article from the Associated Press. Titled, cities and states are turning to AI to improve road safety. And I know what you’re [00:01:00] thinking. No, this is ridiculous, but no, according from the article, Hawaii officials, for example, are giving away 1000 dashboard cameras as they try to reverse a recent spike in traffic fatalities.

The cameras will use AI to automate inspections of guardrails, road signs at pavement, markings, instantly discerning between minor problems and emergencies that warn sending a maintenance crew. I don’t know if it’s instant, but I kinda like this. So they’re basically using these cameras to be like, Hey, where’s damage on these roads?

And if we get enough positive signals, maybe we should send a road crew out there to investigate and or repair. I like this idea.

Michael: I really like the idea. It’s one of the few things, that you see, at least that I see being a cynic about AI that comes across the table and I’m like, you know what?

Yeah, that’s a great idea because I’m sure it takes a lot of manpower that state transportation authorities don’t have to go out and evaluate, thousands, millions of miles of roads that they might have in their state for [00:02:00] problems with the infrastructure. Having a, it, it deploying dash cameras that they’re giving out to citizens.

The citizen gets the dual benefit here of having a dash camera in case something goes wrong. They have a record of what’s occurred, but also at the same time, it’s looking for problems with the infrastructure that the state can then go out and clean up. If they’re only

Fred: looking for problems with the infrastructure, it’s probably a good idea, but that’s a hell of a lot of data that could be used for an awful lot of different purposes.

It’d be interesting to see what kind of restrictions. The Hawaiian government’s putting on that use of that data?

Anthony: If I had the camera, I would turn it around and just film myself so I could use it for my TikTok channel called Anthony. I like that idea. That’s good. Yeah, it’s a great idea.

Michael: That’s just one level of it too. Identifying, guardrails that have been damaged and won’t be there to protect the next car potholes. Those all make a lot of sense from a maintenance perspective, but [00:03:00] also the article notes that, they’re also. Other ways of going about this.

There’s a, there they cite Cambridge Mobile telematics program that uses cell phone data to identify risky driving data. I would say that you could do something similar with cameras or with other types of detection systems for that sort of thing. But it allows, it allows them to pinpoint areas where, you know, not only there may be problems with the road system that’s creating a lot of braking incidents or maybe people aren’t seeing, and I think they pointed out a, part in the article where they were, someone was attending a conference in DC and they saw on the software that, that there were a lot of aggressive braking on a road nearby the conference they were at. And when they went by to check it out, they saw that a stop sign was basically obstructed by a bush and people weren’t seeing it until the last minute.

So there’s a lot of things that could be used that this could be used for. I’m particularly interested in, [00:04:00] systems that identify risky driving behavior because that is something that we’ve advocated for, to combat distracted driving impaired driving and other unsafe practices.

Fred: Conceivably be used for V two X warnings of other drivers and ENT behavior and dangers, dangerous intersections and impending collisions and all that sort of stuff if it has suitable identification and suitable software behind it. It’s got a lot of potential. We’ll see how it works out.

Yeah. And we’ll see if it’s restricted to benign uses.

Michael: Yeah. That’s the critical thing is that, there are a lot of technologies coming out that seem to hold promise in certain areas, but they also have dark side. We, I don’t know that we’ll ever dive into the black hole that flock cameras are, they could have great uses to stop crime, but they also have a significant privacy issues with deployment and with the types of things that they can track and they have [00:05:00] security issues.

Anthony: The article of course, at the end wraps up with a sentence like this. Experts in AI based road safety techniques say what’s being done now is largely just a stepping stone for a time when a large portion. Proportion of vehicles in the road will be driverless.

Michael: Yeah. When is that?

Anthony: I don’t know, but I love it.

Experts in AI based road safety techniques. That’s one guy. Yeah. Just say it’s, just say it’s Phil and there’s no way Phil said that driverless cars are gonna take over the entire world. These come on.

Fred: What’s the phone number? Did they get a phone number? ’cause we’d like to have ’em on

Anthony: here.

No. Who, who is this person and how would we not know them?

Michael: They’re a CEO. Who? I think that’s the CEO who worked on the driver dash cam program in Hawaii.

Anthony: I don’t know. They left this a little vague. You know how

Michael: CEOs like to tout AI technology. Come on. Have you had your eyes open the last five years?

Anthony: So we’ll keep an eye on this if we get some more news outta this, but [00:06:00] potentially we give it three thumbs up. Yes. All right.

Slow Down, Move Over Laws

Anthony: Moving on. Aaa. So I’ve noticed this is driving in New York whenever there’s a car pulled over on the side of the road, no one gets out of the lane and gives space. It’s disturbing.

People I see people speed up to yeah, but I can get past this incident, this issue faster if I apply the gas. But when I’ve traveled in for example, Montana, everybody went way outta their way to make sure we gave huge birth to whatever was on the side of the road. Yep. Which I thought culturally, this is great.

I, I don’t know why, but that’s what happens. Anyway, aaa has an article titled Confusion Over Slow Down Move Over Laws, puts Roadside Responders at Risk Quoting. While every state and the District of Columbia has a slow down move over law, many drivers don’t fully understand what those laws require.

It’s, it says it in the name. In a forthcoming AAA FTS National Survey, two [00:07:00] thirds of drivers said they’ve heard of slow down move over laws, but many couldn’t say whether the state has one or what the law entails. Again, it’s slowing down move over. Drivers are far less likely to move over a slowdown for tow trucks or stranded vehicles than for police revealing a widespread misconception about who the laws protect.

This is this is the perfect name for a law. It is not the make America great and beautiful and happy again, which doesn’t mean anything. It is slow down, move over. How, like where’s the confusion?

Michael: Look, I think the confusion here and this is a, a very thorough study by aaa.

I don’t even know if I made it all the way through it because it was about 130 pages long, but I think I got to the most salient points, which are that state laws are really all over the map here. At least in the states, they went and surveyed, I think there were 13 of them where they actually looked at video of [00:08:00] vehicles going down the road.

Only two out of those 13 states, I believe had this slow down component as part of their state law. I think the majority of states simply have the move over component, which is great. But there’s a lot of inconsistencies between states, right? Most of the states these laws are gonna apply to anytime you have police officers or, fire trucks or emergency crews on the side of the road.

Some states, don’t protect things like. Tow truck drivers on the side of the road. And, a lot of states don’t protect things like, a vehicle that’s stopped on the side of the road. And there’s a lot of, the, I think the most negative takeaway from this is that drivers don’t know about these laws.

Or even if they do, they’re not following them because they don’t think there’s any chance of enforcement. If there is a random car parked along the side of the road and you don’t move over and there’s not a police officer there, there’s not gonna be any enforcement. But a lot of the problem with this is that it seems like a lot of people don’t even know that these [00:09:00] laws exist.

I think we’ve all seen them on the roads. I would suggest that, in places like Montana as well or where I’ve spent some time driving, down south when the roads aren’t crowded and you don’t have a ton of traffic around you at all times, it’s a whole lot easier to slow down and move over or to move over when there’s no one around your vehicle.

It gets much more difficult when you’re in, fairly medium to heavy traffic and you’ve got people around you who aren’t interested in letting you in, who aren’t paying attention, don’t even see a car or a firefighter or whatever it is ahead on the side of the road. And the, this situation happens a lot, particularly if you’re in a state that requires you to slow down and move over for any vehicle that’s, stranded on the side of the road, right?

If, I don’t know how many of our listeners use Waze on their drives. I do because, I like the notifications, but the most often the notification you’re going to receive the most often is going to be a [00:10:00] hazard of a vehicle that’s on the side of the road. And most of the time those are gonna be vehicles with no one in them that are simply stopped, that have some kind of problem, they didn’t get to where they were going, and they’ve been abandoned or left there until someone can tow them or get them back.

Those present it’s, there’s a safety risk there in a sense, still to the driver of the vehicle approaching it. But you don’t have that, urgent, safety issue of having, pedestrians and people that are unprotected standing on the side of the road. So there are differences in state law there.

Ul ultimately, all that said the study is looking at ways to make people more aware of the laws and also to make state laws more consistent so that, if I’m traveling from here to Alabama and cross through six states, I don’t need to be aware of the law of every single state. While I’m driving

Anthony: Mi Michael, is there a windshield wiper law that is if it’s raining, put on your windshield wipers?[00:11:00]

Michael: Those are, I think those are all state-based laws, but yes, there would be wind, oh, there is a law windshield wiper there. Law, yeah, there’s gonna be state-based laws for windshield wipers for your operation of your lights. There are a lot of safety laws that are on the books in states to ensure that drivers are properly using their vehicles.

Anthony: I figured that was just more common sense, and that’s what I see when there’s somebody on the side of the road. I can see an abandoned vehicle. There’s no one around. But if you see a pedestrian or there’s someone outside the vehicle, my takeaway is just, it’s common sense and move over. And I think the American public, the American driver is gaslighting us saying, I know what to do.

Come on.

Fred: That’s, they covered this in the final episode of Seinfeld where all of the characters were arrested for criminal indifference. I think that it would be equally effective to just pass a law saying don’t be an ignorant l

Anthony: that should, we should eliminate all the laws and just put that, it’s the don’t be an asshole.

I like that big, beautiful Bill Universal [00:12:00] listeners, just, be safe, be good. We have a ton, a ridiculous amount of Tesla news, but let’s skip that for right now and let’s go to where is it? There we go.

California’s Lemon Law Changes

Anthony: Let’s go to something that’s near and dear to the Center for Auto Safety and it’s lemon laws.

The state of California, the great state of California, is deciding or has decided to weaken their lemon law. And your lemon law is I could try to stumble through it, but I think, Michael, you’re probably an expert in the lemon law. Can you quickly summarize to listeners what the lemon law is? And it’s not to prevent scurvy, as Fred once told me.

Michael: What the living law basically is, just generalizing across all states, is it is a provision in state law that allows consumers to seek a, a buyback or a and repairs. Ultimately, it helps them achieve repairs of vehicles that are basically never going to function properly. So if you [00:13:00] buy a car and you’re, you continually have problems with it, that can be, you might be required to go back into the dealer and have it repaired 2, 3, 4 times, depending on the state, or if there’s, a significant safety issue.

In some states, you only have to have an attempt to repair maybe once. And at that point you can then file with the state. Every state has its own little process. Some states rely on heavily on arbitration. Some states have their own arbitration system. Some have independent arbitrators. Quotes independent.

As you’ll see from the article where it cites the fact that the CEO of the American Arbitration Association is married to a top Ford executive which also, you know. Ford was one of the main parties responsible. Maybe the main party responsible GM was involved as well. I believe in pushing this new law through the California legislature last year.

It was done in a very suspect way using some [00:14:00] sneaky provisions that are allowed in California’s legislature to get it through without a lot of input from the public. We obviously opposed it, but ultimately what happened there was manufacturers that, that sell vehicles in California now really have a choice of law situation.

They can choose to use either the new lemon law or the old lemon law. And so it’s creates, that creates significant confusion for consumers. The newer law shortens the time period in which you can use the lemon law. So you know, you generally lemon laws in America, you have, anywhere from around a year to even up to two to three years in some cases, depending on the language of the law.

To return the vehicle if you’ve experienced all of these problems with it. So it’s shortened that in California it’s really cracked down on attorneys who are representing clients who have lemon claims and makes it harder [00:15:00] ultimately for consumers to get an attorney to help them go through the process, which is, the process is, can be daunting to consumers.

It’s not super complicated, but it’s always more helpful, I think to get the services of an attorney, particularly in California, where the attorney’s fees can be covered by the manufacturer after you get through the process. This has been a, I think, and I’m not, I don’t remember, I think last year I may have I know I’ve given Gavin Newsom at least one Gaslight.

This may have been the issue for him signing this law that was put through in a, I’m it is a very undemocratic manner of putting through a bill that impacts consumers significantly. And we had problems with it then. We’ve got problems with it now, Newsom promise to go back and review the situation and that they would, get some more law into place to rectify a lot of the identified problems with it.

We’re waiting to see if that happens. I don’t believe that’s up. That’s gonna be up for [00:16:00] debate until January when the California’s legislature comes back. So we’re watching to see what happens here. Although I have to say, given all the movement on it last year and the changes involved there I don’t know, just what the likelihood of a California revisiting the issue will be.

Anthony: Michael, let’s assume that I save up all of my pennies and I go out and I buy a new car and it’s in California, and let’s say it’s recalled seven times within that first. Would I potentially? No. Oh,

Michael: damnit a recall on its own won’t be. But seven times in a row would not be grounds because it’s a recall.

They’re, they are correcting the safety defect as part of the recall. The limit laws, I most limit laws are, most of the time when you’re gonna take advantage of a lemon law, it’s because your vehicle simply isn’t functioning to get you from point A to point B or it, it’s consistently doing something that is highly unsafe and, obviously [00:17:00] unsafe.

And, it’s basically at a point where either it’s too dangerous to drive or it can’t be driven. And, it happens, there’s gonna be a certain, luckily small percentage of vehicles that come off the line every year that. Just suck. And they’re not gonna be able to be corrected.

It could be due to many different things. It can be, damage that’s occurred during transport that misaligned the frame and screwed up. It could be, all sorts of things. Especially nowadays with all the software and computing that we’ve got in vehicles and the glitches that we see from that.

You’ll see a lot of issues that can arise in a lemon law and the manufacturer’s given an opportunity to repair it, right? They, in some states, multiple, far too many opportunities in my opinion. And so lemon laws essentially work to get those vehicles off the roads. You’re not gonna be able to take your cyber truck in just because it’s had 10 recalls in the first year of ownership.

Anthony: But you said it, if it can’t be driven, does shame, does that qualify as a reason that it can’t be driven? Shame. [00:18:00]

Michael: No, I haven’t seen a lemon law statute that, that incorporates shame in the process. Sorry for the, that’s a pretty subjective measure there.

Anthony: Sorry for

Michael: those who bought a cyber truck.

Some people have no shame. Anthony.

Fred: Yeah, the whole government now is full of shamelessness. That’s one of the qu some of the qualities you need to be in the high government position now.

Anthony: Oh, but hey, if you’re free of shame, go to auto safety.org and click on donate. Or if you’re full of shame, click on auto safety.org, click donate.

It doesn’t matter. We’ll take your money either way. Continuing with this article about the weakening of the lemon laws. Shame on you, California. Michael mentioned Ford. So a little shout out to our buddy Jim. Quoting from this article. Ford did not respond to a request for comment, but the firm’s, CEO has said a shortage of mechanics is affecting the company’s ability to make repairs in a timely fashion.

That’s my second gaslight of the week. Come on. We don’t have enough mechanics. We just keep producing shitty cars. What are those mechanics do? I way they’re just they’re busy making crap vehicles that [00:19:00] we’ve had over a hundred recalls. They’re too busy to fix the crap they did. Oh, no. My head exploded.

Jim Farley always welcome as a guest. I look, I throw out two gaslights gentlemen. I know. I’m just using this as a delaying tactic before we have to get to the world of Elon. But Michael, do you have your gaslight for the week?

Michael: My gaslight’s gonna take us right into that world you speak of.

Yeah.

Anthony: Let’s hold off on that.

Waymo’s Autonomous Driving Claims

Anthony: Let’s go to the fun world of Fred’s imagination.

Fred: I can, this is not imagination, it’s real. But you guys are gonna have the vote on whether this belongs to Waymo or to. New York Times. Oh no, I like this. And your gaslight bingo scorecard, so you’ve got a tool to use to evaluate this.

Anyway, New York Times published an article that extolled, or at least describe the expansion of Waymo to freeways in California and the New York Times reporter, [00:20:00] either intentionally gaslit people or just didn’t read the references critically. So this gets a little complicated because the article basically used the phrase that a study by Waymo that had been peer reviewed showed there’d be a, I don’t know, 80, 90%, something like that, reduction in collisions if everything were automatic.

Anthony: Fred, before, before you go on, can I ask, was the New York Times reporter, was that person’s name? Judith Miller?

Fred: I don’t know. I don’t have that in front of me because I went and I dug into the references that underlay the PR release by Waymo, and I’ve gotta give Waymo credit for hiring excellent PR people.

It’s really astonishing how well they did things. But the underlying study talked about automated [00:21:00] driving systems deployed and compared the Waymo’s rider only crash rate compared to human benchmarks, including disaggregated by crash type. So it’s already confusing. This is a objective, and the methods they talked about, again, very confusing.

They compared Waymo crashes extracted from the Nitsa standing general order to other publicly available information. And it goes on and it says they’re examined. Data was examined over 56 million miles. I know, I gotta speed this up. Anyway, they concluded by saying the work should be considered by stakeholders, regulators, and other a DS companies aiming to objectively evaluate the safety impact of a DS technology.

Okay, so what did they neglect? Benchmarks include much more diverse vehicle types than our, in the Waymo fleet, most of which do not have modern safety features. [00:22:00] Re remind our listeners that the average car on the road today is about 12 years old Michael? Yeah, I’m almost 13. Yeah. And so there Waymo’s looking at collisions that are reported.

But most collisions are not injury reported or airbag deployed, so that’s a shortcoming. The Waymo speed restrictions may be a significant factor compared to other vehicles that was not discussed. There’s no discussion of the interaction of remote supervisors over the the waymo’s, over the control of impact or collisions or any of that.

So another shortcoming. They’re really actually fundamentally not fully autonomous. They’re proxy autonomous. They’ve got human supervision. They did not discriminate between different versions of the computer driver. So this, they aggregated everything and that can be done, but it’s a very complex analysis that they did not include here.

[00:23:00] They just assumed everyone was equal to every other, every computer driven car was equal to every other. There’s no discussion of other drivers avoiding the Waymo’s. There’s a comparison of police reports to SGO reports, but that’s not straightforward because they’re really looking at different things and they assume that the route travel times a day, length of trips were all comparable for all the data sets.

But I don’t think that’s supportable either. Some other shortcomings that are in here, they say that there was no comparison to higher severity outcomes, such as airbag deployments of serious injuries, even though they claim that in the introduction they compared Waymo’s rider only crash rate to human benchmarks in San Francisco for just under 1 million rider only miles.

Yet the introduction says they’re looking at 37 and a half million miles. So internally inconsistent. And they said the Waymo crash [00:24:00] rate reported as part of the Nisa SGO was found to be similar in magnitude to self-reported human transportation network company crashes, other words, Uber and Lyft and all those kinds of companies.

So that’s again, inconsistent with what they reported in the introduction, whether they say that Waymo crash rate is much less internally, they say the Waymo crash rate’s about the same. So clearly this study that underlay the New York Times report has some serious shortcomings in it. There’s also some good stuff in there, I think, but I, and I hope it’s true that in fact the Waymo’s aren’t doing much better.

But that objective is not supported very well by this study. So I would give it at bas a b minus if I were going to grade. But the New York Times accepted it uncritically and used that as a basis of what seemed to be a rah article. [00:25:00] So that’s my gaslight. And so you guys tell me whether that’s the New York Times or Waymo.

I don’t know which one I can assign this to. It’s a combo.

Anthony: I disagree. I don’t think either of them are gaslight. I think it’s lazy journalists distracted by shiny object. And Waymo’s pr, as you mentioned, is outstanding.

Fred: They’re really good at that.

Anthony: Their PR department has no idea what the hell they’re talking about.

They sat down with some who said something and maybe the engineer’s what? But there’s a caveat and they just walked outta the room and, Hey, this is the information we got. So I don’t know if you can say that’s, because I imagine a gaslighting, there’s intention behind it.

Fred: Further to the Waymo side of things, I looked at their website announcing this, expansion of the service to freeways and and it is a masterpiece of pr, I’m gotta say, because it starts with the open road symbolizes freedom and unlimited possibility. Wow. How can you argue with that? So it says they’re [00:26:00] expanding their service territory to include in introducing freeways built on real world performance and millions of miles logged on freeways.

That sounds pretty good, except every scholar of the article I’ve read said you need billions of miles on a given software version to a authenticate claims of safety based upon the number of miles driven. So they’re one 10th of 1% on the way to actually validating this on miles driven. But it sounds good.

It also says they’ve closely collaborated with safety officials to seamless seamlessly support this new phase of service. But I know from speaking with some of the Waymo folks that residual risks and mitigation of accepted risks are not revealed to the regulators. We don’t care. We don’t have to. It’s apparently the attitude they’ve got on that they don’t talk about fire suppression.

Remember, thousands of gallons of water are needed to suppress a fire in an electric [00:27:00] vehicle, but only hundreds of gallons are available in a fire truck. So the fleets of fire trucks necessary to support fire suppression are simply not available. And I think the Jang e patients that have the electric door handles too, because they didn’t talk about extraction of people, I think they’ve got the old school manual handles.

Yeah, I think they’ve got that. So they didn’t talk about extraction after electrical failure, and they didn’t talk about any warnings to other motorists and pedestrians. Based upon any false software or hardware faults in the I PACE that’s approaching the airport on the freeway. So I’m not sure what the extent of the collaboration with Safety officials was, and it’s good that it’s seamless, but what the hell was in it?

It’s not discussed in the article. So I don’t know. That kind of pushes the gaslight over towards the Waymo team, but I think New York Times [00:28:00] responsibility is to thoroughly investigate the articles that they’re putting into print. Maybe fact checkings last century thing too. I’m not sure. So I leave it there and you guys you guys tell me which way to go.

Waymo or New York Times.

Anthony: I still don’t know, but I think New York Times has a history of anything that comes out of Silicon Valley. They take it face value. I remember in the nineties before I moved to Silicon Valley, the New York Times had this amazing article about how it was such an incredible place and it was just all this thriving stuff.

And then I moved up there and there’s nothing it’s strip malls and corporate parks, but they described it as Oh, it’s the epicenter of, no, it’s not a, it’s not a place, it’s not a thing. It’s a couple of conference rooms. So I, the New York Times I think is just lazy when it comes to Silicon Valley.

Fred: So it could be an early version of Kool-Aid that they’ve drunk. Something like a Zare factor.

Anthony: I think they put the junior reporters on that beat. [00:29:00] ’cause they figure it’s just fluff. Alright. There it is. That’s my nominee. That’s pretty good. All right, Michael, you ready? Yeah, the skin shining in it face.

Michael: It may be difficult to, parse out who is doing the gas lighting in that situation, but in the case of Tesla’s reporting safety data, it is not. It is Tesla doing the gas lighting. They have introduced a new version of their, basically their safety report.

Tesla’s Misleading Safety Report

Michael: That has long been the subject of our criticisms because of its comparisons of highway miles to city miles and lots of problems.

And so they’ve come out with a new methodology, basically a completely new format for their page, promising everything promising, a seven times reduction in major and minor collisions, and five time reduction in off highway collisions. But when you look at the data it becomes pretty easy to figure out how they got to that [00:30:00] because there are.

Numerous inconsistencies in the types of data they’re reporting. First of all the, the seven dimes numbers are essentially derived from a comparison to average US vehicles and also really old Teslas that don’t have any active safety features. Essentially what Tesla’s doing here is taking its vehicles that are using full self-driving with their most recent software and comparing them to the average vehicle on US Roads, which is 13 years old and has very few active safety features.

And doing that comparison, claiming that there is a far lower risk of collision. If you take a 2026 General Motors vehicle that’s got super Cruise on it maybe you could make a comparison there to a Tesla, but you certainly can’t make this comparison. When you’re taking a [00:31:00] 2011 minivan that has no automatic emergency braking and no other safety systems on it, comparing it to the most modern thing on the road today.

That’s one of many deficiencies in Tesla’s safety report as it’s been rejiggered on their website. And I’m sure all the Tesla fans out there are touting this is somehow proof of Elon’s mastermind and success. But to us it’s just more bullshit coming out of Tesla that doesn’t really, that can’t be supported by, responsible use of data.

It’s a clear gaslight as opposed to Fred’s. And, that’s it. I don’t, at the risk of going on too long about this, I would point everyone to Phil Copeman substack entry from Monday that dives down into all of the various issues there are with Tesla’s data and its representations of what that data means.

And with that’s the [00:32:00] gaslight of the week for me.

Gaslight of the Week

Anthony: That’s pretty good. So I’m gonna do a quick scoring update. Fred, you had a lot of research in yours, very in depth, liked it, but the, who was the Gaslight? A little unclear. Minus some points. Michael. I think it was very straightforward.

Tesla. Obvious subject, pretty good. Clear gas later. But I gotta give myself the win. ’cause Jim Farley’s saying we don’t have enough mechanics to fix these cars, which is just horse shit. And succinct.

Phil Copeman’s Analysis of Tesla’s FSD

Anthony: So moving further into the world of Tesla, Michael just mentioned the Phil Copeman Substack. We have a link to it in the show description, and this is great.

Phil really breaks down Tesla’s FSD safety software in only the way that he can really do, which is he takes their nonsense and very colloquially explains how it’s bullshit without having to say that it’s great. For example, I’ll quote from, it says no fatality numbers. Fatalities are not broken out as a separate category.

Tesla claims that injury data is [00:33:00] unreliable because whether someone dies is not recorded in the autogenerated computer reports. However, you can bet that Tesla has a good idea of how many people have died in crashes associated with full self-driving by, if no other way, counting up the incoming lawsuits.

Damn way to go. Phil please. I highly recommend reading this. It’s so good.

Tesla’s Robo Taxis and Safety Concerns

Anthony: All right, more in the world of Tesla. So we’ve got three separate links from electric where they’re talking about the world of Tesla’s. They they are robo taxis in quotes, keep crashing despite safety monitors.

Now just to recap for people Tesla has quote robo taxis that are not remotely autonomous ’cause they have a safety driver involved. The state of California has not given them a permanent all to do autonomous vehicles. Waymo has, yeah. Like GM cruise. And not only

Michael: Do they have a safety driver in the vehicle, there’re, [00:34:00] there’s, I, there’s a guy behind a screen with a steering wheel and a pedal that can also jump in to take control, a remote driver back at the headquarters.

So functionally, I would say they have two safety drivers, although one is compromised somewhat by latency and other issues with remote driving.

Anthony: But that you’re talking about the actual person in the car who I believe there is who’s falling asleep. Yeah. There’s the one

Michael: person in the car and then when that guy falls asleep, they’ve gotta a backup in a data center somewhere.

Anthony: Yeah. Former guest of the show, Jonathan Gitlin, we have a link to an article from ours. Technica Tesla Safety Driver falls asleep during Passenger’s Robo Taxii Ride, or it details how this one Tesla Robo Taxi. Safety drivers fallen asleep numerous times in the vehicle while they’re in control.

Michael: Yeah, and I’ll point out too that Waymo, had noticed similar issues, I believe, almost a decade ago, maybe it was 2017 or so, when they were having problems with their safety drivers.

And at that point, that’s when they determined that, they’re going to start at level [00:35:00] four and they’re not going to build vehicles that rely on any type of human supervision because of the inherent complications there. Something that Tesla’s entire stock value and business model can’t do at this point.

Because if they admit that, then they have effectively undermine everything they’ve been promising people for the last decade.

Anthony: Remember folks, Tesla’s not a company, it’s a cult. From the electric article, I’m gonna quote from one of them. Tesla Roboto taxi currently crashes at a rate of about once every 62,500 miles.

That’s with a safety monitor, with a finger on a kill switch, ready to stop the vehicle at all times, except on their napping

Michael: Now, and that number they came up with on October 29th when that article was written. But if you go down

Anthony: we’ll go to the next one. Don’t worry. Okay. Okay. Yeah. So I wanna talk about, we have no data on how often Tesla Safety monitors prevent crashes in its row at taxis.

For comparison, the NIS reports lists 1,267 crashes involving Waymo [00:36:00] vehicles. However, Waymo’s Roboto taxis have covered over 125 million fully driverless milestones. Inception. That’s a crash every 98,600 miles without an onboard safety monitor. But this is like Michael just pointed out, that was from October, and then we jump up to November 17th.

Tesla has reported three more crashes involving its robotaxis in Austin now bringing the total of seven incidents despite low mileage and in-car supervisors preventing more accidents. The amazing thing and this is really a gaslight as well, is quoting further down.

Tesla’s Redacted Crash Reports

Anthony: So you can see Tesla’s reports to nitsa and according, unlike other companies reporting to Nitsa, Tesla abuses the right to redact data reported through the system.

The automaker redact the narrative for each reported crash preventing the public from knowing how the crash has happened and who is responsible. So you can see, I’ll give you an example, is it’ll have the date, the time, the location, what they crashed into, [00:37:00] what the results were what the vehicle is doing.

Then it gets to the narrative part and every single one of ’em says, redacted may contain confidential business information, which is the information is our stuff is garbage.

Michael: It’s complete BS because that’s a crash description, right? This is a crash that happened on a public road, presumably with, other witnesses around.

And they’re somehow claiming that it contains confidential business information, which is I’ve never seen a crash description that contains confidential business information. I’m not sure exactly what they’re talking about, but they have redacted every single description from the thousands of crashes they’ve reported to the SGO ever since it started.

Anthony: How do you get away with that? Is it just the nits and forth? Niso won’t

Michael: push back on them for that, and you, it’s. Essentially, NITSA is getting that description right, so Nitsa has that information and can use it in their investigations. This is simply a matter of whether that information makes it to [00:38:00] the public.

What that redaction does though, is it makes Nitsa essentially the decider, and Nitsa is the only one that can appropriately and fully evaluate those reports from Tesla. Scientists and academic researchers and public safety groups are all excluded from being able to do that because we don’t have access to crash descriptions among many other fields in the SGO reporting that Tesla’s redacting.

This isn’t the only field, but it may be the most important one that Tesla’s been focused on and preventing from reaching the public.

Fred: You think it’s possible that Elon Musk and his ability to fire anybody at Nitsa at will and the people who left behind in the government doing the the Doge Hatchet work.

Do you think it’s possible that may have been a factor in N’s reluctance to push back against

Michael: no Tesla? Because this happened way before Doge. This Tesla was redacting as soon as the standing general order came up in, I wanna say [00:39:00] 20, 21, 22, somewhere when, whenever that anti general order was put into play Tesla was already redacting it well before there was any twinkle in Donald Trump’s eye of Elon.

Running Doge. So now I will say that we were very concerned about the standing general order being completely rescinded at the beginning of this current administration. And that didn’t happen, although there were some changes made to the system that reduced the number of reports that would be coming in.

And we didn’t like that idea at all. We think the state of general order could, should be expanded and made permanent, if anything, so that it’s not subject political whim whims every four to eight years. Because this is data that is critical for evaluating, how software defined vehicles are performing on our roads, whether they’re partially automated or fully automated.

Anthony: Oh boy. But I think it could because Tesla’s not really gonna become a car company [00:40:00] anymore. It’s going to become a robot cop company. Oh God.

Michael: Yeah. I mean that RS technical article, I would advise our readers to take a diversion from auto safety for a minute in that and click on, I think it was a story from, was it the independent of uk?

Yeah, it’s the uk. Which is, I don’t know much about the independent how well their journalism is, but they were using quotes from Elon in their article about turning, robots into, effectively Elon was questioning why we have jails, if we could have a robot that follows criminals around.

And, he’s, and this is a prediction that just, that, only ketamine could create this prediction. Yes. That’s part of it. It’s insane. And the one that’s even crazier is that, they’re gonna start uploading basically a recording of human brains into robots via neural leak in the next 20 years, which is just absolutely absurd.

And I can’t believe that. Someone that has ideas that are that bad is as rich [00:41:00] and powerful as they are in America.

Anthony: Sticking with auto safety issues in Tesla.

Tesla’s Legal Battles Over Autopilot

Anthony: The third article from electric gonna quote, Tesla has settled another lawsuit involving its autopilot driver assist system, which has led to have caused a crash.

This time, a 2020 model Y on autopilot crashed into a stationary police vehicle in Texas. This is Tesla’s fourth known settlement in lawsuits involving autopilot crashes. Since losing its first trial earlier this year, how many more lawsuits are lined up for this? Because I remember Elon saying, I don’t settle lawsuits even when I know I’m wrong.

I wanna fight this. And I think his lawyers know, dude, you’re wrong, man. We settle this, we throw some cash at them. They shut up, doesn’t go to trial.

Michael: Essentially since the Benavidez case was decided in Florida, they have, settled four cases essentially in a row. But there, there remain other autopilot cases and full self-driving cases.

I’m sure that, that [00:42:00] are either in the works or coming and also are active right now. I don’t believe Tesla has, had a blanket policy of settling all of them. But this isn’t something that’s uncommon when a manufacturer has faced its first large decision that’s, negative decision for Tesla, a positive for safety.

When that happens, you’ll typically see a manufacturer start to capitulate in its other cases because. It, there’s the tide of public opinion and, there’s, established judicial decisions going against them and their arguments. But as we discussed when we talked about that case, different states have different ways of assigning fault and evaluating negligence cases.

Florida happens to be one where you can have, a driver who is 60% at fault and Tesla who is, or Tesla, who is what, one third at fault in that case, and can still be on the hook for damages. Not all states operate like that. So a, a lot of Tesla’s decision making is going to [00:43:00] come down to where the lawsuits are filed.

Fred’s Autonomous Vehicle Safety Checklist

Anthony: Let’s move on to the Tower of Fred, as Fred will give us another update on the autonomous vehicle safety of the consumer driven vehicle operators regulatory safety approval checklist, which we’ve gotten some really good feedback on so far. And people in the industry are seeming, Hey, I like this.

Here are my ideas.

Fred: Yeah, I’m pleased by the response. We’ve still got time for folks to send in additional comments before we publish the new edition. So if you got any thoughts about this please take a look and send us a note. So I’m gonna go to the next one on my list, which is, has the applicant.

Remember this is designed for people who are reviewing applications to operate self-driving vehicles or computer-driven vehicles in their district, whichever their responsibility happens to be. So the next question they should ask is, has the applicant shown [00:44:00] that the subject, computer driven vehicle will promptly and safely comply with emergency modifications of or restrictions to the approved operational design domain?

This is oriented towards things like fire events where you got to block off a highway or, restrict passage of vehicles or police events or concerts being released. All those kind of things where you need to send out a notice that stay away from this area. Some don’t. And there’ve been a lot of problems with Waymo’s converging on an area and blocking traffic just to the frequency of people calling them with Teslas, of course, just ’cause they don’t care.

And anyway, this should be part of the application process. Next one is has the applicant crash tested the subject, computer driven vehicle to prove that post crash safety of its occupants? Now this goes back to the [00:45:00] F-M-V-S-S for crash testing of vehicles. There should be no exception for. Computer driven vehicles looking at you zoo’s on your oddly shaped vehicle and the poor restrictions on people’s sitting.

So moving. Yeah, go ahead. That’s an

Michael: interesting one. ’cause they have, the side facing seats, and these new kind of, airbags to protect passengers there. And that’s something that is fairly novel. And something that, you know, in, in the area buses with side facing seats, and I think Nitsa has struggled with in the past

Anthony: with the crash testing for this, the computer driven vehicle.

I wonder is there some mechanism to ensure that the computer driver survives the crash? So we can retrieve data from it?

Michael: Yeah, A

Fred: black box.

Anthony: Like a black box, exactly.

Fred: Yeah. We cover that in one of these one of these items that’s coming out. Excellent. Okay. Let’s see. Check that one off.

Make sure we’re gonna do it again. [00:46:00] As the applicant disclosed plausible, significant computer driven vehicle operational risks to occupants while vulnerable road users or property that it has not mitigated. So what’s behind this? So the definition of safety that all of the manufacturers are using is safety means the absence of unreasonable risks.

So that’s is pretty nebulous, but it has a couple of implications. One is that risks exist, right? And so people have to acknowledge that risks to exist. And then all, if you acknowledge that risks exist, then you have to change sort them into risks that are you consider to be unreasonable to consider.

And the ones that are reasonable to consider, you have to decide whether they are going to be accepted. As a residual risk that you’re just going to say this is, life is [00:47:00] hard. We just have to do this. Or whether or not the risk has been mitigated by saying we know how to do this and here’s the engineering application we’ve put in place to make sure this risk doesn’t impact people.

So there’s three categories you have to sort the risks into. Now, the prac current practice is that nobody reveals the risk profile or the residual risk to regulators because they don’t have to for an airplane. The risk profile is no more than one crash per billion operational hours for any vehicle, and they have to establish that to the satisfaction of the government regulators.

I’ll remember the 7 37 max eight crash were, things didn’t go well. There is no similar requirement for vehicles. At the federal level, so state regulators, local regulators should ask the question to the [00:48:00] developers, what is the risk profile? And what risks have you accepted? What do you consider to be reasonable?

And what risks do you consider to be unreasonable? So you haven’t tried to mitigate them. Moving on has the applicant disclosed and shown evidence of risk mitigation, but the risk has determined to be plausible, acceptable, and reasonable for occupants of the subject, computer driven vehicle. So this covers things like vehicles approaching an airport, right?

We’ve got a lot of pedestrians, got a lot of traffic. There are certain risks involved in that. And in conventional industrial machinery development, you put up physical barriers as appropriate to keep the dangerous machinery away from vulnerable people. Computer driven vehicles don’t do that, so you, you really need to, the applicant really needs to disclose the risks, how it helps to [00:49:00] manage contact between the hard vehicle and the soft human bodies and where that’s going to end up.

And, why have you decided that it’s okay? Just to zoom in with your Waymo into an area where hundreds of people are standing around and, do your thing, deposit them, pick people up. That’s, to me, that seems like a risky, inherently risky procedure, but they’re not often discussed in terms of regulation.

The final one for today is as the applicant disclosed, and this is related and shown evidence of risk mitigation for the risks, it has determined to be plausible, acceptable and reasonable for other motorists, public safety officials, emergency responders, vulnerable road users, and property outside of the subject computer driven vehicle.

So it’s a hazardous device. It’s got a lot of kinetic energy, a lot of [00:50:00] potential for injury and damage to other property as the applicant actually discussed with the regulators what the consequences could be and how they’ve developed technology and standards so that this is not something that is accessibly hazardous in the operation of this computer driven vehicle under license.

Is that enough for today folks? Or you wanna

Anthony: yeah, I think I think that’s a good little taste of this as we go on through the weeks on it. Listeners, please send in more feedback. We’ve been getting some great interest, great feedback. Incredibly helpful. Apologize if you don’t get a response from us right away, but I promise you the three of us are like, Ooh, this is great.

And maybe that’s just how it does it in my head. I know Fred gets excited about it.

Fred: Absolutely. And so pick up your phone when you’re at the Piggly Wiggly. Email it in to contact@autosafety.org and we’ll be sure [00:51:00] to add it to the list. And we thank you for your contribution.

Anthony: Exactly.

Automotive Recalls: Chrysler, GM, Toyota, and Honda

Anthony: Let’s jump on to the world of recall.

First up, Chrysler 112,859 vehicles. The 2023 to 2025 Jeep Grand Cherokee, four XE 2024 Jeep Wrangler, four xe. These are plug-in hybrid vehicles. Sand from the casting process can contaminate internal engine components leading to a catastrophic engine failure, which can result in a vehicle fire or unexpected and unrecoverable loss of propulsion.

Wait, they’re sand casting. The engines still, I thought, I didn’t think, I didn’t think that was a technique still used in high-end manufacturing.

Fred: Yeah, they,

Anthony: it still is. It’s not a bad technique. I just thought they moved on to something else.

Michael: It works great as long as you make sure it’s all cleaned out at the end.

Ah, okay. Which it looks like the problem is here.

Anthony: So what do the owners of these vehicles do, Michael?

Michael: The [00:52:00] I assume they’re gonna go in and have the, their engines inspected to see if there is sand remaining in the engine. I don’t know how you would do that though, after you’re gonna have all sorts of other fluids and things mixed in there, but I’m sure they’re smart enough and they figured out a way.

But it may be a minute before people are able to do this. It’s gonna be at least a month and a half it looks like before owners even receive an interim notification. Oh they’re still developing the remedy here. So we’re not really sure exactly how they’re gonna address it, but I am guessing they’re not just going to go out and repair and replace 113,000 engines.

I’m sure they’re going to try to find a way to inspect those engines to determine if there’s any sand that’s residual sand from the casting process. And then at that point, if they do find it, are they going to replace your engine because it might have been damaged already by that sand. Are they going to clean out the engine that you have and try to move you [00:53:00] forward with that engine?

We don’t know yet. And it looks like. Hopefully by the end of December when owners get that interim notification, there’ll be a little more information available.

Fred: If the scene is gonna show up in the filters, unfortunately it will have already gone through the bearings and other moving parts.

Yeah. And potentially have done permanent damage to the engine. So I hope the inspection includes not only the actual physical presence of the sand, but the components that may have been damaged by the circulation of the sand through the engine. Yeah. Before it gets caught by the filters. So

Michael: if they find sand, the filter, they should be replacing the engine is what our line would be there, right?

Anthony: If not, they can be some, I think that’s right. Yeah. Lemon juice. And it becomes a lemon law thing. Never mind. Moving on. Next up. General Motors 2099 vehicles, the 2014 Buick Ano 2014 Chevrolet Cruz. They’ve dec generally MGM has decided that a defect relates to [00:54:00] motor vehicle safety may exist in certain.

These vehicles, the left and right side, roof, rail, airbag, inflators, which are located in the roof rails above the headliner, may contain a manufacturing defect that could result in rupture at the inflator weld joint. Oh, no this is this one another one of these things where these things rupture and they’re shooting trap nel?

Michael: Yeah. This one looks like a toccata type failure. And in this case, it’s, you’re talking about a roof rail, so they’re up near your head. I don’t know that’s any more dangerous or less dangerous than them being right in front of your chest like the toccata ones were.

But it’s certainly a danger. These vehicles are, over 10 years old at this point. And it looks like that it’s a function of corrosion over time, which suggests that the older these vehicles get, the more dangerous they will get. So if you own one of these vehicles. You need to really be on the lookout for your notification that’s coming.

It looks like just before Christmas, so you should be able to start [00:55:00] planning your visit to the dealer to get a replacement right around Christmas

Anthony: and the

Michael: time being wear a helmet. In the time being like we tell a lot of people in the Dakota recall, this is a very personal decision.

Some people were absolutely petrified of driving those vehicles for good reason. Others were, looking at the data and saying, this is a one a million chance, I’ll take the risk. That it is a very personal decision and it, a lot comes down to, how much money you have, what kind of resources do you have the ability to rent a car?

Do you have the ability to take a ride share or you have alternate means of transportation. Some folks are stuck with one car and have to be at work every day and or take their kids to places, and they don’t have a lot of choices there. So it’s a very personal decision as to whether you’re going to drive a car with an active recall and play the numbers game.

Anthony: I’ll give you an update on my airbag. So I got it repaired and it was 600 and some odd dollars for them, basically to [00:56:00] say there was a loose cable. And I’m like, how is it that much money? They’re like nothing was broken. It was just somehow the cable got on done, but we had to retest the rediagnose, the systems afterwards.

And of course, I had them fix it because I don’t wanna deal with the shame of your two looks.

Michael: We’re glad you’re

Anthony: safe, expensive for a cable. Anyway, next up, Toyota 126,691 vehicles. The 2024 Lexus gx, the 2022 to 2024 Lexus lx. And this is the, they’re equipped with a specific B 35, a engine that contains crankshaft main bearings, which allow the crankshaft to rotate within the engine assembly while running during specific production periods, there’s a possibility the engine machining debris of particular size and amount may not have been cleared from the engine.

Oh, it’s not sand, but this could be a bad thing. This can lead to potential engine knocking engine, rough running engine. No start and our engine stall. That’s not a good one. [00:57:00]

Michael: Nope. You don’t have the fire risk you saw in the Jeep recall the sand, but you’ve got the engine stall while driving loose loss, emotive power.

That can certainly increase your risk of crashing. And it looks like they’re going to be putting out a final remedy and owner notifications right around the start of next year. So owners should be on the lookout for that.

Anthony: All right, last recall, and I’m not gonna, I’m gonna be honest with you guys.

I’m a little nervous to read this one ’cause I haven’t read it in advance and I know what Michael does to me. So Honda 256,603 vehicles 2023 to 2025 Honda Accord Hybrid. Due to improper software programming, the integrated control module can reset while the vehicle is an operation. There’s nothing to do with a rear view camera.

Michael: Yeah. I think this is our first week in a while where there wasn’t one. Happy birthday Anthony. This one is basically there’s a computer that’s sensing certain communications [00:58:00] as abnormalities and essentially restarting itself. But this can occur while you’re in the middle of driving, which is a very bad thing.

You can lose motive power. Similar to the last recall, loss of motive power is something that can be very dangerous, particularly at speeds and on highways. But, dangerous generally. And this one looks like folks are gonna be getting notifications around the first week of January and they’re gonna reprogram that CPU, that’s having these problems.

Fred: It’s just a reminder. We’ve been advocating for inspection of heavy trucks that includes the functional capabilities of the computer. This is a great example of safety critical or life critical faults that are purely logical, but can affect the life and wellbeing of the the driver and the people who are actually around the vehicle because of its behavior.

So really, the [00:59:00] inspection process needs to be expanded to include the safety critical functions that are inside of the computer and not visible to inspectors. End of rant. Thank you.

Anthony: I don’t even think that was a rent. I think that was just common sense.

Conclusion and Final Thoughts

Anthony: And with that ladies and gentlemen, that’s our show, and I hope if you’ve learned anything this week, it’s out.

If Elon said it, it’s probably a lie. Till next week. Bye-bye. Thanks everybody.

Fred: Bye-bye. Thanks for listening.

For more information, visit www.auto safety.org.