The decline of Tesla? And Happy New Year

Subscribe using your favorite podcast service:


note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Anthony: You’re listening to their auto be a law, the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino. For over 50 years, the center for auto safety has worked to make cars safer.

Yes, sir. Great. Hey, Happy New Year, listeners. Welcome to 2024, the year of the flying car. Wow. Is that coming? I don’t know if it’s coming. It was supposed to when I was a kid. Supposed to have flying cars and highways everywhere. And then we realized that highways everywhere is a dumb idea. That’s been the

Fred: technology of the future ever since, Ever since I remember, and probably always will be.

Anthony: But that’s not what we’re here to talk about. Let’s start off talking about it’s a new year, and There’s a new there’s a new movement I see of people being less afraid at And they’re willing to point out that, Hey, it’s not as good as it claims. Which is, in years past, people were like, No, don’t say that.

There’s still a bunch of people who say, No, don’t say that. We worship at the altar of Elon. But more and more people, including the Washington Post, are saying, Nah, it’s not that good. The Washington Post has an article we’ve got a link to it. Starts off, Last weekend, my Tesla Model Y received an over the air update to make its driver assistance software safer.

In my first test drive of the updated Tesla, it blew through two stop signs without even slowing down. Which sounds bad, but I’m slightly confused actually by this article because the guy just had the and this is honestly just a Tesla Marketing problem because I think he just had autopilot. He doesn’t have full self driving now autopilot.

You think would be Auto driving, but it’s not it’s basically just like what my Toyota Corolla has which is like Hey, we’ll keep you in the center of the lane and we’ll do automatic cruise control is that what you guys read on this article, is that he just has that versus the full self driving, which is hey, we’re going to make unprotected left turns and crash.

Fred: I don’t know, but if he’s confused about which one he’s using, that points out one of the problems associated with that technology, doesn’t it? I’M

Anthony: more confused by his writing.

Michael: He is looking specifically at autopilot and not full self driving, which is, you can enable full self driving on city streets autopilot and auto steer functions are, should only be available on controlled or limited access highways but people are being allowed to turn them on wherever they want because Tesla hasn’t put safeguards around it, that article is, very similar, brings up a lot of the same points that we saw last week from some of consumer reports, folks who were saying, you can cover the driver monitoring that makes sure that the driver’s paying attention and nothing happens. You can continue to use it, which is clearly a problem.

And also that the steering wheel nags or whatever they’re calling it. Basically, the system Tesla uses to make sure the driver has their hands on the steering wheel. This, yeah. The post columnist said that he removed his hands from the steering wheel for 60 seconds or more with without any intervention by the vehicle to say, Hey, are you there?

So it’s between this and consumer reports articles on this. It’s pretty clear that the Tesla recall is not addressing at all the problems that are inherent in the system. They’re not putting more. Controls and safeguards around where the technology can be enabled, where you can turn on auto steer and autopilot and that’s critical.

That’s critical to any of these advanced driver assistance systems is that they be controlled and safeguards placed around where you can activate them. So you aren’t able to activate them in the middle of a city where they aren’t designed to perform.

Anthony: BUt a lot of it also seems really is there.

Confusing nomenclature around autopilot, full self driving I’m still very confused by this and we talk about this all the time

Michael: as my cars, guys, for one, just basically for our listeners, the autopilot and auto steer functions are more like what you have in your vehicle, and they may be a little more advanced.

They use now. Cameras to, to look at speed limit signs and other things to basically try to provide the driver with assistance and to make the drive a little easier. What we’ve seen from that is that it makes, people get distracted, people zone out, people aren’t paying attention to the driver task when the car is doing these things for them.

And that leads to problems when that person needs to take over. That’s the auto, autopilot is obviously a misleading term. This is not autopilot. It is, an advanced form of, but a lot of people already have on their cars. buT full self driving is a different animal. That, not only costs a lot more for Tesla owners, but also can be enabled in a lot more areas.

And is also under, we’re also questioning the safety of full self driving at this point as well. So they’re both under question from a safety perspective. The NHTSA investigation that the recall came out of was tuned towards the autopilot and auto steer functions not full self driving, which I hope they’re evaluating as well.

Anthony: Okay. Bull Self Driving, there’s a great article in TechCrunch, and normally TechCrunch is just Hey, anything out of Silicon Valley we love. Our friend Kyle would regularly turn to TechCrunch, and they would just believe whatever he said about what GM Cruise is doing. But they had a long article, they evaluated all of these I don’t even know what to call them anymore, but basically Ford’s Blue Cruise, GM Super Cruise, the Mercedes Level 3 system, and their advanced Level 2 systems, and Tesla’s full self driving.

And they’re like, hey, which of these, driver assistance systems is the best? And I’ll just go right to the end of the article. Quoting from the article, The intent of this roundup wasn’t really to pick a winner, but it’s still pretty easy cl It pro Let’s try that again. The intent of this The intent of this roundup wasn’t really to pick a winner, but it’s still pretty easy to choose a loser.

Three years after its initial beta release, Tesla’s supposed full self driving still doesn’t live up to its name. As the most expensive option here, it is easy to single out. That’s harsh. And also it pointed out that, yeah, people pay 200 a month for beta software, where They complain in the article that, oh, Ford charges 75 a month for its system, and it’s system, they’re like, we really enjoy this system.

This is actually really well done. Fascinating. Hey.

Michael: Yeah I don’t think there’s anything in the article that’s really news to anyone that’s been listening to us that Tesla is, Really focused on getting this stuff out and selling it to consumers and selling people on the idea of what the system may enable you to do one day.

But right now, if you hop in a Tesla and turn on these systems, you’re not getting an experience that’s going to leave you. Maybe some people will, but a lot of people aren’t getting an experience that makes them feel safe and taken care of. And that the system is going to be there to protect them when something goes wrong.

I know a lot of people who own Teslas and. To a man almost every one of them or to a woman in many cases almost every one of them says they don’t turn those features on because they those features freak them out and make them feel like they’re not safe, so we haven’t heard too many complaints about the chevy supercruise the ford blue cruise The mercedes and the bmw highway assistant are you know?

Probably not out there quite as much yet. And the Mercedes system is very limited in scope to where it can be used in California, Nevada, and very specific areas and conditions. But, the blue crews and super crews have been activated and geo fence now on control access highways for at least a year or 2 and they’re.

We’re not seeing the type of incidents being reported the government or being reported in the news that we’re seeing with Tesla.

Fred: wHat’s new for me is the mechanism of squandering your money. I thought that I thought that the full self driving was a lump sum 12, 000. This article suggests that you can also squander your money 200 bucks at a time per month.

I guess they’re, I don’t know, maximizing the options for squandering. I like that though. It’s nice.

Anthony: Freedom’s good. It’s their business model. They’re the only large corporation I can think of that basically regularly doesn’t go fund to me. Where they’re like, hey, you want a Cybertruck?

Give us 200 bucks. And you’ll be on the waiting list. And in five years, we’ll deliver this thing. At least 10 of them. But that’s neither here nor there. What, it was pretty neat, cause I, obviously, I only drive my car. And I’m used to what my car does. And for its automated lane keeping assistant thing you have to have your hands on the wheels and it will, it has a torque sensor.

So you got to put some weight on the steering wheel to know that, hey, you’re still there. And even then, sometimes I want you to tap the wheel a little bit. But these other systems, they have the steering wheels are capacitive, so they can just rest your hands there and can feel, oh yeah, we got some sort of galvanic response.

Look at that, Fred, I’m trying to impress you with some tech words.

Fred: Sorry to interrupt, I can’t remember if you have a Maserati or a

Michael: Lamborghini

Anthony: these days. Hey which one runs off of meatball sauce? That was a callback to episode one.

Michael: That’s gotta be more expensive than gasoline.

Anthony: Yeah, but the sounds it makes are incredible.

So continuing with this Tesla thread CNN has an op ed piece from William Wallace, who’s a safety policy director associate director of safety policy for Consumer Reports, and he has this great analogy talking about it’s This fictitious lawn mower manufacturer and he says imagine this, basically this lawn mower that goes out there and it starts killing people and the manufacturer doesn’t make any changes to fix the design and instead it says, in fact, the company doubles down on its flawed design that kills people.

Years go by, hundreds of similar incidences occur and more and more people die, including users of the lawn mower and people who just happen to be walking nearby. I think this is a pretty good metaphor, because people would lose their mind, they’d freak out. But if you call yourself Tesla, people just go along with it.

It was, I think it was a good way to point out the problems of Tesla’s issues, where not only are you paying 200 a month to be a beta tester, you’re making everybody else on the road an on witting beta Tesla. But at least they’re not paying 200 a month for it.

Michael: Because they’re dead.

Tesla should be paying us to test these systems and to take the risk on that’s involved, but that is certainly not the case. And people are throwing their money at the idea that one day, I keep seeing, and I don’t even know who it is. It’s some big wall street firm that continues to this day.

Years after this, the promise of Tesla’s becoming robo taxi, robo taxis has been, fleshed out to the point where I don’t think any reasonable person thinks that’s happening anytime soon, they continue to tell people, Hey, go throw all your money at Tesla investing because they’re going to have robo taxis.

And they’re going to have them before everybody else. Which is just. Absolutely insane to me when you consider, everything that we’ve looked at that Tesla’s done over the past couple of years when, we just don’t see that type of progress. We don’t see them putting the technology into their cars that needs to go in there.

And, in this case, in the article that we’re talking about from Will is we don’t see them even making. Even a remote attempt to correct the issues that NHTSA has identified in Autopilot and Autosteer. They’re just saying, oh yeah, we’ll put out a patch and fix that. They didn’t. They didn’t fix it at all.

They didn’t do a lot of the things they said they were going to do in the recall. And, like they call for in the op ed, we need NHTSA to have a stronger legal background. Our backstop here, we need them to go after Tesla in this case, because clearly they haven’t done that strongly enough.

So far, they’ve allowed these autopilot access to continue up. We’re 7 years after we started seeing this problem and. There’s simply not been done enough by NHTSA to fight back against Tesla’s, mound of attorneys and whatever else they’re using attorneys and PR to, try to prevent this fix from happening, try to, prevent these vehicles from being geofence from, ensuring that they’re operating where They’re going to be operated.

There’s very simple solutions there that could be implemented and Tesla simply not doing it. And it’s not forcing them to. And, I think at this point, we’ve seen this recall. We see that it’s not working. All the Tesla really hasn’t done anything to counter some of the issues that brought up in the investigation.

There needs to be a stronger legal action. There needs to be something along the lines of forcing a recall, like what we’ve seen with the airbag situation.

Anthony: How does that work? NHTSA, says, Hey, you got to recall this for problem X, let’s say something with a seatbelt restraint system, the manufacturer says okay, we recall this, we fix it.

Yeah, this is what we’re doing. And NHTSA puts a

Michael: rubber stamp on that every time.

Anthony: So then just the manufacturer gets caught again for not doing it.

Michael: whAt do you, what NHTSA has NHTSA has a choice to become more involved in that process and say, Hey, we’re not going to accept this as a recall fix.

This isn’t good enough. It’s clearly not good enough, or you’ve already tried this. It’s not good enough, or there are more vehicles that this applies to. This, there, there are a lot of things NHTSA can do. They’re just not doing them. They’re just

Anthony: not doing them. Okay. In the past, has it done that?

Is it been a recall on top of the recall, like?

Michael: They have the, yeah and you’ve, we’ve seen it in some other situations, we’re not, when we go through recall roundup or the investigation sections, we talk about audit queries or other, there, there are various ways NHTSA can go back into a recall and look at the fix or the remedy, see how it’s a, see how it’s performed, see if it’s working, and if it’s not working, go back and order the manufacturer to do something else.

Okay. Thanks. In this case, I think they should just stop playing the game with Tesla and just order them to, put in a fix, put in a driver monitoring fix, make sure that the driver monitoring camera is not able to be defeated by a sticker, right? Make sure that the steering wheel that’s supposed to be monitoring where the driver’s Holding the steering wheel and paying attention to the road, make sure that’s actually accurate and working.

And then it’s an effective way of monitoring whether a driver’s in control and also make sure that, ultimately here, you just want to make sure that Tesla’s not just blowing smoke up your ass every time they announce a recall and a software update that does nothing, which is what. Basically happened with this, last autopilot recall.

It’s gotten a lot of press coverage A lot of people were like, oh, they finally got tesla. Did they I mean tesla put in a Software update that took them who knows how long the design maybe 10 minutes I mean who it doesn’t look like something that’s been in the works for a long time It looks like something they threw together Pushed out in a software update Dusted their hands off and said, okay, we’ve done the recall.

But when it comes to how that recall actually impacts the safety of the vehicles that it applied to, it really doesn’t do a whole lot. And it certainly doesn’t prevent the activation of these systems in areas where they aren’t supposed to be activated. And I think that’s the area where NHTSA really needs to dial down here and make sure that.

No Tesla features, whether they’re autopilot, autosteer, full self driving, or whatever bullshit they come out with next, that none of those features are able to be activated in areas where they haven’t been tested and validated to ensure that they’re safe.

Anthony: Is there any sort of penalties given to these manufacturers when they put out, recalls that aren’t correct?

So and when it’s a physical recall, when you have to actually bring it in, there’s a cost to the dealer, because they have to physically replace a piece, they have to put something like that, whereas software update, that’s just, we’re going to pay these engineers anyway. There’s no real There’s no cost to their business of doing this.

It’s not like they had to hire new software engineers. They’re not paying like some ethernet fee or,

Michael: wifi fee. I guess they have to divert, their resources that those engineers would be working on at the time to, over to fixing a previous problem. But. There are no parts, you’re not paying dealers to do this, consumers aren’t having to get on the roads and come into dealerships to get the recall done.

So they’re all, there are far fewer moving parts in an over the air update, which is why we’re seeing a lot of manufacturers prefer them in circumstances where. There may be a physical defect that needs a physical fix, but an over the air update is cheaper, even if it’s less effective, or, this is a problem that we’ve been dealing with ever since we saw software updates started to come into play and vehicles, if a manufacturer sees that a software update is going to cost them a million bucks to push out where a physical.

Fixed to the defect is going to cost them a billion dollars. What do you think they’re going to choose every time? And if you let the, the watchdog NHTSA is saying you can do whatever recall you want, just call it a recall and put it out there and we’ll dust our hands off and call this thing done without probing deeply into that recall to figure out whether it actually works, what are you going to do?

Anthony: But is there any mechanism for NHTSA to apply fines to a company for. For doing a BS recall or a BS fix? There,

Michael: there is. Now, whether or not they pursue that or whether or not they have the, wherewithal or the legal and technical support of the agency to actually pursue that type of thing is the other question.

It’s Probably easier when you talk about a physical defect where you can, see whether or not that has been applied to the vehicle and whether or not it works, whereas software is a little harder for NITSA and for all of us to evaluate. And Tesla knows this. That’s why they’re, they, most of their recalls are OTAs.

It’s rare that you see a Tesla recall that’s actually fixing a physical problem with a. Physical repair.

Fred: Michael, you know what you just said suggests that Tesla should only be allowed to use its automatic driving assistance systems in areas where they’re proven to be safe. That’s everywhere.

They’re not safe anywhere, based on the data that we’ve got. Is that Nitze’s dilemma? That actually enforcing that would put Tesla out of business?

Michael: I don’t know that it, with one specific problem that’s led to a couple of crashes here. Let’s think about the the Tractor trailers that pull across the road, the drivers on autopilot or full self driving, whatever it might be and the vehicle simply doesn’t see the tractor trailer.

Because, either it’s a camera issue or. Programming or for whatever reason this has happened a couple of times and even if it alerts the driver It’s not in time or not strong enough to you know Wake the driver up or pull the driver away from whatever other task they were Focused on during the moment when they missed the truck as well that’s something you know, you don’t have Big semi trucks pulling across the middle of roads on controlled access highways and interstates.

I guess you could say that, if there’s a problem here and that Tesla’s allowing that technology to be turned on in areas where that situation can occur, if Tesla took. Actual steps to limit or to geofence these vehicles specifically to where that technology can only be used on limited access highways, where that crossing problem for semi trucks doesn’t exist, then you’ve functionally fix the problem.

But that hasn’t even happened here. And that’s a hasn’t. And I don’t know if they have an appetite to require General Motors and Ford are already doing this on their advanced driver assistance systems. Why not go ahead and say, Hey, Tesla, you really need to geofence these controls. You’re clearly your cars are not able to figure out when they’re in a safe place.

And there’s a way we can do this. And we’ve got other manufacturers already doing this. And it’s obvious. That this is a solution here, since we know where interstates and controlled access highways are. If you put in a software fix or a remedy that prevents completely the use of this feature on in areas where there are, you’re off the controlled access highway.

I think that’s. Imminently doable whether or not NITS is willing to take that stance is the question. I think

Fred: that I’d go one step further. There are a number of circumstances now that have resulted in human deaths associated with automatic driving assistance systems or AV systems malfunctioning.

And I think it’s probably incumbent on NHTSA to set up test requirements for these systems that replicate those circumstances and make sure that the vehicles that it allows to be used on the roads are able to navigate those circumstances without killing anybody. iT seems like a step that would be pretty easy to take and it’s within their capabilities.

It would be really useful to the public to know that. Yeah, we’ve, it’s tragic that these events have occurred, but because of the diligence of the government, we know that these events will not occur in the future for vehicles with similar automatic driving features. We got my

Anthony: vote.

Fred: Yeah I don’t know why this is hard to do.

Engineering systems all over the world do this. After catastrophes, drilling systems, offshore drilling rigs, boats, airplanes, Why are automobiles unique in the combination of lethality and lack of regulation?

Michael: Yeah, look, the whole thing is crazy to me. I was playing golf with my dad about three weeks ago.

I’m not a big golfer. I played probably a couple times a year with my dad and I was driving a golf cart in the rough in an area where the cart wasn’t necessarily supposed to be. And I It stopped me and told me I had to reverse out of the area. It knew where I was and wouldn’t even let me drive forward anymore and made me reverse away from the area I was in looking for a ball that I had hit way off the beaten path.

And it’s just struck me that, we’ve got this tech, that’s a geofancy. We’ve got this technology on a golf cart, on a random golf course in Alabama. Why aren’t we able to get it into vehicles that are on the roads that are a threat to pedestrians and occupants and drivers of other

Anthony: vehicles?

I think you got to rephrase that. It’s not, it’s why can’t Tesla get this into their vehicles? And they don’t


Michael: to. The answer is perfect. They just don’t want to. Because it can’t be that expensive. They’re connected vehicles. They have GPS. They know where they are at every time. And they, it’s not that hard to, fold in a map of the United States that knows where the interstates are.

They just don’t want to,

Anthony: No, though, the problem is about a month ago, it comes down to a lack of knowledge on their part, where Tesla’s CTO, their chief technology officer said he wasn’t familiar with the term operating design domain. wHich is insane. Operating design domain is, hey, where is this thing supposed to operate?

Where can it not operate? And he, maybe he was being facetious, but, I mean I sincerely hope he

Michael: was. I hope so, too. If not, they should just shut the company down tomorrow.

Anthony: But maybe not, because, okay, so they keep claiming they’re going to have self driving robo taxis and autonomous vehicles, and now, Fred, you’re a member of SAE, correct?

I am. Okay, and SAE works on autonomous driving systems and setting up standards for this, correct? nO.

Fred: No? They’re not setting up

Anthony: standards. They, you guys have a lot of discussions about what does this mean for like level zero to level five, right? They’re setting

Fred: up definitions.

Definitions. Yes. Not standards.

Anthony: Okay, great. And how many Tesla engineers have you spoken to or been on any of these SAE meetings? No. No.

Fred: Let me think.

Anthony: Zero. Zero? Oh! Okay so the large, body, oh, of engineers from every auto manufacturer coming together to work on these things, and Tesla is not represented at all.

I don’t know. Call me crazy. I don’t think Tesla’s actually working on autonomous vehicles.

Fred: I can’t verify that they’re not working anywhere in SAE. I can only verify that I’ve never seen them.

Anthony: And I think we had a guest on a couple of weeks ago that basically told us the exact same thing and said, it’d be great if there was a Tesla engineer there.

We’d want to have a conversation. But, hey, we

Michael: can

Anthony: talk about Tesla all day, and we might, because they got a letter from Mr. Elon Musk, got a letter from the U. S. Senate that starts off, Dear Mr. Musk, We write with extreme concern following recent reporting about Tesla’s knowledge of safety flaws in its vehicles and concealment of the causes of these flaws from NHTSA.

This reporting puts your statement from January that, quote, Tesla’s are the safest car on the road, end quote, at stark contrast with reality. We understand you don’t understand reality, that’s why you spent 48 billion dollars on a stupid so No, that part’s not in there, I just made that part up. What

Michael: were they spending 48 billion dollars on?


Anthony: buy Twitter. And this is a letter from Richard Blumenthal and Senator Edward Markey. It’s not a good day. I imagine when you get a letter from two sitting senators that say your statements are in direct conflict with reality, that’s,

Michael: Once it happens for the fifth or sixth time, you just start ignoring them like I’m sure he is continuing to do.

This is certainly not the first letter that, that Blumenthal has sent. The way of Tesla, maybe not all directed specifically to Mr. Musk, but this is this issue was, had a lot to do with what we talked about, I think a couple of weeks ago about the Tesla suspension problems, where, it was very clear that Tesla knew that there were some issues with their suspensions, and rather than.

Take accountability for that problem and fix it. They blame it on customers, which is just such a similar tact to how they approach a lot of safety issues where they continue to say, oh we’re not at fault here. It’s the drivers. The drivers aren’t using autopilot properly. The drivers are not paying attention to traffic and they’re causing these problems.

Not the fact that we’ve Basically have told people we’re building robo taxis that will drive themselves at some point and sold them to people who don’t understand the difference between, a robo taxi and an advanced driving assistance system or any of the high level technological terms that people simply don’t learn about before they buy one of these vehicles and start hitting buttons.

So it’s, this is, this Tesla’s approach to the suspension problem is very similar to that approach to everything safety, which is that, we’ve got it right. We’re going to produce some BS data. That well, we’re not even going to produce the data. We’re going to, we’re going to put statements out that say we’re the safest car on earth, but we’re not going to back them up with any data.

And then we’re just going to ignore whatever you guys are saying and keep doing what we do. We may release an OTA update that doesn’t do a whole lot, but when it comes to actually fixing these safety problems we see on the road every day, good luck with those.

Anthony: Alright, for you Tesla fanboys hate listening to this only one last Tesla story, cause I’m getting tired of it.

Cybertruck had its first crash on the road and this was up in a lovely set of roads, a Skyline Drive, I believe, in Redwood City. I used to live around there, and beautiful roads, I don’t know why anyone’s allowed to drive on them. buT the thing out of this is thankfully no one was hurt, but you can see the damage done to the side of The Cybertruck and yeah, and I guess if you’re going to spend a hundred grand on a car what’s it cost to just spend another hundred grand to get another model?

But it’s it’s got some got some good scratches, scrapes, and it also looks like the plastic around the wheel well looks, that just looks like garbage.

Michael: Thoughts? Yeah. There wasn’t much to that crash. I don’t believe it was the fault of the Cybertruck, it looks like it, it did a pretty good number.

I believe it was a Toyota Corolla or a Camry Corolla I think that was involved in the crash and everyone was okay. The Corolla definitely looked like it took the brunt of the collision, which you would expect given the weight of the Cybertruck and the stiffness and all the things we’ve talked about before, but hopefully every Cybertruck crash.

That is going to happen because it’s somewhat inevitable that cars are going to crash. Hopefully everyone works out this way where everybody’s okay.

Fred: I’ve been predicting that these would be very expensive to repair. So if anybody happens to know how much it costs to fix this little fender bender, I’m all ears.

Send us a

Anthony: note. Yeah, and the article from Jalopnik, which we’re linking to Quoting, the Cybertruck has yet to receive a crash test rating from either NHTSA or from the Institute for Highway Safety, but at least in this crash everyone made it out okay. As for saving this particular truck, repair costs and insurance rates are still big questions.

So I’m sure engineers will do all they can to rescue this one. Yeah, if you work for an insurance company, it would be great if you just send us a little note. Contact at autosafety. org and let us know, like, how expensive would it be to insure one of these silly

Michael: cars. I wonder, are insurance companies even willing to insure these things, given the Potentially astronomical repair costs.

We already know that insurance companies are refusing to insure a lot of Tesla’s at, certainly not at rates that you would expect a card and that’s one of the, to be, expect to be charged for insurance and that’s one of the reasons why Tesla started its own little insurance company for its owners.

I would wonder if, whether insurers even are willing. To ensure the Cybertruck at this point.

Anthony: All right, let’s move away from the world of Tesla. How does that sound? Want to take a little break from it? Yes, thank you. All right, Tesla fanboys, you’re done. Thanks for coming.

Thanks for those hateful comments. You’re the greatest. Go get a hug from your mom. Let’s move on to the Towel Fred this week, okay? Because we’re in a new year. And Mr. Fred Perkins has some New Year’s. engineering resolutions for the industry which we’re excited about. You’ve now

Fred: entered the Dow of Fred.

Happy new year, everybody. Yeah, there’s a few items here maybe a few hundred. So we’ll put this out on the website to make it available to you, but I’m just going to blast through them here. Some of the things that we’re looking at, some of the things we’re hoping for, and some of the things we’ll be monitoring as the year proceeds.

And in my arrogance, I’ve got suggestions for just about everybody here. So we’ll start with NHTSA.

Anthony: Oh, thank God. I thought it was going to be me.

Fred: We’d like to see vehicle adaptive speed control with speed limit exceedance warnings and speed limiters, because that’s what’s killing most people on the roads these days.

Second to that is automatic mitigation of drunk, inattentive, reckless, or otherwise dangerous driving behavior. This appears to be within the technology available. It would save lots and lots of lives if this were deployed in cars.

Anthony: This is the one we’ve talked about before where there’s going to be sensors that can detect if the driver has been drinking too much just from analyzing the air.

I think this one is great, but it seems and NHTSA was looking into this one, so there might be a couple years off, or we have hopes about this? I think there is some

Fred: progress on this, yeah. But why wait? Next one would be an FMVSS for protection for animals, children, and incapacitated people and animals in overheated cars.

This is something that’s been killing people every year. We read the sad headlines every summer about Children who’ve been left in cars, animals left in cars. This, again, is something that is well within the technology that’s available. Something that could be deployed immediately.

Anthony: Walk me through this one.

So this is, I get the interior temperature of the car gets too hot. And then what would happen? It would roll down windows automatically or kick on an AC or something?

Michael: They’re probably some combination of those things they could do. Yeah. Warning people that are around warning the owner via apps or, calling the dealership to call police.

We have in the windows, we have crash notification for after there’s a crash. So if you have crash notification that calls the police or emergency responders, when it senses a crash, you would. Effectively simply plug the system to that when it’s sensing a child in a hot car.

Anthony: I love this idea. This is great.

Fred: Now, moving over to a little bit of aspirational objectives here, we would like to see an objective safety standard or type approval process for ADAS and AV operations on public roads. Safety standard of self certification is going to be continued. Or a type approval process if the objective safety standards are unavailable for whatever reason.

And we’d like to see them to be applicable at the end of service life. Not on the day that the car leaves the showroom, scrapyard. iT should be safe throughout its operating life. And then all the operating regimes for which it’s approved.

Anthony: Can you extend it to

Michael: airbags as well? That’s what I was going to say.

Fred: I’m sorry, say that

Anthony: again. Can you extend that to airbags as well?

Fred: Airbags, oh sure, of course. In fact, my next item is the airbag inflator qualification and lot acceptance test standards at least as stringent as military and industrial standards for similar components not used in a vehicular context.

So why should, why should a EED being used in a missile be inherently much more reliable than an EED, electro explosive device, that’s used in a vehicle in millions of instances? It makes no sense to me. So in a missile,

Michael: what, a lot of what goes into a missile, people think about missiles as just being, you hit something, it blows up.

But a lot of what goes into the missile is making sure that it doesn’t, that it only blows up when it hits an intended target, right? Which is A very similar to how you’d want an airbag to function. It’s only going to inflate under the conditions that have been specified by manufacturer, not, in, in other situations,

Fred: You want it to go off when it’s supposed to and never go off when it’s not supposed to.

And when it goes off, you want to make sure it does the thing it’s supposed to do and not something else, right?

Anthony: So we talked about this before and I don’t think any of us have figured out the answer, but Oshkosh, I think they’re subsidiary Perkins, that they make a lot of military vehicles and whatnot and fire trucks.

They have airbags inside these military vehicles. Do we know if they’re operating in a different? safety level and margin? I

Fred: don’t know the answer to that. I don’t know that they have airbags in them either. No, they do. They may have different, completely different standards.

Anthony: Yeah, I just watched some video on them.

It was a video about fire trucks and apparently they put airbags and military vehicles. Quite a while ago. Okay.

Fred: But, they’re used in lots of circumstances. They were used, for example, in a satellite it wasn’t a satellite, it was a vehicle, but anyway on the Mars mission that landed the orbiter, they had airbags on them to soften the landing, right?

Those airbags are the same. Technology that could be used in people’s cars, but the qualification and lot acceptance testing for the airbag inflators that went into the Mars lander are a lot more stringent than what’s going into cars. That people use every day. Why? That makes no sense to me. You’ve got a much wider base over which you can amortize the qualification costs in a in a vehicle application than you’ve got in a one off space vehicle.

This is an inversion of responsibility in my opinion. Anyhow moving on. drIver notification when vehicles approach or exceed safe operational limits. So every car can be relatively safe to operate at very low speed. You can, at five miles an hour, you can crank the wheel over as fast and as hard as you want, and the car is not going to flip.

At 80 miles an hour, if you crank the wheel as hard and as and as far as you possibly can, you’re in a world of hurt, right? So somewhere between 5 miles an hour and 85 miles an hour, there’s an operational limit. Now, for normal human beings driving a vehicle that has a sophisticated computer, why can’t it warn the person when it’s approaching the operational limits that are safe for that vehicle?

Why are people suddenly surprised when their car flips? you know that a lot of simulations have been done a lot of calculations have been done for that car and it’s a little visionary, but why doesn’t the car give a beep when the speed limit or when the speed of the car Approaches the speed at which the car is inevitably going to go out of control.

Anthony: I’m gonna one up you on that one Why does the car even allow you to do that maneuver when you hit those speeds?

Fred: That happens to be my Next item, which is I don’t know, it’s somewhere down in this list. But anyway, my next item is adaptation of NCAP tests to include diverse body types and backseat occupants.

Long overdue something that IHS is doing very successfully and identifying previously unknown safety hazards within those cars where it’s doing the test. That should be immediately. Incorporated into the end cap tests as well related to end cap again. Next item is normalization of end cap star ratings to incentivize continuous safety design improvements when an end cap four or five star rating is merely a participation trophy.

Nobody’s working hard. Nobody has an incentive to work hard to make the cars even safer. The only system that’s driving people to do that is the euro end cap. Which does, in fact, renormalize the the ratings for vehicles, and it does, in fact, incentivize manufacturers to do a better job. And guess what?

The European manufacturers are doing a better job. So aren’t Americans worth as much as Europeans? I think we are. I think I am.

Anthony: I don’t know. Have you tried our healthcare system here? Yeah

Fred: when you get specific on that, maybe not so much, but anyway next up would be nationwide duty of care.

For ADAS and AV equipped vehicles, we’ve discussed this on the show before and a lot of other people have brought this up as well. It’s something that would go a long way towards providing a detailed rationale for how the vehicles have to be designed. And how the responsibility for their vehicle’s defects is apportioned, not to the unfortunate victim who is in the car, but to the person or entity that designed the car and implemented the automatic controls that cause a person to lose their life or get injured.


Anthony: Fred, what’s your, out of your list so far, or out of your list in total, what’s your number one I wish this would happen by the end of 2024? What’s the one

Fred: that you’re like, yes. Number one would be, actually, two number ones. Adaptive speed control and the automatic mitigation of drunk drivers.

Those have got to be at the top of the list because those are the things that are killing most people. And

Anthony: those are ones that we have solutions for today.

Fred: Yes, they’ve been demonstrated solutions for both. Whether they’re the optimum solutions, I don’t know. But certainly, there’s no reason to delay.

Proceeding with the engineering development and implementation. So I’ve got a few others. I’m just going to blast through them. For the FCC, recovery of the 9. 2 gigahertz bandwidth for automotive safety use, prompt approval of V2X DSRC based applications, and disapproval of cellular V2X applications for privacy and cost reasons.

For the states, determine the AV parameters that need to be organically collected for safe operation verification of individual vehicles progressive licensing programs for AV designs, or recover the marginal cost of capital investment in private transportation related electricity distribution from EV owners and users.

Now, that’s a mouthful, but basically, people who don’t have EVs shouldn’t have to pay for putting up more transmission wires. And more transmission towers and more substations to support EVs, there should be a way of apportioning those costs on to the actual users, rather than again, using public subsidies for the privileged few on these EVs.

Recovery, recover vehicle mass based state mass, meaning the size of the vehicle state and federal energy taxes from equivalent to similar taxes that fund highway development. That are now included in liquid fuel retail prices. There’s, these EV users right now in most states are scamming the rest of us because they’re using the roads for free that the rest of us are paying for.

That’s simply not right. Legislate duty of care at the state level. If there’s no federal legislation or rules, establish rules, procedures and mechanisms for periodic and on demand police safety inspections of 80 a s or a via equipped safety critical features for all vehicle classes. So where does this come from?

A state trooper can pull over a tractor trailer and they can inspect the log of the driver and they can walk around the vehicle. Determine that it’s being, it’s got all the safety features in in operating conditions. You can’t do that with an AAV because everything’s hidden. All the safety critical features are buried inside the computer somewhere.

unLess the states force the manufacturers to make these safety critical features transparent to and inspectable by the state troopers, these heavy trucks will be running amok on the highways. And yeah, it’s fun for the states to say we’re the leader in autonomous trucks and these 20 ton vehicles, 40 ton vehicles rolling down the road.

Hey, I don’t have any people on board, but that’s okay. The company said they’re safe. We need to do better. We need to have some means for people to inspect them, for third parties to inspect them. Promote highway design and transportation planning that minimize use of cars and maximize public transportation.

The fewer cars on the road, the fewer people get killed, period. Prohibit AV operation unless safety qualified by third parties. This is something states can do as part of the licensing procedure. And finally, couple vehicle liability insurance to the vehicle mass. It’s very clear that a vehicle that weighs five tons is more hazardous to pedestrians and other vehicles than a vehicle that weighs one ton.

It’s about five times as hazardous, right? So it’s, to me, it’s unconscionable that once again, people who have light, small, efficient vehicles are being asked to subsidize the hazardous vehicles that the wealthy people are using on their roads. This is just not right. So for industry, okay this is a subtle one, but we’ve talked about it.

Rejection of current SAE J3016 definition of minimal risk condition and safety standards as an acceptable safe harbor for errant AV behavior. Note that an injured woman was dragged 20 feet under a cruise vehicle that had maneuvered to minimal risk condition before and during the dragging event. And it ended up parked with a wheel on top of her leg once it finally did stop.

This was a horrible incident, but it’s even more horrible to me that the safety standards being developed around the world, including by ISO and SAE, All think that’s perfectly acceptable if they accept the minimal risk condition that’s currently stated in SAE J3016 as some kind of safe harbor for fallback of automatic driving systems to a purported safe state.

Elimination of proprietary data formats and industry restrictions on access to vehicle operational data. This will help understand what’s going on, help people determine what needs to be improved in these vehicles. Somebody besides the manufacturer who is reluctant to release the information.

This one’s going to seem like inside baseball, but transparency of simulation models, suitability of purpose, and evidence of ground truth. Simulations are doomed to succeed, but there’s a real problem with the simulations, which is that vehicles all have time based events going on, right? So you’ve got the radar scanning every X milliseconds, you’ve got The vision system updating every y milliseconds, but the simulations you’re running are all event based, which means that they run as fast as they can.

Now, Anthony, if you put a faster computer in your car, how much faster is your car going to drive? Zero. Zero. Because it’s not event based, right? The controls are time based for the most part. But if you step up the simulation, and step up the speed of the simulation using event based software, you can say I’ve run a million simulations, and this car is really safe.

Eh, wrong, because you’ve never done a ground truth, and you’ve never really established that it’s okay to bypass the event or the time based elements of your simulation. Missiles have failed because of this. Okay this is an important but subtle thing. Anyway, it’s obscure. I’m happy to discuss it in infinite detail.

We’re almost done here. Bidirectional validation of safety critical requirements. We’ve talked about this. Things should only happen that are supposed to happen, and everything that happens should happen because it was supposed to happen. You should look, should be able to look at it both ways.

You cannot with neural networks and machine learning in the cars. Replication of known failures resulting in death or serious injury for AVs and ADS, we talked about that earlier. Rear seat belt reminders. We know how to do that, just we just have to accept the will to do it. Improved seat back strength.

This is a perennial problem that’s killing people for no good reason. Save a couple bucks per car. Last one I’ve got is rear seat safety design improvements. Motivated by data from rear seat test dummies that IIHS is now Doing, and the NHTSA should be doing in their NCAT test. And last, is eat better and exercise more.

That’s for me and you, Michael. That’s not for

Michael: the automotive

Fred: industry. So that’s a lot. I’m sorry, though. I’m just going through it, but

Anthony: I’m done now, folks. I’m at the end of the list. I’m at the end of the list. I’m at the end of the list. You lied to us, okay? So my new year’s resolution, I’m going to cut you off much earlier.

I tried to do it. I was very polite about it. You’re like, I’m almost done. That was 10 minutes ago. Okay. All right. And my other new year’s resolution for the industry is no more tech bro nonsense. Okay. Tech bro nonsense is the destroying the world. Michael, do you have a quick new year’s resolution?

Michael: I, nope, you don’t. Sorry. Yeah, I agree with the Tech Bro nonsense. I’m sick of that myself, and I need to walk the dog more. Oh,

Anthony: Okay, we got time just to run through these quick recall roundups. We got a Hyundai Motor almost 11, 000 vehicles. This is the 2024 Kona’s an electrical short damaged the 12 volt Positive battery cable to a bracket, increasing the risk of post crash engine compartment fire.

That’s, these Hondas and Hyundais and the fires and the, it’s not good. It just never ends.

Michael: Yeah, that one’s, that one, it’s the Kona, but it’s not the EV version. It’s the gas version. No, their EVs seem good. Yeah, it’s basically a misrouted cable. That’s or cable that was designed to be put in the wrong spot.

That’s creating an electrical short and causing fires. I think that the remedy is going to be out around the late February. If you’re, if you own one of these and you’re looking for the repair, it’s probably going to be about two months from now. Okay.

Anthony: And our last one is Kia almost 80, 000 vehicles.

This is the 2011 Kia Sorentos. This is all 2011 model year Sorento vehicles manufactured at Kia Georgia plant from October 24th, 2009 through June 24th, 2011. They’re manufactured by children. Oh wait, that’s not what it says. They were equipped with Theta 2 point multi, this is basically their engine and their knock sensors.

Michael: And this was the Theta 2 engine was the, basically the bad guy in a lot of the Hyundai Kia fire recalls that, that took place starting around 2019. I think we petitioned the agency around 2018 around those. And so this is just the latest iteration of that. This is the latest recall that’s come out because of that.

And we’ve, this is a. Engine that’s slightly different. The theta two engine this one was put on Sorrento’s. It’s an MPI engine that’s slightly different than the ones that were having all the problems. And so the people with these MPI engines were seeing similar problems but Kia wasn’t doing anything to help them out.

Now we’re seeing, these are what, 12. 13 year old vehicles that they’re getting a 15 year, 550, 000 mile extended warranty to, because they finally determined that they need to be covered under similar recalls. And what they’re doing here is not fixing the defect. I was just talking earlier about software versus physical recalls.

What they’re doing here is you’re going to bring the vehicle in. They’re going to give you a knock sensor detection system software. Update basically, they’re going to install this software that alerts you when your engine’s knocking and can take your car into limp home mode, prevent you from driving it ultimately to prevent the engine condition from occurring that leads to these fires.

It’s not a fix for the problem. You’re not going to be made whole and have a great working engine. You’re just getting a knock sensor detection unit, which is a problem. You’re. You’re not getting a physical fix for the engine problem that’s happening in these vehicles.

Anthony: If the driver switched to leaded gasoline, would that help reduce their knocks? They

Michael: might feel them less as their, nervous system deteriorates, but no, I don’t think so.

Anthony: Okay. Hey, and with that, listeners, thanks so much for listening, being part of this, I hope your New Year’s going well, I hope you’re all walking the dog, being healthy, and realizing that any time you see a tech bro talking, it’s probably a lie.

Hey, we’ll be back next week, we’re finally gonna get to some IIHS rear seat protection and emergency braking stuff we wanted to talk about for a while and, I think I finally have approval for we’re gonna release our list of complaints! Of the most complained about vehicles we’ve gotten from listeners like you, almost 3, 000 complaints came in last year.

Holy cow. Okay, with that, I thought Michael was going to say something. Alright, goodbye.

Michael: Goodbye. Happy New Year. Happy New Year, everybody. Bye.


Fred: more information, visit www. autosafety. org.