Automatic Emergency Braking and Tesla continues to disappoint

NHTSA has finally released a rule for Automatic Emergency Braking (AEB). We’ve been asking for one since 2015 so… 3 cheers for NHTSA? But the rules don’t go into effect until 2029. Ugh. On the plus side IIHS updated their AEB tests and car makers have a lot of work to do. We dive deep into AEB. And will 2024 be the year of the Tesla lawsuits? Maybe they shouldn’t sell things with aspirational names like Autopilot when they keep crashing and killing people? Ford Blue Cruise is under investigation, Waymo says the bad driving of it’s robotaxi was on purpose and we cover the latest recalls.

This weeks links:

Subscribe using your favorite podcast service:


note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Anthony: Hey listeners. Michael can think of one reason not to record this episode. Thank you But hey, too bad, we’re here. That’s how we’re gonna do

Fred: it. Yes, good morning, world. Good morning.

Anthony: Hey,

Fred: let’s

Michael: Good morning, everyone.

Anthony: Let’s now you’re saying good morning? You didn’t want to record? No.

Michael: I’m just a little slow. I’m catching up on coffee here.

Anthony: Okay. Let’s talk automatic emergency braking. We’ve talked about it many a time on this show. And now I hear some consumer advocacy groups, such as Consumer Watchdog, Joan Claybrook, and the Center for Auto Safety are urging, saying, hey, let’s get this in there.

Michael: Well, that’s what we did, in two thousand and fifteen, right? In two thousand fifteen, we petitioned NHTSA to require manufacturers to start installing AEB. One of the reasons for that was that, AEB had been around for, Approximately 10 years at that point, I think Honda introduced it in some form in vehicles in 2003 heavy truck manufacturers like Volvo were introducing into their vehicles.

And as we approached, 2015, other manufacturers were starting to put it into vehicles, but it was almost always included as a luxury item. You’re only going to get it as an optional package that, The consumer ends up paying, a few extra thousand dollars for our idea was, this needs to go into every vehicle right now.

This shouldn’t be a luxury. This is something that if it is in every vehicle could save a lot of lives, prevent, Tens of thousands of injuries per year and, really eliminate one of the major types of crashes that we’ve seen that causes deaths, which are crashes where there’s a rear intrusion into a fuel tank and you end up with a lot of occupants who are, injured or killed by fires rear crashes, into some of the vehicles we were looking at the time that Jeep Grand Cherokee were a big problem. And AEB promised in many ways to eliminate that. What happened, what ended up happening, we, we took that petition all the way to court and ended up getting, losing ultimately.

And what happened in 2016 maybe 2015, later in 2015 was. NHTSA and I believe IHS Consumer Reports and the auto industry all got together and said, we’re going to come up with this voluntary commitment that manufacturers have to meet where they’re going to put AEB into all of their vehicles by 2023, I believe was the ultimate date.

And, that’s a. Something that the industry loves, they love a voluntary program where they’re not held to any standards. And you know what that program got us to, we do have AEB and almost every vehicle that is now sold in the United States. However. That a very significantly and effectiveness and we’ve talked about a lot of those issues.

How at nighttime, there are a lot of problems. Their problems detected pedestrians. There are a number of issues with low speeds. Primarily is 1 of our main concerns because the automatic emergency breaking that was installed into that commitment was really only good up to about, the high 20s, 25 to 30 miles per hour, somewhere in there.

Which is not where we see a lot of the really deadly crashes. That didn’t work, but then, finally, after, years of rulemaking and, after Congress essentially required NHTSA to to do rulemaking they came out with the final rule yesterday that really does a lot of great things, it increases the speeds at which these systems are going to work.

In the case of vehicles, I believe up to even a 90 miles an hour. In some situations, it’s going to require it to detect pedestrians up to 45 miles per hour, which is significantly higher. I mean, right now, not all vehicles have even have pedestrian automatic emergency braking. It’s going to require it and require it to work up to, pretty significant speeds, 45 miles per hour is the speed that you’re going to see on a lot of those arterial roads where we see a lot of bad pedestrian incidents.

And hopefully that will help decrease pedestrian crashes. That Knits is estimating that, the rule will save. I think it’s around a little over 300 lives per year, and they’re going to be You know, tens of thousands of injuries prevented every year. Ultimately, once these systems are installed, I think if these systems are installed and they’re working well, the numbers could be even higher than that.

And that doesn’t even include, that the heavy trucks and that the semi trucks that are going to be getting this technology as well, that’s being done in a separate rulemaking to prevent some of those really bad collisions where that we see where, a trucker doesn’t. See stop traffic in front and runs into the some car runs in the back of some cars similar to what happened to Tracy Morgan from Saturday Night Live years ago when a Walmart truck hit the van he was traveling in.

There are a lot of good things about what’s what’s happened and it’s a B rule that’s come out and the industry is predictably. Silent. They’ve been throwing up objections to it for years now and trying to make the rule as cheap and as easy to meet as possible. And, I think what this has done is given them a real challenge.

They’re going to have to start protecting people at higher speeds and in the dark, and they’re going to have to start detecting pedestrians. And so overall, we’re very pleased with the result here. We obviously wish. That something could have been done quicker, that we could have, had a compliance date with some other with maybe a lesser type of AAP system that could already be met by the industry and could already be out there saving lives.

But ultimately this is going to be a game changer for safety in many ways, probably not to the level of seat belts, but certainly a really good thing.

Anthony: So when, two questions for you, when does this go into effect and will it lower my insurance rates by having a car with this?

Michael: It’s, we’re going to, the compliance date is going to be 2029.

That was lengthened from, I believe it was three or four, I think it was four years in the version of the previous version of the rule that, that was proposed about a year ago.

Anthony: Originally it was supposed to be 2023, and then these guys are like, now we need some more time?

Michael: The voluntary agreement was for 2023.

So the, and the voluntary agreement that was reached really did not put, that’s kindergarten. I guess we, I like the analogy of school and progressing through school for a B that’s a kindergarten level. A B it was only going to protect up to like 25 miles per hour. So when you think either that you’re not going to have a B protection.

When you look at the types of crashes we have in America that cause injuries and deaths. 25 miles and under are not really one of the main sources of that. So that, that, that entire volunteer agreement, while it got the industry, I guess, arguably, it got the industry off to a start on AAV technology.

It didn’t mandate that anyone really do anything, and it didn’t really. Push manufacturers to make the technology better and to try to develop it even further than it had been developed at the time. I think that the new NHTSA rule really does that. It really, like, like, like the automotive alliance said in response to it, they said, there’s not I think they said it, there’s.

It’s, there’s not a vehicle right now that’s sold in America that can meet the standards that are in the AAB final rule. And I say, great, that means you guys need to get to work putting some more safety into your cars. And it’s probably something you should have been doing since 2015 when you entered a voluntary agreement that really didn’t make you do that much work.

So it’s a good thing overall, for sure. Yeah.

Fred: Now, I want to explain to the listeners that there’s a group called the Alliance for Automotive Innovation, which is an industry group with all membership of all the major manufacturers. It’s actually the Alliance for Prevention of Automotive Innovation, and they tend to be very regressive.

I also want to point out to you guys that there’s really a Trojan horse in here that for our side, which is very interesting. One of the, one of the places in here, Anton, you’ve read your owner’s manual, right? Yes. Okay, so one of the statements in here is that the Alliance for Automotive Innovation stated that a no contact performance requirement is not practicable and increases the potential for unintended consequences, such as inducing unstable vehicle dynamics, removing the driver’s authority.

Anthony, what was the entry in your owner’s manual that said, this is the speed you should not exceed because you have a potential to lose control of the vehicle.

Anthony: It didn’t say that, but it did say that it will not work above 25 miles per hour. Well, that’s a different, that’s a different

Fred: issue.

That’s the AEB, but we talk about the performance envelope, right?

They do not talk about a performance envelope. Well, I think the cars should alert owners. When they’re in a situation that the car is not designed for should be easy to do and this is one of those situations if you have hard braking, what the alliance says here is there is a certain speed where you will lose control of the vehicle.

You will not have authority over the vehicle’s motion. That’s really the implication of what they’re stating here. So I went to my Subaru and looked in the owner’s manual. Or the speed, which I should not exceed because hard braking will cause me to lose control of the vehicle. And, surprisingly, there’s nothing in the owner’s manual about it.

But this is an admission by industry that there is such a speed that they know what it is and that they are refusing to alert owners when they approach or exceed the operational safe limits for their vehicle. I think this is huge. This is this rule is actually wonderful. It’s fascinating.

Unfortunately, it leaves out motorcycles because they said the technology just isn’t ready for that, but these are motorcycles and bicycles. But, hopefully, they’ll catch that in the future in a revision to that and update to the rule.

Anthony: Fred, what you’re saying is there’s, there must be some sort of known quantity of saying, hey, we’re going to sell you a car and at some certain point, if you hit the brakes, which are designed to slow the car down we can’t guarantee the car will be under human control anymore?

Fred: That is correct. That’s exactly what it means. so much. Wait, really?

Michael: Well, look, everyone needs to realize this. This is a, an automation, right? Like no matter what speed you’re traveling at, when AEB engages, you’re losing control. I think the concern they’re expressing is there’s a certain speed at which hard breaking, maybe 90, 90, 95 miles per hour, where hard breaking of the type that would.

Be needed to stop the car before it hits another car could and might, there’s a good chance it’s going to significantly impact the ability of the driver to control the vehicle by steering or other man, any other manner. I mean, essentially at that point, you’ve slammed on the brakes and, you have to control the vehicle from there, even though you personally didn’t slam on the brakes, the car slammed them the car The driver is still marginally in control of the vehicle through steering.

It’s, it’s not a it’s not, a point that’s being made by the industry that we think is completely false, I think that, obviously heavy braking at high speeds. Is going to have an impact that I guess the issue is, first of all, why are you traveling at those high speeds in the first place and putting other people in danger, requiring your car to have to break that hard because you haven’t detected a potential traffic accident in front of you.

I mean, there, there’s a lot more going on there than just a car hitting the brakes hard. I mean, the driver at that point has put the vehicle in a situation where it might hit and kill other people, pedestrians or another vehicle. Maybe there should be in the manual, some type of.

Acknowledgement that, if you are traveling at certain high speeds, the automatic emergency braking system, will take away your control if it detects a crash. And that could impact your ability to control the vehicle. I don’t know. It’s, there are always tradeoffs in almost every safety rule, with airbags.

The airbags are going to save, a thousand lives a year and, 20 people may be killed by aggressive deployments by a Takata inflator. In this case, AAB is predicted to save a few hundred lives a year and prevent tens of thousands of injuries. I certainly am not. So idealistic that I think that no one is ever going to be killed or injured in an accident that’s related to the situation where there’s going to be high speed braking and the vehicle loses control.

Perhaps, there’s a million things that can happen at that point. It can leave the road kill the driver of that vehicle, but ultimately. Like I say, it’s a trade off and it’s one that I think we can accept given the significant gains that we’re going to see in safety, but there’s always, going to be a small percentage of people who are negatively impacted by what is intended to be a positive safety rule.

Fred: Oh, that’s absolutely right. I think this issue is a little different, though, Michael. This just refers to the underlying dynamics of the vehicle, not the impact of the breaking. And what it says is that at a certain speed. When you put the brakes on, when you put the brakes on hard, you as a human being will be unable to control the vehicle.

It becomes unstable and it’s going to do erratic things. So that’s what I was talking about with the operating envelope, that people should be alerted to the fact that there are performance boundaries. Remember the old days of the Jeep Wrangler that would flip over at low speed? Okay. So that was an example.

There should have been a placket on that says, Don’t drive this thing above two miles an hour, because it can flip. Right. And there should be a placard on the vehicle that says, don’t drive above, I’ll make up a number, 87 miles per hour, because if you slam on the brakes, your car is going to swerve and go out of control.

So that’s what I’m talking about. It’s not got to do with the AEB. AEB is an unalloyed benefit. There’s no doubt about that. Anyway, I just wanted to bring that up and

Michael: Yeah, I see how, intelligent speed assistance is something that we also think that this should be working on.

It has a lot more political hangups than automatic emergency braking, but intelligent speed assistance is something that kind of does what you’re talking about in a way, it did when the vehicle reaches certain speeds at which the operators probably shouldn’t be operating. The intelligent speed assistance would be designed to either alert the driver.

I don’t know how effective that’s going to be. Or slow the vehicle down to the recommended speeds. And that might be something that helps out that scenario.

Fred: Yeah, that’s a, that would be as a good example of a virtual guardrail that would prevent the operators from exceeding the safe operational limits of the vehicle.

Anthony: I mean, this may upset the fellow members of my Maserati owners club, but maybe we shouldn’t sell cars that go that fast. I mean, maybe we don’t need vehicles that, that, that go 90 miles per hour.

Michael: I think a lot of us could get on board with that. All the guys out there who like to use the phrase nanny state are not on board.

Fred: Well, but Anthony, that brings up a problem though. How would insecure people attract dates? If that’s true.

Anthony: With my American express black card, this episode brought to you by the American express now, but if you have an American express black card, you know what that means? You’ve got more than enough disposable income to donate to the Center for Auto Safety.

Go to autosafety. org and click donate. Use your American Trust Black Card. Ah. There’s one more thing. One

Michael: more thing I want to talk about on ADV. I was continuing. A really good thing. Okay, good. Well, something that’s really cool that NHTSA did here is rather than simply require that these vehicles, slow down a certain amount prior to, to, Prior to a collision, NHTSA’s rules essentially says you have to avoid the collision.

That’s something that the manufacturers didn’t like a lot either because it’s another, Area where they use the term impracticable because they don’t believe that’s something they can guarantee and they believe or they believe that achieving that benefit is going to cost them a whole lot more than what this is estimated, but I think that is a Very good thing, particularly in regards to the pedestrian automatic emergency braking guaranteeing that no contact is made virtually eliminates, the chance for an injury or a death.

So that’s a kind of a, I’d give 3 cheers to Nitz on that for continuing to keep that in the rule over the objection of the manufacturers.

Fred: And my final shot is that there are cars on the road today that comply with these requirements. And they demonstrated that, or they’ve documented that in this report.

The whining about how difficult it is should fall on deaf ears.

Anthony: One of those cars that passed, I believe, is a Subaru Forester, right? Cause the, our friends at the Insurance Institute for Highway Safety has a great video where they test out A, B, they tested out at, I, I could get the numbers wrong, at 32 miles per hour, 37 miles per hour, and 41 miles per hour?

Where they have a a fake car, a motorcycle. And a semi truck and test, hey, when do these things warn the driver, did they stop in time, and they go through and test a lot of SUVs, small SUVs the first few times around through. And the Subaru Forester was the only small SUV to earn a good rating in the updated test.

The Honda CRV and Toyota RAV4 are rated acceptable. Everything else is just rated poorly. But what I like about this is they test the motorcycle. Which is an issue we’ve seen repeatedly with a number of companies not seeing motorcycles and running into them. But I want to ask about this is who actually makes the software hardware combination these companies use?

Because I imagine they’re all kind of buying one, this component or series of components and software from Some third party. They’re not all out there developing their own camera radar Software array for this are they

Michael: Some are developing their own summer, you know pooling their resources and Using a third party supplier, right there’s all sorts of ways to get it done.

Ultimately, it sounds like they’re, they may want to all pull their resources together. If they don’t think they can meet this rule to figure out how they can, but those it’s the law now.

Fred: Yeah, that was actually part of the problem, Anthony, because there’s no. Standard test that looks at all of these AAB systems and says, this one’s good, that one’s bad.

So people could basically throw out whatever they wanted to and say, well, that’s AAB we’re good citizens. Now I think that sources will probably start to collapse into the good ones. Now that there’s going to be a requirement out there, an actual test to verify that these things work.

Michael: And that’s a great video from IHS.

It shows you some of the complications here in this new AEB rule. There’s the old IHS test was running at 12 miles per hour and 25 miles per hour. That’s essentially the standard that the manufacturers had to meet under the voluntary agreement by 2023. So at this point, Every vehicle is coming with a be that can, work up to 25 miles per hour.

And, our general point about that is that’s way too low to prevent serious crashes. And so what I just is now doing is increasing their test speeds that, 31, 37, 43 miles per hour. I expect that in the future. Given that NHTSA is requiring AEB to work at greater speeds than even that IHS will continue to raise those targets to make manufacturers build better AEB systems.

Fred: This might be a little confusing because we’ve talked about two things that kind of run into each other, but the rule that came out from NHTSA does not include motorcycles. The IIHS testing, which is what we’re talking about now, does include motorcycles. So the result that Anthony just quoted is for an IIHS test, not for compliance with the NHTSA rule.

And IIHS is, I think, anticipating What the rule will ultimately have to do, they’re including avoidance of trucks, they’re including avoidance of pedestrians, they’re including avoidance of motorcycles, as well as cars. The national standard isn’t up to that yet, but they’ll probably get there over some time.

IIHS is leading the charge right now, and good for them.

Michael: I was going to say, one thing, one thing also that we haven’t mentioned yet, but that’s pretty important is that there, there are going to be growing pains. We’re already seeing, a lot of. It’s investigations and some recalls around false activation of the emergency braking systems, also known as phantom breaking, probably a few other names out there for it as well.

But essentially, that’s when the vehicle thinks there’s something in the road. And applies the brakes when there’s truly nothing there. That’s a problem that we just saw a couple of weeks ago. And it’s upgraded an investigation into Honda that’s been going on for some time around this issue. And, we’ve seen recalls from Tesla and a number of other manufacturers around this type of issue where the AEB is detecting something that’s not there and slamming on the brakes and that can create crashes.

That’s a concern. And what NHTSA has done is, they have a couple of tests that are included in the new rule that vehicles are going to have to pass. They’ve got a steel plate test and another test that are essentially there to, a base level test that to try to ensure that no type of.

Phantom breaking occurs, but we ultimately think that’s going to continue to be an issue that is primarily addressed by the Office of Defects Investigation. That’s their enforcement wing. I don’t believe that those 2 tests are going to be able to capture every scenario under which fan breaking occurs.

And so that’s something to keep an eye on long term.

Anthony: Alright, well I think we’ve covered enough AEB for one day. I mean, I do have more AEB questions, but, limited time in our lives. Let’s move on to Ford Blue Cruise. We’ve talked about this before, this is their automated level 2 plus driving system that works on predetermined roads and maps and it has On paper, it sounds like a really good idea.

Where it will engage itself on highways that are pre mapped and say, Hey, you can take your hands off the wheels, we got this. But you have to keep paying attention. They have cameras inside watching you and they’ll turn it off and say, Hey, you’re not paying attention, you’re reading a book. Now unfortunately NHTSA is investigating them.

The, from an article in the AP. The agency’s initial investigation of the crashes, which killed three people, determined that Blue Cruise was in use just before the collisions. One of the crashes occurred in February in San Antonio, Texas, killing one person, while the other happened in Philadelphia in March, in which two people died.

The agency says the investigation will evaluate how Blue Cruise performs driving tasks, as well as its camera based driver monitoring system. Not great news for people who like cars that quote unquote drive themselves. We’ve seen this with Tesla, and now Ford joined the club.

Fred: No, and the bad news here is that no matter how well you map the highways, where you’re planning to use your vehicles, there’s a lot of stuff on the highway that’s not being mapped.

You’ve got cars broken down by the side of the road, you have animals crossing the road, you’ve got debris in the road, you’ve got construction barriers that pop up overnight. There are limits on how much safety you can build into a vehicle by investing in better maps. And I think that’s what Blue Cruise ran into.

Michael: Yeah, and that’s this highlights that. But also, this, these collisions were both with a car that had stopped in a travel lane at night. So essentially a vehicle stalled in the road at night. And so

Anthony: in one case, the hazards were not working in the car that was stalled. So it’s a dark box on the road.

Michael: Right. And there’s obviously some questions there as a couple of things, why the driver wasn’t alerted in time to avoid the crash, what’s the driver alerted in time to avoid the crash, but was unable to take control. And importantly here, we just saw, and it’s a close and investigation of Tesla’s autopilot where.

Ultimately, they seem to focus more on the driver’s ability to take over the vehicle versus the vehicle’s failure to detect and respond to an incident in the road ahead. In the Tesla situation, the agency was essentially saying, you need to do a recall to make sure that the autopilot is only turned on in areas where it should be, and that it is able to, give the operator enough warning so that the operator can take over the vehicle in time to avoid the collision. Here we’re talking about A vehicle that is a blue cruise, which can do a lot of things, it can detect and should be able to avoid or at least begin the process of avoiding a vehicle located in a travel lane right in front of this.

In fact, the AB rule would probably apply to the situation as well. In 5 years, Ford’s going to have to meet that. Most vehicles will have to meet the standard of being able to detect and slow down in this situation where there’s a stationary vehicle in front of you on the road. So it’s, ultimately, I think that there’s.

It’s going to be interesting to see where NHTSA falls on this as compared to where they fell on the Tesla situation where we’re still waiting for answers on why Tesla’s can’t see and avoid fire trucks, emergency personnel and other objects and why, in all those cases, we’re looking at the human under a microscope versus looking at the systems Tesla has in place.

Fred: You mentioned takeover. So I want to jump in here for just a second. There was a study that I recently read that talked about takeover ability based upon studies in Sweden of 30 professionals who actually worked in the automotive industry. And they found something very interesting. They said, even when a car is issuing an alert, there’s a problem in the road.

The first thing that people do. Is not look at the road to take over. The first thing to do is they look at the instrument cluster to find out what the hell is going on and only after they’ve looked at the in scan the instrument cluster or the Video display or whatever they’ve got do they then return their attention to the road?

And as a consequence, it takes a long time for people to actually focus back on the road. What, by their measurement of the eye motion of the people and comparing it to the eye motion of people who are actually driving the car, particularly when they’re distracted. They found that it’s actually 15 seconds before a typical person has assumed control of the car again after the alert.

15 seconds, how long is that? We talked about this last quarter

Michael: mile at that

Fred: speed, right? The largest aircraft carrier in the world is the USS Ronald Reagan. Its length is 1092 feet, 15 seconds at 60 miles an hour is 1320 feet. You would be 300 feet over the end of the aircraft carrier flight deck in the water.

Before you actually have assumed control of your car again, once you get the alert, if you’re driving 60 miles an hour.

Anthony: But Fred, how often are people driving on an aircraft carrier? This is ridiculous.

Fred: Oh, you’d be surprised. It’s classified. I’d like to tell you, but we just can’t go there. But anyway, that’s just a measure because you’ve been to the pier and you’ve walked on the aircraft carrier down there.


Anthony: really

Fred: big. They’re really big. The interesting part of this to me is that none of the standards that we’re looking at, including the AEB standard that came out, references this reaction time of the people. And so they’re really the whole idea of people taking over from automated controls, particularly when they’re distracted, is completely outside of the envelope.

Of what the standards are providing for automatic intervention to save lives. It’s an interesting break there because the industry is saying, well, as long as you were taking over, as long as the humans taking over from the automated control. And in fact, we’re even guilty of saying this, or you need human drivers in the automated controls before you should allow these vehicles on the road.

Even in that case, there’s a really long time between the alert that the person receives and the time when they’re actually able to take over control of the car. My opinion is that this will never be a safe procedure. There’ll never be a safe process for people taking over from automated control even after they get an alert.

It just takes too long.

Michael: And there’s, yeah, I mean, you’ve got, I mean, functionally, if you turn that around and think about the burden that has to be met by someone making a vehicle, you need to be able to detect a potential crash and start alerting the driver when you’re still, if you’re traveling 60 miles an hour, when you’re still, A quarter mile from the point of impact, which, makes you wonder, is the technology even available to provide sufficient warning to drivers?

Fred: Well, we know it’s not because in a lot of cases, you’d have to look around the corner and behind opaque barriers in order to see what’s going on in the road, so that simply will never happen.

Michael: It just can’t happen. And so is this technology really just, is it impossible to work? Is it going to be more dangerous than worthwhile?

Fred: I already know the answer to that. I think

Anthony: I bet you do, but Michael, you mentioned Tesla. And so you’ve opened up a whole can of worms, Tesla PBS. org, the U S government’s auto safety agency is investigating whether last year’s recall. of Tesla’s Autopilot driving system did enough to make sure drivers pay attention to the road.

Short answer, it didn’t. It didn’t at all. But wait, I gotta jump back, sorry. The Blue Cruise incident at night, I know I’m jumping now. The Blue Cruise incident at night where they’ve run into cars and unfortunately killed people. I get it’s dark out, our eyes don’t work that well in the dark, sure.

But these cars have radar systems. Now, radar is not affected by Darkness. Is it? No. Fred? No. He’s shaking his neck. No. So that’s one of the things I don’t understand how that failed. I’m curious to see how that comes out. Because in the case of Tesla, I understand how their cameras don’t work at night because they don’t have radar in their systems.

We’ve got a whole bunch of articles on Tesla and Autopilot systems there’s one in MSN about a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y high, traveling at highway speeds. Basically it was the investigation found that after examining hundreds of similar crashes with a pattern of driver inattention combined with the shortcoming of Tesla’s technology resulting in hundreds of injuries and dozens of death.

Basically drivers using autopilot or the advanced sibling, full self driving, We’re not sufficiently engaged in the driving task. The drivers did not adequately ensure that Tesla’s technology, sorry, Tesla’s technology did not adequately ensure that drivers maintain their attention on the driving task.

So just like what you guys were talking about, Hey, we have these automated systems. I mean, yes, they’re partially automated systems, but they got fancy names, man. They sound great. I’m living in the future and people engage them. They assume something called autopilot is going to take care of itself or full self driving.

And then they’re no longer engaged and smash, crash, bash, people are dead. What do we do?

Fred: I don’t know. I mean, I think, the answer, part of the answer has got to be that this requirement that people take over exceeds human capabilities. That’s just not something we can do. It’s not a matter of free will. It’s not a matter of intention. That’s just something humans are incapable of doing.

So that’s a fact and facts are friendly, but I think the industry needs to deal with that.

Anthony: As someone once said, that’s your version of the facts.

Michael: I think ultimately here you need to stop allowing manufacturers to set traps for people. And when I say trap, I mean exactly what Tesla has done with Autopilot and full self driving.

Selling a vehicle that promises capabilities that aren’t there yet. But which consumers are pretty easily led by the nose into believing exist in the vehicle. So that, we, it’s essentially people are being told that a vehicle has or they come to believe that these vehicles have more capabilities that they do.

They rely on them and. When the inevitable happens of a situation where the human needs to take over and respond to a danger, they’re not able to and, it just appears to us and through everything we’ve been talking about in this podcast, as far as take over and timing that, humans.

Don’t respond very well there and it’s looking like we’re not going to be able to do this and so my question becomes, what when anthony’s asking Why aren’t we did why aren’t these vehicles with this advanced technology doing a better job of detection? Why is netza focusing in the tesla investigation right now?

They’ve basically said we’re going to close Our original investigation into autopilot, which they did, they closed that and opened a recall query into what Tesla has done with autopilot to make it better around human takeover. And it just, it seems backwards in a way to me. It seems like NHTSA should be resolving why the Tesla vehicles aren’t detecting and potentially responding, whether it’s automatic emergency braking or some other means to emergency vehicles, to the motorcycles, to a lot of the things that were being looked at in that investigation, that may be the real problem here.

Because we know humans aren’t good at taking over. We know that even if you have the best driver monitoring out there and other systems in play, the human is still going to be a failure point. So you’ve got to focus on why the vehicle’s unable to detect these problems in the first place. Why? Doesn’t the Tesla recognize that there’s an emergency vehicle?

And why doesn’t the Tesla realize that the driver’s incapable of taking over the vehicle fast enough to avoid a collision and then, deploy whatever emergency feature it can to stop that event from happening, whether it’s, bringing the vehicle to a halt. Switching lanes getting away from a potential collision.

There are a lot more questions that we had about the Tesla crashes, and it’s an open, about 40 special crash investigations into that, that haven’t been made public yet. So we’re still somewhat in the dark about that. Each individual crash. But there are, there’s a lot more to the question of why those instances are happening, then driver alertness and takeover there, there is remains the issue of why this awesome technology that Elon can’t shut up about, can’t detect these things.

Anthony: One of our student listeners pointed out that autopilot is, which is Tesla’s favorite little, one of their favorite things. People hear that and they think autopilot must be like planes, but our astute listener pointed out that autopilot and planes has two highly trained pilots on board that is also being monitored by air traffic control, your car.

If it’s called autopilot, you do not have two drivers. And you’re not being monitored by traffic control. Continuing with Tesla, the Washington Post from an article there Tesla maintains that it’s not liable for the crashes because the driver is ultimately in control of the vehicles. Even though they sell something called Full Self Driving.

Which if you act now, you can get a 30 day free trial on Full Self Driving. May become Full Self Crashing at any time. Who knows? Fred’s looking at me with just disappointment. And I don’t think it’s aimed at me so much, but it Unfortunately, the future is not going to become as good as one would hope. But hey, Tesla has another solution for that.

Michael: I mean, we’re going to see this year, lawyers are going to be litigating at least, I think it’s at least eight lawsuits and maybe more involving autopilot. We just saw Tesla, we discussed this a couple of weeks ago, settled the Juan case in California where the vehicle on autopilot drove someone into a cement barrier.

And we’re going to see a lot of different scenarios where drivers are relying on the technology to their detriment. And it’s essentially going, I think that the result of a lot of these lawsuits may ultimately be what. Determines how, determines the trajectory of autopilot and full self driving and the way that Tesla’s deployed them because so far the government has, the federal government at least has done very little to stop these incidents from happening, whether it’s the FTC sitting on its hands, when Tesla’s pretty clearly using deceptive marketing and lots of deceptive means to make people rely on this.

Technologies or Nitsu doesn’t seem to be quite willing to dive all the way into the nuts and bolts to figure out why these tests, why the test Tesla technology is failing. It’s a big year for decisions around, around Tesla autopilot. And hopefully something comes of that that, that makes Tesla protect its customers better.

Anthony: Oh, well, don’t worry. Soon enough, Tesla will be saying, Here’s our RoboTaxi. That’s right. You’ve paid 60, 000 for our car, and while you go to sleep, your car will just go out and pick up strangers. And it’ll drive around all night, and you’ll make money on this. That’s right, with a new RoboTaxi service. And then you wake up in the morning, you get in your car, and you’re like, Oh, my God.

Why are my seats sticky?

Michael: Yeah, no one’s gonna do that. The whole thing is just nonsense.

Anthony: Why is my battery depleted? Why is my car in Milwaukee? What happened here? This is a dumb idea brought to you by a drug addled brain. Not my brain, but hey, if you’re on drugs, donate to the Center for Auto Safety. Weird plug.

I don’t know, one last, okay, sorry, one last autonomous vehicle story, because this one’s entertaining. It’s entertaining because no one got hurt. From Futurism, we have a self driving Waymo on the wrong side of the road. There’s a great video. You can see the Waymo has crossed the double yellow line.

It’s heading down the wrong way, and you’ve got a whole bunch of dorks. I’m gonna call them dorks. They’re on these unicycles. They’ve got lights and weird stuff, but the dorks are corralling the Waymo back into the correct lane. They’re like, Waymo, you’re going the wrong way! They’re literally yelling that which is funny, but one of them gets in front of it to basically steer it into the correct lane of traffic.

And Michael, what was Waymo’s response to this?

Michael: They said, they said that the robotaxi’, it’s robotaxi did the right thing. A detected maybe a risk of a person in this crowd of, what’d you call them?

Dorks. Own unicycles. These are unicycles. They are, don’t forget that part. Motorized, flashing unicycles.

It’s like a hoard of unicycling taking over a city, but it, there were what, like maybe 10 unicycling driving in a pack. The one side of the street, the

Fred: meeting to another zombies are real. Is that what you’re saying?

Michael: Yes, they are. So the rainbow vehicle, instead of stopping behind the group and letting them, get themselves together.

Crosses the yellow line, essentially to pass them, which is what it would get any of us pulled over in a second. And then continues in that lane for a significant period of time. I believe at one point there was another car approaching it that had to move into the far right lane to avoid a collision.

And, Waymo comes out with, I hate to say this Waymo, but you’re sound a lot like Kyle, say. They’re defending what the vehicle did. And I don’t think it’s defensible. I don’t think there’s any circumstance in which a human would be able to drive a block and a half or however far it was on the wrong side of the road in the middle of the city and claim that it was, just avoiding a unicyclist, I think the smart thing to do there is to slow down.

Let the unicyclists have their fun in front of you. And if they’re in your way on that route, take a different road. It’s not, there’s no need to drive in opposing traffic to get around unicyclists.

Anthony: But Michael, as you pointed out, if one of us was driving the car and we crossed that double yellow line, we would have gotten a ticket.

But not if you’re a self driving car, because California law only hands out traffic tickets to drivers that are human. There’s no driver in an autonomous car, so they can do whatever the hell they want, no one will get a ticket. But thankfully, California has two bills they’ve introduced that will allow California cities to write their own individual regulations related to autonomous vehicles.

Which sounds ridiculous, cause there’s already laws on the book, why each city can do their own thing.

Michael: Well, there’s the thing, it’s because of situations like we’ve seen in, in Seattle and San Francisco where there is a, you essentially have folks in the capital of the state dictating to localities what their local policy is going to be and not doing a very good job of it.

It’s that tension that we’ve discussed between San Francisco and this. state of California back when Cruz was running rampant around the streets of San Francisco and blocking traffic and all these other issues, San Francisco had their hands tied because they had the California public utilities commission and the DMV who were in charge of what goes on their streets and where.

GM cruise was going to be allowed to operate. And so what this bill does, and there’s a lot of other good provisions in that bill, and it’s certainly worth the look for those who haven’t seen it. But what it does is essentially restore some of that authority back to places like San Francisco, where there’s some things they could have done to, to ensure, I, I.

That these vehicles were operated more safely and, just providing an additional layer of oversight is important. And so that’s pretty critical going forward, I think, is to ensure that localities have control over some of the issues that arise when autonomous vehicles are running around the streets.

And it’s a very interesting clash between, state and local law.

Anthony: Speaking of autonomous vehicles, Fred, do you mind giving us a little TOW update on some of the changes that have been made to the consumer autonomous vehicle?

Fred: You’ve now entered the TOW of Fred. Well, thank you, Anthony.

I was hoping you’d ask. A lot of people in the Piggly Wiggly parking lot would say, Oh, you guys, you’re young and you’re beautiful and you’re happening, but all you do is complain. We wanted to address that, and what we’ve done to help the industry, really, is to put together a set of requirements that they may have somehow overlooked, and we’re calling that the Consumer’s AV Bill of Rights.

I can tell the industry that if you build your vehicles in conformity with this Bill of Rights, they’ll be a lot safer. And they’ll also give you a competitive advantage against the no doubt questionable quality of the Chinese imports that are coming after your industry. We have the, them on our website.

You can find them by searching for AV bill of rights, right? AV hyphen bill dash. Yep. That’s right. You’ll see the link at the top

Michael: of

Fred: the page, top of every page. And we have updated this with things we’ve learned over the last year. So I want to just give you a couple of highlights from that. The first one, and the kind of governing rule is that the AVs shall not increase risk of injury or death to any person inside or outside of an AV compared with comparable conventional vehicles.

So inquiry minds have said, well, that’s good. You guys are young and happening, but how are you actually going to do that? The way we’re advocating for that is that there’s two different approaches you could take. One is to, before deploying the AVs, you accumulate enough driving experience in enough varied circumstances over your entire proposed operational design domain.

That people can look at the statistics on the behavior and say, well, these statistics are actually superior to what human beings would do. We’ve got the data. We’ve got the confidence in the data. There’s no reason for us to doubt that. Unfortunately, that will take you probably thousands of years to do, so we’ve also proposed an alternative approach, which is a comprehensive neutral third party safety case analysis.

For example, you could use Underwriters Laboratory 4600 standard as the basis of that, of a proposed configuration of the vehicle software, firmware, and hardware for an approved operational design domain. How would this work? Well, basically, you would present your design for a particular configuration to a group of experts who understands what the hell is going on, and use this framework of UL 4600 or something like that, To examine every nits and grits associated with your vehicle and pass judgment.

The benefit of this is of course, that it’ll be a lot safer than without this analysis. Also it would be unquestionably as safe as humanly possible. If you do this, and you could also do it within a person’s lifetime. It shouldn’t take that long. It’s not a it’s not a design basis. It’s a performance based set of requirements.

But if you go through the other, this and the other set of requirements, you’ll find many helpful hints of how you could avoid killing human beings. By releasing your, your I encourage people to take a look at this. There’s another thing that we’ve done here, which is we have segregated the requirements into 10 handy.

Engineering requirements that should be transparent and accessible to everybody who’s designing these vehicles as well as regulators whose job it is to determine whether or not they’re safe. We’ve also set out some legal requirements, for example, conforming to the duty of care, protecting person’s personal information, reporting operational data, and assuring that the occupants of an A.

V. Never waive access to the courts by binding arbitration agreements or any other way. There’s two aspects to this. The technical, which is for my NERD brothers and sisters in arms. And then the legal requirements for the legal eagles who are after a safe operating environment, a safe social environment, as well as the safe technical environment.

So we offer this in the interest of helping the industry so that they will not put out these bass backwards things that they’re doing now that are questionable and slaughtering people. And we’ll just have to go down this road once. So it’s easier to do it once the right way than to do it over the wrong way several times.

Invite customers and invite our listeners to ask questions about this. We’ve had some in the past and we’ve incorporated those in the update. And another year we’ll probably have another update as well, but. We invite you to look at the Consumer’s A. V. Bill of Rights on our website and send in your comments.

Anthony and Michael, anything you’d like to add?

Anthony: I love the Consumer A. V. Bill of Rights. That’s what I have to add. I wish we had a better name. It just feels still, I mean, it’s very, it’s pretty descriptive of what it is, but

Michael: that’s really my own. Well, I love the CAVE BEAR acronym. That’s the best part.

CAVE BOAR. Cave boar. Yeah. Alright.

Anthony: Well that’s that.

Fred: I’m an engineer so I’m more cave boar than a cave bear.

Anthony: Ah, and there’s a difference between a dork and a nerd. Let’s go into some recalls. How’s that sound? That sounds great. Okay, let’s start with Hyundai. Potentially 31, 440 vehicles. These are the 2022 2023 Genesis GV70, the Genesis GV80, the Genesis G80, the G90.

The low pressure fuel pump assemblies may contain impellers that can deform under certain environmental conditions such as high ambient temperature. Now, for those of you who don’t know what an impeller is, it’s like a little propeller, but it impels instead of propels. That’s right, it pulls in. It sucks in, and it’ll pull in your fuel, and it’ll go up, and apparently they didn’t realize that these things will operate in warm conditions.

Michael: Yeah, that sounds right. It looks like they, I can’t exactly tell what the failure is. It looks like they may have, just, they don’t do a really good job. I mean, are they melt or they’re damaged somehow when temperatures get high and that shuts down the fuel pump, your car can’t move anymore and you’re stalled in the middle of the road it looks like they’re replacing.

They’re inspecting and replacing the fuel pump assembly. So that recall is going to be about a month and a half away for owners of Genesis vehicles out there.

Anthony: All right. Next up we have BMW 5, 761 vehicles, the BMW two series coupe. Unfortunately with this driving at speed, your toupee will fly off.

No, that’s not true. It’s actually a bunch of them including the gas and diesel models of their series three sedans or three series sedans, which have a bunch of very unfortunate names, like three 20, I three 20, I X drive three 20, I numbers. That’s,

Michael: that’s an issue we need to cover on other podcasts.

Sometimes the numbers give me a Mustang.

Anthony: Yeah, this is a, so basically this is the head airbag, which may not have been produced by the supplier according to specifications. Contaminations within the inflator can lead to corrosion. So this sounds very Takata ARC like. Who was the who was the manufacturer of this one?

Michael: Oh, it’s not clear. I don’t believe they list that. Oh yeah, they do. It’s auto live. I never quite sure how to pronounce that, but let’s say they are, they produce a lot of airbags just like Takata, just like ARC. They haven’t quite run into the same difficulties that those two companies have run into in the last few years.

But. In this case, it looks to be a somewhat different it’s, this isn’t like an inflator explosion that sends out shrapnel that can hurt. The driver, I believe this just causes stored gas to leak from the inflator, which if you have a crash, then you, your airbag will no longer have the material left inside of it to be able to fire and protect you appropriately, is how this one sounds.

Anthony: And as we know, the propellant, that stored gas can be anything. So if they’re using potassium cyanide, you’re driving along and You’re dead. Right?

Michael: I don’t think they’re using potassium cyanide, Anthony, that would be a pretty significant issue. There is no regulation.

Fred: The stored gas is, the stored gas is usually nitrogen and this kind of inflator is a two stage of activation depending upon circumstances.

So in the one stage, it just releases the nitrogen to blow up the airbag. And then if the collision is more severe. It will ignite the propellant, and that will produce additional gas or produce gas quicker. So that’s why they’re talking about the stored gas.

Anthony: But I just come up with a great James Bond Mission Impossible type villain thing.

We replace his airbag inflators with potassium cyanide. Let’s ram him.

Michael: That’s interesting. I wonder if anybody’s come up with that. I’ll have to go back and watch thousands of episodes.

Fred: I think that’s what you’d find in a Russian airbag, I believe.

Anthony: Last one. Let’s do an investigation. How’s that sound?

That sounds great. General Motors, the 2023 Cadillac Lyric. Now, the Cadillac Lyric, isn’t this the super expensive, like, 100, 000 plus car? Am I right on that?

Michael: I’ve never shopped for a Cadillac. I’m gonna guess, yes, this is the one you’re talking about. It looks pretty fancy.

Fred: Well, yeah, it’s the all electric whiz bang.

Anthony: Right. It looks like it was a design. I mean, this is not a bad description of it. I’m not trying to be facetious or smart alecky or whatnot, but it looks like it’s Hey, stop it. It looks like one of those things from an auto show that you saw in the 90s, and you’re like, that’s cool, they’re never gonna make that.

And it looks like they actually made this. Yeah. So what’s going on here, Michael?

Michael: So there’s they’re having analog breaking system problems. They’re losing a lot. Yeah. Not not, this is not super advanced. Any lock breaking has been around for a while, not related to automatic emergency breaking, they found, I think it was.

There’s the electric. There’s a spindle inside of the electronic brake control model that fractures during an analog braking system event and meaning you lose assist with your brakes during. So you lose your analog braking system essentially, but what was interesting is they put in a new material in May of 2023.

And developed software as well to cycle the brake booster startup now. And then, and those both apparently fix the condition, but it looks like, they didn’t put out a recall at the time which is bad news because essentially they were trying to create. They were trying to correct this potential safety issue and put out a fix for it.

May change their production, but at the same time, never notified owners, never filed with NHTSA that a recall was happening and essentially conducted the software update under the table with its owners. Which is a problem. NHTSA’s looking into that and, it, it looks like they could have done a lot better here.

And so there may be potential civil penalties or otherwise for delaying a recall, but we’ll see how that works out.

Anthony: I just pulled up a picture of the Cadillac later. It is not the vehicle I had in mind. It just looks like a fancy SUV. They have some other electric vehicle, maybe that’s better.

Prototype, but it looks super cool. Anyway. Hey listeners, I hope you’ve learned something today. I hope you’ve learned all you’ve ever wanted to know about automatic emergency braking and how both Michael and Fred are convinced nothing will ever be better in the future when it comes to partial automation of.

Cars which is probably good. So if you have partial automation, or if you think it’s full automation PAY ATTENTION! And, uh, hey, if you no judgment of your lifestyle, if you’d like to ride a powered unicycle, to each their own.

Michael: Yeah, we don’t really think you’re a dork. Yeah.

Anthony: One of my friends has a bunch of unicycles and I let him know. Good for him.

Michael: I’m scared of them. So there you go. He’s a bigger fan than

Anthony: me. Fred’s saying something very interesting, but he’s muted. Why would anybody have more than one unicycle? I’m, this is curious. I will ask him. I have no idea.

Michael: I would need four and attach them together to feel safe.

Fred: Maybe some of them are self driving. Who knows?

Anthony: There you go. Alright, thank you so much listeners. Go to autostage. org, subscribe, donate, subscribe, click like, click five stars, tell all your friends. Friends put in a comment on Spotify and eventually someone will log into the Spotify account and look and see if anyone would like to do

Fred: that.

Well, Bye bye. For more information, visit www. autosafety. org.


Join the discussion