Tesla and stunning negligence

Tesla settles an Autopilot case before the trial starts which makes sense since Elon said they never fight true claims. Another Telsa case involving a 2 year old that managed to start the car and crash into his mom has begun, and Elon says that his robotaxis will be announced real soon. Our expert opinion is that Tesla won’t produce a robo-taxi within the next 5 years. Don’t get high and drive because it’s dangerous and the sobriety checks are so poor that you might wind up in jail for a really long time even if you’re sober. Fred covers reaction time and we cover some recalls. Enjoy.

This weeks links:

Subscribe using your favorite podcast service:


note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Hi, my name’s Anthony and it’s been at least three weeks since I’ve wagged my finger at another driver. Welcome to the show. Welcome to the show. Good

Michael: morning. Good morning. Congratulations, Anthony. I know that you’ve been working really hard on that.

Anthony: No one’s honked at me obnoxiously for no reason.

That’s always helpful. Speaking of honking obnoxiously, Let’s talk about Elon Musk. Ah there’s this big trial, we’ve talked about it in the past, about the it’s, I want to say Walter Wong. Wong. Walter Wong his family suing Tesla as his Model X on autopilot crashed. Horrific person died and this was going to trial.

This was in the works since 2019, 2018. The crash happened in 2017. They filed suit against Tesla saying, Hey, this is negligence on your part. And from this great quote in the Wall Street Journal we’re linking to, Elon Musk has said, Tesla’s policy is never given to false claims, even if we would lose, and never to fight true claims, even if we would win.

Tesla settled this case before it went to trial. Based off of Elon’s own words, I’m gonna assume that They knew they were full of it. They knew that they were gonna get ripped apart on their autopilot, full self driving nonsense. Especially now as Elon is saying, Hey, everybody, come on.

You’re gonna get one free month of this bad stuff. That could be my take. Michael, from a lawyer point of view, I imagine every lawyer that’s been going after Tesla for essentially selling a faulty product is slightly frustrated that this didn’t go to trial?

Michael: Possibly. It’s hard to say really, exactly what took place here.

There was a lot of movement in the past week before trial that had to do with Tesla was coming in late and trying to get evidence that Mr. Wong was playing a video game at the time of the crash. And so it’s unclear as to how that played into the settlement. It’s on, you never really know what goes on behind the scenes and they’re going to be, disclosure agreements that are involved that prevent most of us from ever figuring out what happened.

Here, it’s I don’t know. I just think it’s from a corporate standpoint. It’s more bad strategy from tesla, why You know, you’re going to get bad press, unless your arguments are, somewhat weak. You’re enticing people into the exact trap that Mr.

Wong fell into. You’re saying, Hey, your vehicle is going to drive itself. And he was using the system regularly to the extent that he even. Had noted to family members or someone that he had encountered problems at this exact spot where there was basically an high, a concrete highway median that ended that, or began at a barrier, and that’s what the Tesla basically veered out of a lane directly into a path of a frontal collision with this barrier that Completely ripped the front of the car apart and it was a horrific crash, but it, the NTSB looked into it, this crash has been very publicized.

And so it, I don’t understand from a strategic perspective what Tesla is doing here when there’s. Autopilot drove the guy into the barrier. That’s pretty clear. And you’re to blame the guy and to go after him using assistance, secret assistance from Apple shows you what these, what the plaintiffs are up against when it comes to these cases.

And I, tesla was hoping to push back the, any settlement money that they would have to expand or a potential punitive damage that might have come out of a trial. They were hoping to push that back as far as possible. This crash happened in 2018 in March, so it has been six years since this, a family lost their.

Husband and father and have been waiting to be made whole by the company that, put out this technology that claims to be autopilot, but drives you into concrete barriers. So there’s just a lot of it’s an odd strategy from Tesla. It’s one that they’ve deployed in almost all their cases, regardless of the facts.

So that statement that you referenced earlier, Anthony is. Just bluster. It’s simply not true. They’re not basing a lot of their legal efforts on what the facts of the case are. They’re basing it on their financial considerations. They’re pushing a ton of defects case cases back as far as possible and delaying them as far as much as they can.

And at some point, I hope that one of them hits trial and that punitive damage has come into play because that’s where we might actually see the pile of cash that Elon is sitting on threatened to some extent. And settling before trial, while it is something that may be great for the family, like you say, it’s probably, a lot of legal, a lot of legal folks right here interested in how this trial was going to proceed since it would be essentially.

Lay out in public the arguments that we and others have been making for years now about why autopilot isn’t safe and A verdict for the plaintiff in this case would have gone a long way towards, you know Helping get justice for all the other plaintiffs that are out there Who have been deceived by Tesla and his technology

Fred: I think that a huge factor has got to be the stock price.

They’re under a lot of pressure right now, and this would have been very bad advertising. The amount of money that would be lost due to continued depreciation of the stock compared to any settlement. I think this drove him to settle just to get this over with. I do think it opens the doors for a lot of other litigators though, who are confronted with the same set of facts and now see a strategy to move ahead.

Anthony: I look at this as the victim’s family saying, Hey, you sold a faulty product. And then again, using Elon’s own words, we will never fight a true claim. So basically I think by settling this, they’re admitting, yep, we sell a faulty product. But so autopilot for listeners who are new to the show is it is basically a level two driving system.

So it does lane centering, lane keeping and that’s it. And my car has a version of that. I think both of your cars have a version of that pretty much every car since. 2020 has some version of that. There’s nothing wrong with those systems. They can be very handy. I think the problem with Tesla comes down to is they call it Autopilot, and they advertise it for years as the car drives itself.

Fred: Well, there’s another problem, which is that the Autopilot does not have an apparent time limit associated with it. So the systems in the other vehicles will all disengage or pop up warnings after just a few seconds. In my car, it’s 15 seconds if you don’t have your hands on the wheel. Autopilot doesn’t apparently have Or at least at the time, didn’t have any such time limits.

So people could just not pay attention to the driving at all for a very long time. And there was no warning or attempt by the car to re engage the driver to the driving desk. So I think that was the fundamental difference between Autopilot and the other Level 2 systems that are out there.

Anthony: And my car actually recently, it’s really annoying.

You keep telling me, Oh, I have to have your hands on the wheel. I’m like, my hands are on the wheel. So I got to do a little shaky motion shaky. You’re like, Hey, you remember me here? So I don’t know if a sensor is failing or but

Michael: your car just really wants some, that human touch.

Anthony: I could be.

Hey, I’m here. I’m bored. I’m driving. Yeah. Just give me a little shaky.

Michael: Yeah, there’s something that I always, in Knight Rider, I always found something creepy about Kit talking to David Hasselhoff. There’s something weird there. I don’t think you want your car talking back to you.

Anthony: I mean anything with David Hasselhoff is slightly creepy.

So this is Autopilot, as we’ve talked about before, is different, slightly different from full self driving. Full self driving, I will repeat a thousand times, is not full, it’s not self, and it’s not driving. It’s not those things at all, okay? This is just, again, a level 2 system that you can get in other things that will will help you basically do what they call autopilot on city streets.

My car does this. But again, my car won’t allow you to be like, take your hands off the wheel and be like, Eh, we’re good to go. Buyer beware, be afraid. A question that Michael asked in our week’s notes Is our autopilot and full self driving the modern equivalent of lawn darts? Now I’m old enough to remember lawn darts.

God, they were fun. Not every kid in the neighborhood made it. For those of you who don’t know, lawn darts weighed about a pound each. They were a piece of pointy steel with wings on it, like a giant dart, and you’d throw them. And they would land down, and you’re like, look how far I threw that in the neighbor’s lawn.

Oh my god, their dog’s dead. Oh my god, I killed Fluffy. These, you can’t buy these anymore. But they didn’t claim that that they made life safer. More fun.

Michael: They probably claim they made more fun, but you know, it’s a very similar thing here. You have a product that you know is in here inherently going to You know encourage you to take risks that you might not otherwise take And people end up dying.

And in the case of the long dart, the consumer product safety commission said no more. They banned them completely to the chagrin of myself as a I think I was 14 at the time. I was right in the middle of long dart, long dart prime age and they banned them.

Anthony: Wait, they that long to bet?

No. Cause I think we had to stop using them like in 1980. Things were slower in the South.

Michael: Yeah, I don’t know that the full ban came through until the 88. But yes, things do hit South a little slower, thankfully.

Anthony: But not a lawn dart or a Tesla driving on autopilot. In other Tesla lawsuit news, this is a horrible story.

This is, we’re linking to this from KRON4 from the article, Mallory Harcourt was eight months pregnant when her son crawled into the driver’s seat on December 27th, 2018. He somehow turned on the Tesla, accelerated up their home driveway into the garage, and struck Harcourt, according to a lawsuit filed on behalf of Harcourt and her toddler.

Fred: It’s important to note that the child was two years old.

Anthony: This is not

Fred: an adolescent.

Anthony: No, so this is basically somehow the toddler got into the footwell. The kid didn’t have a key or a key fob, and by it seems depressing the accelerator pedal, the car turned itself on. Is that the correct understanding?

Michael: Yeah, and even going back further the car was in park. I believe off with the, this is the model X vehicle. So those Falcon wings are up and it’s in the driveway. So it’s for all intents and purposes, that’s a vehicle that, that shouldn’t be going anywhere. Especially with those doors up, those can create, an enormous safety hazard in so many different ways to allow people to drive with those doors deployed.

out and open. So there’s a lot of things going on here. The mother had gotten out of the vehicle to put the groceries in the house, vehicle sitting there off. The toddler found a way into the footwell, somehow turned on the vehicle and hit the pedals in the proper sequence while shifting the transmission into drive and causing the incident.

So there’s a lot of things that went wrong there. From the start, I don’t understand why a Tesla Model X with those Falcon wing doors or any vehicle that has those automatic fold out doors is allowed to move while the doors are open. That seems like there should be restrictions around that.

And Tesla came back and disputed the The fact that, they basically said, we have a system in our vehicles that requires a pen to drive. And the mother should have enabled that system. She should have gone through, a thousand pages of the owner’s manual and dove deep into our electronics to figure out that pen system.

And that could have prevented this accident. I guess our question there is, and it’s something that we brought up last week, or Emily brought up last week in our conversation with Emily from Consumer Reports regarding child heat stroke prevention, why aren’t these systems enabled at the time of purchase?

Why is it something that a parent needs to dig down through, a library full of information to get to this one thing that might help save a child’s life, instead of simply Making the system active when the owner buys the car and beyond that, Tesla highlights this pin system and they say it’s unique.

We don’t think it should be unique. A pin system like that, if it was installed on every Hyundai that had been built from, 2010 through the present could have saved, dozens of lives, if not hundreds of lives and prevented the Hyundai. Kia theft problem. So I’m a big fan of putting pin numbers on cars.

We all have them on our phones and they prevent unauthorized usage of, that 12 ounce piece of metal in our pockets. Why aren’t we putting them to use to stop unauthorized use of the 3, 000 pound vehicle in our driveway?

Fred: Those are great points. I do want to add a detail though listeners might find interesting.

The woman in the front of the car. had her pelvis crushed. She was eight months pregnant. Subsequently, she delivered that that infant successfully. But I’ve never been a woman my understanding is that childbirth is not that pleasant a process, and to do it with a fractured pelvis strikes me as being even less enjoyable than the normal circumstance.

The only good news I can find in all of this, is that it was not. A cyber truck, because the cyber truck has got a wedge built into the front of it that probably would have killed the poor woman.

Michael: Yeah, this kind of, it raises questions, both of these cases that we talked about raise this question, but, should we be requiring owners, to do something before operating the vehicle to ensure that they are.

Familiar with vehicle features, you know the in it’s yeah right now. You’ve got Tesla is essentially forcing or at least a couple weeks ago. We discussed they’re they’re they’re making their dealers Activate the demo version of full self driving and take these people around the block Under full self drive mainly just to market the system to them.

Why aren’t they taking time, similar time to, make sure that their owners have at least a rudimentary understanding of the safety features that a vehicle has and how to access them. That seems like time that is much better used both for the consumers and for Tesla. They could potentially be heading off.

Lawsuits like this by making sure that, the moms who are buying Teslas are aware and have activated these features before they’ve left the dealership by being, making sure that Mr. Wong is aware that autopilot is not going to work in all circumstances and that it could drive you into a barrier and kill you if you’re not paying attention.

That just seems like a better use of that time since Tesla seems to be somehow dedicated to re educating its customers.

Anthony: I think Tesla needs to start running TV ads similar to drug ads. Drive a Tesla. Some features may not work in all situations. Autopilot is a lie. Full self driving is not really true.

If we get into a lawsuit and you crash and someone’s injured, we will blame you. They literally need that level of disclaimer on all of their products. And I’d like to highlight the last thing I said there, because this is Tesla’s M. O. Is that, hey, if something goes wrong, it’s not our fault, it’s you.

And I think that’s how Elon runs his personal relationships as well. Just from seeing how many baby mamas he has, but that’s not related to auto safety, is it? No.

Michael: He’s got a lot, he has a lot of children to protect.

Anthony: Yeah, how many car seats has he had? And I bet he’s checked out Consumer Reports car seat safety.

Nah, I’m kidding. He’s had somebody on his staff do it and then him saying, I don’t need car seats. Enough of this. Let’s talk robo taxis. You guys remember robo taxis? Oh, still Tesla related. Elon opened his face the other day and said something. Let’s step back. Let’s step back to I think it was like 2014 where Elon says, Tesla is essentially worth zero unless we get self driving cars and robo taxis working correctly.

And his pitch in 2014 roughly was, You buy this car, and then we’re gonna install some magic software into the car. Magic software does not currently exist. There is no potential for it to exist within the next decade. And while you sleep, your car will go out and pick up fares and don’t you want that?

Don’t you want your car to be a taxi at night and you sleep during the day? I don’t know who wants this. It sounds creepy as hell. Imagine getting into your car for your morning commute and be like, Oh, what happened to my seats? Oh, God. RoboTaxi. Elon’s been talking about this, Eh, it’s coming every six months.

And what’s the definition of insanity? Anyway, so he just said a couple days ago that, Hey, the, we’re announcing, not releasing, announcing the RoboTaxi on August 8th. And the date for August 8th is, this is just, this is priceless, is 8 8. Now Elon, in other parts of his life, has been accused of being a Nazi sympathizer.

Now, the number is 8 8, for those of you in the white supremacist, oeuvre, that’s a big date for them, they’re a big fan of that. Coincidence? I don’t know. Fuck this guy. Elon’s saying, yeah, robo taxis. Yeah we’ve got it right, and yeah I can’t stand this, man.

Michael: Yeah there’s, nothing has really changed here, it’s just more claims made.

They’re gonna unveil this robo taxi. We’ve seen multiple companies unveil a robo taxi. The funny thing here is that, when Back in, eight years ago, when when we were looking at some of the first problems around this issue, they were saying that they had on the Tesla website, it said, we have the hardware capable of full self driving.

And now here we are looking at a picture of a completely different car. So apparently they didn’t have the hardware capable of full self driving. Um, we know that, we know that there’s, they’ve had some issues around sensors, Tesla seemed to be believed that cameras alone and machine learning, we’re going to be sufficient to.

Reach level four autonomy or, full autonomy. And it, we’ve pushed back against that. We think you’re going to need multiple sensors. You’re going to need LIDAR. You’re going to need some other things to ensure that the system is safe. And I think that Tesla is probably going to have to come around to that idea eventually.

I’m interested to see if LIDAR is part of their package on August 8th, but the, on August 8th. But the fact is. The hardware necessary for full self driving was never there in 2016, 2017, and going forward, and it’s still not there in production Teslas, and yes, this has all been a lie. It’s pretty simple.

It is a lie, and it has been constructed to pull the wool over the eyes of the average consumer and get their money.

Anthony: We’re linking to an article in Reuters. There’s a great quote from it. Everyone else has found out that when they thought, what they thought was a 2 or 3 year project turns out to be a 10 or 20 year project, says Philip Copeland, a Carnegie Mellon University professor working on autonomous vehicle safety.

Tesla’s found that out too. Philip Copeland, also known as a fan of the Center for Auto Safety podcast. Hi, Phil! But I think he sums it up there perfectly. He’s this is a much, much harder problem than anybody thought. And he’s talked about on previous episodes that, hey, in the 90s, We were at 98 percent of having this problem solved.

That last 2 percent is really hard. But, Elon’s gonna Elon. What can you do? There was a a great comment I saw online when we’re talking about this, that they’re unveiling this and that someone said what they’re gonna do is just have a guy dressed in a car suit and he’s gonna dance on stage.

Referencing Tesla when they unveiled a robot, which is this guy in a robot costume dancing. But hey, people keep I don’t know. Hey,

Fred: Michael, I got a question for you. Yeah. The lawn darts were banned. The FTC statement said that about three people per year were being killed by these things. Why is it that with a much larger number of people being killed by the Tesla driving features, full self driving features, that they can’t be banned?

Is the FTC got no jurisdiction?

Michael: The CPSC was behind the long dart band from what I can tell. But I don’t, I there are probably, in the case of long darts, there were, children were primarily the ones that were being killed, They were the ones accessing the lawn darts while their parents weren’t watching, throwing them into the air, not knowing where they’d come down.

And they were puncturing skulls and causing serious head, brain injuries and other types of injuries. Maybe, maybe our inclination to protect children more than adults as society plays a role in that, but, the fact is that these. Systems that Tesla’s put out claiming there’s something that they’re not have killed more people than ever were killed or injured by long darts.

Anthony: Oh boy. More robo taxi news. In Arizona, a company called GM Cruise, they’re gonna their robo taxis are going back on their road in Arizona. I don’t know if I sent this link to you guys, but it’s hilarious because they’re not gonna be an autonomous way at all. It’s gonna have a human driver in charge of the vehicles.

But hey, congrats to GM Cruise for realizing maybe we need humans to drive these cars and not be A autonomous vehicle when it doesn’t work.

Michael: Going back to the crash that kind of caused all the issues for them, a human driver in that case would have known that they were on top of another human before they tried to pull the side of the road, further injuring that person.

So it’s a no brainer from that perspective. If you just want to everything in a nutshell, but it’s something That we think is critical until autonomous vehicles have shown at least, some greater ability to navigate city streets and highways and the data is in and until there’s a point where I think Americans and, consumer advocates and the government are comfortable with that happening, you’re going to need to Protections in place to make sure that, there’s a human that could take over when there’s a problem.

Fred: Yeah. But to realize that this is a bandaid that’s already been tried and proven to be fatal, right? So people have heard

Michael: that. Yeah.

Fred: So there is no prospect that in our lifetimes and even you younger people, there’s no prospect that in anyone’s. Any of our listeners lifetimes that there will be enough driving data to assure the safety of these vehicles and reasonable circumstances.

The only prospect for doing that is to set up careful tests on test tracks that compare the behavior of people. In critical driving situations with the behavior of self driving vehicles in critical driving situations, driving down the road on clear sunny days with no traffic is never going to verify that a self driving car will operate safely in a school district at the moment when kids being let out of school on a spring day with nice weather.

It’s just, it’s not going to happen. And this is all a distraction because the industry does not want to set up the tests and government, for whatever reason, does not want to set up the tests that would allow a reasonable basis for saying that these things are okay to use. It’s stunning negligence, in my opinion.

Anthony: You know who’s doing a good job on this? Kentucky Governor Andy Beshar has vetoed a bill that would have allowed fully autonomous vehicles on the state’s roads, citing concerns about safety and security. Opening Kentucky’s highways and roads to fully autonomous vehicles should occur ONLY after careful study and consideration and an extensive testing period with a licensed human being behind the wheel, which is what other states have done before passing such laws.

So that sounds to me like a step in the right direction.

Fred: It is a step in the right direction but how did they get there? They got there because the industry was successful in convincing the two houses of the Kentucky legislature by repeating their standard talking points that This was okay to go ahead and stunning ignorance on the part of the legislators.

Has been quoted representative. Let’s see here. Representative Bray said, let the market make these decisions rather than a heavy handed government bureaucracy. It strikes me that legislating approval is exactly that heavy handed government bureaucracy. So there’s a little bit of a tautology there.

He also said that it’s been proven that self driving cars are 100 percent safer than human drivers. Thank you very much. I don’t think that’s the case. Now, my recollection is 100 percent safer would mean there are absolutely no safety hazards. But perhaps my understanding is wrong.

Anthony: You got to have lobbyists for these companies take you out to the palm and get you a nice steak dinner.

And that’s how you come to that 100 percent effective number.

Fred: The creamed spinach is disgusting, by the way.

Michael: I don’t know where representative Bray got those numbers from. I simply haven’t been able to find a source, but yes, this is simply. The same junk that’s been trotted up to Capitol Hill and back and all around the states now for years where, they’re saying things like, Oh, computer drivers don’t speed.

They don’t text. They don’t drive impaired. They don’t drive fatigued. When they’re talking about AVs and, but human drivers do all those things. So human drivers are the problem. We need these machines to take the place of the humans. They fail to cite all of the litany of items where computer drivers have, come up short and have shown that they’re not as safe in many areas.

As humans, they also don’t talk about the inherent failures, the blue screens of death, the software issues, the bugs, the glitches, everything that can go wrong with the computer that humans aren’t subject to. So there, there is no evidence. I don’t know where the a hundred percent number came from.

Representative Bray is a beef farmer, so he’s used to some shuttling bullshit. Probably. Probably the source of a lot of that just came directly from, the Kentucky Chamber of Commerce and the Autonomous Vehicle Industry Association blowin smoke up the beef farmer’s ass.

Anthony: Listeners, if any of you think that computers are better, safer drivers than humans, I want you to do yourself a favor.

Look at the inside of your windshield, and now go look at your computer screen. Which has more spittle on it, okay? What are you cursing at more? Other drivers are your computer. I, if you’re like me, it’s going to be your computer screen because the thing is, Oh, this broken updated on now my audio doesn’t work.

What’s happening here. And you want to put that into a 3000 plus pound car traveling at 70 miles per hour down the. But New York city, they’re going to the mayor who’s not the smartest person in the world is going to bring autonomous vehicles onto the streets of Manhattan, which is insane because the only way to navigate the streets of Manhattan is you have to regularly break traffic laws.

That’s honestly the only way traveling around Manhattan works. And we do that and we all make. Eye contact with each other and be like, yeah, I know I’m stuck out in the middle of the intersection here. I know the light turned red and I know you got to let me go Because if not i’m screwing up everybody else and whatnot and you look at each other might give you a slightly dirty Look, not so much hand gestures, but that’s how you interact with other drivers It is constant eye contact and saying yeah, you’re gonna go.

I’m gonna go I don’t see how this works with an autonomous vehicle in manhattan. The traffic jams will be outrageous

Michael: Yeah that’s going to be a tough one, hopefully. I don’t know. The mayor has more authority than he has exercised to oversee the AVs coming to New York City. I would urge you, as a citizen of the city, to let him know your thoughts.

Anthony: Oh, yeah. But I appreciate that he takes his paychecks in Bitcoin and is hence making less money than if he took fiat currency. Yeah. Hey, Fred, how are you feeling about Tau time? Oh, Tau time! I don’t know why I said it like that. We’re

Fred: ready to go. I, I wonder if those Kentucky problems are caused by Piggly Wiggly somehow.

That’s a penetration in Kentucky, right?

Anthony: Fred, we know that your retirement is secretly funded by the Piggly Wiggly Corporation of America. Okay. So let’s talk about

Fred: that Tesla. They bought me,

Anthony: huh? Yeah, exactly. But it’s wrapped in foam speed and reaction time.

Fred: You’ve now entered. Okay. Michael pointed out that there’s a standard kind of standard that people should be one car length behind the 10 miles per hour.

Of speed they’ve got. And I was wondering about that. So I looked into it a little bit. The average car is about 16 feet long or so. And at 60 miles an hour, that means. If you follow that standard, you’d be 96 feet behind the car ahead of you. I don’t think anybody actually does that because if you do, three other cars are going to slip in and you’ll end up back where you started.

But anyway, that’s the standard. And if you look into that a little bit, at 60 miles an hour, you’re traveling 88 feet per second. So basically that gives you a one second interval between the time the car ahead of you does something. And you actually reach that point in the car. Of course, what that something is is not a dead stop, typically, because the car is ahead of you going to slow down.

But it does give you an extra second of time to think about things and prepare your response and start to do that. So it seems like a good idea. But the one second is what kind of sticks in your head about that, and I think that’s probably proportional for lower speeds to higher speeds as well. So you’re looking at around one second.

So that brings up the question of takeover.

from autonomous and self driving cars, right? Because the standard for most of those cars is that you have to be ready and able to take over the control of the car under any circumstances at any time. So people have looked into that. How well can you do that? There’s a paper by Erikson and Stanton will And we’ll be able to provide this link to people.

But there’s really two issues with the transfer of control from the self driving car to the human car. The first issue is how fast does that happen? The second issue is how well does that happen? So it doesn’t do you any good to take over really quickly unless you’re going to effectively control the transition and also make sure that when you do take control that you’re Doing it safely.

You’re not swerving. You’re not, you’re running into other cars beside you. There’s a lot to think about when you’re doing the transition from the automatic controls to the human controls. We won’t get into all of that in any kind of detail, but there are some studies, again by Erickson and Stanton, that looked at actual human beings.

and what they can do in the controlled circumstances. And what they found is that about 10 percent of the people, round numbers, that they looked at and median values for their transition from automatic to human controls was less than five seconds. They had 26 subjects. And only two of them were able to do it on the average, or on the median, in less than five seconds.

Similarly, it took a couple of people more than ten seconds, on average, to transition, or again, median, I say, I’m using that improperly, but median value for transition to human control. But this is on a simulator, and I think that There is no naturalistic data supporting this because it’s simply too dangerous to do it on the road.

It brings up to my it brings up the question of if it’s too dangerous to do it on test in a controlled situation, why is it okay to do it on the highway in an uncontrolled situation with untrained people who’ve never approached that before?

Anthony: Why do you hate American capitalism? I’m sorry, continue.

Fred: We should let the market decide. That’s right. Exactly. And if the market decides, then you’ve got to pick a number for what is a reasonable standard for legislators to use for transition from automatic control to human control because this is a traffic safety issue that’s going to arise.

Now, we had Phil Coopman and Bill Wyden on a couple of weeks ago. They were talking about duty of care and the need for duty of care associated with self driving vehicles. The extension of two self driving vehicles in the same way that a human being has a duty of care when they get behind the wheel of a car.

They have proposed that the vehicle has to assume a 10 second time for transition from the automatic control to the human control, while the car still has the duty of care for the safety of the people in the car and the people around the car. Is 10 seconds a reasonable number? Again, this, there’s not lots and lots of data, virtually none in naturalistic setting, but based on these simulator studies, 10 percent of the people, round numbers, would be unable to sustain that standard to take over the car in 10 seconds.

Is 10, is 10 percent of people failing that test a reasonable number? I don’t know, Michael, what do you think? And there’s not lots and lots of data on this and

Michael: That’s 10 percent that are going to be left behind in the safety calculations. And, you’re exposing that 10 percent to additional risk.

If you base your designs on that standard right.

Anthony: And you’re saying 10 seconds. And so I’m traveling down the highway at 60 miles per hour. And I the, my level three system essentially says, Hey, you got to take over 1 Mississippi. Two Mississippi, three Michael Homestead, and like before I hit a break.

Michael: Yeah, you could have saved your video game and taken over the steering wheel by that point. Depending on how fast you are, or depending on where you are, I think, if you’re asleep, it might be different.

Anthony: But even, the car can’t see the future, like the car can’t see ten seconds ahead.

Michael: Ten seconds is an eternity.

And how are cars even going to be able to detect a crash situation in that window? I don’t know if it’s even possible.

Fred: That’s roughly the length of an aircraft carrier deck. So that’s a long way to go. At 60 miles per

Anthony: hour, really?

Fred: Yeah. Holy, they’re huge. That’s huge. Yeah. That’s a long way.

All of these issues are being skirted around by the industry advocates who are, pushing this bullshit at the state legislative level, the Kentucky. Legislation is a great example of that. These are fundamental safety issues that are simply not being addressed by the industry and not by the legislators on the industry’s behalf.

We need to work these.

Michael: If you look at the case the case we were just discussing the Tesla and Wong case where there’s a barrier that is steered into by autopilot, it doesn’t matter. And that was almost an immediate you’re not getting 10 seconds, you may not be getting one, you may be getting a small percentage of a second to react to.

To that. So I don’t know. It’s what’s allowed on the roads and what the science says should be allowed are very different things to your point, Fred.

Fred: And it’s really unconscionable on the part of the industry to move forward, driving this to some kind of imperative for safety when they haven’t even addressed the fundamental issue of whether or not there can be.

A safe transition from the automatic control to a human control in the natural environment where there’s other people around you, and there’s other cars around you, and there’s hazards around you and and putting the standards together for what should happen when you’re testing automatic vehicles on the road, which is something I’m participating in.

One of the issues that we’ve brought up is that you need. To include the size of the shoulder available for a car to pull over when you’re mapping the highways, right? Because if it’s, if you’re on a bridge and the car says, Oh, I got to pull over now, you’re running into the bridge. That’s generally speaking, a bad idea.

Until these issues are settled out. This simply shouldn’t be happening. It’s a catastrophe waiting to happen. And yes, Elon Musk may be the most benevolent person in the world. There are varying opinions on that. To take all of those vehicles and to wave a magic wand and say they’re now self driving auto automobiles and their robo taxis is not going to end well.


Anthony: word.

Fred: Any questions about that? We can put up some of these links on the on the website, right? I’ll send them to you, Anthony. Yeah.

Anthony: So what you were starting off, when I was first learning how to drive, I was taught the it was a three second rule. It was like, all right, make sure you’re not traveling too close to the car in front of you.

Find an object and when you see the car in front of you, pass it. Then you start counting how long does it take for you to get there. So it’d be like a road sign or something like that. And so that’s, so if I see the car in front of me pass a, a yield, I’m not a yield sign, it’s a bad example.

I like on a highway, you see them pass a, mile marker sign and then you count one Mississippi, two Mississippi, Hey, three Mississippi, I’m at it. So I’m, I was taught that was a safe distance to be from that car. Okay. So that’s the general. So I’m the only person doing this.

Fred: It’s difficult to actually do that.

Anthony: Yeah, I totally do that. I’m I people like I have people regularly cutting for me. I’ll have the auto adaptive cruise control on set it to the three car length thing. And you’re let’s pretend I’m going above the speed limit. Like I’m matching the flow of traffic.

Michael: You just deal with the fact that people are going to.

Ultimately, cut you off and cut your distance between you and the car in front of you down that you’re going to have to continue to adjust your distance to all these other cars that are cutting in front of you.

Anthony: Yeah, I let the computer and you

Michael: just let them go and you say screw it. I’m Anthony.

I’m having a good day. I’m just getting to from point A to point B.

Anthony: Exactly. I think that’s a good

Michael: attitude. I wish everyone drove like you.

Anthony: Can you call? I’m going to give you my wife’s phone number right now. Can you do me a favor and give her a call? Just tell her that.

Fred: You are a paragon of virtue, Anthony, and I think that those famous hand gestures that people are misinterpreting is simply your way of saying, yes, I, I’m putting you first.

Anthony: Yes. I’m number one.

Michael: Now

Anthony: you

Michael: need to, you’re telling them they need one second for, for cartilage purposes.

Anthony: Okay.

Fred: Just checking curious that people misinterpret that, but

Anthony: What can you do? Okay. Let’s let’s jump over to a more fun topic. Marijuana. Before we do recalls, let’s talk about weed, man.

As an article we’re linking to in. Benzinga? That’s, I’m not making that up, that’s a real thing. From the article, researchers affiliated with the University of California Davis affirm that there are, there is no direct relationship between impairment and THC concentrations in subjects bodily fluids.

The investigators further acknowledge current methods that focused on THC and or metabolite concentrations in blood, saliva, urine, or exhaled breath can lead to false positive results for recent use due to the persistence of THC well outside of the typical 3 4 hour window of potential impairment following cannabis inhalation.

Basically, there’s no good way right now to check if somebody’s too high to drive.

Michael: And there is no good way right now. And even despite that, we see that there are ten states that have a zero tolerance policy, and even without displaying the characteristics or the symptoms of being high on marijuana, you can, if you, if your blood test positive for marijuana in these states, you can be arrested for operating under the influence.

There are five other states that have certain limits for the detection of those amounts of THC in your blood, but there are 10 states with a no tolerance policy. And, Obviously, at the center here, we are very against driving high, driving drunk, driving sleepy, driving any way that, that limits your ability to function from a cognitive, behavioral, and safety perspective, but there’s some fundamental fairness issues here, and it points to the real need for better research.

Detection methods and better connection between the amount of a drug in your blood and, the how, whether or not you can drive because some of the, apparently from what we’ve seen, you can smoke weed and have a detectable on your blood, over 24 hours later, in the case of chronic users, which means chronic, which means a lot, not chronic.

It’s 10, it could be, days, weeks before you, your blood serum would, or your blood tests would show no THC in the blood. So effectively, Whether, whether or not you smoked five hours before 20 hours before you can be, taken to jail for that amount of teaching your blood. And this goes for other drugs as well.

If you had Xanax or any other type of prescription medication that can be detected in your blood, days after you’ve used it and there was a no tolerance policy, you can be taken to jail for that. That’s, I say that we don’t want people high. We don’t want people on drugs driving, but from a fundamental fairness perspective, we also don’t want drivers who are not you Actively high drunk or sleepy punished for that behavior either.

So what it all, what all this points to essentially is that we need better testing. We need a better way to connect impairment to measurable blood or other levels. And if we don’t get that, we’re going to have, functionally innocent people sitting in jail, having to Pay for experts in toxicology to testify for them at trial and leaving them in really bad situations which doesn’t seem to be a good outcome, either for safety or just for just the justice system in general.

There needs to be a lot more done here to make sure that, states aren’t. Imposing incorrect sentences on folks or on undeserved sentences on folks, as well as to make sure that, the accurate detection is there so that when someone is actually operating under the influence that the authorities, the police have the evidence, solid evidence to convict and to hopefully rehabilitate that person so they don’t do the same thing again.

Looking at this, mainly just to point out to listeners that there’s some, some real problems in this area of drug detection and how the amount of drugs that may be in one system at one time correlates to the operation of a vehicle. It’s a really difficult thing to do. And, given some of the research restrictions around marijuana over the years, I don’t believe we’re as far along as we could be in this area, but more needs to be done.

Anthony: Michael, after you get high, how long do you wait before you drive a car?

Michael: At least 17 days.

Fred: It’s been about 50

years for me, and maybe you could by now. I think you’re safe. Probably, okay. How could that possibly work anyway? Cops don’t carry around blood test kits in their cars. You gotta draw blood, send it to a lab somewhere. You get results, there are time and energy,

Michael: I’ve seen some types of tests where they can test saliva an instant test for marijuana maybe not instant, but within a couple of minutes, and that’s being used. And in some places, I don’t know if it’s to establish enough probable cause that person can then be arrested and given a proper blood test once, similar to what they would do with a DUI offender.

Fred: You’re not supposed to spit at police officers. That’s generally a bad strategy, isn’t it?

Am I missing something?

Michael: That is a terrible strategy. However if they asked to swab the inside of your mouth then,

Anthony: yeah, exactly. That you say, ah, You say, ah, okay, people do not drive drunk. Do not drive high. If you’re going to get high, just wait till the next day. Is that fair? Hey, it’s Friday night.

You’re going to get blasted. Don’t drive till Saturday afternoon. Don’t follow my advice, but you know, that makes me feel vaguely safer. Yeah. Okay. That sounds fair enough. But more importantly, if you’re drunk or you’re high or you’re sober, go to autosafety. org and click donate. Especially if you’re high, just keep clicking that donate button.

No, don’t click the donate button once. Fill out your information multiple. No just donate. That’s all I’m asking. And now with that, it’s time for the recall roundup. And let’s start with something called rear view camera fun. Okay. Hyundai motor Corp. Potentially 18, 206 vehicles. These are the 2024 Hyundai Santa Fe’s.

The rear view camera can shift into reverse and be obstructed. What? It will be obstructed by the trailer park assist message, reducing rearward visibility, and increasing the risk of crash or injury.

Michael: Oh look, this is a perfect example of a system that is mixing, vehicle creature features like trailer parking assist with a safety system, which is the rear view camera system.

So it’s apparently there’s a communication error that happens in the computer network for the vehicle and it’s obstructing the rear view image With this trailer parking assist message. So you’re backing up and you’re not parking a trailer, you’re just backing up and some text or whatever comes up on the screen, a message from the trailer parking assistant blocks your ability to see what’s behind you.

That is the simple explanation of what’s going on here. There’s a much larger explanation that has to do with software and logic and a vehicle can communication error resulting in a TPA pop up message. But I’ll let I’ll leave that for later.

Fred: Great. Good. We’ve talked a lot about software validation in earlier episodes, and this is a evidence of a classic failure of software validation.

This should never have been, this should never have happened.

Anthony: Agreed. Chrysler, 26, 776 vehicles. This is the 2024 Chrysler Pacifica. It is a van. Let’s see what happened here. The traction control system uses the accelerator pedal position instead of the calculated throttle position as an input for system control.

Activation. When the cruise control is activated, the traction control system does not function in vehicles with the suspect throttle calibration because the accelerator pedal is not being depressed. If it takes some SSRIs, maybe it will feel happier.

Michael: What do you think, that seems similar to the last one, a validation issue, like how do you not figure that out beforehand?

Yeah, when you’re on cruise control, there’s not a human pressing the accelerator pedal, yet we have the accelerator pedal logic tied into the traction control system. Seems like they should have caught that one, just Hyundai should have caught the rear view camera issue and some sort of testing prior to deploying these vehicles.

Anthony: This is a scarier one from Subaru, a potentially 118, 720. 2,723 vehicles. The 2020 to 2022 Subaru Outback it looks like moisture can get into your airbag system and cause a short circuit and which your airbag morning system lamp will illuminate. Yeah, this is even though your airbag’s off,

Michael: wait, Fred, you don’t, do you have one of these vehicles, Fred or yours is an earlier Outback?

No, I’ve got a 2020. Outback. So you just may affect your vehicle.

Fred: I thought it sure as an Outback. I thought it was, I thought I read that it was like a No. Subaru Legacy. It’s

Anthony: Outbacks. Outbacks.

Fred: In any event, I have not received an alert for it, but I’m also misanthrope, so I never have anybody on the passenger seat.

Anthony: The owner notification date is May 21st, so you got some time. The dealers were notified on March 27th.

Fred: All right if I should suddenly begin having friends. Between now and then I’ll warn them. Thank you.

Michael: And basically this is, we covered a similar issue before where there were, apparently there was a plant somewhere on earth.

I haven’t been able to figure out where there was. A natural disaster that destroyed an electronics plant that some automakers rely on. And so they switched production to a different supplier in the interim period, while the original plant was coming back online and. The occupant detection systems that were built in this new plant aren’t working properly.

That was essentially the cause of this recall. But essentially, you’re not going to be able to if the occupant detection center in the front seat is not detecting a human, it’s going to leave the airbag off in a crash. And, that’s not what you want if there actually is a human in the front seat.

Anthony: Maybe Fred does because he doesn’t have friends. And if there’s somebody in that front seat, they could be his enemy. Yeah, no,

Fred: I just popped this up by the way. It is legacy, not the Outback. I don’t have to get any friends between now and then, so I’m good. Thank you. Oh it’s the Outback, according to No, it’s the Outback.

I’m looking at the

Anthony: NHTSA report right now. Yeah. I don’t know what I’m looking at then. I don’t know. We’re at part four, whatever. Moving on. Kia. Kia 427, 000 plus vehicles. The 2020 to 2024 Kia Telluride. It will roll away when it’s in park. What? Yeah. Their electronic parking brake will You say it’s in park, but it’s nah, let’s move.

Which, Hey, that sounds fun.

Michael: This was a drive shaft. It was assembled improperly by their suppliers, according to Kia. It’s a lot, a lot of vehicles. This is a relatively new model. It’s been out since 2020 and it looks like it’s every Telluride that’s been manufactured since the beginning of their production.

So it’s large recall basically drive shaft damage that’s occurs over Can’t allow the vehicle to slip out of park if you don’t have your parking brake engaged. So if you own one of these vehicles, really important between now and the late May when it looks like you’ll be able to get the recall performed to use your parking brake anytime you are in park.

Anthony: Here’s a dumb question. So I remember back there used to be that parking brake, he pull up, it was a lever like my car doesn’t have that. So I just put the car in park wherever I am and it parks, but there’s a little button, like a little toggle thing. Yeah. What the hell does that do? Do I ever need that?

Michael: That’s what I’ve got as a button. That’s typically for instance, if I open my door while I’m in, I think it’s while I’m in drive or while I’m in park, I can’t remember, but if I open my door after the vehicle’s been started, sometimes it will automatically put on a parking brake. I think that has something to do with the door logic.

But yeah,

Anthony: I’m just like, do I ever need to use this?

Fred: In the old days, you used to have a mechanical backup for the brakes, which was the lever that you pulled up. With a dual diagonal braking system, that’s no longer a requirement. The parking brake that you’ve got now, it locks the transmission.

It doesn’t really, in most cases, or it could lock the brakes, but it’s only intended to be used as a backup for the parking. Rather than a backup for the brakes when you’re moving the car. And by the way, I checked that notice it’s there’s two cars involved. There’s both the legacy and the Outback. So source

Michael: of

Fred: my confusion.

Michael: And so this is it’s parked by wire in a way, in the past you had to pull up. Firmly on a little handle that was to your right, or you had to push on a what was like a pedal that you had to depress, which I believe they physically engaged the parking brake mechanism, whereas now it’s electronic.

Yeah, that’s correct.

Anthony: I’m just making sure that it just does it all itself, and I’m not, I haven’t been missing out by not touching this little button. There’s a number of buttons in my car I’ve just never touched. I’m like, I don’t know what that does. The car’s always in econ mode. It works for me.

I like economics. Last recall for those playing the home game. Volkswagen, potentially 1, 000 plus vehicles. This is varieties of the Hund, of the Audi e tron vehicles, the 2022, 2024. And let’s see a short circuit in the high voltage battery module can increase the risk of thermal event or fire. Do we park outside?

Michael: No, there’s no warnings on this one. I don’t believe. But I essentially, I’m not sure if this is one that occurs while driving or not. And I don’t believe there have been any injuries or crashes or fires on this issue. They simply discovered through their analysis that there’s a problem here.

And so I’m not sure if they’ve fully developed a remedy yet. It looks like they’re going to get. Notices in late May. So perhaps they have, but I’m assuming they’re going to be replacing battery modules for all of these customers. So this is, it’s about a thousand vehicles involved. That’s not their full fleet of e trons.

But you know, it’s good that they’ve identified this before anything’s happened.

Anthony: Excellent. And with that listeners, thanks for joining us. Come back next week. We’ll give you more updates on if Fred has a friend in the front seat, it’s

Fred: not going to happen.

Anthony: All

Fred: right. Thank you. Bye. Bye everyone. For more information, visit www.

autosafety. org.


Join the discussion