Tesla lies, coverups and expanded nonsense

We start with the Tesla courtcase, revealing the company’s attempts to withhold crucial autopilot crash data. The discussion expands to Tesla’s hazardous software designs and Elon Musk’s outrageous claims about future features. Additionally, topics cover the rising issue of sexual misconduct in ride-sharing services, the shocking inefficacy of autonomous vehicles in improving safety, and sobering comparisons of annual road deaths to historical tragedies. Finally, a rundown of the latest car recalls rounds out the intense conversation on auto safety and corporate ethics.

Support the show!

Links:

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Introduction and Warm-Up

Anthony: You are listening to There Auto Be A Law, the Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Cimino For over 50 years, the Center for Auto Safety has worked to make cars safer.

Fred: Go drink exercises, stretch

Anthony: everybody. All right. Touch your toes. Welcome to August 6th, 2025. Fred’s stretching his face. It’s very disturbing to me. Welcome to another episode of the Tesla News Hour.

Fred: Ugh.

Anthony: Yeah, you have to, I’m, look, auto safety, Tesla, oil, water, something like that.

Tesla’s Legal Troubles: Key Largo Crash

Anthony: Let’s start off. Last week a jury found Tesla partially liable for a fatal 2019 crash in Key Largo, Florida and slap the company with 250. $3 million in damages. Now, this is unique [00:01:00] because Tesla’s been taken to court before, and quite a few of them, they settle before it goes to trial, whereas this one went to trial.

And it is fascinating on so many level levels. So we’ve talked numerous times on the show about how, oh, we can, we don’t really know, was autopilot or full self-driving engaged, or it looks like it was disengaged and it’s really hard. It’s been really hard for publicly to get this data of what’s going on.

Tesla’s Evidence Withholding Tactics

Anthony: But now thanks to the magic of witchcraft and this trial, we discovered that Tesla went, I’m quoting from an electric article. The company went as far to actively withhold critical evidence that explained autopilot’s performance around the crash. Within about three minutes of the crash, the Model S uploaded a collision snapshot video can bus streams, EDR data, et cetera, to Tesla’s servers the mothership, and received an acknowledgement.

The vehicle then deleted its local copy, resulting in Tesla being the only entity having [00:02:00] access. This, having a software engineering background, I am blown away by the sociopathic behaviors of people who actively wrote this system. It makes no sense except to cover up a crime. That’s it.

Fred: Yeah. We’ve got a government that’s run by criminals who hired a criminal to make sure that criminals wouldn’t be prosecuted.

So why is this surprising?

Anthony: I mean it, from an engineering point of view, it makes no sense to do this, you’d want, in case of disaster recovery, you’d want as many copies around as possible, unless you realize, oh, we’re selling death.

Michael: Yeah. It’s, this is just scary. It’s, the auto industry has a long, rich history of trying to hide evidence.

There I have heard many credible allegations of automakers who, find cars after they’ve [00:03:00] been in a crash and, sending in a fixture to remove data from the vehicle or pieces of the vehicle that might lend credibility to arguments that there’s a defect in the vehicle.

But this almost goes beyond the pale. Physically designing a system that is in the vehicle that deletes the vehicle’s local copy of data and sends that copy to Tesla. Meanwhile don’t forget that’s not all that’s going on here. Tesla is then telling police, telling potentially telling regulators we didn’t find out from the trial if how, or whether they’re lying to Nitsa about this type of data.

I would suspect they are given some of the brazenness with which they’ve carried this out. But they’re telling, attorneys for plaintiffs courts that they don’t have this data. When they do have the data. There should be more substantial fines, frankly.

Jury’s Verdict and Tesla’s Liability

Michael: The jury [00:04:00] found, found that Tesla was 33% liable in the situation because the driver’s actu actions certainly contributed to the crash. But despite that finding, gave Tesla $200 million in punitive damages. They didn’t take long to do that either. They reached that verdict very quickly and gave a substantial amount in compensatory damages.

So the jury was not happy at all. With Tesla, in this circumstance, this is really the first major Tesla autopilot case that’s reached a jury verdict. And it, when you break down all of Tesla’s outright lying and the bad faith in which they conduct their operations legally, the jury heard all of that and they were pissed.

I don’t think there’s any other way to put it. They thought that Tesla deserved a substantial penalty here, and, tesla is virtually designing a system to prevent disclosure of crash data in fatal [00:05:00] and crashes and in injury crashes and in, in lots of crashes. There, there is a system that detects a crash and deletes the data from your car if you own a Tesla.

Whether that’s still in place. This crash was six years ago. It took six years for this type this justice to happen. And it’s, that’s a shame in many respects that Tesla’s been able to delay this for so long. But part of the reason they were able to delay it is ’cause they were hiding the evidence so well.

And there were a lot of motions back and forth between the parties to try to get this evidence. Like I said hey Michael.

Fred: Yep. Can I interrupt you for a second?

Explaining Compensatory vs. Punitive Damages

Fred: Some of our listeners may not understand the difference between compensatory and punitive damages. Can you just explain that for a little bit?

Michael: Compensatory damages are essentially the damages that the plaintiff alleged occurred due to the event. So the compensatory damage would be the damages that compensate the victim’s family or an injured party for the [00:06:00] injury they received in the crash. The punitive damages are essentially the jury saying this is this is borderline criminal, reckless conduct and should not take place in America.

So we’re gonna send the company a signal. And so they go beyond the, simply the losses that one party or in the case may have taken and are essentially serve as a, as an additional penalty to the corporation or the individual who committed that reckless conduct.

Anthony: So this is fascinating.

Thank you. This article on electric because it talks about how the local police officer involved said, Hey Tesla, do you have any data that we can get access to? And the Tesla attorney is yeah, let me tell you exactly what to put in the letter to request this data from us. And the local cop’s thinking like they’re being helpful, they’re being my friend.

But no, they were not. The letter was designed to limit what Tesla will expose. And then [00:07:00] years and years go by where eventually they had to hire a firmware expert to get physical access to the crashed vehicle to extract data and be like, ah, Tesla, your son’s a bitches. You’re hiding this stuff.

Tesla’s Ethical and Legal Violations

Anthony: But I wanna know Michael, from an Ethical American Bar Association member point of view was Tesla’s lawyer, are they breaking some sort of ethical guidelines?

Michael: Depending on what their knowledge was there. If they knew that Tesla was taking this, what they’re calling a collision snapshot all of the data on the crash off the vehicle, hiding it on Tesla’s servers, and then the attorneys are going out saying, we don’t have this. Are they receiving that information from Tesla?

And they have no reason to question it or do they actually know Tesla has a copy at some point? Obviously they found that out, but it really will. It, whether or not those lawyers are subjects to sanctions is really going to depend on their knowledge of what was going on at Tesla.

Fred: Hey, there’s something about this, I don’t [00:08:00] understand, sorry for interrupting, but they stated that they sent a command to the Tesla to delete the data, yet when the forensic engineer came in.

He found the data was still there. That’s something I don’t understand.

Anthony: No, he found he went through and he found a reference to a file and the exact location on the remote servers. The file itself was gone, but he saw that there was a log saying, Hey, we sent this file to this server. Ah, okay. So in terms of obfuscation, their software engineers are one sociopathic assholes, but two not very good at covering their tracks.

Michael: And now they’ve designed a way to delete that log entry as well.

Anthony: Oh,

Michael: I’m sure. Yeah.

Anthony: It’s insane. If you’re a Tesla software engineer working on this, go fuck yourself or, yeah that’s my message of the day. Don’t be a bad person. It’s that simple.

Michael: Yeah, and it, it raises, this, it raises.

Kind of another question in line with the [00:09:00] title or the name of our podcast. There ought to be a law, shouldn’t there be a law against removing vehicle data remotely without an owner’s consent or removing crash data from a vehicle in order to essentially prevent. Crash investigators from accessing it, the, isn’t there, Michael?

I think that’s called destruction of evidence. It’s borderline criminal behavior. At the, I just don’t know how the destruction of evidence law work. Does it become evidence at the point it’s requested by the authorities so that if you delete it beforehand I don’t know. It seems like there should be a little more oversight here to prevent this kind of behavior, particularly as we enter an era where virtually everything in our lives is defined by software.

If my. Refrigerator that’s connected over the air to the main company, which a lot of refrigerators are now explodes and they delete the whatever monitoring [00:10:00] system data they have on the refrigerator prior to that. This could go to a lot of different places and there’s, it seems like there needs to be a lot more oversight here legally of these types of, software programs.

Fred: Oh, I’m sure the criminals in charge of the federal government will get right on that.

Anthony: But I am, I imagine it’s like all software. When you agree to purchase your Tesla, you don’t own the software, you don’t own all of the data, it’s recording. You don’t own any of that. You just own a license to use it and.

I’m sure hidden in that legalese is Tesla saying we’re gonna, make discovery really difficult. ’cause you have to know exactly the file name and path of where we hid it. He is that what’s going on? Because that’s what happened in this case.

Michael: Yeah, I mean it’s, I know it gives me a really bad feeling thinking that this type of behavior is even remotely allowed.

Not just in this [00:11:00] context either. For instance, NHTSA’s investigating. Tesla on a number of areas, right? And we don’t see a lot of or anything that Tesla submits to nitsa because they’re claiming confidentiality for all that. So we have no idea whether over the past six, seven years on multiple issues that nits a’s investigated Tesla, we have no idea whether or not what Tesla is submitting to the government is factual or complete.

And there’s no way to know that. I think in light of this verdict. The investigators at the Office of Defects investigation at Nitsa need should go back and review every investigation and every request for data they’ve made to Tesla to ensure that they don’t see very similar types of behavior in Tesla’s responses to those questionnaires.

There’s a number of situations in which data has been requested from Tesla, in the autopilot, in full self-driving summon. There [00:12:00] was a sudden acceleration investigation where they found that, every single driver who claimed sudden acceleration was hitting the gas instead of the brake.

This is the kind of verdict and the kind of information that comes out of a trial that raises a lot of questions about the way Tesla’s been behaving over the past few years. Obviously, we’ve been raising those questions on this podcast. But up until now, we, there’s really not a lot of concrete evidence that Tesla was truly violating the law or trying to circumvent the legal system.

And now the evidence is out in the open and it really should make the National Highway Traffic Safety Administration, the NTSB, and frankly every other government agency that’s been looking into Tesla for the last decade go back and check their files and make sure that they’ve been getting accurate information because there’s clearly a pattern of deception on the part of Tesla.

Anthony: So Elon’s been claiming for. [00:13:00] Ever that hey, these cars are safer than anything. A rocket ship could land in the middle of the road and it won’t crash into it. So when the plaintiffs got access to this data, they could see what the Tesla system was seeing, and it could see the car that it crashed into.

It identified it as a car, it identified the T junction it was coming up to, and it just decided not to do anything.

Michael: Yeah, and I think importantly, it, Tesla’s claim was that the driver had overridden this system in the seconds prior to the crash. However, the data said that auto steer autopilot was still fully in control of the vehicle at the time of the crash.

So that was just a clear falsification on Tesla’s part to do what, to blame the driver. And that’s where all this is going. See

Anthony: that, that’s why all of

Michael: this is intended to prevent Tesla from looking like they have any responsibility in the crash by pulling all that data away so they can just point at the driver.

Anthony: So Tesla had full access to all this data, yet [00:14:00] they wanted to a court of law and made the argument that the opposite was true.

Michael: Yeah, that was their, essentially their argument was that autopilot turned off around two seconds before the crash and that it was the driver’s responsibility at that point.

And we’ve seen a number of other cases where autopilot has suspiciously, or Tesla is claiming that autopilot has been deactivated in seconds before the crash, when there’s probably not enough time for the human to do anything about it. And that’s troubling I think. Any case in any circumstances which Tesla has alleged that should, also be looked at.

Fred: So the question is some, there have been some criminal trials for Tesla and Tesla has won, right? Yeah. In California, and we were a very similar collision occurred, but.

We now know that Tesla ha has a loose relationship with the truth. Does that give but does double jeopardy protect them [00:15:00] restrictions against double jeopardy, protect them from having to go back and look at that trial data or, yes. Are they somehow immune from having to have you mean criminal trials?

Michael: When you said criminal trials, I was thinking of people who have been driving Teslas and were prosecuted criminally for an accident where they might have killed someone. I don’t think Te Tesla has, I thought there was one

Fred: in, I thought there was one in California where Tesla exited a highway and ran into a car at a T intersection and that went to trial.

Yeah.

Michael: And the human was prosecuted for that. Yeah. Yeah. Yeah, that would absolutely be grounds for an appeal or however they do that after they’ve already been convicted. I don’t know if you’d call it appeal at that point. Some kind of habeas petition or something. My, my legal knowledge doesn’t extend that far, but yeah, absolutely.

If you have exculpatory evidence that’s coming out years later, we see this with, murderers on death row who were exonerated by DNA [00:16:00] decades later, frequently. That’s certainly going to be admissible to help prove that the person was not as responsible as they they may have thought they were at the time.

However in most, in virtual, in many of these circumstances, just like this one, the jury here found Tesla 33% liable. It, in fact, this verdict in many, in a number of states that don’t have the same type of. Law negligence law that they do in Florida could have been different, right?

In some states, if the driver’s 66%, 67% responsible for the crash, Tesla is off the hook. While this case worked in Florida, it wouldn’t have worked in a number of other states with different laws. It’s a wild nation we live in.

Anthony: I’m betting that there’s been a busy week at Tesla’s mothership, quote unquote accidentally de deleting evidence.

We were just doing a backup and oh, things got deleted. Dunno how that happened. He, [00:17:00] ah, moving on with more Tesla crap.

Tesla’s Autonomous Vehicle Challenges

Anthony: Tesla, Elon, he’s Hey, we released full self-driving robo taxis in Austin and I just a few blocks area and it’s just not that good. And it’s caused. Drop people off in the middle of the road and can’t make left turns and it’s not very good.

But hey, we’re releasing it in California. And as I think we mentioned last week, California said Uhuh. I don’t think so. ’cause there’s no there’s no there. California DMV from Politico the DMV and another oversight body publicly warned the company against an unauthorized robo Taxii rollout in the San Francisco Bay Area.

The term Robo Taxii was never used in obtained emails, letters and meetings invites exchanged between the company and the DMV’s autonomous branch over the past year. Please know that Tesla is aware of the various permitting requirements as one of Tesla’s lawyers. We understand that driver and driverless autonomous rideshare operations will require attaining additional permits from both the DMV and the [00:18:00] California Public Utilities Commission.

So Elon in public saying, Hey, everything’s magic. And the lawyers, of course, behind the scene are saying, no, it’s we know it’s not magic. He’s on ketamine again, and it’s all bullshit. Don’t listen to him. We’re gonna have a driver in the car. We’re not gonna call it driverless. Yeah. It’s not a thing.

Fred: While noting that Elon Musk got a $20 billion paycheck this week, 29 engaging this 29. Yeah. Or engaging this behavior. They say crime doesn’t pay, but seems to be working out for him. This

Michael: is just a another level of bullshit. Tesla is, we’re still not sure what the Robax in Austin is doing it, that’s where they’re deploying in Texas, where Texas doesn’t have a lot of laws regulating autonomous vehicles. And so Tesla can, virtually put a ham sandwich on the road there and call it autonomous. You can’t do that in California because you have to. You have to. There are [00:19:00] regulations for testing autonomous vehicles and then there’s another set of regulations for deploying autonomous vehicles.

And waymo Sus a number of other companies are making use of those regulations to test and to deploy autonomous vehicles on the roads. Tellingly, Tesla can’t even qualify to do any of that. They certainly can’t qualify to deploy. They can’t qualify to test because, essentially they have to have this driver in the vehicle or some type of safety monitor in the vehicle.

In California, they’ve got a driver in the driver’s seat in Austin. They’ve got, I think, a guy with a red button in the passenger seat who can stop the vehicle. And on top of all of this, they’ve got people in remote setups with steering wheels who can a, apparently take the vehicle over at any time. You really have to ask yourself.

Is that autonomous? It doesn’t seem like it to me. You have functionally, you’ve got two [00:20:00] drivers for every one of these vehicles. Obviously that can’t scale. You’re gonna need, more drivers than we have now in, in taxi cabs or in rideshare vehicles. Functionally, what Tesla’s doing in California is.

Pretending that it’s launching a roboto, taxii or autonomous vehicle service to, to pacify investors, but functionally they’re just operating a taxi fleet. And it’s nonsense. I think to those that, those of that follow us. It’s hilarious, but to many people who can’t distinguish between, a human operating full self-driving and a Tesla versus what Waymo is doing with nobody in the vehicle it, that distinction, is harder, to grasp for investors and for the general public.

And so they think Tesla’s, moving its robo operations to California when it, that couldn’t be further from the truth.

Fred: I going into my gaslight or now because,

Anthony: You can go, you get a [00:21:00] plus one point for asking politely.

Fred: That’s all. Thank you very much. Sure. I picked Tesla this week, which is unusual though.

I’m not going with Waymo, but on a different subject. In Texas, they have passed a law saying that as of September 1st, people who are operating autonomous vehicles have to get special permits for those autonomous vehicles. So Tesla announced that they’ve done the world’s first autonomous card delivery directly from their factory to a paying customer in Texas.

They use a mid-size, what they call a mid-size SUV, but it’s actually the model y I’m not sure why they call it an SUV except for marketing purposes, because it sure looks like a sedan to me anyway. They, this came from their website. They put a video on there showing the transit from the factory to the new owner.[00:22:00]

And in that video it shows the Tesla passing other vehicles on the right and encountering hundreds of vehicles. But there’s no notice to other vehicles of experimental status. No idea of permission from those other operators exposed to the hazard. They’re encountering pedestrians, but again, there’s no notice to the pedestrians that this is an unusual vehicle.

They’re driving on their divided highways, driving on suburban streets inside of a townhouse development. And it’s my understanding that full self-driving or an autonomous version of cell full self-driving is not developed or not. Qualified to drive on all of those streets. This is the same issue that actually came up with the court decision in Florida, as the, there was notice inside the computer that the Tesla was being operated in an [00:23:00] area for which it was not qualified yet it continues to operate. So this is, this may be similar. And at the conclusion of that video, they show the Tesla parking itself in a restricted area where the, right where the curves are painted red, right?

You’re not supposed to park there. It’s a fire hazard, or, and that’s exactly where the Tesla parked itself. So that was interesting and to the applause of the new owners and apparently some Tesla employees, not the

Michael: Tesla employees that were remote controlling that vehicle from afar though,

Fred: right?

Probably not. I, it’s but interesting that it ended up in a place where it shouldn’t have been. Texas has passed a law requiring the permits for all self-driving cars, but it starts in September. So Tesla was able to squeak this in under the deadline, interestingly with the model Y though, there was a, [00:24:00] almost a concurrent on June 15th of this year, there was a test done, which showed that model y the same model y speeding past the stop school bus and hitting dummies, representing kids that were crossing the street in front of the school bus.

This was and we’ll have the link for this up. This was a test done by the Don Project, including partners Tesla, take down and Resist Austin. So it suggests that there may have been a bias in the test. We don’t know the details of that, but it clearly shows an oncoming Tesla y using the FSD supervised technology, passing the spark, the parked school bus, which had displayed a stop sign with flashing red lights and the testers observing what the autonomous driving feature would do in an event like this.

Quoting from the article, however, instead of slowing down and stopping a school bus stop sign, lit up the Model Y using FSD [00:25:00] supervised sped pass, and it eventually hit a dummy model about the height of a kid that was attempting to cross the road. So this is, interesting juxtaposition with Tesla applauding their use of autonomy to deliver the vehicle.

I guess this is just all extensive gaslight, so that’s my nominee for this week. Good. Good. Old Teslas.

Anthony: Okay. Remember you’re starting off with that plus one there, but then at the end, that kind of, that’s a little weak. I don’t know. Michael, what do you got?

Gaslight of the Week: Tesla’s Misleading Claims

Michael: I’m I’m going with Tesla as well, mainly because it just because of their reaction to the verdict from the jury.

Their reaction was that obviously they think the verdict is wrong, but they say that it works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement lifesaving technology. Look, maybe there are circumstances in which Tesla has, created some lifesaving technology, right?

I’m sure there are circumstances where autopilot or full [00:26:00] self-driving might. Turn what could have been a crash into an event where the vehicle intervenes and save someone’s life. That’s what we’re putting into vehicles now. Automatic emergency braking and other things. That’s, something we certainly support.

However, at the same time, you can’t claim it’s life-saving technology if it’s also killing people. And we know that’s happening as well. The difference here is that Tesla in this, in, in this statement, is trying to lump themselves in with the rest of the industry. And frankly, Tesla sticks out like a sore thumb here.

No other manufacturer is deploying semi-autonomous technology. As recklessly as Tesla, the rest of the industry. Look at Ford’s blue crew. Look at GM Super Cruise. They’re using, map based geofencing to make sure that their vehicles aren’t operating outside of their operating design domain.

Aren’t operating on rural roads like the road where this, the crash that resulted in the verdict took [00:27:00] place. And they’re also using effective DR or sim what we think so far looks like effective driver monitoring systems. Tesla didn’t really have a great driver monitoring system at all. They had this nag system on the steering wheel that could be easily defeated and I think can still be easily defeated.

In some ways, they kinda retrofitted a camera based driver monitoring system into their vehicles that, you can put a teddy bear in the front seat and fool it. It’s not capable of a of. Actually monitoring the driver, the way that drivers need to be monitored in these situations. And it’s, this entire statement from Tesla is, obviously reaction to the fact that they got caught in this case with their pants down and now they’re running for cover to the industry.

To, and it’s a question we’ve asked often here, why does the industry not speak out against Tesla? I don’t know, Tesla is now running for that cover to try to pretend that this verdict is somehow an antis [00:28:00] safety verdict. I’m here to assure you that it’s not, finding out that companies are hiding crash data, perhaps illegally hiding crash data and preventing justice from occurring is, not in the best interest of safety in any respect whatsoever.

It’s a, a tactic that has been used by automakers for decades. Going back to before, probably any of us were born, and it’s something that, safety gets better when there’s full transparency about the reasons behind deadly crashes. And Tesla is committed to the opposite.

And for that, I’m gonna give them my gaslight of the week.

Anthony: That is a powerful ending to that. I don’t know. Okay, so Fred chooses Tesla. Michael chooses Tesla. I’m not choosing Tesla, I’m choosing Elon Musk. According from Tesla, Roddy, for all your Tesla fanboy needs, after a video of someone playing Grand Theft Auto in their cyber [00:29:00] truck while operating full, self-driving was shared on the social media platform, X Musk said this capability would be available in probably three to six months, depending on regulatory approval in your city and state.

No. This is pure bullshit and dangerous. If you do this, you are gonna die. You’re playing Russian roulette, not just with your life, but with everybody around you. That’s my gaslight.

Michael: And I like that one because, Elon is adding, pending regulatory approval to everything he says now.

And in fact, there are no states that would ever regulatorily approve someone playing a video game while they’re driving a vehicle. And then certainly not in three to six months. So that is, that is one of the just most out there statements I’ve seen even for him.

Anthony: So I’m gonna, I’m gonna change things up in terms of scoring this week.

Fred, you didn’t win this week. I’m sorry. It’s nothing personal, but I’m gonna [00:30:00] let you decide who won. Michael or I,

he’s muted himself. He’s cursing. I don’t hear him saying words. He’s still muted. Why are you muted? Am I the only one who can? Okay. No, I can’t hear. Okay, great. Because you, Michael is celebrating. I was like, wait, what’s going on here? Okay, so Fred, you get to decide who won this week’s gaslight me.

Ah, no, that’s not because,

Fred: No. The reason it’s me is because I went into a different dimension and I referenced an actual test that showed that the claims of safety that are associated with the Tesla are completely unsupportable based upon the physical evidence. So therefore, I am the winner this week.

Whereas you guys are talking about legal mumbo jumbo and stuff that might happen in the future, right?

Michael: There you go. Six months. Anthony’s gonna write a program that automatically [00:31:00] deletes your winning from the podcast.

Anthony: No Fred actually won this week because he learns how to score this game. You win, you score it, you win.

Well done Fred Perkin.

Fred: I’m taking some lessons from the management of the federal government right now.

Oh

Fred: boy. Just, if you don’t like the data, you just get rid of it.

Anthony: Ladies and gentlemen, I know you’re like, is it more Tesla? It is. But first, if you go to auto safety.org and click on donate we’ll keep talking about Tesla as long as they’re dangerous.

That’s right. Auto safety.org. Click on donate. Do it once, do it Thrice rice. Sure.

Tesla Shareholders Sue Over Autonomy Claims

Anthony: Anyway from the Guardian shareholders accused Musk and his electric vehicle maker of repeatedly overstating the effectiveness of and prospects for their autonomous driving technology, inflating Tesla’s financial prospects and stock price.

This is a proposed class action lawsuit from Tesla shareholders against Tesla. Ah, that’s great. [00:32:00] I think this is amazing. There are even Tesla shareholders like, wait a second, you’re gaslighting us.

Has he been lying to us the whole time? If we buy into a scam, is this Beanie Babies? I should put all my money into crypto.

Michael: Yeah. If you wanna see how a giant line of bullshit develops over time, you can go read that complaint because they’ve tracked it pretty well. But essentially that’s a lawsuit saying, we’ve, we’re as investors we’re being lied to about autonomy by Tesla. And, I couldn’t agree more.

Anthony: Alright, I’m gonna, I’m gonna switch it into a little darker dimension. This is an article that just came out this morning that we glanced at. Are we ready for this? Sure. So first of all, I’m gonna quote from the street.com, which is a little puff piece on Tesla and it starts off, and the only part I’m gonna quote is saying, thanks to the month of safe testing, the months of safe testing.

Waymo one, users in San Francisco get the added privacy of having a truly autonomous writing experience without [00:33:00] another human present. I’ve talked repeatedly on the show. They keep trying to sell. Oh, without a human driver. We don’t need a human driver. We’re it’s private. And they always give the example of someone who’s elderly and can’t get around.

And I’m like you kinda want somebody to help them get around, like in and out of the vehicle, help with groceries packages.

Uber’s Sexual Assault Problem

Anthony: But unfortunately this morning in the New York Times, they have an article titled Uber’s Festering Sexual Assault Problem. Quoting from it, Uber received a report of sexual assault or sexual misconduct conduct in the United States.

Almost every eight minutes, on average between 2017 and 2022 sealed court records show a level far more pervasive than the what the company has disclosed. While we were on the show, I was trying to do some math to figure out is this a all kind of taxi ride hailing service thing, or is this an Uber problem from the data I could pull?

It looked like in 2016 in New York [00:34:00] City cabs, they had roughly, or 2015, they had roughly 146,000 plus taxi rides in New York City that year, and 21,000 complaints of sexual harassment. And this consisted of things of saying, Hey, I like the way you look, and things like that. Not assault.

And New York City, they put out these massive fines saying, yeah, you do that’s a thousand dollars. That’s points on your license, 30 day suspension of your taxi cab license. Because in New York to drive a cab, you need a hack license. You need to, it’s a job. You need to pay for it. You can’t just be saying, Hey, I drive for Uber.

Which it seems a little more lax there. So that 21,000 complaints out of. Hundred 46. Oh no. 101 hundred 46 million, sorry. 146 million taxi cab rides, 21,000 complaints. That’s roughly 0.0143%. So very low percentage wise. Still very horrific. You’re talking 21,000 of [00:35:00] these complaints. It seems that Uber is at a much higher rate than just the standard.

I know. Uber does

Michael: about 11.3 billion rides per year. Oh, is that, oh, maybe it’s not that. I would expect that the level of creepy assholes doing things to women or possibly men in vehicles is rather steady, regardless of who they’re operating for. Although maybe some companies, maybe lift screens people better, but I, if someone doesn’t come in with a criminal record of sexual assault in some form, there’s really no way to screen creepy, and a lot of these complaints are creeps, but then some of them are, physical, sexual assaults. And to me this is, the, one of the very few arguments for autonomous vehicles out there, avoiding creepy drivers.

Anthony: But the uber stat you said of their driver, the 7.6 billion or the billions of ride that’s [00:36:00] globally.

Michael: Yeah, it probably is. I’m not sure how much is America. I’m wondering.

Anthony: Yeah, I can’t figure that one

Michael: out. We should probably do this and talk about it once we’ve had time to, that’s

Anthony: a, to digest it,

Michael: I just can’t imagine that, that the people driving for Uber are any more or less inclined to be creeps or worse than people driving for any other company.

Anthony: Yeah. I’m curious, we gotta dig into data there to figure out is what’s the background check on getting a license,

Michael: but I think that the real issue here is that, we’re still, if you’re a rider in an Uber, you’ve signed a. User agreement that typically will include an forced arbitration clause.

So if you are sexually assaulted in an Uber by someone who Uber didn’t appropriately check out, you’re gonna have to, you’re not gonna be able to just take Uber to court for that. You’re gonna have to go through the paid arbitrator from Uber who, you know, who knows what their decision is gonna be.

There’s certainly, I certainly don’t want my judge being paid by the [00:37:00] company I’m suing. And I think most folks would agree.

Fred: Then, hearkening back to the New Jersey court case, you can lose your ability to sue Uber if you’ve ever ordered a bag of potato chip over Uber Eats, at least in New Jersey.

I don’t think that’s a nationwide standard, but there’s legally you’re putting yourself in tremendous jeopardy when you’re using. Uber by signing and agreeing to the user agreement for the ar forced arbitration or compulsive arbitration. Michael, you’re the lawyer here.

Michael: Yeah. They and I, they say Uber is saying that they give survivors of these events a choice. To mediate, to arbitrate, or to go to open court. They’re not providing any evidence of that on a case by case basis. I’d be interested to see that. But ultimately I think that the big, the overarching point of this article in the New York Times is that, this, these are [00:38:00] numbers that we’ve never seen before.

These are incredibly high numbers. And Uber has been actively trying to prevent this information from reaching the public until it was, uncovered in this lawsuit, which sounds very similar just to other corporate behaviors we’ve been discussing today. So

Anthony: I don’t use Uber. I have a, I use a different ride sharing app on my phone and if the driver deviates from the course that’s supposed to take me, I immediately get pinged saying, is everything okay?

We’ve noticed you’re deviated from course. Let us know if you need help type thing. Does Uber do that? Do you either of,

Fred: noted that Uber does do that? But in the, in one of the cases they cited, the passenger was drunk and passed out in the backseat, right? So the pings went unnoticed.

She woke up in a terrible situation. Horrible.

Michael: I don’t understand, I, if I was running Uber in that insane world where I’m running a company of that steel I would [00:39:00] require, a system like the insurance companies have where they’re monitoring your drives possibly even with video to so that as a company, I have video evidence of what’s going on under my watch.

Now that might violate some of the other things that Uber’s trying to keep in place. Like the status of these drivers as independent contractors, something that they’ve been, really pressed to preserve that relationship versus them being considered employees. But I guess the camera would add a level of monitoring that might tend to make someone think that they were employees.

But, a driver who knows that their behaviors being video recorded and monitored in that way is far less likely to commit that type of behavior. It’s not going to eliminate it, but I could certainly see it reducing those numbers significantly. And it’s something that Uber could probably rather cheaply and inexpensively implement, but they don’t want to because it goes against their it would probably threaten their ability to treat these folks as [00:40:00] independent contractors.

So in the end, it’s all about money.

Anthony: All right. So listeners, we’ll we’ll try and dig more into this issue and bring some more insight on future episodes. So I’m gonna jump to a different one before we jump into the towel.

Volvo Recall and Software Issues

Anthony: This is a Volvo. This is about a Volvo recall, and we’ve talked about things like this happening where this was a guy Peter Rothschild was driving home down a steep single light road in Northern California when his, suddenly, his Volvo started accelerating outta control and his brakes wouldn’t respond.

And this was because of his car had a recall and they made a fix, I believe it was to the rear camera problem.

Michael: Yeah. That was it. And

Anthony: that fix introduced a problem that broke his brakes. We talked about repeatedly how it seems the auto industry hasn’t learned the basics of software engineering and regression testing and anything like that.

’cause it goes against their bottom line. Thankfully this guy survived and he he owns a [00:41:00] number of Volvos.

Michael: Yeah, this is a weird one. We talked about this a couple of weeks ago after we saw the recall and the warning from Nitsa. But. You get a little more information here and it’s an, it is an odd sort of defect, right?

They did a recall repair on a software repair that was supposed to fix a problem with the rear view cameras. And what it basically did is, I suppose it fixed the rear view camera problem, but it created a, another separate defect in the regenerative braking system where if you’re coasting down a hill using regenerative braking for.

At least one minute and 40 seconds, you lose all braking functionality without any warning. So it shows you some of the odd things that can pop up in software that you know, things that we never really saw in the past, right? Traditionally, we don’t see a repair to a axle affecting your rear view mirror, or, things like that, right?

[00:42:00] The completely unrelated safety systems in the car are not going to be impacted by repairs to another system. Software defined vehicles change that game enormously, and it is a big reason why companies need to validate these remedies before they put them into the real world.

Anthony: Fred, what’s the over, what’s the over under on these companies validating these things before they put ’em in the real world?

Fred: There’s a mathematical impossibility of these companies validating software for all possible combinations of failures in the safety critical systems. It just, it cannot happen. It would take too many universe lifetimes to get that all done. The complexity of the systems and the number of safety critical features that are in software simply make it impossible to check them all.

So this is a problem that will never be, that will never be fixed unless somebody figures out a way [00:43:00] to make these control systems work safely with far fewer safety critical functions. And I don’t think that’s gonna happen. Is there no

Michael: way to isolate systems, isolating safety systems, for instance? I constantly harp when we talk about these review camera recalls, interacting with infotainment system bugs, creating, infotainment, creating safety problems, is there’s.

No way within the vehicles to, air gap, the rigid braking module from the rear view camera module to ensure that repairs to one don’t affect the other.

Fred: Oh, sure. That’s just basic software design and dumb ass engineers who configure it so that unrelated systems can interact catastrophically.

I was talking about something else in which the, you know, even behind a firewall that you’ve put in place, you’ve got all these different safety systems working with each other, and they all have to work. [00:44:00] And if you wanted to validate them, you would have to inject faults into the various software modules and see what happens, and then test them on a track.

That’s just never gonna happen. Nobody’s got the time and money to do that even if they wanted to. It would absorb an infinite amount of money.

US Road Safety Crisis

Fred: So these, the, as I will talk about in the towel, there’s just a real disregard for safety inherent in the self-driving vehicles

Anthony: before we get into the towel.

That’s ’cause you’re, I’m gonna do a positive spin on your t we’re gonna, your tau is gonna be on road deaths. And we have an article from Jalopnik about how in I’ll just quote from it. The US is the only developed country in the world where roads have gotten more dangerous and reach recent years, not less.

Finland, for example, just won an entire year without a single road death in the capital of Helsinki. That [00:45:00] is amazing. How do they do it? And they’ve apparently not covered all Helsinki residents in giant piles of foam, so they can bounce like a ping pong ball off of cars. They’ve just done some basic simple changes that we can do.

Michael: We could do, if Americans were collectively a little more willing to like, accept lower speed limits, smarter road designs, public transportation. A lot of the reasons this succeeds in, Helsinki, we’ve also talked about Norway and other places where they have broken. Yeah, Hoboken places where they’ve been able to get through lower speed limits is a huge part of it.

And in Europe now we’re seeing required intelligent speed assistance going into vehicles, and things like that. But also the, obviously I think the public transportation is far better in, in most European cities than it [00:46:00] is in most places in America. And we’re still way behind on road designs.

Our infrastructure needs a lot of help. We need a lot of help with protecting bike lanes and preventing interaction of pedestrian, vulnerable road users and heavy vehicles that are riding on the roads. We need to separate them. That makes a lot of sense. But it costs a lot of money and it’s hard to get programs through like that when communities have other pressing issues that are sucking up their resources and when.

Federal funding is in jeopardy, like transportation and infrastructure, federal funding is at the moment. So it’s going to be very tough for America to get to where we need to be on safety if there aren’t proper funds for upgrading our infrastructure. And if Americans can’t accept the fact that they need to leave their house on time and not speed to get to where they’re going speeding plays a major role in this.

[00:47:00] And this whole notion that people have a freedom to drive and freedom to speed despite it being a privilege, needs, needs to be eliminated and people need to be smarter.

Anthony: You heard it here first, folks. People need to be smarter. They do. I love it. That’s a good solution to this problem. Now, Fred, let’s jump into your ta.

Fred: The TA this week is an honor of the. Sharp in memory of the atomic bomb attack on hir. Oh my god. More uplifting. Yeah, it’s cheerful, right? But there it is. The fact that it happened the United, and quoting from one source that I looked at, the United States military estimated that around 70,000 people died at Hiroshima.

So later independent estimates argued that the actual number was 140,000 dead. In both cases, the majority of the deaths occurred on the day of the bombing itself, with nearly all of them taking place by the end of [00:48:00] 1945. There are still people dying from the effects of radiation, but that’s not the point I wanted to make.

The point I wanted to make is that every two or three years, maybe four years the United States experiences cumulative cumulative accumulated deaths from traffic. Are equal to the deaths associated with the atomic bombing of Hiroshima. Now, if people know one fact about World War ii, that’s probably it, right?

That the atomic bombing ended the war, and it was a horrific war. It was a horrific event. And without regard for whether or not it had to happen, it did happen. My question is, or the thought experiment is that if we had an atomic bombing every three years in the United States that killed that many people, what would we do?

What would we do to the [00:49:00] source of that atomic bomb? It’s highly likely that we would marshal all of the nation’s resources to defend against those horrific deaths that were occurring due to the atomic bombing, but we have the exact equivalent of that deaths in the. Lackadaisical approach to highway safety that’s been in place in the United States for a long time.

As Michael just said, we’re, or maybe Anthony, maybe you said it, but this is the only country, the only developed country that is experiencing increased deaths annually from roads hazards rather than decreasing deaths. So we’re trajectory is needed in the right direction, and I think it’s appalling that the scale of these deaths isn’t motivating government action to defend us against this industry, [00:50:00] to defend us against this slaughter that’s occurring on the roads.

Think of it, folks. This is the equivalent of an atomic bombing every three years. We had an attack from a foreign entity on September 11th, 2001. That launched a war that lasted over 20 years, right? The number of people who died in those attacks was horrible, but it was a small fraction of the number that are dying every year.

And in fact every couple of months in the United States from the kind of hazards that we accept and how do we treat it we treat, GM’s bad decisions with deferred prosecution agreements, right? We argue it down to the point where they pay no fine and they are not prosecuted as criminals for defects in the [00:51:00] cars that they built in knowing full well that they were defective.

Knowing full well that they were going to cause a large number of deaths in the United States tested the same way we’re seeing evidence now that they knew full they were allowing people to operate dangerously in their vehicles, even though the vehicles themselves knew that they were operating outside of their approved operating regime, that it was hazardous and, people are dying as a result.

Yet we let it go. It continues noted that the court in Florida probably didn’t have authority to, but certainly did not force Tesla to fix the damn problem. So we don’t know if there’s any fix that’s gone into the later releases of the software. That are keeping people from dying from known defects in the [00:52:00] software.

Instead, we find a defensive crouch from the company that just does everything to increase the profits and increase the profitability of the company. So I’m not sure what the solution is and in simple terms, but in general, we’ve gotta stop treating people as disposable commodities at the behest of stockholders in companies producing this dangerous industrial machinery.

We we need to be aware of that and frankly, we need to do something. I have children, I have people in my family that I love, I suspect many of the people in these companies do. Also. Why is it that the history in the United States of treating people as disposable commodities. Is superior to the idea that we can get rid of these deaths that are occurring in huge numbers.

Remember, an atomic bombing every [00:53:00] three or four years. Is that something we would accept in the United States? I don’t think so. But if it’s put in terms of a car or a vehicle or an e EV or an AV or something with a technological luster, that becomes perfectly acceptable. So end of ran for this week.

Thank you.

Anthony: I think we should take over the National Mall. Just put a whole bunch of 40 something thousand tombstones on it. Just highlight this is how many people die every year from this. That’s an effective form of protest. And then the following week we’ll have people, the 40,000 people are killed every year by guns do the same thing.

Michael: I think we should mandate real intelligent speed assistance, which would be in-car speed enforcement and get vehicle to vehicle and vehicle to infrastructure communications up and running so that crashes can be prevented before they [00:54:00] occur.

Anthony: Alright, that’s a good solution as well. But yeah, Fred it just, if it all happened at once, maybe people would notice, but even then I look at school shootings.

People just, they ah, it’s, that’s now’s not the time to talk about it.

Fred: Yeah. Thoughts and prayers instead of action.

Anthony: Exactly. And that’s okay.

Aurora’s Nighttime Autonomous Trucking

Anthony: ’cause Aurora is gonna start doing driverless semi-truck driving at night. That’s right. Aurora’s expansion includes growing driverless feet to three trucks.

Surpassing 20,000 driverless miles at the end of June. Ooh, 20,000 driverless miles of an 18 wheeler coming at you Aurora notes that the unlocking of nighttime autonomous operations can also improve road safety. This could have been a gaslight. It cited a 2021 Federal Motor Carrier Safety Administration report on large truck and bus crashes that noted a disproportionate 37% of fatal crashes involving large trucks occurred at night.

This comes despite trucks traveling fewer miles during those hours. [00:55:00] Get rid of those humans because they’re sleepy.

Michael: That’s par for the course. Most fatal crashes occur at night despite, nighttime over the course of the year, I think being shorter than the daytime. And it’s, I think it, it comes down to poor visibility and to some extent the fact that people tend to use substances that impair their driving abilities later in the day.

Ultimately if you’re going to be operating a trucking fleet autonomously, you’re going to have to do it at night as well as during the day. And that’s what Aurora’s headed towards. They, we’ve, we, I think we’ve talked about how they had to put a safety driver back in the truck, which of course we are fans of.

Because it certainly helps keep things a little safer when you’re not completely relying on the autonomous computer that’s in the vehicle. But. This is, yet another announcement from an autonomous vehicle company that they’re doing something new that’s going to have a [00:56:00] great outcome and blah, blah, blah, blah, blah.

So not too much new to see here, although, I guess it’s not, it’s notable that auroras finally moving into nighttime operations, which proved to be a little more difficult for autonomous vehicles because a lot of the cameras that they use aren’t gonna work as well at night.

And, the challenges at night with human drivers around you, who may be in all sorts of states of consciousness are a little tougher.

Fred: It’s worth it’s worth reminding people that there is no data, none. Zero, zilch, Nick, that autonomous vehicles improve highway safety. There is just no data. This is just an aspirational claim.

And whenever you see that associated with an announcement of AV capability, it’s purely bullshit. There is no data,

Anthony: purely bullshit. The name of our other podcast.

Ford and Kia Recalls

Anthony: Now let’s jump into [00:57:00] recalls. Who’s first up? Ford? Yeah. Ford Keeping its winning streak going 312,000 plus vehicles. The 2025 Lincoln Navigator, 2025 Ford Ranger Ford Bronco Ford Expedition Ford F-150.

The electronic brake booster module may enter in a faulted state while driving, resulting a loss of brake boost until the electronic brake booster module completes a sleep cycle. Completes a sleep cycle when the vehicle is off. So this is literally unplug it, plug it back in to fix the situation.

Is that what it is? And that’s what it sounds like. You gotta complete ale it sounds like. Yeah. You can turn the vehicle off.

Michael: You gotta turn back, I’m sure. Okay, so there’s a software update that’s coming, that’s the actual remedy for the vehicle. Problem. And this one’s a little bit scary because, if the automatic emergency braking system in these vehicles relies on the brake booster to perform the braking operation when the human’s not braking.

And functionally, [00:58:00] if you have this problem in your vehicle, you’re also not going to be driving with automatic emergency braking. So all of you owners who have these vehicles should get this in and have it fixed immediately. And you should be hearing about it in about two weeks, August 25th.

Anthony: Alright, next up on the countdown. Jim Farley’s favorite auto company Ford, 56,000 Ford to 73 vehicles. The E 2025 Ford F-150 Hybrid. And a missing audible seatbelt chime can result in decreased seatbelt usage. People, you don’t need a chime to remind you to put on your seatbelt. Just put on your seatbelt, click it or ticket it as they say.

Michael: Yeah and you know what? Even if you aren’t as smart as Anthony and know that you need to put your seatbelt on when you get in the car, I think most of us know that. But there are some people who might not do it if the car doesn’t annoy the hell out of them with some type of chime. So for them, we want this recall to occur.

And also this is a violation [00:59:00] of federal motor vehicle safety standards that requires a warning to owners. And it looks like. Owners should have already been notified. Actually no notification to dealers should have already happened, but owners are gonna receive something on the 25th of this month.

You could probably go ahead and call your dealer and schedule an appointment.

Anthony: Yeah. The owners will be notified by a text messages ’cause it was ding ding. Next up Kia 100,000 plus vehicles. The 2023 to 2025 Kia K five. Ooh. Due to a supplier quality issue, the sea pillar garnish face plate may progressively de-laminate and become loose from the base of the molding.

The face plate may eventually fall off. What is a what

Michael: and this one, this recall, and the next recall on the list are both Kia recalls for,

Anthony: oh, did you ruin the surprise? I don’t know what’s next.

Michael: Essentially parts falling off the vehicle. We’ve seen trimmed detachments and things like this and this when one of them, I think this one that was just noted by Anthony, it’s a [01:00:00] cpel C pillar garnish face plate that delaminates becomes unglued and gets loose and flies off the vehicle and is a risk to other people on the road and oth vulnerable road users who might be in the area.

All sorts of people who could be hit by this thing flying off your car. The interesting thing about this recall is that I, I think between the two different recalls that are occurring here. One of them. I think Kia had 12,000 of these failures reported before they even decided to do the recall. So that suggests there was possibly some pressure from Nitsa on that side to make sure they, they went through with it.

That is a lot of failures. In a population of, a hundred thousand vehicles virtually more than one out of every 10 had already failed before they did the recall. That’s a very high failure rate. And owners are gonna hear about this later in September.

Anthony: Yeah, if you listened this far, you might have also become unglued, but we only have one recall [01:01:00] left from Kia as Michael ruined the surprise.

201,149 vehicles. The 2023 to 2025 Kia Telluride. And that’s it. Insufficient application of the adhesive layer between the door belt molding, face plate, and the base of the molding. Yeah, this is more of just things falling off the car on the inside.

Michael: Yep, same thing and similar for owners. You’ll expect to hear about this in later in September and you’ll get a new this one is a door belt molding assembly that’s going to be replaced and you’re gonna get a new adhesive layer to stick it onto your car.

And hopefully it won’t come off this time.

Anthony: It’s just gonna be some bubblegum.

Conclusion and Call to Action

Anthony: And with that, listeners, thank you so much for sticking around. We know this one went long, your love for us is I’ll stop auto safety.org, click on donate, give us five stars, tell all your friends, like subscribe, hashtag at something.

Bye-bye. Thanks everybody.

Fred: Bye-bye. For more [01:02:00] information,

Michael: visit

Fred: www.auto safety.org.