Robo Rides and Recall Realities
Big thank you to Janette Fennell from Kids and Cars Safety for sending us some cool swag.
This weeks topics include:
- Tesla’s controversial full self-driving claims;
- Regulatory issues surrounding autonomous vehicles;
- Self-certification in vehicle crash testing;
- A Fisker recall update;
- When/why do vehicles get crash tested;
- The New York legal loopholes in drug-impaired driving;
- Recalls and more…
Links:
- https://www.iihs.org/news/detail/fewer-drivers-are-opting-out-of-lane-departure-prevention
- https://www.reuters.com/business/autos-transportation/uber-lyft-drivers-use-teslas-makeshift-robotaxis-raising-safety-concerns-2024-10-03/
- https://electrek.co/2024/10/02/elon-musk-celebrates-winning-lawsuit-tesla-self-driving-claims-embarrassing-defense/
- https://www.consumerreports.org/cars/car-safety/some-cars-will-never-be-crash-tested-crash-test-ratings-a9250800738/
- https://philkoopman.substack.com/p/remote-assistants-for-autonomous
- https://techcrunch.com/2024/10/04/gm-is-working-on-an-eyes-off-hands-off-driving-system/
- https://www.msn.com/en-us/news/us/prosecutors-federal-officials-push-to-close-loophole-in-new-york-law-that-allows-drugged-driving/ar-AA1rO5zj?ocid=BingNewsVerp
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V720-8602.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V729-8495.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24V718-2751.PDF
- https://static.nhtsa.gov/odi/rcl/2024/RCLRPT-24E081-6474.PDF
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
[00:00:00] Introduction and Podcast Anecdotes
[00:00:00] Anthony: You’re listening to There Auto Be A Law, the center for auto safety podcast with executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino for over 50 years. The center for auto safety has worked to make cars safer. Hello,
my fellow Patriots wait, wrong podcast. Sorry. Hey guys do You ever have one of those days where you get an alert on your calendar and it’s hey the podcast starts in 15 minutes and you’re like, oh shit, I don’t have any pants on. Ah, because that’s how the day started off for me. But, know what started off better for me?
Fan of the show, former guest on the show, Jeanette Finnell. I know I just mangled that last name, I’m so sorry. Oh my god. Anyway, she sent us this package, I assume you guys got this as well? Yes. Yes. Okay, so [00:01:00] we got this awesome letter from her, it said, to my favorite podcasters, I hope this note finds you deep in an epic debate on how to seamlessly weave Piggly Wiggly references into your next podcast episode.
To aid you in your quest for the ultimate shopping experience, I’m sending each of you a Piggly Wiggly reusable shopping bag. Think of it as your official podcasting accessory for those emergency grocery runs. Because who doesn’t need more snacks during recording? This is awesome. So we get a piggly wiggly shopping bag, but more cool than that.
We get this kids in cars. org window breaker and seatbelt cutter is so cool. And how to escape from a sinking vehicle in one minute directions on that. This is the coolest thing in the world. Thank you, Jeanette, go to kids in cars. org and support them. And I never say to support any other group, but they’re awesome.
That’s. That’s the coolest thing, but I have to point out that none of us should be having snacks during [00:02:00] recording because it just doesn’t sound good.
[00:02:02] Fred: That’s a good point. And Jeanette, we love you, but you are our second favorite podcast fan, behind my sister. Let’s be real.
[00:02:13] Anthony: But anyway, none of this has that actually does have stuff to do with auto safety, but let’s get into the deep dive.
The beginning. Are we ready for it?
[00:02:23] Elon Musk’s Robo Taxi Announcement
[00:02:23] Anthony: Let’s, by the time you’re listening to this, it will be, this will come out on Thursday, October 10th. A day that will live in infamy. This is the day that, Elon, I’m so full of shit, I can’t see straight. Musk is going to unveil his robot taxi. I don’t think it’s going to be a real thing.
I think it’s going to be a guy dressed in a robot costume, crawling around on all fours. No, it’s going to be a taxi that they’ve had mapping out the Warner Brothers lot in Hollywood so that Anybody could drive that area. Who knows what happens there? [00:03:00] Maybe he’s not full of shit.
Pardon me,
[00:03:05] Michael: but go ahead. I was expecting, we don’t really know what’s going to happen Thursday, but I’m expecting a lot of pomp and a big show, but the fact that they’re, well, they’re putting out this robo taxi or they’re saying they’re going to, and there’s not a lot of specs on the thing.
We don’t know if it’s going to have a LIDAR, we don’t know if they’re still continuing to try to promote the fallacy that a vision only system is going to work in autonomy. So that’s what really the only thing I’m looking for on Thursday out of that whole mess is, Is Tesla finally going to make some type of admission that a vision only system is not good enough.
And they’re going to have to put LIDAR on the robo taxi. I don’t know if they’re ever going to say that because that undermines, what they’ve been telling the owners of all their other vehicles for the last decade. Ultimately I expect, a lot of robots and, there’s a reason they’re holding [00:04:00] it at a major movie studio.
And that’s because it’s mostly fiction.
[00:04:05] Fred: You’ve got to admire his inventiveness, though, because most of the people that we honor with bullshit representations are strictly one dimensional. He is really the master of multi dimensional bullshit. He’s built an entire five dimensional empire. And created great wealth on the basis of multidimensional bullshit.
It’s like you’ve gone from Excel spreadsheets to MATLAB multidimensional calculations. And one fell swoop, I hat’s off to him, my, my MAGA hat, no, my, my hat is off to him.
[00:04:44] Anthony: It’s true, Fred is not currently wearing a hat. So why would we make such claims about Elon? Because it’s true.
[00:04:52] Tesla’s Legal Battles and Misleading Claims
[00:04:52] Anthony: Also Elon Musk is celebrating winning a lawsuit over his misleading claims regarding Tesla’s self driving program.
[00:05:00] So I’m going to jump down to the fun part of this. Basically his lawyers say. No reasonable investor would rely on the many misleading statements from Elon Musk around self driving because they’re mere puffering. They are vague, generalized assertions of corporate optimism or statements of mere puffing and are not actionable material representations under federal security laws because no reasonable investor would rely on such statements.
To remind you, Elon Musk says, don’t invest in this company if you don’t think we’ve solved automation and autonomous vehicles. I’m confused.
[00:05:38] Michael: Yeah, it’s, this is a shareholder lawsuit, so it’s not as though there’s a, someone who was injured in a crash who’s bringing a claim over those misleading claims.
There’s, there may be a, I think there’s a different legal standard involved here, but ultimately Tesla made the argument that, their terms like autopilot and full self driving and this [00:06:00] entire, Fiction that’s been created that Tesla’s camera based systems are going to somehow use machine learning to evolve into a safe autonomy.
It’s not going to happen. It’s Tesla made the argument that was just puffery. We’re not actually doing that. We’re just saying we’re doing that. And, if we attract, billions of dollars in investment money. Over the course of 10 years doing that, then that’s just, a risk.
These investors took in the market. So, while Tesla and Musk himself on Twitter celebrated this as a victory, it’s basically a stamp of approval by the court saying, you’ve been lying to consumers for the last decade. So it’s not really something to be celebrated. And it’s also not something, Frankly, I don’t think any of us here have a whole lot of sympathy for all the shareholders who bought into this because it’s been pretty obvious for a long time now that these vehicles were never going to be [00:07:00] robo taxis.
And also, you have to question the ruling in some ways. Is calling a system full self driving or calling a system autopilot really, is that puffery? You’re naming the system, something that they’re not. I would say that goes a little further than corporate puffery, which might be protected in that, it’s a falsehood, it’s a deliberate falsehood intended to get investments primarily because it doesn’t look like they’re going to be achieving autonomy anytime soon.
[00:07:27] Anthony: Hey, my friend Matt would engage in the same level of puffery on his dating profile. He would list he was 6 foot tall, fully employed. Nah, he was like 5’7 borderline homeless. I didn’t give him any money either. You guess, Fred, you’re leaning in. No, he’s got nothing.
[00:07:47] Fred: I got nothing except I was going to tell you about a date I had once, but maybe that’s too long.
[00:07:52] Anthony: Okay, maybe. We have a link to an article from Reuters that starts off, and this is an amazing sentence.
[00:07:58] Self-Driving Cars and Regulatory Concerns
[00:07:58] Anthony: A self driving Tesla carrying a [00:08:00] passenger for Uber rammed into an SUV at an intersection in suburban Las Vegas in April. An accident that sparked new concerns at a growing stable of self styled, quote unquote, robo taxis is exploiting a regulatory gray area in U.
S. cities, putting lives at risk. Yeah. I don’t know what else to say about that. We’ve discussed this ad nauseum about how the regulators are just like, yeah, go for it. Why are we still allowing this? Is there anybody saying, hey, let’s pump the brakes, or is it just people like us and fans of the show?
[00:08:37] Fred: I think attentive listeners will know that One of the problems is that there’s really no set of requirements for safety that any of these self driving vehicles need to subscribe to. Just a reminder, we have helpfully put such a list of requirements on our website, which readers, listeners can find as our AV Bill of Rights, and that would go a long way towards [00:09:00] solving some of these problems associated with reckless AV use on our highways.
[00:09:06] Michael: Another thing to look at here is, if you are, using a ride share, Uber, Lyft, whatever you use, and you see a Tesla pull up, you need to tell them you’re not, it’s not acceptable that they’re using full self drive and that, That you’re not going to ride with them if they do because you’re the one with the choice here.
There’s going to be, it looks like this is a burgeoning trend and they’re going to be people in Teslas that are working with Uber, working with Lyft. And using Tesla full self driving. And, you have a choice in that, either don’t get in the vehicle or, but be very clear with the driver that you’re not going to allow them or you’re not willing to ride in the vehicle if they’re going to use the full self driving or autopilot features.
And another thing, if it’s an Uber and there’s a crash here, you’re not going to have a [00:10:00] claim in court because you’ve agreed to their terms of service, which forced you into arbitration. You’re handing over your safety to Tesla and your potential for, civil justice to Uber if you’re taking that chance.
[00:10:14] Fred: Michael, what’s wrong with compulsory arbitration? Why wouldn’t people want to do that? It seems like an expeditious way to get their problems settled.
[00:10:23] Michael: It’s an expeditious way for, corporations to, set up kangaroo courts so that they don’t have to pay the legal fees and go through the court system and go through all of the great things that make the civil justice system in America what it is.
It’s corporations set up these arbitration. Schemes in order to make their lives easier and typically an arbitration is going to result in lower recoveries for plaintiffs and lower recoveries for people that are injured and are killed in accidents and at the [00:11:00] same time you completely lose any chance of punishing bad corporate behavior.
They’re not going to be. They’re not going to be Ruling for punitive damages in arbitration. It’s a great thing for corporations and it’s a, it can be a really bad thing for consumers. And so we think that consumers should have the option. If someone wants to go through arbitration lickety split, that’s their option.
We wouldn’t recommend it, but you should always have the option to go to court if arbitration doesn’t suit you. And it probably shouldn’t.
[00:11:35] Anthony: I wish there was some way for us consumers when we have these one of the contracts of adhesion, or you have to accept all whatnot that we could just negotiate finer points in there.
Cause I remember when I worked at a very large software company, they’d have a, this standard contract that all third party developers would have to sign. And I’m like, really, is Adobe signing this, is Microsoft signing in this? And they’re telling me no, they’re not signings. They get their own custom [00:12:00] contracts.
They get a whole bunch of things that. That the average Joe just won’t sign on to. Maybe we can all band together and become our own, the corporate citizen, citizenry of America. I go to autosafety. org, click on donate, and I’ll spend your donation money on coffee and elocution glasses. You know what I’m saying?
No nonsense. Isn’t
[00:12:23] Fred: it?
[00:12:24] Anthony: There’s an idea in here, man.
[00:12:27] Lane Departure and Speed Limit Technologies
[00:12:27] Anthony: Here’s another idea from IIHS. Titled, Fewer Drivers Are Opting Out Of Lane Departure Prevention. Almost 9 out of 10 drivers of vehicles equipped with lane departure warning and prevention systems Now keep them switched on, a new study from IIH shows.
7 out of 10 drivers of vehicles that give visual alerts when they exceed the speed limit also keep that feature running. I love this, because I always keep the lane departure on even when it’s annoying. And if I’m driving, it’s on, my wife’s in the passenger seat, and I’ll go bing at times [00:13:00] when.
A lane ends or it merges. She hates it. She’s she turns it off immediately when she’s driving. I don’t know why, but I’m a big fan of lane departure warning, even when it’s wrong. You guys? Fans?
[00:13:13] Michael: I don’t actually have it in my vehicle. I don’t have a lot of experience with lane departure warning.
I do sympathize with some of the folks who the earlier versions of lane departure warning apparently were very reliant on audible warnings And instead of some of the other methods haptic alerts, the vibrations and your steering wheel, your seat and so I think a lot of folks, turned off those systems because of that.
And also, I think it takes a lot of time to get adjusted to the fact that your car at some points can steer for you. And accepting that somewhat loss of control there. The study looked, I think the study came down two facts for why we’ve seen this improvement in the rate of use.
One of them was that [00:14:00] manufacturers have been migrating the systems to be turned off and on into the vehicle menu. So you have to go into a settings menu to deactivate the feature, which is a lot harder than just clicking a button. One thing that may have, resulted in more people using it was just the fact that it’s a little harder to turn off.
And then the other thing is that, that has been changed is that you’ve seen a lot of people and a lot of manufacturers. Migrate from the audible warnings, which consumers find really annoying to more of the haptic alerts, the vibrations of the steering wheel or the seat. And apparently drivers find that a lot more acceptable.
And as vehicles come out with more of those haptic type alert type alerts and less of the audible alerts, we see more people willing to use this feature as an option and not turn it off.
[00:14:55] Fred: I’m sorry on my car. They’re decoupled. So [00:15:00] you are able to have the lane keeping assist system operating or have just the warnings operating.
That may be part of this as well, because I don’t like to have the lane keeping system operating because it’s well, not very good. And. Also really annoying, but the warnings themselves are, they’re pretty good. They’ll let you know when you’re drifting off and crossing a line. And so the ability to decouple the warnings from the active system that controls your steering.
Maybe part of that evolution as well. So I find that to be a very attractive feature of my car.
[00:15:40] Anthony: What I found fascinating with this was it said seven out of 10 drivers keep the visual alerts on when they exceed the speed limit. Which is fascinating to me because my car will, you, as the, the camera that will read speed limit signs and it’s 70 percent accurate.
It misses construction speed zone signs. Can’t read them [00:16:00] at all. If it, if there’s a car that passes in front of its field of view blocking it, it gets it wrong. So I’m wondering though, I, I don’t have this in my car it’ll warn me for going over the speed limit. So I wonder, how well do these things work?
Have they gotten better? Can they see through that semi truck that’s blocking the speed limit sign? Like, how are they doing it? Any ideas?
[00:16:21] Michael: I think they’re using, there’s a couple of ways to do it for, I guess cameras for that recognize speed limit signs and a better way I think is, mapping where it’s functionally, your vehicle knows the speed limit wherever you are at any one time using GPS.
[00:16:38] Fred: Yeah, my car uses mapping exclusively. So it doesn’t record the the signs. I know that because it changes without having any signs available.
[00:16:48] Anthony: Because I’ll use Google Maps, I know it’ll change, but even it gets it wrong. The other day I was driving and it said the speed limit was 55, and I’m like, oh, I get to speed up, and I’m like, no it’s, speed limit’s 25.
Why does it think this?
[00:16:59] Fred: [00:17:00] Sorry, as I will, I’ve tested it to see how much it conforms to the actual change of signs and within about a hundred feet or so is when my car will actually change the notification of the posted speed limit. So it’s pretty good, but it’s not instantly translatable to the location of the signs.
[00:17:24] Michael: And you’ll have circumstances like road work where there’s a temporary speed limit put in place that, mapping is not always going to pick up immediately or if at all. So you can’t totally rely on that feature to be accurate all the time.
[00:17:39] Crash Testing and Vehicle Safety Standards
[00:17:39] Anthony: So we’ve all seen the Cybertruck.
We’ve all talked about it and everybody I mentioned it to, they think this just looks incredibly dangerous. Like it’s a, it’s sharp edges and sharp angles. This is the reason we remove tail fins from cars. Like sharp angles lead to death and dismemberment. And bouffant hairstyles. [00:18:00] But Consumer Reports has this article talking about why some of these cars will never be crash tested, leaving consumers in the dark.
So reminding listeners that in the United States, auto manufacturers self certify that their cars are crash worthy. That’s right. They say. Yeah, Mr. Government Man, my car is safe. Prove me wrong, bro. Bro, prove me wrong. And I guess that happens sometimes, but this article from Consumer Reports talks about the Jaguar F Pace, Land Rover Discovery, Porsche Macan, and the Tesla Cybertruck.
How these will never sell enough car sales to warrant testing. But it makes me wonder, they’ve Cybertruck’s already had, what, 13, 000 recalls? Thirteen. I
[00:18:47] Michael: just got number five this week.
[00:18:49] Anthony: No, what I mean is they’ve recalled thirteen, they’ve recalled every car they’ve made and sold. So what’s the limit on, what’s the threshold where they’ll crash test the vehicle?
[00:18:59] Michael: You’re [00:19:00] not going to see NHTSA crash testing too many vehicles in NCAP, which is where they’re testing most vehicles. You’re not going to see a lot of luxury cars there simply because NHTSA has a budget and some of these very high end luxury vehicles aren’t going to be bought en masse.
And so to protect or to get their best ratings on the most number of vehicles, it makes a lot of sense for them to be testing the less expensive models. And the same thing goes for the insurance Institute. They don’t have an unlimited supply of money and. It doesn’t make a lot of sense for them to test vehicles that are going to be, produced, in, in low numbers.
They’re going to get the best bang for their buck testing the type of cars, the most of us drive that’s really what that comes down to.
[00:19:47] Anthony: But what’s the, do we have any idea what’s the limit? What’s the point where they’re like, Oh, we’ve sold 12, 000 vehicles of this stuff. We should crash it.
[00:19:56] Michael: I would assume they’ve got a, a sliding scale, and they’re not, they’re going to [00:20:00] test, I think one vehicle from, you might see five different types of Honda Accords on the market. There are going to be lots of different, engine types and other things, but they might only test one of those as an exemplar for the Honda Accords for that year.
So I don’t know that there’s an, I’m sure that there’s. Probably a money number that they’re never willing to cross. I don’t think we’re going to see a lot of vehicles that are over 100, 000 being tested. Although a lot of, you know, big pickup trucks are reaching that price today that are bought in fairly high numbers.
Like a a Ford F 150 with all the specs and all the options and leather seats and, your own chinchilla in the back seat. Those can, reach. Very high prices. And so the agency in those situations, probably opts to test a lower range, more economy model.
That’s, got pleather seats and, and it [00:21:00] doesn’t have all the bells and whistles of the other ones just to keep their budget straight. It doesn’t make a lot of sense to test the really. Stacked
[00:21:08] Fred: models. I think there’s another factor at work here, which is that it’s a reasonable assumption that a car company would test all of its cars to certify its crashworthiness the same way.
Since the cars that both IHS and. NHTSA pick at random, it would make sense for them to concentrate on cars that come out of manufacturers with lots of models and just test one and assume that all the other models are being tested by the manufacturer the same way. So that’s a, that’s another way of economizing with that assumption that.
The manufacturers are certifying all of their different models the same way. It would also be a way to save money and, just limit the amount of models that they actually have to test. So with a Tesla, it doesn’t have a lot of [00:22:00] models, so it probably doesn’t make sense for them to assume that There are different test procedures for different models that they’ve got.
So testing one would represent all of them.
[00:22:15] Anthony: It was like the cyber truck is such a different vehicle. So I’m wondering like to self certify this, does Tesla and every other manufacturer, do they have to take, a cyber truck and a Porsche 911? Do they have to smash it into a barrier? Okay. So how do they certify that?
Yeah, we’ve crash tested it, man.
[00:22:34] Michael: And there’s a number of ways. I think they’re just certifying that it would meet federal safety standards. They don’t have to physically demonstrate that to the agency. They just have to have within their files, some type of proof that this vehicle’s meeting those standards.
It could even be simulation testing which. Which I think Fred might get into a little more later in the podcast, but it could be a lot of things. It doesn’t have to [00:23:00] be literally putting a vehicle on a sled and crashing it. There are a lot of ways that they can back up their certification.
[00:23:07] Anthony: So this self certification basically says, Hey, if we were going to crash test it, it would pass. This is that’s insanity. That’s. That seems totally unreasonable. Not totally unreasonable,
[00:23:19] Fred: sorry I got to differ with that, but I’d have to get into the nuts and grits of how simulations are done, but you can validate a simulation and assume that, once you validate the simulation, that it’s reliable.
But again, I would have to get into the mind numbing details of how the simulations are done. So let me just leave that for now. And
[00:23:42] Michael: Also you can, you can use, if you haven’t changed the basics of a model for five or six years, often you’ll see runs of vehicles that don’t change a lot within a few years, you can certainly use your crash testing from year one [00:24:00] to year two.
To support similar conclusions for the model years coming after that. So there’s a lot of ways there that it can be done. Ultimately, I think what these manufacturers are hoping for is that, NHTSA doesn’t perform a compliance test, which they do occasionally on vehicles and discover that they actually fail.
That’s when you see a big NHTSA conducts a crash test or conducts one of the. Test specified in the motor vehicle safety standards and the vehicle fails. That’s when we typically see a pretty quick recall.
[00:24:32] Anthony: So I think you guys are missing what I’m really trying to get at here. Yeah, I get, previous model years of a Honda Accord from five years ago to now.
Yeah, you crash it, it’s going to be more or less the same. What I want to, what I’m looking for is video of a Cybertruck being smashed into a wall. Because I don’t believe whatever Tesla’s selling. And hopefully NHTSA will be like, let’s do a compliance test here, a confirmance test or just, we’ll smash it.
Because they claim it’s bulletproof. It’s not. [00:25:00]
[00:25:00] Michael: We almost got that. I almost got that yesterday. I was driving back to Alexandria from D. C. on I 295 going south, and the traffic was incredibly slow up ahead of me, and so when I looked about 200 yards up the road, I saw a Cybertruck prowling the road in front of me, First, I, there was a pickup truck in front of the front of it.
And I thought the pickup truck driver was screwing around with the cyber truck break, checking him, doing something to slow his role. And then as things shook out and I got closer, I never caught the cyber truck, by the way, it jetted off. There was another Tesla riding on, I don’t know if it was full self driving or autopilot, it was one of the two because the person inside the Tesla had pinned a document to their sun visor and had it right in front of their face where they couldn’t see the road and was reading and just relying on full self drive or autopilot, whatever they were [00:26:00] using to drive them down to 95 at.
Obscenely low speeds, by the way they were not speeding, but they were in the left lane and slowing up everyone on their way home from work. So it wasn’t the Cybertruck that was the villain there, but it was yet another stupid Tesla technology and a really stupid driver
[00:26:20] Anthony: and their license plate number is
[00:26:23] Michael: We aren’t giving that out, but I’ve already sent, I’ve already sent a no, I didn’t do anything.
I just talked about it on the podcast.
[00:26:33] Anthony: All right. Last thing on Cybertruck, and then I’m done with it for the day, I think.
[00:26:40] Cybertruck Insurance and Repair Challenges
[00:26:40] Anthony: Rumor has it that GEICO is refusing to insure Cybertrucks now. There’s some guy posted online saying, hey, I got a letter from GEICO saying, yeah, we’re We’re dropping your coverage, and the theory is that trying to repair these things is just so outrageously expensive.
GEICO is just yeah we’re treating your Cybertruck like we treat the [00:27:00] state of Florida. No bueno. But then GEICO is also claiming that, no, we’ll insure it, so who knows? Hey, if you have a Cybertruck and you listen to this podcast, what’s wrong with you? Is this some sort of self flagellation? But let us know if GEICO will insure you.
[00:27:16] Michael: Yeah, that was a weird one. Because from the letter that consumer received from Geico it sounded they were killing coverage on those types of vehicles, but then Geico comes out and says, Oh no, we’ve got lots of coverage for those cyber trucks. I don’t know what happened there.
If he had, if there’s some type of local Geico That, that changed their mind on his specific truck or what happened. But, as we’ve discussed ad nauseum, cyber trucks are going to cost a lot to keep on the road.
[00:27:45] Fred: I did notice that this cyber truck was his eighth vehicle. So they’ve got, some limits on the number of vehicles they can carry under one policy.
But this person apparently really loves vehicles.
[00:27:58] Anthony: I, they must have the world’s [00:28:00] biggest driveway. I don’t know. Hey, speaking of a gaslighted type product, haha, the Cybertruck let’s go into gaslight illumination. I believe let’s start off with Mr. Fred Perkins. He’s got some good technical reasons of why we’re beating Gaslit and by who.
[00:28:16] Fred: Oh, thank you.
[00:28:17] Autonomous Vehicle Safety and Liability Issues
[00:28:17] Fred: This may be a repeat winner or repeat contender anyway, but there’s a group called the AVSC, which is the Autonomous Vehicle Safety Coalition so they start off with bullshit because it’s They say it’s best practice for autonomous vehicles, yet their best practice contains no representation from the public, no representation from consumer advocates, no representation from anybody who might differ from their perspective.
So I think perhaps it’s misnamed and maybe it’s autonomous vehicle secures capital. I’m not sure, but anyway, they issued a. Document at the [00:29:00] end of November, which was interestingly one month after the Crash in San Francisco of the Cruise vehicle. Excuse me. The, yeah, the the cruise vehicle that was dragging a woman underneath it.
So one month later, they came out with a document that basically says that no problem because they were talking about. The remote assistance available for cars, and what they said in the document, which they want accepted as an industry standard, is that for remote assistance, the remote assistance does not include real time driving or fallback performance by a remote driver.
Rather, the ADS, the autonomous driving system, automatic driving system, performs the complete DDT and or fallback. That sounds interesting. Okay. It says even when it’s assisted by remotely located human goes on to [00:30:00] say that it does it without assuming direct control of the vehicle. And it cannot perform safety, critical functions, and it goes on to say that communication system failures for real time data exchange.
It says load latency is preferred, but not required for our communications. So basically, this is set up to say that the human being at remote assistance is not responsible. And even if the person could have been responsible low latency, which means a long time between a request for action and the action is delivered is okay.
Now, this is exactly what happened in San Francisco because the vehicle hit the woman then pulled over to the side of the road. Some time later, several seconds later, the remote assistance person recognized that there was a problem, but this was after the vehicle [00:31:00] itself had done everything and its power to pull over and in the process drag the woman underneath it.
So this is all actually set up to make sure that what happened in San Francisco is okay. And it’s completely consistent with industry best practices for remote assistance, and also it indemnifies the subscribers to this from any responsibility for the human being who’s providing remote assistance to the damage that actually occurs by the car if it has a computer driver.
We’ve talked before about A. V. industry efforts to indemnify their hazardous vehicles. And operations at the state level, so states that absorb this and states that endorse this AVSC nonsense are very far down the road towards saying that there’s no human responsibility and no liability associated with damages to people that are caused by the AVs and this has been [00:32:00] written into law in the state of Oklahoma.
That only claims against the computer itself can be supported in the state tort litigation. Apparently, they’re moving out and saying, if you want to sue us, even if, even if you can sue us, which is not possible in some states, the only thing you can sue is the actual computer that’s sitting in the car.
And just as a reminder to our listeners, those computers don’t have any bank accounts, and they’re not human beings. Really, this is working very hard to provide a complete, open landscape for the AV people to operate their vehicles without regard to the danger that they provide to human beings, and without any recourse for human beings who are injured.
To recover damages. So under the innocuous title of AVSC best practice for ADS remote assistance use case, [00:33:00] this is complete bullshit and really a very dangerous document to have out in public without regard for whether or not there’s any base, any safety basis for it.
[00:33:13] Anthony: Okay. So I’m in some robo taxi.
It runs over 18 children. The families of those children, they. They can’t sue me, but they can sue the computer? They can’t sue the company that made the cars, the auto manufacturer?
[00:33:30] Fred: And they cannot sue the human being who provided remote assistance that said, after the fact whatever happened, and yeah, they can’t sue the company, they can’t Basically, this document is set up so that they cannot sue any entity that has money that would allow people to recover damages.
For more information on this, actually, you can go to a Substack article by Phil Koopman. We’ll provide this reference on our website. He goes into a lot more [00:34:00] detail than time available allows right now.
[00:34:03] Anthony: That’s crazy. But because my gaslight is a piece from CNBC entitled, The Year of the RoboTaxi. This is Waymo, as I’ve mentioned the last couple weeks.
They’re PR marketing machines in full gear. And this is nothing more than an infomercial for Waymo. It’s ridiculous. It has no They don’t bring on a single consumer advocate, a single expert in safety. A single concerned person. They ask people on the street who are like, yeah I trust computers because they they never get distracted they never drive drunk my, my best friend is a computer.
Like it’s, clearly like I hate that computers never get distracted thing. Cause yeah, none of us have ever used a computer and watched that little spinning wheel as it thinks and thinks and you’re like what happened? And I just. I’m trying to watch this [00:35:00] Waymo infomercial. Why is my computer frozen?
It’s ridiculous. They have people who are like the most dangerous part of a taxi ride is the driver himself. And if you’ve eliminated that, I feel safer. Don’t leave your house. That’s what I want to say to that person. It’s not the driver of the vehicle. This is, you’re afraid of people.
Okay? Don’t go outside. Stay inside. This is really, or don’t get so drunk. Okay? In public that you’re like, I, everybody’s a threat to me. It’s it’s ridiculous.
[00:35:30] Critique of CNBC Robotaxi Infomercial
[00:35:30] Anthony: I’m not even sure if I want to put up a link to this cause it is, it feels, it honestly feels a Ginsu knife commercial from the eighties.
The music, the production value, it is 100 percent an infomercial, and they gloss over the fact like, Hey, Robotaxis would be great until there’s a dangerous incident, and then they glance over how GM Cruise hit a woman. That’s what they say. Hit a woman. Not drove her over and dragged her. And Waymo accidentally ran into a telephone pole, but hey, there’s no problems with these things.
It’s the [00:36:00] future. They point out how they’ve, they bring on some expert who’s one day this will be a multi trillion dollar industry. And they talk about how many billions and billions Google has spent on Waymo. And they keep spending billions. And I still invite any member of the world to show me the math where any of these will ever break even.
I guess if they gouge the customers dramatically and you find a five block robo taxi ride cost you 300. So my gaslight is. CNBC’s infomercial BS division. Thank you. Good night. Michael Brooks.
[00:36:39] Michael: All right.
[00:36:40] Gaslighting in Autonomous Vehicle Terminology
[00:36:40] Michael: This week we are, this is, I’m not exactly sure who’s gaslighting us here, but the quote comes from Dave Richardson, who is Senior Vice President of Software and Services Engineering at GM.
So he, his statement is they’re talking about super cruise and how they’re progressing to level three right now. And [00:37:00] he describes super cruise as an industry leading L two solution for hands off eyes on, and then says, we’re looking aggressively to make that a, an, a level three solution where you don’t even have to look at the road anymore.
I just want, the gas lighting here is in that terminology, hands off, eyes on, hands off, eyes off. Anyone who has driven a vehicle probably understands, and we have probably talked about dozens of examples of this, when you’re driving down the road, not only do you have to make sure that you are staying in your lane and doing appropriate things, but also there is always a chance.
of a split second crash caused by another human, a couch falling on the road out of the car. There are, I could sit here for probably two hours and list all the scenarios that happen so quickly on the road that you simply cannot respond to [00:38:00] them quickly enough. Yes. You’re not focused on the road.
If you’re not looking up, watching the road, watching your environment around you, and you’re not prepared to take control of the vehicle then that presents a pretty big safety issue. And so I’m very uncomfortable with. With companies advertising these systems as eyes off systems, even at level three, if you look at the SAE definitions, which aren’t all that great on their own, even at level three, the driver has to be ready to take over at any time.
And if that driver’s eyes are not on the road and they’re watching the latest episode of the. Taylor Swift and Travis Kelce’s Love Island or whatever’s on TV these days. They’re not going to be able to respond appropriately to, a driver in coming at them and going the wrong way in the oncoming lane, a tire coming off of a [00:39:00] vehicle in front of them.
I, Ultimately, this week, I’m nominating this one for a gaslight because I think this is the terminology we need to get away from. We’re encouraging people to overtrust these systems by using language like this. It’s, it doesn’t quite rise to the level of calling things autopilot and full self driving when they’re not but leading consumers to believe that they should have their eyes off the road.
At any point when they are ultimately responsible for responding to the threats to that vehicle is just it’s a bad approach. It’s a bad safety approach and it’s not a great way to educate consumers on how to use these systems. So either Dave Richardson at GM or just anyone in the industry who is calling these systems hands off, eyes off those would be my Gaslight nomination folks for the week.
Thanks.
[00:39:55] The Hand Grenade Safety Analogy
[00:39:55] Fred: I want to remind our listeners that when you are in your car driving down the [00:40:00] road at highway speeds, the kinetic energy in your car is about equal to the energy of a hand grenade. You’re essentially driving down the road with a hand grenade, the pin has been pulled, and you should have attention to the car’s operation equivalent to what you would have if you had a live hand grenade in your hand.
And you were walking around with that. You should never ever lose that perspective and you should certainly never ever cede control of that to any device unless you’ve got absolute. assurance that device will have at least equal attention to the safety of the driving that you would have being aware of that dangerous amount of kinetic energy.
[00:40:45] Anthony: And that brings us to our new segment called Hand Grenade Safety Education. Our view is that this should be mandatory from kindergarten on. Everyone should know how to safely remove a pin and throw a hand [00:41:00] grenade. Wait, no. No, it’s going back to Michael’s point. So the article that you’re referencing is from TechCrunch, we have a link to it, and the, Dave Richardson, the Senior Vice President of Software and Services Engineering at GM he he didn’t provide a timeline for when his L3 system might become publicly available.
And he played coy as to exactly how far the company had progressed in this mission. I, we, L3, look I’m not quite full Elon level bullshit, but I, we had Kyle working for us for a bit, so I’m kinda, just dipping my toe in the bullshit exaggeration waters.
Hey, look, you quoted me in an article. Oh, it’s just a website? I thought it was on TV. What’s going on? Not even a podcast? This is ridiculous. I like podcasts. My favorite is the Center for Auto Safety that ought to be a law. I regularly go to autosafety. org and click on the donate button. I’m too nervous to fill in my credit card details.
But you shouldn’t be too nervous because everything’s secure. Secure as driving in a car with a real driver. No automated drivers for us. Yeah. Oh boy.
[00:41:59] Listener Mail: Self-Driving Cars Debate
[00:41:59] Anthony: [00:42:00] Here’s a, I’m going to jump into a a listener mail piece. We got I don’t know if you guys saw this. I think he did. We had listener mail from Anthony.
It’s not me, just somebody else. And it was about self driving cars. He said, they, there already exist aircraft flying in three dimensions autonomously. That’s the only way flying cars can ever succeed. The general public can’t all get certified as pilots. Cars driving in 2d on existing roads. Can be made to self drive safely and better than humans.
Human reaction times are limited to around 300 milliseconds. Self driving systems can process data and react in microseconds. The automatic systems are not affected by drowsiness, alcohol, drugs, or stupidity. The Tesla cars are very close to full autonomy today. Okay, by Anthony, I meant this was clearly written by Elon.
This is gonna, I’m gonna let Fred take a stab at this, but my quick summonation is No. Fred, do you agree with my summation of No?
[00:42:59] Fred: Oh, I always [00:43:00] agree with you, Anthony , I think he said nanoseconds rather than microseconds, but
[00:43:05] Anthony: no, it’s a, it says, he says it’s 300 milliseconds and computers can react in microseconds.
[00:43:10] Fred: Microseconds. Okay. So it’s absolutely true. The computers can respond on microseconds, however. It probably takes thousands or millions of microseconds. So just depends on how you enumerate it. And, if you’ve done any computer programming, that one tick of the computer clock allows you to make one step through a program.
These programs have hundreds of thousands or perhaps millions of lines of code. So it’s not an instantaneous response. It’s true that if you are going to have flying cars, Not everybody’s going to be able to get a license for the flying cars. I’m not sure why that’s important, though.
[00:43:50] Anthony: And
[00:43:53] Fred: I’m not sure why that is support for the idea that self driving cars can never master [00:44:00] two dimensions safely.
Thank you for the letter, and thank you for the thoughtful questions. I’m not sure that it really holds up under technical scrutiny.
[00:44:11] Anthony: But thank you for sending that information. Anyone can send us a letter at contact at autosafety. org. And if I’m feeling like it, I’ll read it on the podcast, but now I think it’s time for
[00:44:23] Fred: you’ve now entered the Dow of
[00:44:25] Michael: Fred.
[00:44:27] Fred: Thank you. Always my favorite part.
[00:44:30] Simulation vs. Reality in Autonomous Vehicles
[00:44:30] Fred: I’m going to talk about micro versus macro and how that relates to simulation failures. There are a lot of people who think that simulations are the equivalent of actual. Experiments and tests that you can do on cars. That’s not really the case. And so let me give you this example.
From a reference in the British Isles. Let’s see if you can recognize this. She looked at me and I could see that before too long, I had [00:45:00] fallen in love with her. Sounds familiar? Yeah, there you go. This is definitely a micro attraction. Think of yourself in that situation. You’ve got a population in the room that you can only identify as she and all the cohorts of she, and then you somehow pick out one from that population and you say that, that’s the one that I’m going to fall in love with.
That’s the difference between the micro and the macro. The macro would simply look at all the she’s and assume that they’re the same, Or as the micro says here’s the one based upon microscopic cues and the alignment of eyes and, how you are going to pick this one out. Clearly a simulation that looks at all the she’s as equivalent is not going to be appropriate for what you need to do to direct your own life.
Then they go on to say my heart went boom when I crossed that room. That’s really an [00:46:00] automatic escalation of risk attended to this one. Individual’s otherwise statistically indistinguishable from background entities based upon microfactors. Okay, so this is an example of how a human being.
We’ll focus in on the important things that are happening, or as a computer or a simulation, we’ll look at the broad background and try to extract. Information that’s generalized, looking at statistical responses of, what those idealized objects might do etc. A model of a pedestrian in an AV simulation is definitely macro.
You’ve got a car driving down the street. There’s several hundred pedestrians. They can’t pick out the individual characteristics that you as a human being would pick out. Notice when you go to a stop sign and there’s multiple people there, you don’t watch the cars so much as you’re watching the individuals who are driving the cars.
Are they looking at you? Do you have eye [00:47:00] contact? Are the subtle gestures of the head suggesting that they’re going to move one way or another? And Anthony I know you’re in New York, but other people deal with subtleties and driving, but
[00:47:11] Anthony: talking about
[00:47:13] Fred: Slow down in the presence of small children, even if statistically the children aren’t going to run out in front of the car very often.
We still slow down because we recognize the escalation of consequences associated with each individual’s microscopic characteristics, something that computers will never do. And until simulations are able to replicate those micro influences and the responses of the humans based on our deep seated biological and ethical responses, they’re really unsuitable for validation of safety critical or life critical software.
This is why We can never rely on computers to do the validation of safety [00:48:00] criteria. It’s also why we need those safety criteria for cars in the first place, which are lacking. And basically, when simulations are able to replicate both the sources and the responses of real world micro influences on human driving behavior, they might be ready for a second look.
But please don’t be misled by the fact that a company will say we’ve simulated X billions of miles of driving. Those simulations that they’ve got are not like a human being responding to the microscopic influences to extract the information they need. From a diverse statistical background, that’s sounds like a lot of word salad, I know, it’s just important for people to remember that you need real data.
You need real responses, you need real safety standards that are validated in rural circumstances before you can rely on these [00:49:00] simulations as evidence of autonomous vehicle safety.
[00:49:05] Anthony: What if we’re living in a simulation now, man?
[00:49:09] Fred: That’s a different kind of problem.
[00:49:11] Anthony: Yeah. I had a perfect robo taxi fail situation the other day.
I’m driving in upstate New York, and I come to a a main road. I think it was actually called the main. And lots of traffic, and I come to a stop sign, and I want to cross this, but It was one of these things I got there and I’m like, I’m, there’s, traffic’s never going to open up. Never gonna happen because there’s no traffic lights.
I’m just, I’m stuck at the cross and there’s a stop sign there. And thankfully somebody came along who clearly must have been local and he’s yep, I’ve been in that situation. Stopped, waved me through. Robo taxis would never do that. People from outside of that town wouldn’t do that. I don’t know what the point of this story was, but I always say it was that robo taxis wouldn’t handle this.
There we go. Maybe.
[00:49:55] New York’s Drug Driving Loophole
[00:49:55] Anthony: Okay, let’s get on to where my brain really is at. And [00:50:00] there’s some New York State loophole. That allows you to do drugs and you can get away with it while driving. As long as you don’t go ahead and admit to what drugs you’re on, it seems like you can just go ahead. So there was a it’s an article from MSN, from CBS New York where there’s a loophole in New York that allows drugs driving.
They break down a scenario of somebody something that actually happened where an officer pulls somebody over and says, Hey, what are you on? Did a full sobriety test. The person says nothing. I’m fine. Whatever. The officer gives them Narcan, which is an opioid overdose medicine. And they arrest the person, but then a few months later, charges are all dropped.
Which is Insane. Michael, am I reading this right? Is this,
[00:50:47] Michael: Yeah, essentially, in the way they summarized it in the article, and this wasn’t an issue I wasn’t even aware of, if a driver refuses to name what drug they’re on, and they refuse a [00:51:00] toxicology test, they cannot be charged with drug driving.
And there’s apparently a list. So this is even weirder since we know that there are a lot of drugs that are out there that are, drugs, illegal drugs and ways for people to get high are being invented all the time. There are all sorts of things that go on. And If the drug is not on the list that’s put out by New York State, it doesn’t matter how messed up a driver is, how much drugs they’ve done, they cannot be charged with drug driving.
And that’s a huge problem and a huge loophole because essentially, anyone who is doing drugs can just not talk to the cop and refuse a field sobriety test and they’re not going to be charged. That’s a terrible way to address. Any type of impaired driving. And not only that, it’s a little scary.
I think that this isn’t just New York. This is also Alaska, Florida, and Fred, Massachusetts. So not my fault. There’s a few States out there [00:52:00] that, that. that apparently have similar laws. And it’s a problem. It’s such a problem that Jennifer Hemendy from the board chair of the NTSB is speaking out about it.
And, it’s something that I think should be changed in every one of these states immediately in their next legislative session. Because it’s it, it has to be awful for the families of people who are killed or maimed by impaired drivers to have to, figure out that there’s not going to be a punishment for that person.
[00:52:35] Anthony: Because in New York, and I believe in a lot of other states, if you refuse a breathalyzer test, you lose your driver’s license. Yeah. Like it’s an automatic admittance of I’m drunk and we’ve learned that recently in New York State in the said case of Justin Timberlake versus alcohol and the Hamptons.
Like he refused a breathalyzer and they’re like, yeah, you don’t have a driver’s license anymore. Have a nice day.
[00:52:58] Michael: Yeah, that’s an example of how really [00:53:00] good lawyers can get you off anything.
[00:53:01] Anthony: Yeah. Oh, he got away with it?
[00:53:04] Michael: Yeah, he did. He I think he had to pay a 500 fine or something and didn’t get punished otherwise.
He pled down to something.
[00:53:10] Anthony: Boo.
[00:53:11] Michael: Disappointing. Disappointing.
[00:53:13] Anthony: But you can literally I can’t understand how this loophole even started what legislator is wait a second, I don’t drink and drive, but needles and driving are fun, like, how else did this loophole get in there?
[00:53:28] Fred: I’m not sure, think it goes back to probable cause, doesn’t it, Michael?
If you can’t identify the chemical agent that you think somebody is using, how do you charge them for it? Is it the behavior that you can make illegal or do you have to trace it back to a particular cause?
[00:53:48] Anthony: But like alcohol, the same thing, like you can’t identify hey, wait, did you have an old fashioned or are you drinking a Manhattan?
You were drinking alcohol.
[00:53:55] Fred: Alcohol is easy to chemically identify. There’s A lot of ways to do that. [00:54:00]
[00:54:00] Michael: Also, alcohol has, there’s a very distinct connection between, your blood alcohol level and impairment. That’s very similar from person to person enough to where it’s convincing in court, with drugs like marijuana is a great example there.
You can test someone’s Saliva or blood or whatever the test that they have available at this point, but there’s really no good way to connect that to impairment on a, on a scientific level you can,
[00:54:30] Anthony: I disagree. I think the easiest way to test for marijuana impairment is to offer the driver a bag of Funyuns.
And if you want to eat a Funyun, you are high as a kite. That’s good. That’s a good point.
[00:54:42] Fred: That’s good. Yeah.
[00:54:43] Anthony: Yeah. You’re impaired. Officer goes, Funyun? Yeah, man. Get out of the car. Get out of the car right now.
[00:54:52] Fred: I’ve heard that I’ve heard that the number of brownies you can consume is also related to that.
[00:54:57] Anthony: Hey, but look, I’ll eat a brownie sober. Brownies are [00:55:00] delicious. Okay? Bunions are disgusting, okay? It’s
[00:55:04] Fred: true, they are.
[00:55:05] Anthony: It’s like Blue Ranch Doritos or, yeah, anyway. Hey, let’s jump into recalls.
[00:55:11] Fisker’s Recall Controversy
[00:55:11] Anthony: Actually, wait, before we jump into recalls, do we have any update on our friends at Fisker? And And what’s going on where they claimed, Hey the car’s under recall, but you gotta pay for it.
Oh, okay we’ll pay for the parts, but you gotta pay to have it fixed, man. And Michael Brooks, our deal leader, is like, No. No. No. And it seems that the DOJ’s listening to the show and saying, Wait a second.
[00:55:37] Michael: Yeah, that was some really bad lawyering by Fisker who thought they were somehow going to weasel their way out of recalls on their vehicles while they were in the process of bankruptcy.
Essentially Yesterday, the Department of Justice writing for NHTSA, who is the heavy behind all this, and who probably referred this to the Department of Justice objected [00:56:00] to Fisker’s bankruptcy plan because, They’re, essentially they said the scheme that Fisker created around their recall repairs violates the safety act, which I think we said a few weeks back anyway.
So the DOJ is objecting to the settlement plan. So it looks like Fisker is going to have to go back in and rejigger their entire bankruptcy plan to accommodate. The owner. So if you’re a Fisker owner who was worried about having to pay for one of these recall repairs, or even find a place to get them done, it looks like you might have some good news coming your way.
For Fisker the news doesn’t look too good. They are, in dire straits. And if you do own a Fisker, I would advise that you. Get these recalls as soon as possible while they’re available and then figure out how you’re going to repair that vehicle after Fisker goes under because it’s going to be really difficult to do that in the next few years, since you’re not going to have the manufacturer [00:57:00] support your vehicle is, a software based vehicle and you’re not going to have updates or security patches or any of that.
coming for your vehicle. There’s a real question as to how much longer your vehicle is going to be viable as an actual car that works on roads.
[00:57:18] Anthony: But computers don’t get distracted or drowsy or do drugs. They’ll be fine. Forever. Open the Podbay doors.
[00:57:27] Fred: I was out on a bicycle ride a couple weeks ago, and I saw a Swatch car that somebody had put into their front yard and turned it into a planter.
It had a maple tree growing out of the driver’s seat, so I wonder if we’ll be seeing some Fisker planters.
[00:57:42] Anthony: Hey, send us your photos of your Fisker planters as soon as you have it. We suggest a maple tree or a peach tree. And if you do a peach tree, make a pie. And if you make a pie, send it to us.
That was a lot.
[00:57:56] Automotive Recalls Update
[00:57:56] Anthony: Okay, recalls. Chrysler. 154, [00:58:00] 000. 32 vehicles, the 2020 to 2024 Jeep Wrangler. These is the plug in hybrids. They may have been built with battery packs, which contained cells that are susceptible to separator damage. This is a park outside warning, and this could be very bad.
[00:58:21] Michael: And if you had listened to the podcast a few weeks ago when NHTSA opened the investigation on it, you would know that we already issued our own park outside warning for these vehicles from the center because at the time when NHTSA opened the investigation, I think there were 13 Fires that had happened and yet Chrysler was apparently not convinced about doing a recall.
I think it probably took a month before NHTSA convinced them that was the way to go. So now we have an official recall on those vehicles we spoke about and owners should be hearing about repair by the end of the month. It looks like. [00:59:00] There’s going to be a, looks like there’s going to be a software flash followed by an HV battery replacement if needed.
So they’re going to be evaluating your battery to see if it needs to be replaced.
[00:59:14] Anthony: Yeah. And a warning to listeners, Chrysler is not quite at the Fisker level of financial doom, but things aren’t looking great there these days. And next up, next recall, Chrysler! 129, 313 vehicles, the 2023 2024 Ram 1500 pickup truck!
They might have been built with a steering column control module, where mechanisms within the turn signal lever experience an interference condition. I had that once after eating a a spicy meal, and I got through with some antacids. Will that work in this case? Maybe. Okay. You’ll be notified hopefully this week and then, actually no, [01:00:00] you’ll be notified at the end of October, early November.
[01:00:02] Fred: No, I like fart jokes. Why not?
[01:00:04] Michael: Hey, I didn’t say it was a fart joke. In any case, this one is basically, when you take a right turn and use your signal, when your steering wheel returns back to a certain point, your turn signal is turned off automatically. That’s part of federal motor vehicle safety standard.
1 0 8. And what happens is in these vehicles, the turn signals not self canceling, and it’s staying on just like your grandmothers for miles down the highway. And that, that in turn will confuse other drivers. They think you’re late. Changing lanes or turning when you’re not and can cause safety problems.
So I think this should be a relatively simple fix. I’m assuming it’s going to be a software repair of some sort, or maybe a new turn signal lever, who knows, but that should be coming out in about, I think the last date owners can be [01:01:00] expected to notify would be right around the start of November.
[01:01:02] Anthony: Okay, next recall. Tesla, 27, 185, 2024 Tesla Cybertrucks. Okay, so that means, again, they sold over 27, 000 of these things. They’re on the road. When can we crash test them? No? The vehicle system I mean go ahead.
[01:01:22] Michael: If you really want to crash test a Cybertruck, you should, go ahead and plunk down your deposit and get in line.
[01:01:29] Anthony: Oh, no. I Certain vehicles under certain conditions, the rear view image may not complete a shutdown process before the system is commanded to boot up. If the driver starts a backing event before the vehicle system completes its shutdown and boot up, the rearview image may not display within two seconds of placing the vehicle in reverse as required by FMVSS.
Which one, Michael? Come on, what’s it?
[01:01:53] Michael: One eleven.
[01:01:54] Anthony: Ah, there you go. This is more rear view camera stuff. I don’t,
[01:01:58] Michael: yeah. Why, there’s [01:02:00] nothing special here other than the fact that, every manufacturer is struggling with the rear view camera recalls. I haven’t seen I think we’ve seen a recall on that from every manufacturer, multiple, in fact, it’s been one of the leading problems in vehicles in the last three or four years.
And, the standard, I believe came into force in 2018. So they’re Growing pains here. And also there’s a problem where a lot of manufacturers are integrating all of their electronics into one system. And you see the rear view camera system being integrated with stereos and other entertainment parts of the vehicle which pose a problem, but this one, I’m going to assume since it’s Tesla let’s see, it’s going to be an over the air software update remedy.
And it looks it’s going to start, it looks like you can, it’s hard to tell because Tesla doesn’t get into this and the owners are going to be notified late in November but it doesn’t exactly specify when or if [01:03:00] you might receive the over the year update that actually fixes the condition.
[01:03:04] Fred: There was a report that owners thought their cameras were defective because whenever they saw people in the rearview camera, the people were always laughing at the Cybertruck, so I don’t know if that bears on that at all.
[01:03:21] Anthony: Cybertruck, the only car with a built in laugh track. The last recall from Evigect Inc. Oh, 8, 900 things. Cause I don’t know if these are shards. These aren’t
[01:03:33] Michael: shards, these are these are chargers and they, it looks like a charger that yeah, it’s designed so that I think you can just put it in.
Pull off without unplugging it somehow. I don’t know. It’s called an EV jet. So there’s something, there’s an eject portion of this. But this one was interesting. We don’t do a lot of equipment recalls, but this was interesting because. EVJEC, the company, [01:04:00] actually got a letter from Tesla saying that they think that the product was unsafe because it didn’t have thermal protection built into it.
And Tesla apparently stayed on top of this company. Until they issued a voluntary recall. I, I wish we could find a way to make Tesla respond in a similar way and a lot of junk that they’ve done. But apparently Tesla is very interested in this independent charging companies products, because I believe they were concerned that if fires occurred due to these chargers, that.
The Teslas were going to be blamed and they’d have to deal with that headache. So hit Tesla bullying, smaller battery charging companies is basically what this recall comes down to. I’m assuming owners are going to be getting a replacement. And that looks like that’s already in the works.
The notification should have been started a couple of days ago.
[01:04:57] Anthony: All right.
[01:04:57] Conclusion and Safety Reminder
[01:04:57] Anthony: That’s the end of our show. Remember listeners [01:05:00] buckle up, make sure your car is recall free, and if not get it fixed and send us a picture of your, what you’re doing with your Fisker now.
[01:05:11] Fred: Thank you folks. Thanks for listening.
For more information, visit www. autosafety. org.