Breaking the law… and the AV’s that get away with it.
Waymo’s ignores school bus stop signs and warnings, Tesla releases something called “Mad Max Mode” cause they are sociopaths and there is little oversight and regulation for these vehicles. IIHS explores the crash data around headlight glare and we dive into recalls.
- https://www.msn.com/en-us/money/companies/us-investigates-waymo-robotaxis-over-safety-around-school-buses/ar-AA1OP2BD
- https://www.wusa9.com/article/news/local/dc/waymo-autonomous-vehicles-washington-dc-council-charles-allen-ddot-safety-study/65-470f58be-8c9d-4b5b-96fe-d831a9b1db50
- https://jacobin.com/2025/10/self-driving-cars-morrison-nhtsa
- https://www.iihs.org/news/detail/headlight-complaints-abound-but-glare-related-crashes-havent-increased
- https://www.youtube.com/watch?app=desktop&v=d_apmcDJGS8
- https://www.benzinga.com/markets/tech/25/10/48268811/elon-musks-tesla-risks-nhtsa-scrutiny-with-fsd-v14-1-2s-mad-max-update-that-ignores-speed-limits
- https://www.washingtonpost.com/nation/2025/10/16/commercial-trucker-drivers-english-proficiency-trump/
- https://static.nhtsa.gov/odi/rcl/2025/RCLRPT-25V676-8306.pdf
- https://static.nhtsa.gov/odi/rcl/2025/RCLRPT-25V678-9029.pdf
- https://static.nhtsa.gov/odi/rcl/2025/RCLRPT-25V685-7473.pdf
- https://static.nhtsa.gov/odi/rcl/2025/RCLRPT-25V686-8702.pdf
- https://static.nhtsa.gov/odi/rcl/2025/RCLRPT-25E068-2476.pdf
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
Hey everybody. Welcome back to another episode. Today is Wednesday, October 22nd, and this is another episode of Waymo ha Wmo w woo.
Waymo’s School Bus Incident
Anthony: Anyway, we got a lot of Waymo stories starting with one from Reuters, US investigates Waymo Robo taxis over safety around school buses. So they say that autonomous vehicles are gonna get ’em to drive like us, but better.
And these Waymo’s unfortunately drive like the assholes of us quoting from the article. Nitsa said the Office of Defects investigation opened the review after flagging a media [00:01:00] report describing an incident in which a Waymo autonomous vehicle did not remain sta stationary when approaching a school bus with its red light flashing stop arm deployed and crossing control arm extended the report said the Waymo vehicle initially stopped besides the bus then maneuver around its front passing the extended stop arm and crossing control arm while students were disembarking.
So that’s a ticket.
Fred: Not much of an endorsement for AI driving vehicles. Is it? Yeah,
Michael: I think I think this came from, this came out of Atlanta from the Waymo’s that are being tested there. I’m not sure, it’s hard keeping track of everywhere these vehicles are. I’m not sure if they’re driving without a safety driver in Atlanta at this point.
But from what it looked like, there was a partial video of the incident that really just captured the Waymo kind of pulling away from the school bus. But it looked like there was a a side road or a driveway [00:02:00] essentially perpendicular to the road that the school bus was on. The school bus that stopped was letting off passengers, in the general area of that.
Side road that the Waymo was on, and the Waymo didn’t appear to recognize that there was a school bus there. Maybe it saw a school bus, but it certainly didn’t see or couldn’t respond to the the stop arm, the flashing of the school bus, the crossing arm, all those things that, you know, when a human sees a school bus in that position, most of us are very aware that we need to stop and just hold our position until the crossing arm.
It goes away. The flashing lights stop and the school bus pulls off. The Waymo instead maneuvered and pulled out in front of the bus and around it in order to make a left turn and then drive on down the road. Apparently that’s, I think, a thousand dollars fine or something in, in Georgia.
Autonomous Vehicle Regulations and Liability
Michael: And here we have another [00:03:00] situation where not only is a autonomous vehicle making some seriously questionable decisions that could result in a safety problem, but we also have an inability for state authorities to do anything to regulate that behavior. And even if they did, giving Waymo a $1,000 fine is, less than peanuts.
A lot of problems here. NSA’s looking into it, I’m, you, things like this. And what I fully expect to happen next is Wabo saying, oh, we already changed our software to fix this, blah, blah, blah. This is done. And Nitsa closes the investigation. So I. I sometimes I wonder, why, is Nitsa looking deeply enough into some of these violations on the part of not just Waymo, but other autonomous vehicle companies?
Anthony: So if I do this, I get a fine plus points on my license. Does Waymo get points on their license? ’cause they have, it’s the Waymo driver or the Aurora driver or their software. Yeah, they don’t have a
Michael: [00:04:00] license.
Anthony: So
Michael: Yeah. How does that there’s nothing to put points on what,
Anthony: yeah. So this is the problem because I do this in my insurance skyrockets.
In New York State, you get a certain, after a certain number of points in your license, they take your license away. And as far as I can tell, Waymo doesn’t have a license for a reason. ‘Cause it would’ve way too many points on its license.
Michael: Yeah. This is, I don’t know, I guess this is the problem with a lot of state’s autonomous vehicle legislation, a lot of that legislation was passed based on, principles that were handed to them by the industry itself, where they basically have to, acquire insurance and check a few boxes saying they’re gonna be safe.
But what the states are not doing out in this, rush to get these vehicles on the streets is assigning liability for the computer driver. And, essentially treating that computer driver like a licensed driver and [00:05:00] giving it penalties points. There, there is no enforcement system around autonomous driving at the moment in, in almost any states.
I guess you could argue that California and has done that with some recent laws they’ve passed, but it still, that’s incomplete and hasn’t been put into effect yet. So it, it’s a huge problem and I think if you saw. Some significant fines in these situations. And, a thousand dollars isn’t going to deter a, multi-billion dollar manufacturer.
You’re going to see, need to see significant penalties and, punishments. If you make this type of movement on a public road in Georgia in an autonomous vehicle, why are you still allowed to operate? Is my question. Shouldn’t you be suspending the operation until they’ve rectified this problem and proven to authorities that, this isn’t something that’s gonna happen again and things like this won’t happen again.
And, I, other than, this rush to [00:06:00] innovate that you see capture legislators in, in their, in, in passing these laws I don’t see any reason why these systems aren’t being set up to ensure that enforcement is taking place. So I’ve
Fred: got a question for you related to that.
Most of the public relations announcements from Waymo will say that Waymo is five times safer or five times less collisions than human drivers. What does that even mean? Does that mean that if Waymo has one bad collision, they get five good collisions? I just, I don’t, it’s, I just don’t understand.
It almost leaves me speechless, which is unusual for me, but it’s just it’s just crazy. How does that work? If you’ve violated one safety announcement from a school bus and you drive past one and you kill a kid, does that mean you got five more to go before you get [00:07:00] credit for that one kid you’ve killed?
How what does this even mean? We’re in a strange universe where language is meaningless.
Michael: I will note just one correction. I said, I wasn’t aware if there’s a safety operator in the vehicle. There was no safety operator in the vehicle according to nhtsa. That’s part of the resume.
So apparently Waymo is testing without safety operators in Georgia at this point.
Anthony: Hey, there are no Tesla does way So wait. Waymo operates in California, and California law requires all motor vehicles to have car insurance or all drivers to have car insurance. Does Waymo have car insurance or are they allowed to bypass that or has Alphabet decided to start their own insurance company to ensure their own fleet?
Michael: That’s, I, the way I believe it’s written in most states, that is that manufacturers have to, basically, if you’re testing AVS in California, I know you, they have to put up like a $5 million insurance bond per vehicle. They’re [00:08:00] essentially no total, which is, not even enough technically to cover what Nitsa says is the cost of one life.
That’s problematic for a lot of reasons. We think that insurance bond should be significantly higher and that cost nets the nothing to get that. Yeah. And I assume that most of these manufacturers are self-insuring there, but who knows? Maybe they’re buying, a wouldn’t be a very large policy.
That’s $5 million or something you might commonly see on, individual humans umbrella policies for their home insurance or their car, something like that. So that’s not a lot, and that’s not going to cover, that’s certainly not gonna cover an incident where an autonomous vehicle crashes into, a, a group of people or a, it’s never $5 million is not a lot at all for these companies to have to put up.
So even in California where. They have prob, in, in the most restrictive regulations in the country around autonomous vehicles, you’re seeing a severe lack [00:09:00] of enforcement or in this case, surety provided by the manufacturer that if there is a problem that takes place, they’re going to be able to at least compensate victims or their families.
Fred: Yeah, but isn’t it true, Michael, that even if there is insurance, that’s no guarantee somebody’s going to get paid or no injury caused by some device. You’re still gotta go through lots of hoofs and hurdles. You know this,
Michael: right? Getting around the, mandatory arbitration clauses in, in the contract.
You you sign with Waymo when you get in the car, right?
Fred: Yeah. The little things like that. Also, the fact that the school children you’re going to run over have not signed and the agreements that allows them to, get compensated for whatever injuries they get outta a vehicle, right?
It’s gonna go through a whole lot of legal activity in order for the, any of them to get paid. Isn’t that so the insurance, the fact that they posted a bond for, x millions of dollars to me, seems pretty irrelevant to the human [00:10:00] circumstances of somebody being injured by one of these deadly vehicles.
Anthony: I’m, I, the reason I ask for the insurance question is ’cause that’s part of what keeps human drivers, in theory at least, and auto manufacturers trying to be as honest and safe as possible. Because if your car fails because of some issue with the manufacturer of the car, the insurance premiums on that type of car will be more expensive than other ones.
If you get into accident, speeding tickets and whatnot, your insurance premiums go up. If Waymo and these other companies don’t have that oversight of the insurance industry whatever effect that has, that’s just another part of. What every other road user every other driver deals with. That’s one less, oversight.
I couldn’t imagine Geico being like, yeah, go ahead. There’s no human in the car. We’ll insure it. That sounds great. We have a lizard as our mascot. And
Michael: yeah, I guess you could just see that as one less layer of protection. Right now we [00:11:00] have zero federal standards around autonomous vehicles.
The state standards are clearly lacking and insufficient, and insurers don’t have any they’re not heavyweights in that industry because the AV manufacturers aren’t using in the major car insurers and they’re not gonna be punished by those insurers for violating traffic laws or for getting into crashes and other things.
The AV industry has exactly what they want. In that regard. And they’re, surging towards trying to get more federally in Congress by, essentially getting preempted from state laws that could tackle some of these problems. So it’s, quite literally we are in the wild West with this stuff.
And, I think we’re there in a lot of areas involving AI and other things. But, when it comes to cars, there is no question, that there, there simply aren’t enough state and federal regulations around these companies to help ensure safety for any of us. It’s not just the people in Teslas that are [00:12:00] at risk, it’s everyone on the roads and, Waymo’s are essentially a computer that, that threatens, could threaten us all.
If they’re not operating properly. And, certainly in this case, in Georgia, pose a threat to the children exiting that school bus.
Anthony: Oh, when where’s the nearest insurance lobbyist? When you need one?
Waymo’s Expansion Plans in DC
Anthony: Continuing with Waymo. Let’s go to the District of Columbia. So Waymo wants to put its autonomous vehicles out on the road of dc but then this big orange smear is like we’re shutting the government down.
Part of that is cutting off funding to the District of Columbia that was going to have a safety study around how autonomous technology would work on DC Streets. Now this is good that they wanted to do this study because, how do autonomous vehicles deal with, I don’t know, a presidential motorcade?
’cause they can’t handle a stop flashing stop line on a school bus. [00:13:00] I imagine the Secret Service would be like, oh, we need these pieces of crap out of our way, or some other dignitary, something like that. So unfortunately it looks like unfortunately, I kid myself it looks like the Waymo deployment in DC will be delayed unless I, Mr.
Brooks is gonna tell me there’s some loophole around that and the Secret Service was gonna blow there. No,
Michael: what did I cut off your joke?
Anthony: No, there was no, it was not, no, it is not a joke. This I,
Michael: The, some of the folks in the article appears to be leading us to the conclusion that this is directly related to the government shutdowns, I, or, to budget cuts.
But it looks, to me it looks like this study was never really started may have been delayed anyway regardless of who became president in January. It’s not clear that this study is was something that was going to happen on Waymo’s timeline. Although, I.
Personally have been interested in seeing [00:14:00] Waymo’s launch in DC because that’s the closest city to me where I could, observe their behavior.
Anthony: I think Fred and I, and you as well Michael, have pointed out the absurdity of some DC streets. I would love to see a, an autonomous vehicle navigate streets that run parallel to themselves in the same direction.
A block apart from each other.
Michael: DC has a lot of interesting roads and streets. I don’t know how unique it is in terms, I know there are a lot of, more, circles and diagonal type roads running around DC that make it somewhat unique. But, I don’t see it as, the type of unique environment that AVS would struggle with as I do someplace like New York City where you have just this crush of humanity that’s unlike anywhere else in the United States.
The district is, although it has its, transportation, some interesting things going on, I guess in terms of, congress and, a lot of different police forces operating and lots of parks and, [00:15:00] weird spots and traffic. I don’t see it as a driving environment that proposes any peculiar challenges in the way that, places with a lot different weather patterns or, places with a lot more elevation and elevation changes would pose for driverless vehicles.
Fred: I don’t know, I don’t think the logic is gonna handle cabinet secretary swimming and sewage very well. That was on the Rock Creek Parkway, so there’s gotta be some special accommodations for Washington.
Anthony: Yeah, only jeans, no shirts. Continuing the world of autonomous vehicles. Hey guys, you never think this was possible.
I’m talking to listeners. The person overseeing autonomous vehicles is somebody who came from the autonomous vehicle world. We’ve talked about Mr. Morrison before. He’s the, is he now the full head of Nitsa Michael? Is he? Yes, he is confirmed and
Michael: He is running the agency.
Anthony: Yeah. So before this, he [00:16:00] was a lawyer working for a fruit company out in California called Apple, and he was working on their self-driving vehicle.
And this is funny from an article in Jacobian, I’m gonna quote, when confronted by lawmakers at his nomination hearing, he declined to discuss the work he did for Apple, refusing to answer, refusing questions about whether he was involved in Project Titan, their self-driving car project. I love this like you are getting nominated for public office.
Yeah. I’m not answering that. No.
Michael: Yeah, that’s really my biggest problem there. You’re not even willing to disclose whether or not you worked for an autonomous vehicle company. Yeah, I worked for Apple, but I can’t tell you if I worked on the car,
Anthony: I was in the t-shirt department. Come on.
Michael: Yeah. It’s pretty obvious he did. I think, he came from Nitsa to Apple and at the time they were trying to launch a self-driving car. I don’t think there’s any question that’s what happened there. But, the, while Mar who was awaiting confirmation and wasn’t in the building [00:17:00] at Nitsa, Peter Simshauser, their chief counsel was effectively acting as the administrator of the agency.
And he came from Motional, which is also an autonomous vehicle company. It’s pretty clear how the administration is approaching safety at nitsa. That’s by putting people from industry into positions regulating safety, or not regulating safety as the case may be.
Anthony: This episode brought to you by Fox and Henhouse, the new animated series explaining how our government works, the Fox and the henhouse.
Michael: And this is nothing new, right? We’ve had a revolving door between industry and the National Highway Traffic Safety Administration amongst many other government agencies for, since, probably since before, at least two of us were alive, maybe before even Fred was around on this Earth.
So it’s become, it’s. We’ve, we haven’t ever seen them directly nominated to the position of administrator, but we have seen a lot of folks moving [00:18:00] back and forth in the chief Council’s office from industry and into, other positions in research or in enforcement.
I’m struggling to remember if there was a, an actual administrator that came from the industry. The closest thing I think we saw to that was Andy Card being George w Bush’s chief of staff. He wasn’t, at Nitsa, but, the president’s chief of staff also being from General Motors is obviously gonna have some impact there.
So this is somewhat, a, a new thing with a administrative NSA coming directly from an autonomous vehicle company. But, we can’t confirm that since Morrison’s unwilling to break the non-disclosure agreement, or I think Tellingly willing to he wasn’t required to, sell his Apple stock that he accumulated during that time period.
In order to take a job regulating the industry he was in. So there’s obviously all sorts of conflicts there.
Anthony: Wonderful. Civil servant. Civil servants. Alright let’s get onto something that really burns my eyes. Yeah.
Headlight Glare and Safety
Anthony: [00:19:00] Headlights. The Insurance Institute for Highway Safety has finally listened to my complaints that I bring up all the time about the goddamn headlights blinded me all the time, increasing glare, but they got it wrong ’cause their articles titled headlight complaints abound, but glare related crashes haven’t increased.
Sure. I have not crashed, but I feel like crashing every time I’m blinded quoting from this, although it can. Can certainly be uncomfortable. Headlight glare contributes to far fewer crashes than insufficient visibility. I, IHS President David Harkey said, but that doesn’t mean reducing glare isn’t an important goal one that we’ve long focused on at IHS and in addition to approving illumination.
Now I think that’s great and I’m frankly shocked that this has not led to more crashes.
Michael: Yeah. That’s one of the biggest issues that I’ve always wondered about here. We’ve heard a lot in the last few years, particularly from folks who are saying that glare is a big safety issue, but when you go and look at the [00:20:00] data and try to find incidents of glare contributing to crashes, they’re very difficult to find.
And, IHS, did exactly that. They went back and looked at about the last 10 year, I think, 2015 to 2023. So nine years, they took data from 11 states. It was, I think there were millions of crashes involved here. And they found, I think they found, one, maybe two crashes related to glare out of every thousand of those crashes.
But, I think significantly and this is the point that David Harkey makes, in, in that quote. The, bright lights on a vehicle are necessary to provide drivers with visibility at night, and lack of visibility is much, much more likely to cause a crash than glare.
And when you look at it that way, and everything we see in the industry and in auto safety, there, there are some trade-offs, right? We talk about them all the time. There’s a trade off, between the type of [00:21:00] windows on the side of your car. You’re not gonna be able to escape through a, window that is made of safety glass, but at the same time, it’s gonna keep you in the vehicle in a crash and prevent, prevent you from being ejected. And the numbers essentially show that the, the. There’s a low chance of being trapped in a vehicle and having to escape through the glass versus a very high chance of people being ejected through that glass.
And so in order to meet site impact protection standards, manufacturers have been moving from tempered glass to glass that you know, is essentially much harder to break in a collision and also hard to break when you need to escape the vehicle. But overall at the higher levels looking at down at safety, you’ll see that’s going to ultimately save more lies, even though some people aren’t going to be able to escape the vehicle.
I think there are probably some ways we could improve that. And, we certainly, egress from a car is very important [00:22:00] to us. So we have a problem with the way that’s gone, but. On the other side here with lights, you see a sig, a lot of deadly crashes happen at night.
I think it’s more than even though nighttime periods there aren’t as, there isn’t as much night in the United States as there is daytime. Over the course of a year, we see, I think 51% or greater deadly crashes happen at night, and a lot of that is due to poor visibility. And so lights are in, incredibly important.
And Ih S’S study is showing that glare, is going down, according to their study going down significantly over the last 10 years. And there’s no change in how often glare is mentioned in crash reports. It may just be that, a lot of us, Anthony have problems with glare.
A lot, as we age, we all, our macular degeneration happens, our eyes, I see what you did there. Start to our eyes, start to be more, more sensitive to light, more sensitive to these types of things. And they, they [00:23:00] say in the article that most of the complaints about glare are coming from older adults.
And he’s gaslighting me. Look, and a lot of the complaints about glare are coming from people who aren’t involved in a crash either. They, the glare blocks their visibility briefly and they are, complaining about it, but even in the absence of a negative safety impact.
So I’m not trying to ppo it too much. It, it’s it’s certainly a factor in some crashes. But I think that having lights that are bright enough to show the road and on top of this having, we know that advanced driving beams or adaptive driving beams. Are coming to America at some point.
They’ve been in Europe for a decade. Hopefully they’ll get here at some point. We’ve had some issues due to our archaic due to NSA’s inability to get rulemakings out in a timely manner, frankly. But when I, I think adapted driving beings are going to rectify a lot of this. I don’t think we’ll ever get to a point where, all vehicle headlights are perfect and no one is going to be affected by [00:24:00] glare.
But I, I think we’re moving in the right direction here. And the IHS study confirms to a pretty large extent that glare is not a massive factor in crashes and is far outweighed by the need for, visibility at night.
Fred: Two things about that first is that IIHS study did not consider the impact of daylight running lights.
And the fact that the lights are bright makes cars more visible during the day. So that may well be a compensating factor as well. And second is Anthony, do you have a keep off my damn lawn, sign up in front of your apartment building.
Anthony: I live in an apartment building. I don’t even have a lawn. Keep off my damn.
Yeah, this show’s over. This is,
Michael: yeah, that’s an interesting point. I guess for another reason I think IHS was, looking at nighttime only in their study, but daytime running lights are required in Canada to make cars more visible. The United States has refused to do that, and [00:25:00] I think there are numerous studies showing that, that daytime lighting on vehicles really helps with daytime visibility and can prevent crashes.
And there’s no, a lot of cars already have ’em. There’s, it’s not like it’s a significant financial commitment by manufacturers to get them on cars. I’m, yet again wondering, what the holdup is on requiring those in American cars.
Anthony: I think my issue is, there’s many issues I have, but this I think might be more of and Fred, you’ve pointed this out in the past, is the angle that these headlights are shown.
They’re aimed, not necessarily at the road, but they’re aimed inside my skull. And that’s the part I don’t enjoy whether the vehicle’s behind me and all of a sudden I feel like I’m being abducted by aliens or they’re coming at me and I feel good thing, I’m not a moth today ’cause I would die.
Michael: Yeah, there’s just so many things there that could be the problems, right? There’s incompati incompatibility between vehicle heights, so you know, we, all of us have seen big [00:26:00] pickup trucks that are shining directly into our eyes, right? There are people approaching you who have their, are behind you, who have their brights on, and who just simply don’t give a damn about everybody else on the road or have, are clueless, one or the other.
So there are a lot of, and even if if there’s a crash, and IHS points this out in the study, if there is a crash. The car that, that with the bright lights that, caused the glare and potentially caused the crash is gone. They’re not stopping.
They might not even be aware there was a crash. Because they couldn see it just nothing white light in front. There’s no proof there of, were their headlights not properly calibrated. Were their headlights not aimed properly? Were their brights on? We don’t have a lot of answers there because these crashes are very difficult to study.
Anthony: With that gentlemen, I think it’s time for now. Actually, before that, if you agree with Michael, and this isn’t a big deal, go to autos savee.org. Click on donate, and donate in an amount that ends in zero. If you think I’m right and you want to join me in a revolution to put [00:27:00] signs on our front, lawns donate in the amount that ends in a five.
Michael: I, what I think you should be doing is pushing for, the United States to write a regulation and, that allows adaptive driving beams into vehicles in America. That’s really the problem, is that we have an archaic regulatory scheme around headlights at this point that’s not allowing for innovative designs that can, make visibility at night better by seeing around curves and other things, but also automatically dimming bright lights when there are other cars around and things like that.
The potential for a lot of really good things happening with adaptive driving beams is there they just have to be allowed on the United States roads.
Anthony: I had a Kia rental car that had the adaptive lights that would turn on brights and turn them off. And I loved it. Yeah that’s pretty great.
It great. It was cool.
Gaslight of the Week
Anthony: But anyway, let’s jump into Gaslight. ‘Cause I feel like I was just gaslighted personally, but that’s okay. I am getting over it. Fred, what’s your gaslight is your his [00:28:00] gas Hit the mute button.
Fred: It’s on now. Anthony, that’s not a gaslight, that’s an on button. Okay. So my friends at Kodiak have announced Kodiak is a company that’s making computer driven trucks, heavy trucks.
And they’ve announced that based on analysis by nato, that they are a winner in let’s see, what is this category? The Kodiak driver, Kodiak’s AI driven autonomous system earned a Vera score of 98 tying for the top spot among the more than 1000 commercial fleets in autos network. All right. That’s very interesting.
So what the hell is nato?
Anthony: Yeah, that’s what I
Fred: wanna know. It turns out that NATO is a surprisingly predictive AI company that is selling predictive AI attached to a camera that they clip onto the rear view mirror of these trucks. So the predictive [00:29:00] AI is predicting that the predictive AI of Kodiak is better than all the other trucks around that do not use predictive AI because there just aren’t a lot of fleets doing that.
These are 18 wheelers, right? What could possibly go, what could possibly go wrong with this? Lots of things. They don’t, they do not go on to specify exactly who is included in the study, whether or not they’re comparing apples and oranges or computers to human beings or potatoes to cucumbers, who the hell knows what they’re doing, but they’re very sure that they’re really good and they’re very happy on.
To be able to tell people how exactly good they are. So based upon the fact that there’s absolutely no data presented to show what the hell they are or who they’re doing it I’m going to give them my Gaslight nomination this week.
Anthony: Excellent. Well done. Mr. Brooks.
Michael: Oh, this week again, I’m following [00:30:00] Anthony’s lead since Anthony, con, continually I guess lambasted, asset managers. Investor advisors. I’ve got another one this week and this one is, wild in that it’s, I don’t know, it’s a gaslight. He’s gaslighting himself. He’s gaslighting us. I’m not sure, but it’s Gene Munster, maybe it’s Gene Munster of Deep Water Asset Management. Talking about the investigation that NSA opened at a Tesla full self-driving a couple of weeks back.
He says the Nitsa probe is actually good for Tesla because it could be a testament to Tesla’s efforts in developing autonomous driving technology. And he says, Tesla just stands alone in this category of consumer autonomy and. With that quote, I completely agree. Tesla does just stand alone in this category of con consumer autonomy because they’re the only manufacturer deploying it recklessly.
And without regard for a lot of the science behind how humans interact with partially autonomous vehicles they’re [00:31:00] the only manufacturer that is reporting, thousands of crashes to nitsa related to level two vehicles. And they are, quite frankly, it’s obvious, yes, they do stand alone.
They are the least safe company touting autonomy by a country mile. And so even though that’s not how Mr. Munster intended for me to interpret his statement I’m going to interpret it that way because to interpret it otherwise is simply nonsense. And for that, gene gets my gaslight of the week.
Anthony: Oh, well done.
I don’t even have to go, Michael, you already won Gene Munster. Come on off the bat, but I’m gonna go anyway. Same article. This is from Benzinga. It’s a safe website to go to. Don’t worry. It talks about how Tesla’s rolled out something called Mad Max Mode for their full self-driving system.
So in their latest release, 14.1 0.2, they introduced a new speed profile, mad Max, which comes with higher speeds [00:32:00] and more frequent lane changes than hurry. Wait, what? So first of all, they had a mode called Hurry, and then now they have one called Mad Max. And this is a company saying we’re the safest vehicles on the road.
Tesla’s Controversial Driving Modes
Anthony: What is happening here? There’s definitely cognitive dissonance. There’s should we all just be on ketamine? Is that, what’s, is that the only way this makes sense? So
Michael: It doesn’t make sense to me. You continually see Tesla’s been under fire from Nitra on a number of occasions and had to do recalls for violating traffic laws.
I think the California Rolling Stop was one of the first times a few years back when they were called before Nitza on a problem like that. And here they are, allow a Mad Max mode apparently allows for vehicles to violate speed limits and essentially weave around an inside and out of traffic.
I, I think I saw a video on this where the Tesla was passing someone on the right which is, a problem and especially when you’re vi, when you’re well over the speed limit. [00:33:00] But, Tesla’s essentially programming vehicles not to be good drivers or responsible drivers. And that’s a huge problem.
But, sorry, I interrupted your Gaslight.
Anthony: No that, you’re absolutely right. Apparently Tesla’s next release is gonna be called Mass Hole. Every time it goes to turn left, it’ll put on the right blinker. That’s our gaslights for this week. Still te a spread that seemed to hurt you as a master.
I,
Media’s Role in Shaping Public Perception
Fred: I’m just astonished to be living in a time when people who control access to information, choose to lie and present a completely false image of reality in order to promote their own products that are killing people. It’s just, it’s, I don’t know, it’s like Soylent. It’s just I’m left speechless.
I don’t know how to do this. And, they’re aging. Any independent news sources. National Public Radio, they’re aging. The public [00:34:00] tv they’re even aging private news companies that choose to deliver unbiased facts. I sorry I’m ranting. No, it’s, but I’m astonished and I’m living in this age.
Anthony: We all are.
Tesla’s Full Self-Driving in Australia
Anthony: Anyway speaking of that news agency stuff and continuing another Tesla piece, 60 Minutes Australia which is similar to the US-based news show, except that their pieces are much longer, it seems the US version of the show is like each segment’s seven minutes, and three of that is just, playing a ghost of morally safer.
Whereas this one, they did a one that’s about a half hour segment on Tesla. His full self-driving is coming to Australia and they really do a pretty good deep dive. Into this on Australian roads, including the we don’t have to go too far into it. It’s fun. There’s a link to it. And the host is in the car with some guy who’s works in the auto industry.
He’s a fan boy, or I’m not really sure. Fan boy sounds rude, he’s excited [00:35:00] for this stuff to come and the Tesla drives off the highway beyond signs that say, do not enter construction zone and keeps going Hey, I only need cameras to see where I’m going, but I just can’t interpret what I’m seeing.
Tesla.
Michael: Yeah, there’s a problem there. And Australia is interesting, in, in the story they said that, currently Australian law doesn’t allow for vehicles that are higher than an SA level two to operate. And so a full self-driving is only a level two system according to Musk in Australia.
It was, it’s interesting for another, a few reasons as well. You’ll see in the, in, in the story, they talk a lot about the case that just wrapped up in Florida, or the trial portion of the case wrapped up in Florida. And I did an interview with the young man who was injured in that crash. His [00:36:00] girlfriend was killed.
And you know what they went through as, just. Innocent party sitting at a three-way stop looking at the stars, and all of a sudden a car comes out of the darkness and kills her and significantly injures him. A lot of good commentary in the story from Missy Cummings, who’s former guest of the show, and from Brett Schreiber, who was the attorney who handled that case in Florida.
But ultimately, Lee, it’s, there are a lot of questions are that are going to be raised in Australia around these vehicles, just like there are being raised in the United States around these vehicles right now. And I just wonder why some countries that have, seen what happen has happened in the United States with Tesla are not more to leery of allowing those vehicles to be imported into their countries.
I think Australia had a chance to say, Hey, we don’t want this shit in our country, and they missed their opportunity. I think that’s something that’s going to have to happen as more of these whizzbang [00:37:00] tech features are sold and drive companies stock prices and become more and more commonplace.
Anthony: Yeah, it’s funny, you can see the US 60 minutes puff piece on full self-driving from like 2018 and Elon’s driving around hands free and the US host is just this is amazing, the world we live in.
And you see the Australian version like, this is insanity. Why are we allowing this? So just for that comparison, it’s great. Leslie Stahl, please donate.
Michael: Ha They already gave, they, that’s another difference between all C 60 minutes and the United States version is that their parent company isn’t capitulating to the president.
Anthony: True. Yeah.
Regulatory Challenges and Safety Standards
Anthony: Continuing with that political thread, the Washington Post had this long article that maybe we can get into, I don’t know how deeply, but it basically talks about English proficiency testing for commercial truck drivers and the Trump administration and their. People are basically trying to pull over, immigrants [00:38:00] and non-native English speakers and be like, ah, you can’t drive here because you can’t, your English isn’t provisional enough.
And reading this, my reaction is yeah, if you’re driving an 18 wheeler, I want you to be able to at least read road signs where you’re going, right? And all sorts of random signs that pop up of like temporary construction zone and different warnings like that. And that makes sense to me that I’m on board with.
But this also, it seems that it’s arbitrary how they decide whose English is proficient enough. And it seems that it’s left up to highway patrol. And some states like Texas might have a couple racists on the police force. I don’t know. Yeah,
Michael: I look you see something like this and to me it immediately suggests that there’s another motive here.
I think most reasonable people can agree here that, the Trump administration is not out there super concerned about the safety of Americans, otherwise they’d be doing it a lot more about other issues. They keep highlighting this one crash that happened in August in [00:39:00] Florida where, you know, an, a guy I think from India who, failed a English language proficiency test, made a illegal U-turn and killed three people.
So yeah. Yeah I agree. I think that, commercial truck drivers operating these massive vehicles should have to read and understand English road signs. But at the same time, there’s just. The way this is being enforced and the way it was rolled out, one of the first things that the Federal Motor Carrier Safety Administration did, and Sean Duffy did after getting in place, is to go after immigrants.
Which has been a, a drumbeat for the whole the current administration. I, yes, maybe this helps safety to some extent, but I’m very skeptical of the underpinnings of why they’re doing this
Anthony: fascinating article. If you have access, please read it. And for now, let’s jump into the dulce tones of our own Frederick [00:40:00] Perkins.
Fred: Is it Frederick on your birth certificate?
Anthony: We, I’m sorry, what? Is it, Frederick, on your birth certificate? It is. Ah, all right.
Fred: There we go. It is, yeah. Cow time.
Checklist for Regulators
Fred: All right, so what we’ve talked about last week a little bit is a checklist for regulators. And so I just wanna talk about why this is necessary.
It’s all about risk, okay? And risk is related to safety. And all of the developers will say we conform to all applicable safety standards. True. It is. So what are those safety standards? In the US code 49 USC 3, 0 1, 0 2 a nine defines motor vehicle safety is the performance of a motor vehicle, or motor, excuse me, motor vehicle equipment in a way that protects the public against unreasonable risk of accidents occurring because of design, construction, et cetera.
[00:41:00] That’s means basically you’re relying on what’s reasonable and who’s to say, right? So if you look at other standards, there’s something called ISO 2 6 2 6 2, which defines safety as the absence of unreasonable risk. All right? That doesn’t give you a lot of new information. So let’s go to the United Nations U-N-E-C-E.
They say that they don’t even define safety. What they do say is safety case means the structured arguments supported by a body of evidence to provide compelling, comprehensive, and valid case that the a DS is or will be free from unreasonable risk for a given application. So in this case, you don’t even have to be safe if you know reasonably safe if you say you’re going to be in the future.
There’s a lot of ambiguity here, and they all rely upon reasonableness without saying what that means. So when people say they comply with all regulations, basically they’re saying, we think what we’re doing is [00:42:00] reasonable. Yet the developers won’t reveal what they think reasonable means. NSA should do this.
NSA’s done that with the Federal Motor Vehicle Safety Standards for conventional cars. They refuse to adapt those standards to computer driven vehicles. So that leaves the develop developers themselves, the courts via tort law settlements, and state regulators. The developers won’t do it because what they think it means is we don’t have to do anything.
We don’t have to tell you what we’re doing. And so screw yourself. The courts can only act after tragedies happen, right? If nothing happens, there’s nothing to litigate. So then these regulators, especially state permitting agencies, to evaluate the risk profile of these computer driven vehicles, they’re asked to license.
There is a way of doing that called comprehensive safety case analysis using standards like UL [00:43:00] 4,600, but that takes a lot of time and energy even to read it. It’s hundreds of pages and it’s requires a lot of effort. Legislators are overworked. They don’t have time to go in and dive into a about multi-hundred page document to consider what the right developers are whispering in the ear is completely safe.
Nothing to see here, folks. Let’s move on. So while this is not perfect, we developed a concise checklist for regulator use that identifies key risk factors that safe computer driven vehicles have to consider in the interest of public safety. And of course, we’re happy to share that with people. It’s still under development here, but let’s go into a few of the topics.
So these are all yes or no questions, binary answers. And the first couple are has the applicant followed the technology readiness level validation process to assure safe and reliable [00:44:00] operation? A technology readiness level validation process has been developed over 20 years for various government programs.
DOT has signed onto this. Waymo has signed onto this in a document produced by the government accountability office. So it’s a reasonable question to ask. If you do conform to it, you have very high confidence that the vehicle will be appropriate and safe. So have they done that, yes or no? Simple answer.
Next one is, has the applicant developed and validated compliance with a comprehensive safety case analysis or its computer driven vehicle offering? But we talked about that a little bit earlier. Computer safety case analysis is available if you do something like comply with underwriters Laboratory 4,600, which involves independent verification of how the vehicle has been developed and produced and [00:45:00] what standards it conforms to.
So for the legislator, the regulator, a simple question, has the applicant done that or not? Next one is, has the applicant shown its computer driven vehicle offering has passed a safety case audit by competent and qualified independent auditors? Most people think that’s an absolute requirement for a safe, computer driven vehicle offering, and I say computer driven vehicle rather than level one, level two, level X, level Y, or av because all of the computer driven vehicles that allow for sustained operation for any length of time.
Without human inputs into the control system has similar safety profiles and similar safety consequences if a developer has not done their job properly. So this applies to a broad set, and I didn’t want it to be dependent upon the language that the industry [00:46:00] itself came up with for what qualifies and what doesn’t.
Clearly Tesla’s operating in Mad Max mode are comp are subject to the restrictions that computer-driven vehicles should embrace.
Anthony: Fred, I just wanna check on that. So my vehicle, which has, adaptive cruise control, lane centering when it, when I have it in that mode, this would qualify as a computer driven vehicle?
Fred: Yes it would. And why is that? Because any vehicle that drives itself for long enough to kill you or to kill people outside of the vehicle. Has got to conform to these safety guidelines, right? That’s what it’s all about. Protecting you and protecting the public.
Anthony: Okay. So this covers my 4-year-old car all the way up to the Mercedes that will drive itself on city streets and Ford, blue Cruise, all of Tesla’s nonsense.
So it’s a whole wide swath of vehicles. It’s, but it’s a, [00:47:00] I mean you can almost say that it’s more or less every vehicle sold in the last couple years. ’cause they all have some level of this and one, one of I think so. I believe that’s
Fred: right. Yeah. Again, my standard, which is not the industry standard, is then you kill somebody and you injure somebody if your design doesn’t work perfectly.
To me that seems reasonable. Okay. Now what are the reasonable standards? I don’t know. What does the legislator think is reasonable? How many of these do you have to embrace and what do they be reasonable? I don’t know. I don’t know what the answer to that is,
Anthony: but I think very simply, we can say if you sell brakes and the brakes fail, that it’s, that’s unreasonable.
The reasonable thing should be that the brakes work when you apply pressure,
Fred: right? Simply. Okay. Seems to again, the standard is a defect in this particular characteristic cause you to enable you to kill somebody. [00:48:00] Okay.
Anthony: Okay.
Fred: Or to injure somebody. And how often it does that is part of the risk.
Remember, risk profiles have two parameters. One is the consequence of the defect and the other is the probability of it occurring. So what the companies don’t like to say is, yeah how often is a. Is an AV likely to encounter a school bus from the side with a slight flashing when it, comes to a t intersection?
The answer would probably be, it’s not gonna happen very often. So the developer could say, ah, this is low risk, never gonna happen yet.
Anthony: But I’m thinking of what you’re saying is it has to be perfect. They already have engineering calculations for how often a brake failure would occur or a steering wheel inputs would fail.
So they know this. And when people hear computer driven or autonomous, their brains turn off and don’t think of there’s computer, it’s much smarter than us. There’s no way to calculate it. This is so much [00:49:00] better. But it’s really what you’re saying is not, we can track this similarly, we can track mechanical linkage failures.
Fred: And so the purpose of this again, is to just give regulators a handle on what are the hazards. The risks that are involved in safe operation of these computer driven vehicles. And remember, there’s two aspects of it. There’s hazard and there’s frequency, right? So as a regulator should consider both, like for example, how likely is it that a computer driven vehicle is going to plow into a group of school children walking in hand behind their kindergarten teacher crossing the street?
That’s a hazard that you probably don’t want to involve in a question of frequency, right? You wanna know that’s just really never gonna happen. Even though it’s a very low probability, you really don’t want that to happen. So the regulators really [00:50:00] need to consider both the risk and the hazard, and certain hazards should just be unacceptable and whatever frequency they might occur.
Anthony: This is great. I’m looking forward to us getting this online in the next couple weeks.
Fred: Yeah, I think we’ve got enough for now, but I wanted to set the context and if anyone’s got suggestions, if anyone on of our listeners have suggestions in how to put safety considerations or computer driven vehicles into a binary context, yes or no, go ahead and send them on to us.
We’d love to get your inputs as well.
Anthony: contact@autosafety.org unless you happen to personally know Mr. Michael or Mr. Fred, and you can just email them directly or you can just meet them in your local ply wiggly parking lot. They’ll be there all weekend.
Vehicle Recalls and Safety Issues
Anthony: Anyway, let’s go into recalls. First up Nissan, whoa, 173,301 vehicles, and of course, the first one listed as the Chevrolet City Express.
So confusing. The Nissan NV [00:51:00] 200 taxi. The, this is the 2015 to 20 18 20 nineteens. These are vehicles I’ve never heard of. I don’t know what’s going on, but a blown fuel pump fuse can interrupt fuel delivery to the engine, potentially causing a loss of mode. Power lump. Oh, loss of mode Power is the acronym.
Is lump preventing the vehicle from restarting increasing the risk of crash? Oh, no, first, Mr. Brooks what are, huh? What are these vehicles? The NV 200, they’re
Michael: mostly passenger vans, cargo van types, like the, got it. If you see, a smaller smallish minivan looking vehicle running around, that’s, a plumber might use them, a bike or tech repair guy or a small taxi service, okay.
Anthony: Okay. Moving on to the next one then.
Ford’s Recall Details
Anthony: Ford. Yeah. Ford. Only 500, 405 vehicles. Come on Ford. You can do better than that. This is the 2025 Ford Ranger non-conforming instrument. Panel topper [00:52:00] components may alter the intended passenger airbag cushion opening inflation characteristics. Oh, that’s a bad one.
Oh. So they the instrument panel was installed incorrectly. Is that what we’re seeing here?
Michael: I think this is similar to what we’ve seen on and a few other recalls where the instrument panel has to have a particular pattern etched to, they call it scoring right. Into it so that the airbags can deploy, essentially threw the plastic on the instrument panel.
In this case, their airbags are not deploying, are not able to deploy because there was a change in that scoring pattern that was made by the supplier. And so that’s essentially preventing the airbags from properly deploying. I don’t think they’re, they meet motor vehicle safety standards because of that.
And so this is a compliance recall. Essentially [00:53:00] Ford’s recalling these vehicles, they’re relatively new. It looks like this only impacted vehicles that were made on September 9th and 10th of this year. There’s a chance I don’t know how many of these have actually made it to consumers at the moment, but you can locate your dates of manufacturer and on your vehicle and figure it out from there.
Or, this recall’s up on its website to go ahead and check your van to see if you qualify for this recall, because this is, a pretty critical problem. And it’s something you want to get fixed immediately. Because if you don’t have, it’s the passenger airbag. If you don’t have, I don’t know where I was going with that, but it is the passenger airbag and it looks like owners are gonna be notified about this later this week. So you should be able to get a fix. It looks like sometime in later November. It doesn’t look like they have a perfect remedy for this. She, although you would think that would just be a new instrument panel.
Dashboard that doesn’t have this problem or that was manufactured [00:54:00] to their older specs. But we will see.
Anthony: So you’re not recommending people take a razor blade and create their own scoring marks?
Michael: I don’t think that’s ideal. No.
Anthony: What if you don’t get it repaired? ’cause you’re like, I really don’t like my passenger and I wanna create the perfect crime.
You should be in jail for
Michael: murder.
Anthony: Okay, fair enough. Moving on. Next up, Ford 59,006 vehicles. That’s right. That sounds more like Ford 2020. To 2023 Ford Explorer, 2021 to 2024 Ford Bronco, the 20 16 20 18 Explorer, 20 22, 24, blah, blah. So many. There’s the escape, the Bronco Sports, the fusions, the Lincoln Corsair, the limp, MK six.
In the effective vehicles, the engine block heater may develop a coolant leak through its element pins, which would cause a res resist resistive short circuit. When the engine block heater system is plugged in, what is an engine block heater?
Michael: It is basically a heater that warms [00:55:00] the engine up prior to you starting it to help it start easier.
That would be in like cold weather and other times. So people who live in really cold environments essentially can plug in a block heater before they start their car so that they don’t have any problems starting the car. It looks like this one, there’s obviously a short here and it’s gonna, it could cause an underhood fire.
And there’s a lot of factors here. You know what, whether you’re plugged into 110 volt electrical supply, whether you have a, A-G-F-C-I power outlet, but, I think the easy thing to do right now if you’re a consumer and have on these vehicle is just not plug in the block engine block heater at all.
And wait for the fix, which looks like it’s gonna be a few months and it, so this engine block heater may not be helping you out a whole lot this winter. It looks like the remedy owner notification’s gonna take place in early February. Not a lot of solutions for you folks who are living in extremely cold environments on this one for the next few months.
But they [00:56:00] are instructing owners not to plug in the block heater, so keep that in mind.
Anthony: I will not plug in my block heater. All right, next up, Ford. Oh, wow. 209 290 1,901 vehicles, eh? Yeah, we’re getting in our groove now. Ford 2020 to 2022 F four 50 sd. The F two 50, SD three 50 sd the 360 degree camera system, a substantial difference in lighting clip come on.
One or more of the cameras exposed a significantly different light than the other cameras. The system’s automatic exposure compensation, which attempts to balance the overall scene may cause the camera image to appear severely underexposed or overexposed. Okay. It’s not quite the rear view camera. Okay.
They’ve got some fancy multi lens system going on here.
Michael: Look at the description of the safety risk, right? I know.
Anthony: Oh no.
Michael: The customer may not be able to clearly view the area behind the vehicle when using the rear view camera. We’re all reversing under. I apologize, Anthony, I didn’t mean to this week.
I even left out [00:57:00] a very large 1.3, I think, million vehicle forward recall on rear view cameras this week. ’cause I didn’t wanna trigger you and here we’re
Anthony: No. It’s okay. This is a much more complex system sounding than the other rear view cameras we’ve run across. So this is, I’m gonna, I’m gonna give him a little bit of a little latitude here, Mr.
Farley, come on the show. Let’s talk about it.
Michael: Yeah. And this one’s got, this one’s got a rather long, remedy period as well. It looks like they’ve got to create some updated image processing software. And owners are probably not going to be seeing a fix for this until probably April of next year at the earliest, since they’re saying the planned owner notification’s gonna be March 31st, 2026.
Anthony: People aren’t driving these cars until April anyway. ’cause their engine block heater doesn’t work. Oh. All right.
Aftermarket Accessories and Safety Concerns
Anthony: Last recall Ford 1048 vehicles. The 2024 to 2025 Ford [00:58:00] Mustang supercharger upgrade kits that were sold as after market accessories for 2024 to 2025. Model year 5.0 liter Mustang vehicles included an updated powertrain control module calibration that incorrectly disabled level two functional safety features in the PCM.
Okay this, huh.
Michael: A lost for words there. That’s a lot of gobby go. Yeah. And this is basically an aftermarket kit. It’s a supercharger upgrade and it’s for your Mustang that you wanna go wild in, but it actually has a powertrain control module calibration that screws up your level two functional safety features.
So I would assume that’s, your adaptive cruise control, that type of thing are. It and it says it’s it could result in, sudden unintended acceleration or unintended vehicle movement. Big problem. And this is an equipment recall, [00:59:00] folks. So it, this is for people who, you’ve got a 24, 20, 24, 24, 25 Mustang.
You’re going into your dealer and saying, Hey, I wanna upgrade this bad boy. And you’re buying a supercharger upgrade kit. So if you bought that kit and you’re using level two features, Ford caused them functional safety features. Although, there’s a big question as to whether level two features have anything to do with safety at this point.
You don’t want to use this software. You wanna uninstall it, you want to get rid of it. It looks like they’re going to offer a fix at some point in the future, and you’ll be hearing about it, in the next couple of weeks. I would say just throw away the supercharger, upgrade and drive like a normal human.
Conclusion and Farewell
Anthony: No, we don’t do that. Anyway, with that folks that’s another hour and change of your life spent with us in your ears. Till next week, bye.
Fred: Thanks everybody. Bye bye. See you with the Piggly Wiggly. I for more information, visit www.auto [01:00:00] safety.org.