The Big Beautiful Bill and why are AV’s on the road?

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

The Big Beautiful Bill and why are AV’s on the road?

[00:00:00]

Introduction and Welcome Back

Anthony: You are listening to There Auto Be A Law, the Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Cimino. For over 50 years, the Center for Auto Safety has worked to make cars safer.

Hey listeners, wait, what happened? Yeah, we took a week off, but now we’re back. That’s right.

The Big Beautiful Bill: AI and State Regulations

Anthony: We took a week off to have a nice little break and relax and now we’re gonna talk to you about the big beautiful Bill act. Why would we start this way? My God.

Michael: Yeah, it’s, I’m sorry. I’m sorry about that, but this one’s kind of important.

Anthony: Yeah, I’m gonna, this is fascinating ’cause so from tech policy.press. Also from lawfare media.org. Everyone’s go-to publication. I’m gonna quote from tech policy [00:01:00] press. Part of this bill is states would be unable to enforce any non-criminal law in situations where someone used the computer, which constitutes most of modern life and commerce and driving any car, I assume.

’cause every car made in the last, I don’t know, at least decade, if not more, has a computer, right?

Michael: Yeah. I don’t know that I don’t know that, that it would apply that broadly, but it is certainly going to apply to almost anything that a manufacturer calls ai, whether it’s AI or not. We’ve talked many times about the phrase AI and how it can apply to almost anything you want to call ai.

So just from that standpoint the bill is problematic in itself.

Anthony: From Law fair media.org. Quoting officially the one big beautiful bill Act it contains a provision section 43 2 0 1 C, imposing a 10 year bar [00:02:00] on state and local governments enforcing any law regulation concerning artificial intelligence.

The provision does include an exemption for criminal law and some of the provision supporters claim and exempts generally applicable law. But this claim does not appear to be supported by either the text that passed the house or the earlier versions that passed out of committee. So we talk a lot about robo taxis and AI vehicles.

So is, and we’ve talked about how California gives exemptions to these things. So they violate a bunch of traffic laws and there’s really no consequences at this point. I think California’s slowly putting in some sort of consequence, which is adorable, I think on the rating of enforcement. So this seems to give every state blanket pass to do whatever the hell they want with a robax, with an AI autonomous vehicle, or am I misreading this?

Michael: I think you’re misreading it a well, virtually it says. You can’t do anything if you’re a state and that federal regulations are the [00:03:00] only thing that are going to be able to be enforced over companies that use ai. It’s not something that, typically this is called, essentially this is called to get nerdy field preemption, where the federal government is saying, no states can regulate in this area.

And it’s problematic because a, there are already states enforcing laws in this area. I think there’s a law in Tennessee, appropriately nicknamed the Elvis. Act or something like that protects mus music and musicians from AI stealing their music and using it. There are laws and states that prevent, deep fake pornography that wouldn’t be able to be enforced under this provision.

And it’s, frankly, this provision even caught, this provision is passed the House of Representatives already and it’s supposedly headed to the Senate. So it’s problematic from a lot of perspectives. Our focus is automotive, it’s written so broadly that virtually [00:04:00] anything a corporation calls AI would be exempt from state regulation.

Hugely problematic. I it. The last couple of days suggests that even some of the representatives in the house weren’t aware that this was there. This is one of the few situations I think we find ourselves agreeing with people like Marjorie Taylor Greene, who did it even read the bill, and now is saying if it comes back to the house after the compromise in the Senate, that she won’t vote for it because of this provision.

It’s,

Fred: oh yeah, and we know the mar te the green is, has a lot of integrity and that’s exactly what she will do.

Michael: But it’s, here’s what there’s, just from a bigger perspective, when field preemption is usually put into place in a situation like this, it’s because the federal government has taken a firm stance on regulation and has put laws into place to protect the public from threats from this type of thing [00:05:00] here.

They’re trying to preempt states from acting to, reign in rogue ai, but the federal government hasn’t done anything in that area. That they haven’t done anything to identify the threats from artificial intelligence and protect the public from them. We know now, we talk about it virtually every week on this podcast.

Tesla’s AI and Safety Concerns

Michael: Tesla AI is killing people. They’re, they’re irresponsible drivers sometimes associated with that, who may be a little AI infatuated, but they’re killing not only those drivers, but also innocent passengers and pedestrians. States wouldn’t be able to touch that kind of behavior.

If this. Provision passes as part of the bill. It’s an absurdly broad standard to be setting. It’s it really just allows corporations producing any type of technology to call it AI and basically opt into a free pass from state regulation. Hey Michael.

Michael. It’s a tech bro’s dream,

Fred: right? Yeah but [00:06:00] Michael, as you read this bill, would it eliminate the possibility of state police pulling over an AV that is speeding does go, does

Michael: it doesn’t extend to criminal law. States still have the authority to regulate AI criminally but they wouldn’t have any ability to regulate it on the civil side of the law.

So I. That’s where we see a lot, what NHTSA’s doing in regards to Tesla. If they want to force Tesla to do a recall, that would be a civil type thing that hasn’t really reached the criminal domain yet. So it’s that kind of behavior, what California’s been doing with autonomous vehicles what Tennessee’s doing with music, what other states are doing with deepfake pornography, what, states wouldn’t have the ability here to go after companies using an AI that is, publishing disinformation and that type of thing.

So it’s. There’s, [00:07:00] this is such a broad provision, and the way it just applies to AI generally it, that’s what concerns me the most here. We’ve seen the auto industry for the last 10 years in the auto and tech industry when it comes to autonomous vehicles. They’ve been trying to put this same type of thing into law in regards to autonomous vehicles.

They don’t want states regulating autonomous vehicles for a certain period of time, but at the same time, the federal government still hasn’t written the rules on them. So it creates a legal vacuum that I’m sure we’ve discussed before that where the autonomous vehicle companies essentially are making the laws, there’s no federal regulation, and states are prohibited from regulating.

And so they virtually get to do whatever they want. This provision would do a. Basically the same thing, but extend it to every system operated by AI that we know of which is so absurdly broad. I, this is one of the, one of the, one of the thing, one of the [00:08:00] bills that just really got my hackles up as soon as I saw it, because it’s just, it’s scary to think about what would happen if this comes into law.

Anthony: So let me try to understand a little bit more. For example, there’s a, there’s some towns around the country, like there’s one in New Jersey that is a dry town, like alcohol is not allowed to be sold there in any form. Say there’s a town like that, say, Hey, we ban autonomous vehicles in our local town.

This law would preempt that. The Fed say, oh, really? So state’s rights.

Michael: There are no, yeah, this law is completely anti-federalists.

Anthony: Wow.

Michael: No question about it.

Anthony: There’s such a confusing bunch of people.

Michael: It, it’s odd. It’s odd and it essentially giving a blank check for, AI companies, which I don’t think any of us think are all on the they’re not all trustworthy.

I’ll leave it at that. But it gives them a blank check to avoid regulation by states and then all they need to do is lobby, our federal representatives to prevent regulation from happening [00:09:00] to them. And it, you it’s ultimately, giving our will as a society over to robots that we don’t even know yet, and the CEOs that unfortunately we do know a little more about.

But it’s something that should give everyone pause. And fortunately I think in the last 24 hours, there have been, besides Marjorie Taylor Greene there have been some republican senators who have expressed some serious concerns about this. First of all this shouldn’t even be included in the big beautiful Bill.

It’s, that’s a budgetary bill, and this is a non budgetary item. It’s not supposed to be included. That’s a longstanding rule in the Senate. These days we probably, as most of you’re probably aware, we can’t really rely on the rule of law or the rules that legislative bodies might have as much as we may have hoped to in the past.

But we’ll be watching to see if it, passes. I think that’s the bird rule in the Senate that it has to, that, that, that has to make it passed. That’s where we stand at the moment. We’re, I’m, we’re gonna be [00:10:00] watching this, the, this one really closely because it does affect cars, but it could affect everything else even more in, in many ways.

Anthony: I do see one positive side to this for Fred and I at least Fred, for our autonomous vehicle company, we just saved a ton of money on our lobbying expenses. ’cause we don’t have to lobby states anymore.

Fred: Absolutely. And it, what’s great about this bill is that it illustrates really, it’s unreasonable to expect the legislators to actually read the bills they’re voting on, especially when they’re going to touch every person’s lives every day in multiple instances.

I, I think it’s important that we realize that. Legislators don’t actually read the stuff they vote on. And I guess that’s just the world we’re living in now.

Anthony: Yeah. I’m too busy dialing for dollars to read this stuff. And [00:11:00] I hired my nephew’s, former roommate’s brother as my legislative director, and he reads at a third grade level.

Hey, speaking of people who might read at a third grade level, Elon Musk, ah, let’s talk Tesla Electric. Do co I you say that? Yeah.

Tesla’s Declining Sales in Europe

Anthony: electric.co has an interesting article about how Tesla sales continue to crash in Europe. And basically no one’s buying Teslas anymore except for sales having declined in Norway and Aus Austria.

Ah, I wonder historically why those two countries might be interested in a guy who does an unfortunate gesture.

Michael: Yeah. Any ideas? Austria, that came to mind with Austria, but Norway also has a pretty extensive EV infrastructure and, I think we’ve discussed that before, just how the penetration of electric vehicles in Norway is number one in the world, I believe.

And their infrastructure is very well set up for electric [00:12:00] vehicles, and maybe Teslas are one of the only vehicles they have access to, or there that might explain that. Austria I’m still a little confused about, but the rest of Europe seems to be, going the way the United States is with respect to Tesla.

Tesla purchases.

Fred: A little deeper. What you find is that the increase in sales in Norway is year over year. But the base then is last year and last year had a big dip in the Tesla sales. So if you looked at it over a multi-year basis, you’d find that the sales are steadily going down, even in the Norway.

So the jump in sales in Norway is just a statistical anomaly based upon their results last year being lower than normal. So there’s really no there.

Anthony: Alright. I like that interpretation. It’s a little more positive than [00:13:00] mine.

New Jersey Ends Contract with Tesla

Anthony: Continuing with the world of Tesla, New York Times had an interesting piece today and I’ll Hey listeners, we’re recording this on Wednesday, June 4th, 2025 about the New Jersey Turnpike.

They had a multi-year contract with Tesla to provide supercharging stations at their. Or rest stops. They’re all have really interesting names. A lot of them are named after scientists, which is pretty cool. And the contract’s up and New Jersey says, yeah, we don’t wanna work with you on again. We’re done working with Tesla.

Which is bizarre because Tesla’s supercharging network like was, by far leaps and bounds ahead of everybody else in terms of availability, reliability. And New Jersey just says no. We’re gonna go with somebody else because more and more cars are using the North American charging standard and we’re gonna make it easier for everybody else.

And of course, Elon Musk says, that sounds like corruption. And, he would know.

Michael: But the look, [00:14:00] Elon saying anything about corruption is laughable, but the North American charging standard actually incorporates the, doesn’t it incorporate the Tesla charger itself? It’s

Anthony: based on the Tesla charging.

Yeah. Tesla’s. And it is Tesla

Michael: giving that away from free or is it licensing it? It seems to hear the Tesla stance to gain either, no matter which charger New Jersey, maybe the, maybe Elon’s not making as much if they’re not buying Tesla branded chargers versus using the North American charging standard that he’s licensing out.

I’m not sure about that completely, but yeah, it’s all

Anthony: unclear.

Michael: Yeah, I don’t know, Elon called it corruption, which you know, is funny to me for a number of reasons given his behavior over the last nine months or so. But it’s also funny because it’s. The contract was up and New Jersey chose a different path.

And there’s a lot of people leaning away from Tesla these days. Why would you expect New Jersey of all places to be [00:15:00] any different?

Anthony: I would not at all.

Michael: Although, making an accusation of corruption towards New Jersey. Anthony, what do you think about that? Is that fair? Hey,

Anthony: OHEY. Hey, yoy Ohey.

Michael: Oh, sorry. Maybe I watched too much Sopranos

Anthony: for some. I know. Look, it’s the garden state. There’s huge sections of it that are beautiful and then there’s parts of that standard oil set on fire and it’s still burning to today. Alright, look, we started Tesla. I’m just gonna continue, I’m gonna plow through all the Tesla.

Bullshit. Nonsense.

Michael: Good luck.

Anthony: I know. But hey, there’s one in Bloomberg. Michael, we wanna get your opinion on this one, this big long piece in Bloomberg about how.

Tesla’s Robo Taxis in Austin

Anthony: Robo taxis are coming. What’s the date? June 17th. Is that what he is looking for? June

Michael: 12th. It’s next week. We’ve got less than a week before.

Oh boy. The Robax made by Tesla starts running around the streets of Austin.

Anthony: Yeah. Lock your kids up please.

Michael: Lock a lot of things up.

Anthony: Yeah. I’m gonna quote from this piece in Bloomberg. We would love [00:16:00] the link to it, but it’s paywall. And here’s my quote and I’m quoting a quote. Elon Musk has bet the entire company on this philosophy that current Tesla vehicles are capable of being a robox said someone who’s really attractive.

Executive director for the Center for Auto Safety, a Washington based advocacy group. We know that the full self-driving system is camera based and Sun Glar can inhibit camera based operations. Yeah. Michael Brooks, you cynic you.

Michael: It’s, and it goes way beyond sun glare. This article was specifically looking at a crash in Arizona where someone in, in, in a in glare was hit by a Tesla.

Elon is claiming that Teslas have solved the sun glare problem. I would challenge that. And I would challenge the, I would challenge that that they haven’t solved any of the issues with the video cameras they’re using on their vehicles. They’re they’re not working at night properly.

They’re obviously not working in reference to flashing lights and emergency vehicles, having [00:17:00] trouble detecting that. So weather, sun glare. So there’s all these situations where. The eyes behind full self-driving and autopilot and this robo taxi they’re deploying in Austin simply don’t work. I’m still interested to see if they’re operating in Austin on days when there is bad weather on days when there’s fog at nighttime just to see how much they’re trying to game the system because I don’t see how they can’t be aware right now that the fatal flaw in their, this entire enterprise and what they’ve been promising forever is the fact that they’re not willing to use better sensors that can get around the problems that camera only system would have.

Virtually everyone is saying the same thing, and yet Tesla still continues to pair it. The belief that their system is going to work. I think we’re going to see it fail hopefully not [00:18:00] dramatically, and hopefully before too many people are injured or killed in Austin. And we’re going to be keeping an extraordinarily close eye on that rollout.

And I believe, as evidenced by the letter they sent to Nitsa a couple of weeks ago, the National Highway Traffic Safety Administration will be doing that as well. And hopefully some of Tesla’s competitors that are producing technology that seems to be far advanced and works a lot better for safety.

Waymo, I’m looking at you. Hopefully some of these companies will speak up about this problem because, you’ve got dozens of articles coming out, touting this Tesla autonomy and comparing them to Waymo and comparing them to companies that are using a range of, technologies to detect objects that are, might be unsafe and the road ahead and to detect potential.

Accident scenarios. Tesla’s not doing a lot of that. And they’re using cameras and putting all their chips on the table, on their [00:19:00] cameras. And we think it’s a terrible move for safety. And we’re hoping, but beyond the Nivas and the Center for Auto Safeties and the professors out there who are studying this we hope that the industry at some point will step in and say, look, Tesla’s doing a poor job here.

This isn’t going to work. Again, way more we’re looking at you for something. Defend yourself, please.

Anthony: We don’t need way more to defend themselves. ’cause we’ve got Tesla’s head of, what is it? They’re head of self-driving admits that they’re lagging a couple years behind Waymo.

Oh, look, this, if

Fred: you

Anthony: don’t believe

Fred: me, believe Tesla, I gotta say we shouldn’t be seeing any of these stories this entire. World of stories about Waymo, Tesla, or bot taxi, all that stuff should not be in the press because there should be a disciplined process for qualifying these vehicles, for validating their safety and for testing them to say, [00:20:00] okay, you’ve passed this safety criteria.

You’ve passed this review process. You’ve passed a disciplined process by competent people who understand how this technology works to validate your safety claims before you can put them on the damn highway. There is no other industrial discipline that I’m aware of that allows lethal technology, lethal moving industrial technology to be put in the public with no safeguards to keep the public away from this dangerous technology.

This is just absolutely freaking crazy. I don’t understand how we’ve ever gotten to this stage. I don’t understand on this, has ever through several administrations said that we just don’t care enough to actually put regulations in place or even a process whereby companies can self validate [00:21:00] that their technology is safe enough to use, that it meets the standards of every other piece of lethal industrial technology that the public might be exposed to.

You go past a skyscraper under construction, there’s a fence to keep you the hell out, right? You go past a cement mixer on the highway, pouring cement, there’s guards, flashing lights, barriers up to keep people away from the truck. When you go past a Robo Taxii in San Francisco, there is nothing, not a single molecule of air that separates you from the lethal kinetic energy that’s included in these vehicles.

This is, this whole situation is just an absolute disgrace and a failure of the government. I hate to use words that orange idiot would use, but this is truly a disgrace. Sorry, I’m not even ranting here. I’m merely observing.

Michael: Yeah. [00:22:00] And you’re right to observe Fred, because I believe you were one of the proponents, I think it was seven years ago when we proposed to nitsa that they need some type of independent gated certification process to evaluate these vehicles.

And nothing has happened in that time.

Fred: I can’t say nothing has happened because we’ve got the UL 4,600 which is an absolutely adequate vehicle for structuring this gated certification process. Hats off to Underwriters Laboratory and Phil Cooperman putting that together. And, hats off to people at Nisa who’ve turned their back to it and dropped their drawers and said, no, we don’t have to do this.

It’s not a standard. Make it a standard or make a process a standard. Protect the people in the public whose lives you are paid to protect. This is just an embarrassment. I’m sorry.

Anthony: You’re absolutely right. And listeners, [00:23:00] if you think Fred’s absolutely wrong, write us [email protected].

But while you’re doing that, go to auto safety.org and click on Donate. And if you disagree with Fred so much, you donate so much, we’ll remove Fred from this show. His starting price is $10,000.

Michael: Wow. That’s cheap. Wait

Fred: I’m worth more than that. Yeah,

Michael: I agree. Fred,

Fred: come on, Elon. Open up that corruption wallet and buy us off.

Come on, man.

Michael: But back to that article. It’s, I think it’s a shock. S Swami, I’m sorry if I pronounce your name wrong. He was asked about the difference between Tesla and Waymo on self-driving, and he said, essentially that Tesla’s approach is much cheaper. Yeah. You’re only using one sensor and you’re trying to, you’re still trying to pretend that the cars you made, eight years ago are going to be capable of autonomous driving, which they’re not.

Anthony: I don’t know if they’re actually cheaper, and I don’t think he knows either, because yes, the physical car will be cheaper. What they’re making

Michael: is cheaper

Anthony: than what Waymo is. But with Tesla and [00:24:00] Austin, from what we know, they’ll have a human supervisor, essentially, a kid with a remote control monitoring them one-to-one when they first roll out.

Fred: Yeah. There’s one fewer, there’s one fewer 71-year-old grandmother in Arizona this week than there was last week. Who would argue that? Who the fuck cares? Fix the problem.

Michael: And he’s, and what he said was, is the host asked him, did you mean it’s less expensive, but of equal quality?

And he says, equal quality. Technically, Waymo is already performing. We are maybe lagging behind by a couple of years now. To me that’s like saying, i’m only a couple hours behind someone when I’m traveling a completely different direction, right? You can’t have equal quality to Waymo when you’re not actually sensing all the things, all the potential dangers in the path of your vehicle that they are, you can’t have equal quality to Waymo [00:25:00] when you’ve spent the last 10 years building a supervised automation that completely relies on humans.

Whereas Waymo has been doing the opposite. They’ve had a safety driver, but they’re completely different paths toward autonomy. And, I don’t think Waymo is perfect by any means, but they are far ahead, more than a couple of years ahead of Tesla. And I don’t, this is just more nonsense spilling out of Tesla.

In fact, it is my gaslight of the week from a shock s Swami, because, tesla simply cannot perform safely in conditions where cameras can’t detect the threats. Whereas Waymo has, radar, lidar, other sensors that are perfectly capable of seeing in those conditions. And to even compare the two systems as the Tesla technical director’s doing is absurd.

And I, I’m, I think I’m beyond words for this entire comparison at this point.

Anthony: Wow. I’m gonna do my gaslight [00:26:00] then. My Gaslight is also gonna be Tesla. Let me jump in really quick also, Kathy Wood at Arc Investments. ’cause whatever Elon Musk says her repeats you love her. You too love her.

No, she’s my Kyle. Electric has an article about about a Tesla in full self driving beard off the road and flips the car. So this guy who oh yeah, he says every day I would use full self driving. And I used quoting from this article, I used FSD full self driving Every chance I could actually get, I actually watched YouTube videos to tailor my settings and experience.

I was happy, could drive me in a waffle house and I could just sit back and relax while drive me on my morning commute to work. Two months ago he was driving to work on Tesla full stop driving when his car suddenly swerved off the road. It flipped over. Yeah, because it’s it’s not, you can watch

Michael: the video.

He’s got it.

Anthony: Yeah. If you wanna watch the video, we have a link to this article. It’s it’s crazy. But this is not even my full gaslight. My gaslight is Tesla rolling out robo taxis because they’re like, look, we have full self-driving robo taxis in a handful of streets [00:27:00] in Austin that have been detailing mapped.

And their operating design domain is like my bedroom. My bedroom.

Michael: Even if you look at, if you look at that crash. Do you think that a remote operator is going to be able to take over control and prevent that crash in the time it took for it to happen? Absolutely not. No. Is zero possibility is the remote?

Anthony: No. Is the remote operator big balls. Who works in Doge? Because I hear his gaming skills are height.

Michael: It doesn’t matter because he won’t have the opportunity given latency to respond in time. There’s just, there’s no way that remote operators are going to be able to make these vehicles safe in Austin and that’s why we’re concerned about what’s happening next week.

There’s no, no matter how many robot operators you have, and even if you have one remote operator for every vehicle, and that’s the requirement going forward. Why don’t you just have a driver? Are you, are these people virtual paying, a virtual driver to [00:28:00] operate their vehicle somewhat less safe than they could?

None of this makes sense to me. But, welcome to Tesla Autonomy.

Anthony: Hey, Michaels, whose gaslight was this? You just stepped right? Come on man. Sorry about

Michael: that. I was just trying to support you, man.

Anthony: No, it was okay. It was great. I loved it. I loved it.

Fred: Fred, you got a gaslight? I do. And I’m gonna switch gears a little bit and mix it up and Good.

Zoox and Autonomous Vehicle Safety

Fred: My subject is Zoox, which is a company Wow. That is promoting really ugly vehicles in the San Francisco area. It’s owned by this is an Amazon company. I, yep. If I remember right. Yep. And what happened a couple weeks ago, the actual date is uncertain because their press release doesn’t give it.

What happened is that a. Person on a scooter ran into the Zoox robotaxis, and was knocked over, was lying in the street next to the Zoox, Robax, and Zoox. Robotaxis then [00:29:00] continued to move. It said it completed the turn. And in fact, lemme just read the first paragraph from the press release on Thursday.

Again, they don’t say which Thursday. We submitted a voluntary software recall notice to Nitsa. We chose to do this voluntarily following our internal review of an incident that took place in San Francisco, California on May 8th. Oh, I guess I stand corrected. They do have the date involving a Zoox, robotaxis and an scoter.

After evaluating our internal data, Zoox opted to proactively submit a voluntary software recall. Nisa as we continue our transparent and open communication around our safety practices and operations. The updated software has been deployed to while impacted Zoox vehicles operating on public roads. Okay, so they go on to say the e scooters fell to the ground directly next to the vehicle.

The robot taxii then began to [00:30:00] move and stopped after completing the turn. Happily, it did not make further contact with the Scoter zoo’s. Operation personnel met the individual quickly, not saying how quickly on the scene to exchange information and offer medical attention, which was declined. We take an incident involving another road user.

Seriously. Following this incident, we were transparent in sharing both information and video with regulators as we implemented our safety review and reporting process. It goes on and on. I invite people to read it. So why is this? Why is this Mike Gaslight? They take it apart and they say the traditional idea of a recall, is it a bit of a misnomer?

No, it’s not. Recall is a recall and they’re trying to say that we don’t have to do a recall because it’s only a software update. They go on to say delivering their volunteer report on a Thursday, but it’s not a voluntary report. They’re required to report this under the standing general order.

[00:31:00] They’re saying the voluntary report is taking a transparent and responsible approach, but it’s really not voluntary transparency. They never say how long before assistance arrived. It’ll, they don’t talk about the root cause. They don’t talk about how they arrived at the root cause determination, whether the, I go on and on because I actually understand what you should do when you have a crash.

They didn’t do it. Worse is that the absolute minimum that the AV industry should be offering to the public is that lethal incidents and dangerous incidents that have occurred in the past should be understood and should be built into their test and development procedures. It should not be repeated.

This incident is very similar to what put Cruz out of business. Okay. You have, after a crash, you have a person lying in the street next to the vehicle. The vehicle continues to move before any human being has a chance to leave their eyes on the scene or do anything. [00:32:00] I believe for this bicyclist or the scooter, it didn’t run over him and he was able to get out of it alive.

But why is it that the regulators in California. Allowed this vehicle to get back on the road when it didn’t replicate, didn’t test, didn’t evaluate or didn’t validate the safety of the same incident, the same set of circumstances that grievously injured a person put crews outta business. This is just, this is unconscionable on the part of both zoo’s developers and also the regulators who allowed this to happen.

I don’t, they care. People are dying out there.

Understanding Modern Recalls

Fred: When they say the traditional idea of a recall is a bit of a misnomer. No, it’s not. It’s Michael. What does a recall mean?

Michael: Recall. Traditional [00:33:00] mindset. A recall is you’re pulling all these products back to the factory, right? But as defined by NSA’s regulations, that could be what happens, but it’s very rare that you see an actual recall.

And buyback recall is a term when it comes to cars that applies to a lot of things. You could have that, you could have a buyback, but typically the car’s only gonna be recalled to the dealership for a fix. Now that we’ve entered an area, era of connected vehicles and software updates, a lot of recalls can be carried out over the air.

The, so it’s really just, a kind of a new definition of what a recall is and the people that get their hackles raised by the calling something a recall when it’s not. Just aren’t quite with it. They, I don’t know. They seem to be objecting more to the safety more than they are to the fact that this car has a problem that needs to be repaired.

However we repair it, whether it comes back to the factory, the dealership, or over the air, we’re gonna call it a recall. That’s [00:34:00] how these things have been. That’s what the, these situations have been called, going back to the start of Nitsa in 1960s. So that’s not going away. People there, and you can object to it all you want, but at any rate, you’re gonna have to file a safety defect report when there’s a defect in your vehicle.

And when that number’s posted to NHTSA’s website, the media’s gonna call it a recall. Pull up your big boy pants and get used to it. I’m just so tired of this is just such a tired objection that comes up over and, the Elon, worshipers are the ones who, who started it, I think a few years back because Tesla was on the the cutting edge of over the air updates, but it’s become just a tired objection in situations where clearly one of the parties has a safety issue and they’re usually trying to deflect from the issue that they’ve had by saying, oh, this shouldn’t be a recall.

So it’s an old tired excuse.

Fred: And even update to the text in a user’s manual is [00:35:00] a recall. So any modification that goes along with this is recall. So even, and I could dive in a little deeper, but I will for just a moment.

The Debate on Autonomous Vehicle Safety

Fred: How in the world can you do a software update that affects the operations of a vehicle in the vicinity of pedestrians lying in the street next to your car overnight?

Validate it. Do regression testing, do the validation of the software, test it to make sure that it actually works. How in the world do you do that in a day? And why does the regulators in California allow them to get away with something that is technically completely unacceptable and would immediately get you fired if you were working in any other industry?

Developing technology that exposes the public to danger? This is it’s a consistent theme today. Okay, so I’ll let it go. That’s given it to Zoox because they claim transparency. When there’s no transparency. [00:36:00] They claim voluntary activity disclosure when it’s in fact compulsory. They try to redefine themselves away from any responsibility for having been involved in a collision that exposed the public to lethal danger.

By saying this is what, just what we do. And in fact, they say that this is, they’ve evaluated it and this is the expected behavior of the vehicle. If this is the expected behavior, you guys are just putting dangerous vehicles on the street with the expectation that they’re going to knock people off their vehicles and then run over them.

I, I. I’m not often speechless, but I gotta, yeah,

Michael: What if somebody walks into court with that excuse after they move and run over a pedestrian after a crash, it’s not gonna work on the judge, is it? Oh this is expected human behavior to pull [00:37:00] away from the scene of the accident and hit a pedestrian for the second time.

It’s so absurd that no judge could ever fall for it. But when it comes to autonomous vehicles, somehow, I don’t know why, but somehow they think they’re getting a pass in that circumstance.

Fred: Quoting again from their press release, it says, after review of the incident, we found that the robotaxis was operating in a manner consistent with expectations.

When it was struck by the scoter.

Michael: So it was, that’s blame the victim made very clear that it’s the scoter struck the zoox robax in, in and that probably is what happened, right? I don’t think the Zoox was responsible for the initial collision, but the big problem is moving after any type of collision.

So I, I guess the question here is, they’re not detecting humans close to the vehicle. A is one problem, but B the other problem is they’re not detecting a human being on a e scooter crashing into the vehicle. And the first place. ’cause it doesn’t appear that the vehicle is [00:38:00] aware that there was a collision and that’s maybe why it that continued to move.

Anthony: So again, Michael, I gotta ask if I’m the, if I’m the human and I hit Fred on his scoter, I. That’s a, is that is there a criminal liability there? Is there that’s

Michael: Sure. Depending on the circumstances.

Anthony: Okay. And imagine I was cloned 500 times. Would people want me on the street with my clones?

No. No, of course not. Hey, welcome to the autonomous future. So I’m gonna do the scoring now for whistleblower, for the gaslight. Sorry, the Gaslight the correct answer was Tesla this week. But Fred, your passion you’re clearly the winner. Zoox is Fred the biggest piece of shit The world, Fred.

Fred: That’s about damn time I keep paid losing.

Anthony: You did have the wrong answer, but man, you got it right. That was something. So yeah, if you’re playing at home listeners as a human, if you violate traffic laws you get punished. But if you’re in an autonomous vehicle, eh, it was just, we’ll just update the software.[00:39:00]

Oh boy.

Regulatory Changes and Their Impacts

Anthony: Before we jump into the towel, let’s go into what the latest update from Mr. Sean P. Duffy. Oh p Duffy. Is that, given the current court case related to someone with similar name maybe he should drop the p from its name or something. Secretary of Transportation, Sean P.

Duffy has announced 52 deregulatory actions across the Federal Highway Administration, federal Motor Character Carrier Safety Administration, and nitsa. Oh. I’m sure there’s maybe some weird things that maybe need to be rescinded, but I’m also sure that these assholes never read anything. They’re just like, we’re cutting things.

Michael. Are they just cutting things for the sake of cutting things?

Michael: Yeah, I mean I think Nitsa had about 18 of these somewhere around that when it came out last week. I think I choked on my waffle for a minute. ‘Cause there were so many coming out at once. But I. Looking into them more deeply.

The vast [00:40:00] majority of these actions were, I wouldn’t call them deregulatory. They’re gonna tout it as deregulatory because they’re essentially cutting out obsolete sections of the law. These are things like in 2007 you had to report X, Y, and Z to the government because there was a new motor vehicle safety standard that applied to you and it no longer applies ’cause it’s no longer 2007.

So we can cut that out of the regulations because no one’s using it anymore, and we can claim that it was deregulatory and that we’re cleaning up big government. It’s a lot of political posturing. But there were a couple of things in there that were actually substantive that were snuck in.

One of them is nits a recently announced a, desire or intention to regulate novelty motorcycle helmets. A lot of, this is part of a lot of the counterfeiting too that’s going on where people are buying these helmets on eBay or on [00:41:00] Amazon or in other, I don’t even know the names of all these places.

Tmu is that one of them. But buying a lot of these novelty motorcycle helmets that don’t meet federal standards, and it’s announced that it was going to do some regulation in that area and conduct a rulemaking. They withdrew that rulemaking. If you’re into buying novelty helmets on the internet and trusting that they’ve been certified, you’re not gonna be protected in the future.

It doesn’t look like NHTSA’s going to continue looking at that area very deeply. But the more concerning one, it may be that, they are going to eliminate FMVSS 2 0 4. Now, FMSs 2 0 4 is essentially prevents. Displacement of the steering wheel rearward in the passenger compartment in a crash.

So it’s functionally what required collapsible steering co columns and to prevent the steering column from intruding in the passenger [00:42:00] compartment. And, impaling or what use are forcefully injuring the driver. And that’s a really important it a really important standard, particularly at the time.

However, in the years since fe Federal Motor Vehicle Safety Standard 2 0 8, which applies to airbags, has come into play. And supposedly, and this is something that, Hondas filed petitions on this and it seems like it’s fairly well grounded in fact, that if you fail FM VSS 2 0 8 or even come.

Close to failing f and DSS 2 0 8, you’ve already failed f and DSS 2 0 4. And so there’s really no reason to have both standards in play because the tests in the frontal collision tests that applies to tests for airbags to make sure they’re working properly if you fail it. You would’ve failed 2 0 4.

If you’ve even come close to failing 2 0 4, you’re gonna fail 2 0 8 anyway. So 2 0 8 is in some ways [00:43:00] even more protective in this area than 2 0 4 is on its own. So Honda’s claim that was denied, I think it was a decade or more ago when they tried to get F and VSS 2 0 4 taken out of the rules.

Their claim was that, essentially FM DSS 2 0 8 is protecting us from the same danger to an even greater extent than FM DSS 2 0 4 already is. We’re still looking into that to make sure that argument is justified, particularly since this is, a collapsible steering columns are a fairly significant, in on automotive safety history, and we wanna make sure that’s a proper way to go about it. But, other and then, and that could qualify, as a, we already have this one test that’s doing the same thing or another, there’s some duplicity here and we wanna eliminate, this excessive regulation, blah, blah blah, that could qualify as that type of thing for the current administration.

And so they’re going to eliminate 2 0 4. Like I said, we are still looking into that. And if anybody else has thoughts on [00:44:00] that, please let us know.

Fred: How does Donald Trump monetize getting rid of F-M-V-S-S 2 0 4? That’s what’s supposedly me. It’s a I don’t

Michael: know, is there some kind of cryptocurrency that could be made around that?

Anthony: This is horrible, but, the early classical steering column tests, they, they, I’m not even going into it. It’s horrible. Don’t look it up. Don’t look how they did the initial classical steering test in the sixties. Idiots. What were they using cadavers? Ho no, they’re using hogs.

Michael: Oh yeah.

Anthony: Oh God. Horrific. Isn’t

Michael: that a video? There’s a video of it.

Anthony: Don’t look it up. Don’t look it up. Don’t look it up. Let’s, instead, let’s go to the towel. How of Fred? How’s this? We’re gonna do that as a transition. Let’s go to something light and fun and giggly.

Tesla’s Robotaxis: What We Know

Anthony: Tesla’s robotaxis bullshit

Fred: from squished hogs to me.

Thank you for that segue. I appreciate that. So I’m gonna talk about on request Tesla Robotaxis emerging next week. So we start with [00:45:00] what do we really know? And the answer is not very damn much. I checked the Tesla website. They’ve got nothing listed. So the only sources that seems to be available is from news reports.

So based on the news reports I’ve read and that many of you have read as well they’ve not answered, published, they’ve not published answers to the Nitsa query, which is asking them a whole lot of questions about how the FSD is gonna work. And by the way, the FSD version that they’re talking about in the vehicle is FSD unsupervised.

So the, it’s like when people release a new thing, they say new, redesigned, and then they use the same name as they used before, new redesigned, Chevy Impala it’s a completely different car, but it hangs onto to the name I. This is probably another instance of that FSD unsupervised, but it’s not yet released and there’s no public information [00:46:00] about it.

So who knows? It’s not clear for if a safety human test person will be in the taxis. Elon Musk stated that there will be no one in the driver’s seat, but there could well be somebody else in the car who’s got access to a brake pedal of some other means for redirecting the vehicle if there’s an emergency.

So we just don’t know that yet. We of course recommend that there is a human being there who can save the life of the passenger in the back during the inevitable growing pains and learning that’s going to go on. Again, as we stated earlier, we don’t believe that unguided lethal missiles should be.

Presented to the public without protection. Didn’t you used to work on missile systems? Fred? Just checking. I did. Okay. Yeah. Got some experience with that. Staffing levels for remote drivers are unknown. They’ve stated that there will be remote drivers, but they haven’t said [00:47:00] how many and what the percentage of remote drivers to vehicles will be, nor how quickly and what situations the remote drivers will be able to respond to with any level of safety.

So that’s a complete void. We know that latency, which is just to refresh your memory, that’s a time between a command to a digital system and the response that digital system makes. Basically all digital systems do have a finite latency associated with their response. We don’t know what their latency designer requirements are.

We do know that the FSD occasionally pops in hallucinations like every other AI system. In fact, we talked about one earlier where the r and I guess I can’t remember where it was, Texas, on using FSD, went off the road ended up going between two trees on the left hand side of the road and flipped over happily the [00:48:00] owner or the driver wasn’t killed.

So they were able to report what had happened. Same thing is going to happen in any other vehicle that has AI implementation sooner or later because that’s inherent in ai. AI deals in basically bullshit in the sense of Harry Frankfurt. And again, my favorite book on bullshit is recommended reading for everybody, the software stability requirements.

The vehicle are unknown. What the margins are, the how close they are to the edge of their capabilities in any kind of situation. Let’s see. We don’t know what terms Tesla will impose on its customers, whether it’s going to force them to sign a waiver or submit to forced arbitration. Whether it will sell your customer’s personal information.

’cause every customer’s gotta present some personal information, credit card, right? Validation, all that sort of stuff. [00:49:00] What’s test going to do with that information? We don’t know. What are they going to, if there’s a collision and somebody’s injured, is Tesla going to do what it has done in the past, which is force people into productal liability litigation, or they’re going to accept some kind of responsibility for blanking on the term, Michael, duty of care.

We simply don’t know. And these are important questions that, that not even been discussion inable or much less answered. What else have we got? Computers driven, testes have caused documented fatalities and dangerous interactions in many cases. Tesla, the motorists approaching stationary are slowly moving track to trailer combinations.

Crossing highways have killed people has killed drivers who were sober and some who were drunk. People whose only fault was being in a path of the computer driven Testa have died, including roadside firefighters and police couple who were return from a date. A 71-year-old grandmother standing next to the road [00:50:00] in Arizona as the sun was going down so she could be safe.

Ironically, the glare from the highway caused her to stop, and then the tester ran into her. And again, as I stated earlier in my opinion, the absolute minimum that the AV industry owes the public is verification that known or critical events that occurred while consumers were operating or were passengers in the computer.

Driven vehicles have been identified and mitigated and in mitigations, thoroughly tested before an AV or AV transportation services are made available to the public. Tesla has not done this bare minimum safety design validation. There doesn’t seem to be any requirement and Testa skirting through the requirements doing the minimum possible.

In my mind, it’s kinda like difference between organic vegetables and conventional produce. You go right up to the limit because you wanna maximize your productivity, put [00:51:00] whatever, we’ll put whatever skirts onto the, the limits offered by the FDA fewer every day, by the way.

Whereas organic food, they do the best they possibly can. And we need the organic food approach to avs. We don’t need the pesticide approach. So anyway, concluding we don’t think FSD in any of its releases is safe, whether Tesla has any intention to make it safe, either for the physical safety of passengers, other motorists or vulnerable road users, neither do we think it adequately protects its customer’s personal information or provide adequate access to the courts or recovery of personal or property losses.

Tesla seems to be doing the bare minimum to skate by and let the chips fall where they may mostly on you, the public. Good luck in Austin.

Anthony: Yeah, good luck. Good luck in Austin.

Recent Recall Announcements

Anthony: Yeah, and now it’s time for some recalls. [00:52:00] Ah, first off, NITSA has changed the display of how they do recalls in their PDFs.

All

Michael: right. Don’t get me started on what’s going on, NITSA and

Anthony: recalls. Okay, nevermind. First, there,

Michael: there has been a huge problem the last two or three weeks with nits a’s recall reporting and that whole system, I’m not quite sure what’s going on. The part five seven three safety recall reports are now blue and a little prettier.

But there’s been a lot of delays and issues with proper posting of documents. Even one of the recalls we’re gonna talk about today, we don’t have the five seven three recall report yet, so I can’t tell you owners when Volkswagen’s gonna get you that notification about a repair.

Anthony: Wow. It’s Michael Brooks putting in a recall to the recall system.

There you go. We’re gonna start off with Zoox. We’ve previously talked about 270 vehicles. Does anyone know how many vehicles they have on the road? My guess is 270 vehicles. Two 70. There we go. In speeds of less than 0.5 meters a second. Wait, are they actually using meters per second or is that miles per second?

That’s gotta be meter per I should be [00:53:00] meters, yeah. Wow. UUS using meters. What kind of world are we living in?

Michael: I just wanna go on record as saying I’m an advocate that we go ahead and switch to the metric system.

Anthony: Hey, I agree. Speed’s less than a half a meter. Second, the zoox vehicle may not detect the presence of a prone BRU for those not playing the home game.

BRU stands for vulnerable road user for those who need more definition of that means anybody on the road. Okay. You’re on the road, you’re vulnerable. Not in a car. If you’re not in a car, not in a car, if you’re on your scooter, your bicycle, you’re walking your skateboard, you’re on a unicycle, you’re a vulnerable road user.

It may not the it may not detect a presence of a prone person at certain locations immediately adjacent to the vehicle. Basically them saying, Zoox has blind spots. We can’t see. Aren’t computers great? Aren’t computers supposed to eliminate and do all these things that humans can’t do? Nah. Why?

Maybe ’cause they’re built by humans. That’s that,

Michael: yeah. And all those were supposedly already [00:54:00] fixed with an OTA update. We’ll keep of course. People watching to make sure that autonomous vehicles can actually see humans.

Anthony: They’re better than humans. They’re computer humans. God duh. Have you ever gotten your credit report?

It’s done by computers. Yes. Perfect. Also for the people with the credit agencies. My last name is not Pierce. I was not born in 1985 and I don’t work in a casino. Anyway. Next up Ford Motor Company. What’s surprising? 1,000,700 over a million vehicles. 2022 to 2024 Lincoln Navigator. The 2021. 2023 Lincoln, MKX, the Lincoln Corsair Ford Mustang.

Oh, that’s a weird, it’s

Michael: virtually every car Ford’s made the past few years. Oh, yeah.

Anthony: Oh my God. As it keeps going and going these oh, my, this is just pages and pages.

The Rearview Camera Issue

Anthony: The Oh, son of a, of course, the center infotainment screen may freeze, followed by a black screen and a system reboot.

If this occurs during a backing event, the rear view [00:55:00] image may be frozen, missing, or delayed. I wanna understand. I really, I want to talk to an engineer at one of these. Auto companies and figure out why is the rear view camera so difficult? I don’t get it. Why can none of these companies do it? What the

Michael: it’s crazy to me that, and in this case, it falls back to something that we’ve, one of the. Problems we see over and over again. They’re integrating a safety rear view camera in with the Ford Sink System, which also controls navigation and entertainment and other things. And when they have problems, when they have problems within the sink system, it can affect the safety performance of the rear view camera.

And I continually have asked companies, to. Somehow firewall the safety features of the rear view cameras off in a way. I know they have to use the same screen in a lot of cases. But can you give the, can you give the rear view safety camera precedents over everything else?

Or somehow [00:56:00] make sure that your continual updates to infotainment aren’t screwing up the rear view camera system? I think there needs to be a some, maybe some changes to federal motor safety Vehicle safety Standard two 18, I think it is maybe two 11. I’m a little off this morning.

But the one that applies to rear view systems to, to make sure that the safety rearview camera is given priority over infotainment or navigation of, or other things.

Fred: We’ve talked about software as stress testing in the past. This is a great example of a failure of software as stress testing. ’cause they should be testing this system at the maximum capabilities.

Not at some nominal level. If they had tested it at full capacity in all of its respects, this wouldn’t be happening because they would’ve detected this systematic defect in the software. So software starts, testing is very important and this is what happens when it’s [00:57:00] not done. And by the way, we haven’t mentioned Piggly Wiggly, so ah, dammit.

Anthony: If you work at one of these auto companies or one of the suppliers who supply the system, you can contact us anonymously. Just go to send an email to contact at Auto Safety. We’ll keep your information private. We’re not trying to get you to blow the whistle or anything like that. I really just wanna know from an engineering standpoint, why is this such a hard problem?

Because if you can’t get this right, yeah. Autonomous vehicles, they’re coming. Anyway. Next up. As Michael said, there’s no recall notice yet, but guess what it is? Volkswagen distorted rear view camera image,

Michael: blah. Yeah, we can’t tell you a lot about this one. We, I can tell you because I found it in here that owner notifications are expected to be mailed July 18th.

But beyond that we know that it is a. Software error that distorts the image on, on, on that the rear view camera displays. So you’re, you may get an image, it’s just not gonna be accurate. [00:58:00] So that violates f and DSS one 11, which is the F and DSSI couldn’t accurately define a couple of minutes ago.

Anthony: Is it one 11 or two 11?

Michael: It’s one 11. Oh, at least it is in this. Okay. So I was way off.

Anthony: Yeah. I’ve never seen you be that off.

Michael: Yeah. Not enough coffee yet. I get you.

Anthony: Next up, Chrysler 235,640 vehicles Oh dear. In 2022 to 2025 Ram Pro master, ah, guess what it is? Rear view camera. A rear view image that does not display.

The Dr. A driver will notice that the rear view image is not displayed when the vehicle is placed. In reverse moving on. All right.

Michael: Yeah. This is the same problem you place in reverse. It’s not showing up. Yeah. Let’s, this one’s, yeah let’s move on to the next.

Yeah. Owners can expect this one to be implemented in mid June.

Anthony: Oh, okay. Here’s the one. Hey. Oh, it’s not, he, it’s not rearview [00:59:00] camera. Thank God. Final recall Nissan 79,755 vehicles. It turns out you might have the former CEO hidden in your trunk. Now I’m kidding. 2025 Nissan kicks the 2025 Nissan Frontier.

You like that little Carlos? No. I like the fact

Michael: that you’re walking into a trap here.

Anthony: Yeah, I, oh, no, I am. I thought it was, it’s a spark ignition fuel. Yeah. Yeah. Come on, God damnit. If the rear view image is not available. Ah, rear view image again.

Michael: Yeah, this one’s a software logic error, communication error with the in vehicle infotainment module, right?

So more problems involving infotainment and safety systems being used in inside the same system. There’s not much more to say on that since I’ve already said it. But basically they’re gonna update your software on this one, and it’s gonna happen probably in July, around July 4th. You’ll see your notification if you own one of these [01:00:00] vehicles.

Anthony: The only lesson I can learn from all these rear view cameras is that by a Ferrari, ’cause they haven’t had a rear view camera image yet. Problem. Yeah. I think Ferrari is like the only manufacturer that hasn’t had a recall around rear view camera. Everybody else I would,

Michael: I would say, rather than buy a Ferrari, learn how to use your mirrors

Anthony: and turn around, but still be very,

Michael: be careful.

Anthony: The rear view camera with a guidelines, that’s really helpful when you live in a city and you have to back into a spot really tight. It’s very helpful.

Michael: Yeah, I love the rear view camera. I love them. Their forward view. I love these 360 degree views when they work. All of those would be awesome, particularly for parents with children running around people with pets.

There’s just an endless amount of advantages to having. Cameras on the outside of your car, seeing places that you can’t see.

Anthony: Yeah.

Conclusion and Sign-Off

Anthony: If you can drive a Tesla, you can see people from all different directions flipping you off. Hey, with that listeners, thanks for joining us for another hour and [01:01:00] couple minutes of your time.

And we’ll be back next week.

Michael: Bye-bye. Thanks everybody. Bye

Fred: bye.

For more information, visit www.auto safety.org.