Pin the tail on the Bullshit

Subscribe using your favorite podcast service:

Transcript

note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.

Introduction and Hosts

Anthony: You are listening to There Auto Be A Law. The Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Cimino. For over 50 years, the Center for Auto Safety has worked to make cars safer.

Hey, listeners, today is Tuesday, December 16th. I hope it’s nice and toasty where you are. ’cause I’m freezing and based off of Fred’s outfits, he is living in a polar ice cave.

Fred: We’re just waiting for the glaciers here.

Anthony: Okay let’s start off with something moving at a glacial space. No pace. No, it’s moving fast than that.

Discussion on NHTSA’s Research on Automated Vehicles

Anthony: NHTSA has released a multi-year research project on modernizing safety standards for automated vehicles. What does that mean? I don’t know. But part of it, and I’m gonna quote is from nits administrator Jonathan Morrison, formerly of Apple. [00:01:00] The future of automated vehicles is right around the corner.

That’s as far as I can get with the quote. ’cause what does that mean? How big is this corner?

Michael: That’s meaningless because people have been saying that forever, right?

Anthony: Yeah, right around the corner. I self flying cars right around the corner too. I remember that. Hearing that in the eighties and the nineties.

Fred: So well, it’s the technology of the future and it likely always will be.

Anthony: Exactly. But this, I

Michael: mean, this is, this

is a project that n has been conducting since I mean they started it in probably around 2015 or so. They had their Volpe Center for Research. Take a look at the Federal Motor Vehicle Safety Standards and say, okay.

We designed all these safety standards with, the assumption that there’s a human in the driver’s seat. Now we’re seeing, at that time Google Waymo was already fairly heavily involved in building an autonomous vehicle. A Tesla had started parroting it’s [00:02:00] nonsense about autonomy.

And so they were taking a look at the Federal Motor Vehicle Safety standard to say, obviously they need to be modernized. I don’t think that anyone disagrees with that, no matter what side you’re on of the discussion. But this was essentially, they contracted, I think in 2017 or so, maybe late 2016 with Virginia Tech Transportation Institute to do a study of the Federal Motor Vehicle Safety standards to see what needed to be changed to allow for autonomous vehicles to be safely integrated into the transportation ecosystem. Fred, I believe you, you participated in that effort, did have a little bit to do with that.

Fred: Yeah. Only a little.

Michael: Yeah. And so they did it in four parts. They basically broke the f and VSS up into, certain categories, that they thought they could tackle it once they were, bore some similarities.

And so this is basically their release of volume four of that. I, research and, NHTSA in conjunction with this, [00:03:00] released a request for comment to get public input on it. Frankly, the request for comment, seemed to be directed less at safety groups and individuals, and more at the industry asking for, industry changes, product plans, new concepts or emerging issues, none of which the industry is telling any of the public about.

Most of that stuff is kept as trade secrets and who knows how much they’re gonna release to NHTSA as part of this request. But it’s, it’s, I’m a little I guess I’m,

Getting cynical again about NSA’s approach here.

Fred: Let me, let

Michael: me just expand that a little bit.

Fred: Not the cynicism, the history, the, at the time the urban myth was being developed that.

If only we can get rid of humans, we can reduce the number of collisions in cars by 94% because the computers are gonna be perfect and the world will be wonderful. So at the time, NHTSA bought that Department of Transportation [00:04:00] bought that approach. And so implicitly, anything you could do to promote the proliferation of autonomous vehicles would enhance highway safety.

So the only thing they looked at is getting rid of regulations rather than using any technology or, a more holistic approach that would use that technology to enhance highway safety. Because of the prejudices and because of the urban myths that was that was all they really looked at.

And Michael, you’re a philosophy major, so isn’t that a good example of deductive reasoning?

Michael: Yeah. Yeah. I’ve always wondered if there’s not a better path here.

Challenges and Considerations for Autonomous Vehicle Safety

Michael: If autonomous vehicles are so different. Do we need a separate set of STA safety standards that apply to autonomous vehicle? Like a lot of the crash worthiness and a lot of this stuff they’re talking about in this release.

There probably are tweaks that need to be made here and there to ensure that [00:05:00] computer driven vehicles can be tested and certified to the F-N-V-S-S. But there are also, specific issues with autonomous vehicles, related to, perception and, path planning and all the things we talk about safety wise that are only related to autonomous vehicles.

That, that makes me think we need, a kind of a subset or a new set of safety standards that would apply to those vehicles. And specifically some things that come to mind are, that manufacturers keep proposing all of these, weird seating arrangements, with side facing seats or seats that, that revolve and things that have never really been put into your average car.

Never really been, truly vetted for safety. I think. There are a lot of challenges there protecting people in seats that face different ways and are movable. And, beyond that there’s also a lot of challenges, in, in making sure that these systems can, detect problems when they’re [00:06:00] encountering them.

These computers need to be self-aware or introspective in a way that allows them to discover their own faults before they, commit safety violations or before they crash. So there’s a lot of things involved here that, that don’t ever come up under the traditional motor vehicle safety standards that don’t necessarily involve a lot of, the mechanical engineering aspects that the previous or the older standards do.

There’s certainly something that needs to be done here. I guess we question just. Does it need to happen immediately? Does it need to happen? This is presented as a multi-year continued effort. And it, we haven’t seen any proposals as far as actual changes to the Federal Motor Vehicle Safety Standards.

At this point. We have seen, we did see a proposal, I think at the end of the previous Trump administration that attempted to address some of these things, but that one was wiped out by the Biden administration. And so at this [00:07:00] point there, effectively aren’t any planned changes that are going into effect in this area.

And so it continues to be a almost decades long, decade long research project with, no real end in sight.

Fred: And the problem underline this, is that people anthropomorphize the computers and say they’re just like people only better. But any rational approach to regulating AVS has to recognize the new IES and the new issues and the new potential false associated with computer control of these lethal machines.

Fundamentally they are industrial machinery, which is capable of killing anybody in their immediate environment. So people need to look at it that way rather than saying, this is the Jetsons, and the sooner the better. I’ve noticed, and this is the subject of my gaslight this week that there’s shading the urban myth towards a new urban myth.

Myth mythology is very strong, very [00:08:00] powerful, and people buy into it quite readily. But time for that wants to get the gas.

Anthony: So I like what you said there, Michael, is these artificially intelligent cars, they need to have I guess artificial introspection in order to understand the environment.

Yeah you’re

Michael: That’s something I’m really concerned about. Not just in autonomy also, but in, automatic emergency braking sensors and things like, there are points at which, either the software’s not working properly, the conditions don’t fit what the vehicle is designed for.

There, there are just so many of these situations that the vehicle can encounter where it needs to be able to figure out that it’s got a problem, which is,

Frankly, incredibly difficult for humans to do in the first place. I got to imagine it’s certainly not easy to program that into, the computer systems they’re using.

Anthony: That’s computer driven cars, but unfortunately people still drive cars. And what do people like to do when they drive cars? Drive really [00:09:00] fast.

Speed Limiting Devices Proposal in Arizona

Anthony: If only there’s some way we could deal with speeding drivers. The great state of Arizona a representative there is proposing something that’s right Arizona to consider speed limiting devices to improve road safety clutches, pearls quoting from an article from the Arizona Capital Times representative ing, I apologize for totally garbling that name Republican from Prescott Valley said on October 27th that he’s on introducing a bill next session that would allow courts to order a speed inhibitor device to be installed in the vehicles of motorists who frequently or accessibly violate speeding laws.

I love this. I think this is a simple solution like you, you keep speeding. Here’s a little device that prevents your car from speeding. What am I missing here besides freedom burgers and all that nonsense? I,

Michael: I don’t think you’re missing much here. We are completely in support of these types of laws.

I I know that right now, I think [00:10:00] Washington State, Virginia and DC all have something similar. I, Arizona is different, I believe than the one we’ve looked at in Virginia. In that Virginia, the judge has the authority to order this. That this technology be installed in the vehicle in Arizona.

It’s an option for the driver who is, to trying to get, a lesser sentence. Like you essentially in this case, you might have the potential to have your license suspended for, up to a year or so. And in lieu of having that suspension take place, you would agree to have technology installed on your vehicle that prevents you from speeding.

Which is I, that could be successful approach as well. I, I think relying on judges to order this in every case versus relying people to use their own self-interest to decide, Hey, I’ve still gotta drive and the only way I’m gonna be able to do that is to agree to have this [00:11:00] technology place placed in my vehicle.

I think that could work as well. But I think, we’re not really sure how this is going to be put into place yet. Is this going to apply to, everyone who gets a reckless or a, a bad speeding ticket? I’d say if you’re going, 15 miles per hour or more of the speed limit, are you automatically facing a situation where you have to do this?

Is it only going to address people who have gotten 10 tickets like that? So we’re not really sure what the thresholds are at this point. And that’s important because, frankly, we want similar technology to go into virtually every car in America. Because we think it’ll prevent, some of the, I guess there are, at least 10,000, maybe around 20,000 deaths every year.

Related to speeding. And we think this technology, we don’t think humans are really ever going to quite be perfect. We’re not all going to change and become nons speeders. So really the only way to do this outside of, you know, [00:12:00] increasing speeding enforcement to insane levels would be to use this type of technology.

And you also have the option of, doing more speed detection and cameras and that sort of thing. But the way those systems are implemented vary greatly from state to state, and there’s a lot of pushback against those systems. At least here you are addressing the worst of the worst and getting some of them off the road because they’re probably going to be responsible for a significant percentage of the crashes that occurred due to speeding every year.

Anthony: That kind of ties right into what I’m gonna do for my gaslight. You’ve inspired me. I had a just tough decision to make, which way to go. And you’ve pushed me towards my gaslight. Now my gaslight is the American public. That’s right. So this is related to Vision Zero.

Vision Zero and Public Resistance

Anthony: Now. Vision Zeros is great idea to, hey, what we can put in all of these basically road safety techniques and whatnot to make roads safer for drivers, for pedestrians [00:13:00] and everything.

And I believe this is from an article unfortunately that we can’t share ’cause it’s paid locked, but from the Washington Post, I believe where it’s, I’ll just quote from, it says eight years ago as part of a national initiative to stem traffic deaths called Vision Zero. The city of Los Angeles shrank the number of lanes on the road, Vista Del Mar, and several connecting streets in the Shoreside community just north south of Venice.

But they were stored it to four lanes after an uproar by drivers among them, Octavia. Something who railed against a city official on a 2017 Facebook post. You should take that seriously stating he was stuck on one of those intersecting roads in the traffic hell you created. So you will not find any US citizen anywhere in the world who’ll be like, I don’t want streets to be safer.

No, we’re all like safer streets. This is great, but don’t you dare do it to me. This is like not on my road. So I’ve created a new thing called Nomism. It’s related tangentially to Nimbyism. But the American people, [00:14:00] you’re gas lighting us and yourself. If you want safety, you gotta have to, stop being a dick.

Michael: Yeah. Look, that’s a big part of the issue, right? People are against, we’ve heard people who are against having alcohol detection devices made standard in vehicles. Bernie Moreno comes to mind. We have people who are against in intelligence speed assistance. They feel like it’s the government telling them how they should drive, and we have people who are against improvements that save lives of pedestrians in the midst of the, the biggest pedestrian death crisis that we’ve probably ever seen in this country.

So people will say they like things safe, but exactly they when it comes to them and causes them, a 12 second delay, it’s a problem.

Anthony: The ironic thing is the woman who complained or the person who complained on Facebook not much longer later they [00:15:00] got into a car accident on a crosswalk where they said, Hey, we need four lanes back.

So that’s my gaslight. Fred, you were you were already prepping your gaslight and you’ve muted yourself, but we enjoy looking at your hat.

Fred: I have not muted myself, Anthony. Oh, you,

Anthony: you

Fred: have, you were just, you were lip syncing without words. I just pushed the wrong damn button. It’s different.

Oh. Anyhow, is that your

Anthony: gaslight? Is that your gaslight? Is that mute button that’s gaslighting

Fred: us? Yeah, I’ve gotten good at it. Yeah. Good.

Gaslight of the Week: Autonomous Vehicle Industry

Fred: So there’s a organization called the Autonomous Vehicle Industry Association, which has declared that everything is great. And so what I’ve noticed is that people are moving on from the urban myth.

Of humans are the problem. And if we just put in computers, everything will be great to saying, now we’ve done it. We’ve put into computers and everything is great. And this is all based upon studies being done by Waymo, [00:16:00] oddly enough, that show that based upon insurance claims, everything is pretty ducky with the autonomous vehicles.

So what’s the problem with this? The problem with this is that if you base things on insurance claims and the number of at fault incidents, what you’re really doing is you’re removing the analysis from the fact of the injuries to their legal facility of the people who are injured. If you have a hurricane sweeping through somewhere and there’s a lot of damage, you would probably say what was the maximum wind speed?

And what are the conditions that led to all this damage? And then you would go in and you’d look at the building code and maybe make some modifications to say the wind speed exceeded the previously predicted wind speed by make up a number 20 miles per hour. So we need to make things stronger.

That would only make sense. You could alternatively say let’s look at the number of claims made [00:17:00] by people who are insured by Allstate. And we’ll look at that and we’ll say if, if there’s a lot of claims, then we’ll go ahead and try to make things stronger. But how would you know how much stronger to make it?

And how would you know you, you’ve moved things from the engineering domain into the legal domain. This makes no sense from a perspective of collecting data to assess the hazard associated with something. So this is what they’ve done basically for all these studies showing dramatic improvements in safety.

What would be the easiest way to skew the data? Based upon insurance claims to show that the system is really safe. The easiest way to do that would be to just pay the claims or pay for the liability without making an insurance claim. So that would mean that all the claims would go away. The number of claims adjudicated through the legal system would go way down because you, the [00:18:00] company have paid them off and everything will look pretty good.

Now, these studies also neglect all of the collisions reported in the standard general order standing general order. So they start with rejecting all of the known data associated with the collisions. Then they look at insurance claims, which are certainly several steps away from the actual damage. Then they go ahead and recycle that over and over again.

So the A VIA used this Waymo study to say that everything is ducky. And because this study says that everything is ducky for all the other AV manufacturers too. So we’re ready to go and make the road safer. So it’s a cascade of bullshit. And in this game of pin, the tail on the bullshit, we are all lining up to say that, this is all we need to do.[00:19:00]

So I’m giving this award or my nomination to our friends at the Autonomous Vehicle Industry Association.

Anthony: I like it. That’s it. If you’d like to hire Fred Perkins for your Let Your Children’s Party, he’ll definitely bring his. His fan favorite game. Pin the tail on the bullshit. It’s messy. Some of the kids might get Hep C, but hey, it’s a good lesson for all

Fred: kids

Anthony: like that.

Yeah. Great. Michael, what’s your Gaslight of the week?

Gaslight of the Week: Trump’s AI Preemption Order

Michael: My Gaslight of the week was brought to us by president Trump this week. Aw, boo. You’ve lost already. Congress has thankfully so far refused to, pass a broad preemption order basically that would direct states to be, to stand down in regulating ai.

Anthony: And Trump never passes any broads. Hey.

Michael: Ah, come on. That’s good. And that’s been, sketchy. It’s, the autonomous vehicle [00:20:00] industry for a decade now has been trying to preempt states from regulating the autonomous vehicle industry under similar, it’s basically similar notion that, that all 50 states are passing all these different laws and the industry just, it’s impossible for us to keep up with all that and to produce autonomous vehicles in every state.

They refer to this patchwork of state laws all the time, but, most of those state laws are very directed at safety and it, and at preventing vehicles from doing things on the roads that are going to injure the residents of that state. California is probably most notable here in that they, collect data from autonomous vehicle companies.

And, any day now we’re going to hear about the California DMVs decision on Tesla. And that’s, Tesla stands to lose a lot of things if it loses that case and it probably deservedly but the Trump AI order is basically, [00:21:00] essentially preemption works when the federal government con, when the Congress passes a law that, under the supremacy clause of the Constitution prevents states from doing something different or from having a separate standard or from, having a heightened standard in some circumstances.

But that’s a process that only takes place when Congress passes a law. The president can’t wave his hands and eliminate state’s ability to regulate these things. And that’s exactly what they’re trying to do here. Given the failures to get this type of legislation through the House and Senate, president Trump has said, we’re going to, they’re casting this in terms of state laws that obstruct our national artificial intelligence policy.

Most of the laws that have been created so far have been done in response to things like deep fake technology, areas where AI is used in very nefarious ways. Essentially actions that are already illegal for humans to take [00:22:00] part in. And so there’s a lot of nonsense in the executive order.

My gaslight is focusing on one very specific part that kind of. Turns the Federal Trade Commission Act on its head. The Federal Trade Commission has the authority to go after deceptive trade practices. Any companies that, that, you know, and we’ve written the Federal Trade Commission many times about Tesla to try to get them to tell Tesla, Hey, you can’t use the terms full self-driving or autopilot to describe a system that requires humans to, to take over at any time.

That’s not full self-driving, that’s not autopilot. It’s going to get people killed. They didn’t take that up unfortunately. But in this circumstance, president Trump is directing the FTC to go after states that past laws requiring alterations to Now listen to this this is a quote, the truthful outputs of AI [00:23:00] models.

Fred, what do you think about a, the truthful outputs of AI models?

Fred: That’s another version of pin the tail on the bullshit because it’s been proven. And inquiring minds have recognized that all AI results are by definition bullshit using Harry Frankfurt’s definition of learned philosophy professor at Princeton University.

They’re indifferent to the truth. They just spit out whatever seems to rise to the top of a probabilistic algorithm without regard to whether or not it has in fact any factual basis. Yeah. I also wanna point out that since we always want to help the president in every possible way we can, we’ve helpfully put together a list of questions that regulators should be asking of any, aV company that’s looking for permission to use its roads, and this is available on our website. Just [00:24:00] go to the website and look for the checklist and it’s very helpful. I’m sure that will be incorporated in total by the NISA as soon as they get around to it. They’re very busy,

Michael: And back to the Gaslight.

So essentially what they’re setting up is, a lawsuit machine. They’re gonna use the Department of Justice to go after states and use the section five of the FTC Act, which in my understanding only applies to decept deceptive practices by corporations, not by states, this is 2025, and the federal government is engaged in turning the law on its head in a number of areas.

And here’s another example of that, where they’re directing the FTC to make sure that states aren’t being deceptive by questioning the truthful outputs of AI models. So trust the corporations go after the states who are trying to protect their citizens. That is a perfect [00:25:00] gaslight of the week.

Anthony: That’s pretty good.

And now I really like how you two teamed up on that one. That was really good. But hey, no surprise, I won’t this week. Okay. For creating the acronym nomism. Okay, but hey, half need, look. If you wanna feel like a lit winner two, and you don’t trust the corporations, go to auto safety.org and click on donate.

Tell your friends, Hey, I don’t need any gifts for this holiday season. Instead, make a donation to the Center for Auto Safety, and they’ll be like, I think you’ve been pinned the tail on that bullshit. No, not at all. Speaking of pinning the tail on the bullshit,

Fred: Teslas, wait a minute wait minute before you do that know, jump in any, this has got, yeah, what has any of this got to do with Piggly Wiggly?

Anthony: Ah, played, second place for Gaslight. You are run in distance. Look, I,

Michael: I have an argument that Piggly Wiggly is deceptive in the fact that if you go in there, none of the pigs are wiggling. So

Anthony: I,

Michael: that’s all I’ve got. That’s

a

[00:26:00] good one, Michael. I like that. That’s that. Oh,

Anthony: but nope, you’re not.

You’re very

Fred: perceptive guy. Michael’s

Anthony: in second place now. Again, sorry. I’m not even sure that such a place exists.

Tesla’s Controversial AI Chatbot

Anthony: So Tesla, speaking of AI and nonsense part of little Elon is Empire is their little AI chat bot called Grok. And Grok likes to spew things like Hitler was pretty cool and all other sorts of ridiculous things.

And now they’ve incorporated grok into your Tesla. That’s right. ’cause you’re driving along and you’re like, why do I hate the Jews? Grok remind me. And the car tells you, remind you why you hate this. No, it’s not what it does. Every possibly does that. From an article in USA today quoting users can adjust rock’s, voice and personality, choosing from a variety of profiles, including unhinged, because that’s what you want when driving down the street.

Hey, where’s the nearest gas station? Fuck you man. We don’t need no gas station. That’s the on [00:27:00] hinge voice. Sorry. Is this too much? Is this

Michael: yeah, this whole thing is just nonsense. We know that, even talking to a passenger in your car is distracting.

Anthony: Is not a passenger.

It’s my spirit animal.

Michael: I don’t know why. Tesla and Elon Musk continue to insist on top of last week where they announced that they were going to be essentially. Allowing people to text and drive and promoting that behavior in their vehicles when it remains illegal in 49 states in the District of Columbia.

Now, on top of that, they’re adding their, wacky chatbot as an option for people to talk to and distract themselves even further while they’re driving. I, Tesla just doesn’t take safety seriously. Certainly not as seriously as they take, releasing new gizmos and gadgets and features every week to boost their stock price, re regard regardless of whether those new gizmos and features are in, in, in any way safe or in any way add anything to our lives.

So [00:28:00] it’s once again, disappointment in, in the, in a Tesla announcement.

Anthony: Surprising yeah, tangential. The, maybe the the bizarro version of Tesla. Let’s talk about Rivian. Now Rivian is founded and run by a guy who’s a legitimate engineer. I don’t think anyone has any question that aj, whatever his name is, Sc cr Yeah.

He is not a very smart guy. He’s not, he doesn’t strike me as a con artist like Elon. But I guess he’s got a juices stock price a little bit. ’cause what now they’re coming out with saying, Hey, we’re going all in on buying a full self-driving and enhance things. And from an article in the Washington Post I’ll quote, the most precious thing that we all have is time syringe.

Tell the gather reporters. I think what autonomy does is really give you some of that time back, how you’re still sitting in the goddamn car. Like what? Yeah, exactly. How’s it giving you?

Rivian’s Controversial Autonomy Plus Package

Anthony: Because now you, your time’s back and you can watch a movie, read a book put yourself and your [00:29:00] loved ones and every other road user in serious danger because.

This is, I before the show, I wanna quote from Michael Brooks said, I want to like Rivian, but I can’t. And I agree I can’t because of nonsense like this.

Michael: Yeah, that’s exactly it. I just, this, they’re not really announcing anything here. They’re saying, we’re gonna make a push to make our vehicles capable of doing X, Y, and Z.

Which is something we hear all the time from the, auto industry and the AI industry when they actually don’t have a product ready to go and they’re just looking for some other benefit in the short term. The best I can say about this announcement is that, sc CR has gone all in on lidar, right?

They’re saying, essentially, they’re following Tesla in a way, and I think they’ve been accused of this in other areas, of coming after some of the same practice that Tesla does. But they have been, I would argue much more safe in some of [00:30:00] their introduction of partially automate partial automation and other things.

But in this case, despite the lidar, I just wonder, it seems like there. Sticking their spear in the pa, same path as Tesla, which is we’re going to develop this as a system that relies on humans to take over until we’re not right. And I just don’t know, we’ve probably talked to ad nauseum about how, Waymo abandoned that repo approach almost a decade ago because it’s simply unworkable and is unsafe.

And here we have Rivian kind of alluding that they’re gonna be doing the same thing as Tesla. And they’re calling it an Autonomy plus package. They’re gonna sell for $50 a month and. Start with hands-free driving what you already know is a safety risk. ’cause as soon as a driver takes their hands off the wheel, they’re less engaged in the driving task.

These are all things that make us very [00:31:00] uncomfortable and make us wonder, why if Rivian has a potential solution for autonomy, why don’t they just deploy their level four autonomy and work on it just the way some other companies are doing, rather than take the Tesla approach and pretend that this is an area where it can be.

I just continually have questions about what is that point? And Tesla, I have this very similar question now that they’re saying they’re removing safety drivers from robo taxis in Austin. What is the point at which you can say it is safe to move from a vehicle that’s monitored by a human that can intervene to where you just remove that completely and say we’re fully autonomous.

I think that’s an incredibly difficult decision to make, and I don’t think that you can build the data upon which to make that decision. Certainly not within any reasonable timeframe.

Fred: I, I think it’s also important to challenge the logic of a company leader who thinks the most valuable thing in the [00:32:00] world is time.

And many of us would think that the most valuable things are associated with, not killing people and keeping your children alive, and little things like that. All of the AV companies and all those aspiring to be AV companies, I’ll say the safety is the highest priority, but none of them actually exhibit that.

Waymo’s Safety Concerns and School Bus Incidents

Fred: We’ve recently seen Waymo running past school buses with stop signs deployed and red lights flashing in Texas and in Atlanta, in Austin, Texas, and Atlanta, Georgia, and. Okay, so that was observed and they were maybe taking corrective action. They were updating software, but they refused despite the request by the Texas School Board to stand down while they were making those changes.

So Waymo refused to acknowledge that yes, there’s a safety hazard with our vehicles, zooming past school buses that are [00:33:00] letting children off and on, in fact, zooming past the school buses while the children are getting outta the school bus in one case. And clearly the operations have a much higher priority than the safety of the school children.

And so there’s no data and, associated with Rivian saying they’re gonna do the same thing with Waymo and Tesla removing safety drivers from the vehicles without any kind of substantial. Database to say that it would be safe to do that. Clearly safety is not the highest priority. Clearly operations are a much higher priority, and probably the highest priority of all is public relations and the image of safety because they want to expand their operations base into more and more cities as quickly as possible.

Safety be damned

Anthony: well said. I like it. And related to this, let’s talk about your favorite [00:34:00] subject, Fred Waymo. That’s right. What you’re talking about there with the school bus situation from an article in futurism Austin Independent School District Police Chief Wayne Snead, told CNN that Waymo did not agree with our risk assessment and respectively declined to stop operating.

’cause the city of Austin said, Hey, this is a problem. You guys said you fixed it. You haven’t fixed it. We’re still seeing this. So when our can you guys stop running this around our school bus routes? And Waymo said, no man, we’re the future. You just don’t get it. Waymo actually didn’t respond to question.

The dispute comes quoting the dispute comes as Waymo has been begun, reprogramming its vehicles to drive more aggressively. Those updates are intended to avoid passive or defensive driving. Though the company has yet to cause the death of a human, it’s already run over numerous beloved pets and has become involved in a growing number of bizarre incidents like when a Waymo drove its passenger through an armed police standoff.

Waymo, take your life in your own [00:35:00] hands and I really look forward to a future episode where we go deep dive into the terms of service, which are amazing.

Waymo’s Privacy Issues and Odd Incidents

Anthony: Michael.

Michael: There’s going to be a lot of things like, we saw two, two kind of our, in our Waymo Follies collection. We saw a couple of things this week I think, that were interesting,

Fred: great

Michael: ones.

The first was there was a woman in San Francisco who gave birth to a baby in a way. I think she actually gave birth to the baby or wasn’t just in labor. Which was really interesting. And I think Waymo put out a post about that was very, aren’t we great? Yeah, it was an aren’t we great and everything?

And, but our question is, how did they know that a woman was, giving birth inside the vehicle, and how do they monitor people inside the vehicles? Because presumably at some point, your behavior changes from, I’m sitting in the back of the Waymo as a normal passenger to.

I’m laying down and actually giving birth to a child. How did they figure that out? Without, there was [00:36:00] no evidence that, the woman had reached out or contacted Waymo Customer Service. From all the statements I saw, at least it looks like they discovered it somehow, and that the only way they could have done that is by directly monitoring the passenger in the vehicle live.

So that raises some questions about, privacy in Waymo vehicles. But otherwise, I’m glad that woman was okay and the child was okay and everything turned out hunky dory. But it raised some questions because the second Waymo folly I noticed this week was from LA where a mom and a daughter are entering their Waymo, and there’s a guy in the trunk, which is not really a trunk, it’s more like the.

The back of a hatchback compartment kind of area. And the Waymo didn’t detect that there was an actual, a non-paying passenger lying in wait in, in the vehicle prior to this mother and her daughter, I believe. I don’t know if it was just the daughter who was taking the ride or the mom or what, but prior to that, so that just raises [00:37:00] questions about they’re monitoring, but are they doing a very good job of monitoring what’s going on here?

So there’s, this is going to be I think, and we’ve seen, I, I called this Waymo Foley’s Volume 46 because there have just been, so many situations where we’ve seen. Waymo vehicles involved in odd situations. And this one is something that, could happen to any of us, right?

Any of us could get into our car. My and there could be something in the vehicle that we don’t want there. My aunt a couple years back, got into her car and there was a bear inside of it. Things happen that are odd and it’s, it was hard to tell from the video and from, oh,

Anthony: No.

Your story stops right now. No, you’re not. Glancing over that your aunt got in her car and there was a bear inside. No, we’re not just, yeah, she got, we’re not walking past that.

Michael: She got into her vehicle one morning and immediately got out when she noticed there was a bear in the vehicle. The bear had gotten.

They don’t really, they don’t know if the bear opened the door, it’s on. If they left the door open, the bear got in and then somehow got shut in the car and didn’t know how to open the doors. But yes, there [00:38:00] was a bear inside of the car. I think the

Fred: battleground, the bear’s fob had probably run out, but I had, but I wanna point out that question number 28 on the checklist is, has the applicant shown that the CDV occupants are adequately protected from harassment, entrapment, and threatening actions by unauthorized, unwelcome, or threatening third parties, including bears?

Use the checklist folks. Bears are not on here. We can always update that.

Anthony: Okay. We’re not gonna add bears, but yeah. So the story with the woman in in Los Angeles who the, there was somebody in the quote unquote, as they call it in the story trunk or back of a Waymo car. This is a weird one ’cause it doesn’t say the age, but it says a mother was ordering the Waymo for her daughter.

I, I have no idea the age range, but we’ve heard in San Francisco of parents these lovely, wonderful parents ordering Waymo’s for their children for 10 year olds. ’cause they don’t, I don’t wanna drive you to school or have you get on that gross public transit. Here’s a driverless car to take you.

That’s [00:39:00] pretty angel number

Fred: 25. Has the applicant shown that unaccompanied underage users are prohibited from use?

Michael: Waymo has a program, they call it Waymo team, that they announced, I think in July. I think it was o operating in Phoenix. I’m not sure if it’s moved to Los Angeles where this took place yet, but it allows I think 14 and over teens to ride without a parent.

Anthony: Waymo Teen. Oh, this is awful.

Michael: Yeah.

Anthony: But I’m biting my tongue so hard,

Michael: And then they have, I think they require, if you’re under eight, you have to have an adult with them. But nine year olds, you’re fine. It’s weird. Must Yeah. And Uber for years was just very, they wouldn’t say anything about this subject, ’cause they wanted to make the money from these rides, but they didn’t want to explicitly give permission for people to use them as a way to to usher their kids around the city.

So it’s an area where I’m sure if we will get [00:40:00] to again, when we look at the Waymo terms of service.

Anthony: Yeah. But getting back to that Waymo pregnancy story from the Guardian quoting the company who Waymo said its rider support team detected unusual activity inside the vehicle and called to check on the rider as well as alert 9 1 1.

So a preview to the terms of service episode this is great. It says, the microphones inside the car are only during on during voice calls with rider support. And when you actively choose to enable microphones inside the car there’s no indication that this pregnant woman called Rider Support.

Actually, they say they did not call rider support or choose to enable the microphones from their terms of service. Occasionally in more urgent circumstances, support may access live video during a trip. And a question for our terms of service episode is what does more urgent circumstances mean and

Michael: how does well, and how do you determine there are more urgent circumstances there?

Exactly. In the first place, then

Anthony: it’s the, your right is being actively monitored and I’m sure there’s, yeah.

Michael: Just admit what’s going on, [00:41:00] right? Yeah. Don’t shade it. Say we’re watching you all the time to make sure you don’t, do something crazy in our Waymo.

Anthony: Yeah. I’m looking forward to the dark website that has all the live video feeds that you’re watching website.

Yeah. Where did

Michael: all those images go, right? Are they thrown away or are they stored on a server somewhere that anonymous might put out into the public anytime soon or are they’re on the dark web who knows? I don’t know. And that kind of thing is. I don’t have a lot of I don’t, I haven’t written a Waymo, but, and if I did, I don’t think I’d be doing anything too crazy.

But I’m not just thrilled at the idea of Big Brother watching me everywhere I go, and I don’t think anyone is.

Anthony: No.

Ford’s Stolen Vehicle Services and Safety

Anthony: But before we go into the towel, Fred, this week, let’s go into one with Ford. This is an interesting one. Ford has something called a Stolen Vehicle Services, which launched in the 2024 F-150 model year and added a start inhibit feature that allows owners to disable an F1 fifty’s engine from a smartphone.

And so I guess F1 fifties are stolen a lot [00:42:00] because

Michael: Yeah. You they’re one of the most, the number one selling vehicle in Yeah. So America, I think. So statistically

Anthony: they’re gonna get stolen more than, yeah. Vehicles that sell less and

Michael: They’re really in demand. I think in certain parts of the country, like Texas where you have a lot of F1 fifties, they are probably by far and away the number one stolen vehicle.

Anthony: Yeah. So this is, it’s a neat thing. So you can get a little warning saying, Hey the, your car door is open, somebody’s in your car, and you can be like, wait a second, don’t start the car. Which is neat. And they only charge 7 99 a month for this. And so this is interesting and frightening quoting from this article on AP News.

The call center Ford’s call center coordinates efforts with police to use start inhibit, to shut down the engine and to pinpoint where the pickup stop. So you get an alert that your F1 fifty’s stolen and it’s out on the road. And I guess Ford and the police while this car’s under use, but let’s turn it off.

Which sounds incredibly dangerous to me. Isn’t it better? Hey, [00:43:00] we know where this car is. Let’s maybe not turn it off. ’cause you don’t have any data to what else is going around. Oh, you’re in the middle of the highway. Let’s turn it off.

Michael: Yeah, I mean it depends there. If it’s a stolen vehicle and the police have, acquired a, they it’s in their site and they know it’s an area where it’s safe to turn off.

That’s one thing. I don’t know. They do start inhibit. So if it’s weird that they’re, that I think an owner should just be able to, why don’t you just put, start inhibit on whenever. The owner’s not in the vehicle to prevent the theft in the first place.

Going after the theft, after the fact, and then having to confirm that with the police in order to use that feature seems a little cumbersome. And makes me wonder, just how well the system’s gonna work.

Anthony: If we do it your way, Michael, though, we’ll never be able to charge 7 99 a month for an enhanced service.

Michael: It just seems like it’s so much more [00:44:00] effective to have a what is effectively a kill switch that is actuated whenever the owner or driver doesn’t want the car moving. Makes sense to me. Yeah. Versus dealing with this other issue of stopping vehicles in the middle of highways. Which, I think when we talked to the officer from Massachusetts, a few years back, we discussed the idea of a kill switch.

We were talking about police chases and. There are a lot of things out there that I think could be useful in police chases. We’ve talked about the grappler. I love the grappler. We’ve talked about the use of drones in certain circumstances, but there are circumstances where police, really have to stop that car.

There could be children inside, there could be a bear mass murderer fleeing, right? And the car’s gotta be stopped. And, A-A-A-A-A remote, a stop button that could be deployed remotely could be one of the safest ways to do provide, the police have the vehicle within sight. And [00:45:00] maybe even if it’s not within insight, if there’s certain circumstances that are threatening to the public that the vehicle must be stopped at all costs I could see that being usable as well.

I think we’re totally in support of technology that can prevent theft and prevent, criminal actions using cars. I’m not sure that this one, is. Is it effective as some of the others we’ve seen?

Anthony: I don’t know either.

AV Checklist and Forced Arbitration Clauses

Anthony: But let’s move on to the wonderful world of the AV checklist.

Fred, can you give us your next two on the AV checklist?

Fred: Sure can. I can also give you the next 25 if you want.

Anthony: We like to tease it out,

Fred: alright. So the checklist again is intended to be a technical resource for people who need additional depth that into the AV safety that they cannot get from the manufacturers.

So regulators would be interested in this and the journalists would probably be [00:46:00] interested in this. And the general public who wants more depth but doesn’t care to get a graduate degree in engineering also includes a lot of references. We can flesh it out a little bit if you’re not satisfied with just the the top level anyway.

Number 21, has the applicant established that any paying for hire customer must affirmatively opt in the forced mediation or in, as opposed to selection of forced arbitration by default, close for run, if included in terms of service or equivalent or otherwise proposed by the applicant or its agents?

This is not necessarily safety, but it’s a vital interest to consumers and we will be having a session in the future, an episode dedicated to the terms of service from Waymo, which is a great example of how hideous these terms of service can be.

Michael: And that one I would say that one is safety because, they’re trying to take away your civil justice rights in the event [00:47:00] of a potential crash.

So it’s not stopping. It could, incentivize, if these companies are required or to not use these types of clauses that force customers into arbitration and that makes them subject to more actions in the civil justice system that can have, the effect of making them ensure that their vehicles are safer before they put them out there.

So I think there is a safety tangent there, but. The, the proliferation of forced arbitration clauses, across the technology industry is, has migrated into automobiles. It’s not something that was traditionally, used by auto companies, but the more tech they put in their vehicles, the more we’re going to see for arbitration clauses, including in your terms of service agreements.

Fred: I was just gonna say, I’m the only person I know of who actually read the terms of service and decided to not use the Waymo based upon the [00:48:00] terms of service. So if there’s anybody else out there who’s in a similar situation, please let us know. ’cause I’d love to have an affinity group starting. And if you’re in that, I’m lonely.

Anthony: If you’re in that same group, I suggest adding more fiber to your diet. No, just to help.

Fred: If I add more fiber, I’m gonna turn it to cardboard.

Anthony: Hey, y next up is, yeah, go ahead. I forgot my thought. Nevermind.

Fred: Next up is number 22. Has the applicant provided adequate insurance for both occupants and potential external collision victims?

And we’ve seen states require companies to put up a bond for what, three or $4 million? Michael? Yes. Very low to cover all damages, but this is clearly inadequate insurance and clearly it’s not insurance if it’s a bond. So there’s a lot of problems here. And if you are injured, you’re gonna be in, I forgive the point, a world of hurt.

Got time for one more. One more, Anthony. One more. Moving on. One more. 23. As the applicant’s shown that [00:49:00] the subject, computer driven vehicle cannot be used to deliver dangerous or illicit goods when unaccompanied by a live customer.

Anthony: I love this one, so I can’t use my Waymo or self-driving car taxi thing to ship my dynamite across state lines anymore.

Michael: You’d have a hard time getting a Waymo across state lines at this point, right? Oh,

Anthony: that’s a good point.

Fred: I go to apologize. My dog has discovered a wasp that’s woken up in the sunlight, so that’s the barking. Don’t even hear the barking. You hear it.

Anthony: And, but that was a great haiku. My dog has woken up in the sunlight.

Michael: This is something that I think, inner city drug deals, there are a lot of things here where there are, where, that aren’t even necessarily safety related. Obviously the dynamite would be and, having dangerous goods, like for instance, someone uses a Waymo to deliver a, I don’t know if they ship fentanyl in kilos, but a kilo of fentanyl to don’t pretend like you dunno somewhere in the city.

[00:50:00] And some of that fentanyl spills into the Waymo, the next passenger gets in the car is a pretty, pretty significant danger. So that’s what we’re where this is going. That and it’s an area where, you know, the, frankly how else are the applicants or the autonomous vehicle companies here going to be able to figure that out without monitoring the vehicle live.

So maybe we need to loosen our idea of live monitoring.

It is a, it’s a

Fred: sticky issue.

Anthony: Yeah. It’s like the, what is the Mitch Hedberg joke? My, my mailman doesn’t know. He’s also a drug dealer.

Michael: Yeah. Oh yeah. Fortunately if you’re, if you accompany your drugs or dynamite in the vehicle, you haven’t violated this provision.

Right.

Anthony: It’s perfect. Okay.

Waymo and Chrysler Recalls

Anthony: And with that, we have to move to recalls. Are we ready? We only have a couple recalls this week. And let’s start off with Waymo 3067 vehicles. The fifth [00:51:00] generation a DS. And this is in certain circumstances, Waymo vehicles that were stopped or stopping for a school bus with its red lights flashing and or the stop arm extended would proceed again before the school bus had deactivated its flashing lights.

And Waymo claims that, yeah, we fixed this, and City of Austin says, nah, you haven’t. That’s my summation of this one.

Michael: Yeah. And then this started, I think, this isn’t something that Waymo’s have been doing. They’re claiming in the recall report that this started with their new software release in on August 20th which was deployed across the fleet it looks like in mid-September, which is when, I think we started seeing some of these incidents.

So I, I think they’ve made at least one or two software changes so far that didn’t work. So we’re hoping this one sticks. It looks like. It looks like this is a software. This recall is [00:52:00] addressing the fixes which occurred between November 5th and November 17th. So that would’ve been two fixes that were applied there.

We’re not sure if this recall that, their fix I, in quotes, is actually working at this point, or if they’re still struggling with school buses. So I’m sure NHTSA, and, the media will be monitoring this one.

Anthony: I know Fred’s dying to jump in on this, but before you do, Fred, ’cause this is, this like ties into your whole theory here.

So prior to this software release, Waymo didn’t do this, and then after the solver release, the Waymo’s are causing a problem. How many million miles have Waymo’s driven safely,

Fred: Fred? Yeah. Manifold problem associated with it. And Waymo’s claim that safety is the first priority, right? So they’ll always say that’s the highest priority, except you gotta say that when they’ve been notified by the authorities that there’s a condition with their vehicles that is endangering school children [00:53:00] at their most vulnerable moment, and Waymo refuses to do anything except weight, then you know you got a problem and you can really challenge the assumption that safety is the highest priority, in fact, even a high priority.

But pushing past that a little bit it’s apparent that this problem has persisted throughout the first a hundred some odd million miles that Waymo has been accumulating, using its various versions of the software and hardware. So I think. Yes, this is an immediate problem for Austin, but I think it also speaks to the corporate culture at Waymo.

It just frankly doesn’t give a shit

Anthony: that they should put that on a t-shirt mo. We don’t give a shit. We’ve got alphabet money. Last recall Chrysler 5,974 vehicles. The 2024 Alpha Romeo Tonal fiv, and the some oh and the Dodge Hornets, I guess also related to this [00:54:00] too. They may have been built with a brake assembly that may collapse while braking.

That is horrible. How does, oh man.

Michael: What

Anthony: happened

Michael: there?

Anthony: What happened there is the brake pedal collapses and then I, yeah, that, that’s something we don’t,

Michael: we don’t see a lot of, I’ll see a lot of pedal to the floor breaking and things like that where the system doesn’t seem but the actual brake pedal collapsing isn’t something we see a whole lot of.

Anthony: Is this the brake pedal? ’cause now it’s saying further down is the inability to act, activate the service brakes? Why is it that would

Michael: be the, that, that would be because your brake pedal has collapsed. You can’t activate your service

Anthony: brake. What’s a service brake?

Michael: That’s just the general name for your brake system.

Oh, that, that’s used there. But that you look, that, that’s an somewhat odd to recall and it makes you question the engineering behind that brake pedal assembly. But, I also saw that in, in the they give some tips to owners if you have this [00:55:00] happen. And it says if it happens, you use the electronic parking blade brake on the center console to be, you pull it and you hold it while driving to slow the vehicle to a controlled stop.

Also, your automatic emergency braking system will intervene without further action from the dryer, from the driver when it detects a collision is in imminent. So I’ve never seen a a manufacturer site, its automatic emergency braking as potentially helping out in a situation where a defect is actively occurring, but there’s always a first for everything.

Fred: So did the manufacturers tell the owners to not use these cars because, safety is the first priority.

Michael: No, and that’s, that’s somewhat rare, we’ll see. Do not park. We’ll see. We’ll rarely see, the types of recalls that are do not drive recalls where the manufacturer believes that the defect is so dangerous that the vehicles, shouldn’t even be used.

But no, they did not [00:56:00] do that here and at least they didn’t check the do not drive box on their part. 5, 7, 3.

Anthony: No, the remedy is, the remedy will be to reinforce the brake pedal arm by adding a bolt and a nut.

Michael: Alright? That’s gonna happen in mid-January if you’ve got a hornet or a toe.

Conclusion and Final Thoughts

Anthony: And with that listeners, I hope you’ve learned some things and if you have, let us know what they are.

But first, go to auto safety.org and click on Donate. And that’s the end of the show. Bye-bye.

Michael: Happy birthday mom. For more information, visit www.auto safety.org.