Right to Repair or pleasure to be charged
Your car is a computer on wheels that the auto manufacturers won’t let you repair. Apple co-founder says Tesla is an example of AI that is trying to kill you. Waymo doesn’t understand road flares, Cruise blames their safety driver, something called a Fred light plus Recall Roundup and the Consumer Autonomous Vehicle Bill of Rights # 12.
Subscribe using your favorite podcast service:
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
Fred: say Good morning to our listeners. Good morning. Good morning. Whatever time it
Anthony: is. Good afternoon. Good evening, gut and Togan listeners, are you ready for another exciting episode of It came from the Center of Auto safety. That wasn’t a great, I think we are. All right. Good. So this week we’re gonna start off talking about the right to repair.
And some of you may have heard about this before. The simplest example is your cell phone. Hey, know how your cell phone, everything’s sealed in there and you’re like, Hey, I need to replace the battery, but you can’t open it and get in there to actually replace the battery. I’m old enough to remember when you can actually replace batteries on cell phones.
I can’t do that anymore. So the auto industry has said, Hey, let’s also design and planned obsolescence. Features where as consumers you can no longer really repair your car. Hey, there’s a little, tiny little part, but if you wanna get in there to repair it, we’re gonna avoid all warranty, all coverage, everything possible.
We’re gonna cause you to have gastrointestinal problems. You’re gonna get skin conditions. All sorts of horrible things will happen to you because you can’t repair things on your own. And I know our friend Mr. Michael Brooks has a lot of thoughts on this
Michael: issue. It’s, there have been little issues with Right to repair over the years.
For instance, the, some of the being able to access your engine oil system to make oil changes and that type of thing. I’ve definitely heard complaints from people over the years who have seen new designs that seem to make it unreasonably hard to get at those parts and do the kind of change, do the kind of maintenance that people do at home.
But now you know that cars are becoming computers on wheels. They’ve got software embedded in them everywhere, and what that software is allowing manufacturers to do is essentially shut off access to certain vehicle systems that they don’t want anybody else to touch. And, there’s been a years long.
Battle disagreement, I’m not sure how to describe, between the independent repair shops in America and automakers around some of these the newer vehicles with software on them that the independent repair shops don’t have the ability to repair because the manufacturers aren’t sharing that information with them.
And, lots of other reasons, but from a consumer standpoint even ignoring the independent repair shop issue, like with the iPhone, what we see with farmers where they’re not allowed to repair their tractors and are forced to wait months or weeks for a repair so they can get back to their work.
Consumers are having that same problem, the. With software, none of us really know, we’ve talked about before on here you can look at a vehicle and inspect it and see a lot of the safety issues that are occurring from a mechanical standpoint. But the software is a black hole to most of us.
We’re not gonna be able to go in and interpret all the code that’s running all the different systems in the car and that virtually eliminates the ability for consumers to perform vehicle repairs they have in the past. And for those that are able to going in and messing with your vehicle software is probably going to void your warranty.
So there’s some, there are a lot of changes here that consumers are facing in their relationship with their car. This is, it’s probably worth mentioning too here that what automakers are doing right now is basically putting software into every part of your car so that they can ultimately make you subscribe to different features in the vehicle so that they have a continuous money source.
So it’s in their interest not to give out information to independent repair shops or to consumers or to anyone about how to repair their cars and to how to, manipulate their vehicles because that’s what the manufacturers want to do, and they want to do it for a price. So this is an issue that’s not going away.
Massachusetts passed the law attempting to address it a couple years back. But at this point, automakers really have the upper hand here because they’re designing newfangled complex things every week and month that the rest of the world’s trying to keep up with. And, I’m not really sure how this is gonna play out.
What do you think, Fred?
Fred: I think that astute readers of our AV consumer Bill of Rights will notice that we’ve included a provision in there for disclosure and Monitoring of safety critical functions that are not accessible by a visual inspection. This is a close related issue because there’s a lot of stuff buried in your car you need to know about that they’re not telling you about.
Among them are the kinds and types of software that’s included in the vehicle. Small probably small compensation for a lot of people, but maybe it’s reassuring to know that the people who build the cars aren’t really sure what all that software is doing either. So we’re all in this together. But a lot of this, as Michael said, grew out of the agricultural community, which is spending hundreds of thousands of dollars on sophisticated equipment from John Deere and other suppliers of agricultural equipment that they can’t fix.
Farmers are a very pragmatic bunch. They’ve all got their welding tools, they’ve got a lot of sophisticated hammers and sickles and they really need this in order to maintain their harvest schedules and their planting schedules. So the vehicles that we drive are actually the kind of the tail end of the dog.
This is a systemic problem with all the modern, sophisticated equipment and something that, that really needs to be solved. So
Anthony: when I buy a car today, do I actually own the car? Is this a kind of like with my cell phone where I own maybe the physical device, but I actually don’t own anything that runs on it, so I’m Yes, that
Fred: is correct.
Yep. You only borrow the software for as long as the owner of the software happens to license it to you.
Anthony: And is it, and often as they decide to make updates to it, and we see Google regularly announces a product and then 18 months later cancels it. Will that start happening? Is there any regulations around my car so that, oem, every time I ask, we love you Anthony.
VO: Are there any regulations?
Anthony: They just look at me and laugh, oh, subscribe for the premium edition of this, and you can, I’ll send you pictures of the two of them laughing at me all the time. But seriously, this is a legit question because Google’s notorious for this where say, Hey, here’s a new feature.
You subscribe to it. And then 18 months later they’re like, Hey, yeah, that’s gone. You don’t have access to your data. Thanks for your money. Have a nice day with Google running a lot of auto software now and a lot of OEMs turning to Google to run their software. What’s the stop them from? You’re driving down the road and be like, oh yeah, you wanted air conditioning gone.
We’ve decided to no longer support that feature, or we’re no longer doing a d a s
Fred: There’s nothing to stop them. And as Michael pointed out several episodes ago, they’re actually now getting into a subscription programs for features that you wouldn’t expect to be part of an optional operation of your vehicle, like seat heaters.
And there, there’s nothing to restrict them from attaching a subscription business model to any of the other software features that are in the vehicle. It’s a closely related issue to their control of the software.
Anthony: Okay. So here’s my radical idea. Ready? I bought the car, I own the car.
And you say, I don’t own the software. I own a license to the software. Like I own my computer and it came with an operating system, but I can erase that whole thing and put in some open source software on that. Where I do, I’m not gonna inspect the software, there’s a whole community of, hundreds of thousands of people who inspect this and make it better.
Is there a situation where, or is anybody working on open source auto software? So I can be like, Hey, I don’t want, GM or Tesla’s nonsense software. I want to use my own
Michael: open source ledge. You used a bad example there because GM is the one who, a couple weeks ago came out and said they were supporting open source software protocol that they were.
Wanted to share within the auto industry. I think that’s a great concept. I think we need to share, they need to share a lot more things in the auto industry because they’re so focused on profit that their operations are siloed off and they don’t really want to share in the profits they want all themselves it seems so there’s some issues there. Look, if it, when we say cars are turning to computers on wheels, it’s not just cuz they’re getting lo loaded with whizbang and gizmos that are based on computers, it’s because the entire model is basically being converted to what we see on our computers. Where you get Adobe Acrobat Pro for one year and you’ve gotta pay a fee every year.
In cars, the seat heaters are a great example cuz that’s a subscription model that exists right now. They’re putting, and it’s, they’re ultimately, it looks like they’re gonna be putting a lot of fancy features in vehicles, creature comforts. Convenience features that are going to be based on subscriptions, you’re just gonna have to get used to paying it or going in app.
That’s fine by me. I’m happy rolling my window up with a handle and unlocking my door by hand. But I don’t think it’s fine with a lot of people since we’re also seeing this week at the average cost of a vehicle is now about $50,000. I don’t know how many people are willing to pay that and subscribe to the tune of a couple hundred dollars a year.
It’s like an annoying condo fee or something. I it’s. It’s an ano it’s going to annoy consumers significantly to have to pay fees on top of already having a monthly auto payment.
Anthony: But the example of the seat heaters, so the manufacturer, they design the car, they have the heating coils and cooling coils inside the seat.
And so they’re just, and I bought that again. We’ve agreed that I’ve owned this physical object. So can I, why can’t I just delete their software and be like, Hey, here, I’ll turning on seat heaters without paying them $5 a month
Michael: and whatever. I think you’re perfectly okay doing that, but you ha you, you lose your warranty.
So that’s a risk if that’s a risk you’re willing to take. You can modify the vehicle however you like after you get it. You surely see a lot of vehicles running around that, that people have done that too.
Anthony: Aren’t there laws around how much I can modify my vehicle before it violates the warranty?
Michael: There may be not really. Particularly in this area of software, if you’re wiping software from the vehicle, we’ve seen consumer complaints that span the range of things. Some dealers and manufacturers look for ways not to honor warranties. I think we talked about that in our episode with Joanna Johnson on Hyundai and Kia.
Oil pan drain plugs falling out. When a manufacturer. Or a dealer even is looking into a situation, sees an easy way to avoid a warranty, that means they’re gonna get paid from that point on. That’s something that happens frequently. I would expect there to be a way in the more modern cars for them to tell whether there’s been tampering with software and that type of thing.
Fred: I also wanna point out that quality control of open source software is a continuous and real problem. And when your cousin Lenny sends you a software module that changes the gain on your steering control, for example, saying this, this’ll make you really zippy as you’re going around corners. That’s fine, but has your cousin Lenny done all of the work that’s required to make sure through regression testing and other means for validating the software that is not impacting some other unseen part of the control system that you need to stay alive?
Open sourced has a lot of virtues, but it also has a lot of intrinsic problems that you’re losing control over key functions and key features of your vehicle if you put that into the operating system of the car, not a simple solution.
Michael: Yeah. And then on other issues here, and when you ask about a d a S, that’s something that if they start, first of all, if there’s a minimum standard, like there’s about to be for automatic emergency braking and vehicles, they’re not going to be able to put that out as a subscription because it’s mandated to be equipped in every vehicle that’s built.
They’re, you can’t charge for it after you sell someone the car. I’m, the sinner and me are not going to be very happy at all if we start to see safety features monetized as subscriptions and which is basically just inequity on its face. You’re basically telling Americans that they’re gonna have to pay to achieve a higher level of safety than the next guy.
I, that’s not a system I think is gonna fly right now, maybe 30 years ago. But I just it’s so unfair. And getting safety features out onto every vehicle in the fleet as soon as possible is, is the really the best way to ensure that we’re lowering all these very large fatality numbers we’re seeing on the roads right now.
Anthony: Right to repair will keep you updated as things progress on that. In other news, Steve Wazniak, he is the technical founder behind Apple. He’s the guy that Steve Jobs claimed all of his ideas as his own. But Steve Wasniak a very smart man, brilliant engineer, also a huge fan of Tesla back in the day.
He’s oh my God, I’m getting a Tesla model s and I’m gonna get another one. This is the greatest thing in the world because like most people, he believed the hype. He believed, oh, this is full self-driving. This will take care of itself. This is unbelievable. We’re living in the future. And now Steve Wozniak says if you wanna get an example of artificial intelligence trying to kill you, buy a Tesla.
Yeah. Now that he’s had a, he’s had a couple of Teslas, he’s basically saying, yeah, it’s not, he’s saying, I actually believe those things referring to full self-driving and an autopilot, and it’s not even close to reality. And boy, if you want a study of it, artificial intelligence gone wrong and taking a lot of claims and trying to kill you every chance it can get, go get a Tesla.
Michael: Elon just said this week that the Teslas are coming out with end to end ai, whatever the hell that means. It just sounds like another corny PT Barnabas phrase that he’s trying to use to sell these things when we know they’re not going to achieve the type of performance that is gonna allow them all to be robo taxis across America.
I don’t think anyone that has a brain believes that anymore. But for whatever reason, Musk and Tesla keep pushing that idea and the reason is to sell more cars. But it’s, it’s, there’s squads is right, and this is an example, I you, in the last week or two or month, ever since that Google engineer resigned, everybody’s screaming about the danger of ai.
And to me We’re already surrounded by some things like Tesla’s and, you’re driving past cruise and blue crews and Waymo’s and all these other vehicles, depending on where you are, AI is around and it can impact you. Now. It’s not some future thing it’s going on. So yes.
Full self-driving Tesla AI could be trying to kill you. I think that’s a fair assessment and one that a lot of people who have been in crashes in these vehicles or motorcyclists who have been hit by autopilot and other things would agree with. Another thing is, AI is not a bad thing.
One of what this. A lot of the focus recently seems to have been on this, the threats of AI versus, the simple fact that we as humans, if we’re going to use AI in quotes, need to figure out which components of AI are good, beneficial to society and can be protected from being used by bad actors and which are clearly bad and need to be eliminated.
I think that there are, probably a lot of things that fall on that good side and so this kind of AI fear hype that’s being created is somewhat unjustified. It gives. The 5 million different things that could be called AI a bad name. It paints it with a broad brush. And, we’d we’d like to see the news media and other people be a little more specific about what they mean when they say ai, and we talked about this in our AI episode.
But to just say, AI is going to come kill us, or AI is threatening, is functionally a meaningless statement to me when I know there are thousands of different types of AI out there doing very different things. You can’t, you just can’t paint them all with that broad brush. It’s it’s not helpful to talk about it like that.
And I think the media and Twitter and all these limited space areas where we get our information don’t really do a good job of, filtering out all the information and talking to us about what’s really going on.
Anthony: His entire episode is generated by ai. We’re nothing but a deep bake. I hope you’ve subscribed to our deep baking subscription service.
No, really subscribe and go to auto safety.org and donate. And now, Mr. Fred Perkins.
Fred: I just wanted to start with, AI is actually a misnomer, and what we’re really talking about is automated correlation between data sets that have been validated by human beings and an input that’s provided by another human being.
And then there’s a mysterious mathematical box in between that says there’s a very high probability that this, the input that you’re providing is associated with this vast array of data that some human being has already gone through and pared and adjusted so that it can be part of a consolidated data set.
The AI that your Roomba has in it is quite different, as Michael said from the AI that. Your car might be using as it’s driving down the road. AI is really good at sorting information from a vast array of data. It’s really poor and in fact, it doesn’t at all anticipate an unknown situation and make a judgment as to the best way to navigate that unknown situation.
It’s just completely absent from AI implementations. So when you’re driving, there’s a lot of unknown circumstances you encounter. The analog world is very complex. The digital world that the AI is using and that the simulations are using to test whether or not this AI is okay to drive the car is very limited.
There’s, gillions of degrees of freedom that are not included in the simulation. Every simulation is necessarily an abstraction and. Every simulation uses some form of AI to figure out what’s going on. But these are always abstractions. They don’t represent the real world. And the question is, how close does it come to representing the real world before you can say that it’s a safe approach to driving.
The answer is that they’re not there yet because these vehicles keep killing people and they’re very erratic. And I, it would be useful if people would, as Michael said, stop thinking that AI is some thing, and start thinking about AI as a process of managing data sets that some human being is put into place.
So Steve Wosniak is AI trying to kill you? It’s not trying to kill you, but it’s completely indifferent to whether or not you’re alive.
Anthony: It’s more, I think marketing is trying to kill you.
Michael: So you’re saying AI is like a lot of drivers out there.
Fred: They’re like that. Yeah. Except worse.
Yeah. Because drivers essentially are forced to live in an analog world, because that’s the way we function. The system that’s inside the car is trying to operate in a digital world to represent the real world. So let me give you an analogy. Okay? There’s a fraction called one third, right? If you divide something into three parts, if you try to represent that digitally with one decimate point, endless three
Fred: right? Yeah. If you try to represent it with two, you get 0.3, three and so on. There is a limit though, to how high that’s going to get. No matter how far you carry out the series, it’s never gonna be 1.4. So what is that you’re incrementally approaching a limit for a fixed value that you understand.
Intuitively from your analog world, but that the digital world of this numerical series never quite understand, never quite gets to. So that’s the basic problem with ai. It doesn’t have a reference to say, this is the reality, this is what you are approaching. So you never know how far you are from a safe state, which is the analog reality of driving that doesn’t kill you using an AI simulation that can only numerically represent parts of that universe as being close enough.
Is that geeky enough to be confusing or have I, I think
Michael: that makes sense. So that the, an, the analog reality is like a mathematical approximation of re of reality.
Fred: right? The analog versus the digital is, the real world versus some poor programmers attempt to, make a living in pay for braces for their kids.
Anthony: I don’t think there are poor programmers working on these things. I think they’re compensated quite well, perhaps too much. In fact, there’s a great video we’re gonna put up of I think it was a Waymo where you figured they’d they’d, if they read the AV Bill of Rights, they’d understand these vehicles need to stop to police Good Samaritan commands.
This Waymo was going straight for a fully charged fire hose gonna go over it. And that’s bad news. It wouldn’t stop. So the cop had to stand in front of it and yell, stop. It stopped because it was a human, not because it was a cop. And then he is I’ll just open up a road flare.
Must be able to respond to fire and smoke. No, the car’s just I’m gonna, I’m gonna drive over an open flame. Cars are dumb. That’s what I’m trying to say right now. Car, eh, they’re not dumb. They’re children. There you go. Artificial intelligence they’re children that, that need a lot of handholding.
Michael: they’re. Toddler, they’re 5,000 pound toddlers.
Anthony: I saw one the other day,
Fred: 5,000 pounds with the kinetic energy of a hand grenade.
Anthony: Yeah. I think that this kid was hopped up on some crazy amount of sugar. Anyway, the other
Fred: thing inter excuse me, the other thing interesting about that video is it showed that there were six cops tied up for 20 minutes just trying to manage this vehicle, this labor saving vehicle so that it wouldn’t interrupt emergency services trying to put out a fire.
So you’ve got, six cops per vehicle for about a half an hour until the maintenance people showed up who could actually get this automatic machine out of a harm’s way. And put the cop out of a
Anthony: harm’s way. Yeah. So basically, hey, tax payer in San Francisco. Cause that’s where this happened.
You’re subsidizing the development of a private corporation’s vehicle. What will you get for it? Nothing. You’ll get nothing. No. You’ll get traffic. Yeah, you’ll get traffic. You’ll get the police taking longer to respond to your desperate need because then said they’re going, how come this car
Anthony: respond to me?
Fred: then there are probably other things those six cops could have been doing while they were babysitting the vehicle
Anthony: perhaps. Jelly Donut. Custard. Donut. Oh, sorry.
Michael: Yeah. It’s not really, it’s, I wouldn’t want to be anywhere around that thing when it was in that state. Right?
You have those co those cops are brave. They’re literally jumping in front of it with a flare while it’s inching forward. You have no idea if that thing’s gonna take off any second. You have, you, you can’t see the driver. Like you have zero indication of what’s going on inside of that vehicle. Which is to me I don’t wanna be anywhere near it.
Fred: And one of the things that we’ve advocated, and you’ll find this astute readers or astute listeners in our AV consumer Bill of Rights, is that there needs to be a way for emergency personnel to affirmatively disable vehicles when they encounter them for law enforcement purposes. This is a great example of how the police should have been able to approach that vehicle, flip a switch, do something, call it on their cell phone and immobilize the vehicle so it didn’t continue to endanger the people around it.
This is a sleeper issue, but it’s very
Anthony: important. Personally, I was so surprised that it didn’t respond to a flare, cuz that ca that’s not some edge case scenario. We’ve all been on the road at some point where there’s flares in the road and you don’t think, Hey, let me drive over that flare.
That’s what this might want to do.
Michael: Usually behind a flare is a disabled vehicle with people possibly around it. So that’s obviously a bad situation,
Fred: right? I’m gonna go to my betting limit, which is 25 cents, and uhoh suggest that there’s not enough examples of this situation in the A database for the car to respond appropriately to it.
You’d need to have several hundred examples of this exact situation or something very close to it for the vehicle to recognize what’s going on and make an appropriate action following that. It’s just not there. You’re never gonna have that kind of database integrity and that database breadth to allow vehicles in the analog world in which we all live to appropriately respond to every critical situation.
Michael: Why do we keep seeing fire hoses? Why? We’ve seen this for months now, right? You would think if, you would think, crews I believe had the first problems that we saw with it a few months back. But, why aren’t they taking queues? Why? This is what machine learning is.
This is where that AI comes in. You see a fire hose, don’t run over it. How? Why is that not getting programmed back into these systems?
Fred: No two fire hoses laid on the road have the same pattern. Bing. They’re all different, right? They’re all com they’re all significantly different. Sometimes they put the fire hoses through the windows of cars that are parked illegally in front of buildings.
It is really a very difficult thing to have Some representation says this is the generic fire hose because, there’s just every one of ’em. Every array of fire hose is near a fire is quite different than every other array. So this goes
Michael: back to the, that’s a real problem, the 2 billion Cal database.
We need, these vehicles need a representation of every possible fire hose layout that there is in order to recognize them. So it’s a
Anthony: hard problem. Yeah. They I remember in the, was the eighties, early nineties, the US Navy spent a ton of money doing artificial intelligence work on submarines to identify different objects.
And like at one point it’d be like, Hey, this is a rock. And the next point they’re like, Hey, that’s a bomb. And it’s incredibly hard because of every different possible angle where they’re seeing things. It’s like people seeing faces on the moon. It, things will look different from any different angle, but hey, enough of submarines and at one point we’ll hear about how Fred invented submarines or disabled the submarine, or was chased by a submarine.
Hey, did you listen to the last week’s episode where Fred told us about being chased by a tank? Oh, you should check that out after this episode and after you’ve subscribed, continuing with Tesla News. Tesla, as if you listen to Elon and you believe the book of Elon they’re the safest vehicles on planet Earth and they’re the greatest things in the world.
And if you buy one, you’re an amazing person. It turns out some members of the US Congress disagree. They sent him a letter, I believe it was five senators. That signed this, or maybe six, and they said, Hey, we have problems with not only your workplace discrimination issues because you’re a kind of a racist organization and you’re harassing people.
But your safety record, that’s a scam. Actually, they said quote, let us be clear, your safety review is a sham. Now, we’ve pointed this out on many episodes. Sure you can have a Tesla, and we learned you can drive off a cliff fall down 200 feet, and the car amazingly is fine, but their cars are not safer than any other vehicle on the road, especially when they have autopilot full self nonsense engaged.
In fact, they’re less
Michael: safe. So when you buy a Tesla, you’re going to be forced to sign an arbitration agreement that requires you to take up any complaint, legal complaint you might have. Around your car in one of Tesla’s kangaroo courts before you can even think of getting justice. There’s a huge problem.
It’s something we’re really worried about as avs come on the road. And, you’re signing an agreement when you enter these vehicles like you do with Uber. You agree to some sort of user agreement that contains provisions in it that basically say you don’t get to go to court if we screw up or if we kill you, you have to go to one of our little special courts that we’re paying for, that we’ve set up so that we can basically prevent you from getting justice or we can woo you with a low offer so that you don’t get, you’re not made whole.
And we lose less money, and that’s a great thing for a company, but it’s, these provisions can, prevent access to court. They interrupt civil justice. It’s, they’re not fair. I don’t think that any con any consumer groups out there agree that you’re going to get a better deal in a, in arbitration court than you are from a jury of your peers or a judge.
Arbitrations are, we’ve. Seen so many lemon law complainants go through the arbitration process over the year. The private arbitrations, not state arbitrations, private arbitration process is set up by manufacturers and get nowhere and waste their time and be forced to do that before they can go to court and get their money back.
So it’s a way that automakers use to delay people to limit the number of claims they faced and to basically make it harder for you to be reimbursed when something happens that’s their fault, that cost you money or even worse injuries or kills someone. That was the kind of the overriding point of the senator’s letter in which they asked for basically very detailed data on all of Tesla’s safety complaints.
Basically, they asked Tesla for every safety complaint they’ve received and for a lot of background and data on those complaints. So that’s a really good thing. Now, I don’t know that Elon is going to lift a finger to answer it given his track record. I doubt he does. I’d like to see him, subpoenaed to testify in Congress on this issue at some point.
And maybe that’ll happen in the next few months if the Department of Justice or someone starts seeing this the way we do, which is a possible
Fred: criminal case. I just wanna correct the slip of the tongue. Michael made, he said the center’s letter. It’s not the center’s letter, it is from the US Senate.
Anthony: Excellent correction there. But clearly members of the US Senate are fans of this podcast like you should be to have you subscribed. Stop it. Alright. Also related to my favorite self-driving company our, my buddy Kyle over at GM Cruise GM cruise, their car hit a guy on a scooter back in 2019 and GM Cruise says, we didn’t do that.
It was our safety driver who did it. So they’re saying it wasn’t us, it was somebody else. That guy did it there because they hired the company to so GM Cruise, they built the cars, they sent ’em out, they had the license, but they’re like, Hey, we don’t employ the drivers, man. We hired another company.
They had the drivers. They did it not our self-driving car. Very strange, very, maybe the whole thing was deep faked
Michael: and it never happened. Yeah I just think, I find it really strange that they’re doing this simply from a PR perspective, that you would think they would want, not want this kind of thing to get out, because it shows that their operations are a little messy, either here they had a safety driver who wasn’t trained and wasn’t doing his job and or their car is screwing up.
It could be both. And not recognizing pedestrians on scooters. So there’s, there’s obviously an issue here. W. Trained safety drivers are super important. We think they should probably be in every one of these vehicles that’s on the road now. There’s no real, for the reasons we discuss just now, there’s an easy way to disable the vehicle and not hit fire hoses.
If you have a trained safety driver in the vehicle, if you have a trained safety driver, you’re not gonna hit a bus like you did in San Francisco. You’re not going to take, kidnap a passenger and take him to his destination three separate times in one drive. There’s a lot of these issues that we’re seeing that these cars are.
Causing that would be virtually eliminated by the presidents of a trained safety driver. So we are strong advocates for that. We think it’s important, if these cars are, if these adss are going to work someday, and if consumers are going to accept them you’re going to have far less incidents with trained safety drivers there to take over if necessary.
And it’s going to, help in the long run. When you’re talking about cities, states, consumers accepting these vehicles, they’re not going to accept traffic jams, fire hoses getting run over and emergency vehicles being stopped from proceeding because your car is stalled in the middle of the road for no reason.
And that’s why right now we think that, these vehicles really aren’t, Helping us at all in the roads. They’re just adding problems and they’re doing it simply to fulfill this, you know what may not even ever happen this idea that America has this big desire or need for driverless passenger vehicles carting them around.
Fred: I wanna be clear about how unusual the circumstances were for this particular event. One of the unusual parts of it was that the scooter had properly stopped at a red light and was waiting for the red light to turn green. And when the scooter’s red light turned green, the scooter advanced into the intersection, at which point the self-driving vehicle decided to ignore the red light that it was facing and accelerated into the intersection and hit the anomalously responsible scooter driver.
There’s a lot going on here and. Again, getting back to the correlation versus intelligence, there is no database probably in the AV inventory of scooters actually stopping at red lights and doing what they’re supposed to do. So maybe this is the result of, or the cause of the particular collision.
But the circumstances were that the car was clearly at fault and the scooter was clearly not at fault. And yet, a serious injury happened.
Anthony: Kyle says it was the human safety driver, and I believe Kyle. Always. I’m glad
Michael: you two guys have made up.
Anthony: Ah, speaking of Fred, let’s discuss the Fred Light.
So the Canadian City of Brossard low credit right across from St. Lawrence River from Montreal has installed a new traffic light in a school zone that only turns green for safe drivers. I love this idea. So this light stays red unless you go below the speed limit. If you go below the speed limit, green light, and it is called the Fred Light, it is named after Fred Perkins.
It is so crazy that they did this. They’re gonna call it the Perkins Light. They decided that’s too long. No, it is. The I didn’t take French. It’s the.
I’m gonna guess Michael didn’t take French either, but that’s, pretty,
Michael: I did one year in high school, but it did not stick. No.
Anthony: This is a great idea and they’ve, there’s a place, I think in the US too, where they’re doing this and they’ve talked about doing this, which this is this go back to our early conversation around the the infrastructure talking to vehicles.
Was that Vxf? Is that right? No vehicle?
Michael: No. It’s really just the infrastructure. Monitoring vehicle speeds. Essentially it’s the equivalent of a speed camera that’s attached to a red light. And I think it’s a great idea for, particularly for things like school zones where you wanna make sure people are going to certain speed.
I look this is the kind of technology that could be used, in cities and urban areas to slow people down when they’re going too fast. If you’re going the speed limit, you have a line of green lights five miles long, but if you’re speeding, those lights start to turn red ahead of you to slow you down.
Now that’s probably far more expensive than getting intelligent speed assistance or even an even better system into vehicles that ensures people want speed. But it’s something that could work. And I think it has, there are certain use cases where it makes a lot of sense right now, particularly the school zone use case that they were talking about in the article.
Fred: Two Freds are better than one. So I’m gonna go with that. That’s, there, there isn’t, could be a fault here. The assumption is that unsafe drivers are going to stop for a red light. And I don’t think it’s a given. That will happen, but it should certainly, cause most drivers to notice what’s going on and approach with more caution.
So I think it’s a fabulous idea. Fabulous idea. Love the name.
Anthony: All right, I think that’s three thumbs up for our pre lights. So Michael, you just mentioned it was the, is a, the intelligence speed assist, right? And this is a pretty neat idea too. This is this is a webinar from couple weeks ago where it was saying that, hey, we can have fleet vehicles city fleet, vehicles and municipalities, fleet vehicles, we can have them automatically stay at or below the speed limits.
They’ll never be speeding. And we can start getting people this way. And we’ve talked about this with automated vehicles, that automated vehicles won’t, they shouldn’t disobey the loss of the speed limits 50 miles per hour or in Montana, 150 miles per hour. They will not go 151 miles per hour, which.
I don’t want a an autonomous vehicle doing 151 miles per hour, cuz that’s just too slow for me. So how exactly does this work? Is this more of the, how is the infrastructure talking to these vehicles? What’s the mess in this,
Michael: In the intelligence speed in case there, there is no there’s a there the vehicles, it’s like how Tesla sees speed limit sounds.
The vehicle’s, cameras are capturing speed limit signs. And this could be done in a better way, I think. I don’t think cameras are the ultimate solution. I think that ultimately vehicles with gps will know what the speed limit is in the area they are in. Which is probably probably going to be more accurate than signs that can be obstructed or, you can’t see them as well in, in certain weather conditions.
Fred: actually see this on, I’m sorry, Michael. And
Michael: then And then it time and then it basically gives the driver a warning, an escalating warning possibly. And then ultimately it, the systems that we would encourage can slow vehicles down. Basically, if you’re going to ignore the fact that you’re speeding and you refuse to slow down, we’re gonna do it for you.
We’re not gonna allow you to kill people with speed which is something that Americans don’t like. And that’s exactly why they’re putting it into government vehicles first, because, you can pass a law. It makes sense for governments and fleets even, I think, to use this technology cuz it lowers their liability.
But also, in this case you’re not, government workers frankly don’t really have a choice here. They’re gonna be forced to drive the speed limit by their employers. The rest of us aren’t going to be forced by anyone. I think we’ve got a long, a long hard road to getting speed assistance into vehicles, mainly because so many people seem to think that speeding and going as fast as they want is a, some kind of basic human right, which is absurd.
But, if you look at some of the groups out there that are supporting motorist rights, they think that these speed limiting devices are the equivalent of Satan. So it’s just it’s a really polarizing topic around freedoms. The freedom to speed or the freedom to kill, as I would call it and intelligence feist is, it’s gonna be mandatory in Europe in a few years.
At some point it’s going to be mandatory in America. If we could all just accept it now and get it into cars, we could save thousands of lies between now and the time we abs. We finally do actually have it put into vehicles. I wish that was an issue that wasn’t quite as brought with the political considerations as it is.
It’s, we’re seeing speed kill on our roads more and more every day, the last three to four years. And when you’re putting, faster vehicles on the road, electric vehicles, they’re gonna be faster, they’re going to have better acceleration, and you’re not putting anything out there to stop the inevitable consequences of that, then we’re going to see problems.
Fred: are you talking about gun ownership now or you talking about cars?
Michael: No, because
Fred: it, my car isn’t gun. I guess I digressed a little bit, but I, I’ve seen bumper stickers that state that certain commercial vehicles are already using this and this GPS base for the most part. I assume, because I don’t think the technology is there to do it visually.
Another thing that is important to note, excuse me, is that all of these intelligence speed assistance systems have a bypass option. If you stop any accelerator, you will be able to exceed the speed limit. It’s just that it requires conscious action on the part of the driver or to go ahead and exceed the speed limit.
I’m not sure how this would work with an automated driving system since it’s hard for them to know exactly when to violate the speed limits in order to achieve some kind of safe state. But anyway I’ll back onto the point. It is an option and then it is. Something that can be overridden by the driver.
If and when necessary,
Michael: which when it comes up, wh when that rule comes up, we will be firmly against the idea that it’s optional. It makes absolutely no sense to spend all this money putting speed assistance into vehicles that just warns people that they’re going over the speed limit.
The worst speeders out there already know this. They already know they’re going too fast. Not too fast. They already know, they know the speed they’re going, they know what the speed limit is. They know they’re violating the speed limit. Having the car say, oh, I’m gonna slow you down, and then having the option to say, oh, no, you’re not.
It just doesn’t work. Ultimately that’s not going to address the hardcore speeders that we see causing a lot of the carnage on our roads. And it’s, it’s unfortunate that’s the case, but even in Europe, that’s the system that they’re going with ultimately. I would hope that we would have vehicles that are simply restricted by technology to the speed limit in the area and that, you don’t have this option anymore.
Speeding isn’t a human right. Get up earlier.
Anthony: Michael, you’re ob obviously avoiding the all two common scenario of the rabid bear wrapped with a gun on jet powered roller skates that are chasing you. You’re driving along a nice Sunday day doing 30 miles per hour and a 35, and then this bear is coming at you on jets, on, on jet powered roller skates.
You gotta speed up. Okay. This happens.
Michael: We, when roller skating bears start killing 10,000 people a year, we can talk about it.
Anthony: They’re getting up there. I read about this, it was on, fake
Fred: news. I don’t think it spares nearly as often as Wile Coyote.
Anthony: He’s more of a failure than anything else.
Let’s not get into that. This sounds like something that on a serious note, that the insurance companies would actually really incentivize people. Hey, you have this in your car. We’re gonna drop your rates some, significant amount, cuz I’d sign up for that because auto insurance in Manhattan is really expensive.
So please auto, my insurance company, please th go ahead offer me this. I will do it.
Fred: Some insurance companies do that already. You can put a dongle in your car and use that to monitor your behavior and they’ll drop your rates by 59 cents. I think this
Anthony: is a family show. Let’s not talk about putting a dongle in my car.
Speaking of donals and cars I think it’s time for the the Tower of Fred. We’ve had the Fred Light, now we have the Tower of Fred. We have so many things that are Fred related. This is again, the autonomous vehicle consumer Bill of Rights number 12. Oh my word. How many do we have? Are they gonna go on forever?
Is this the penultimate one?
Fred: more. This is, I think we’re, I think we’re getting down to Pinta, Missy. Yeah. You’ve now entered the Dow of Fred. Moving ahead here. The number 12 is the AV shall collect and report operational data to support research and development to improve safety, performance, and reliability.
And thanks to Michael for the initiative of putting this in as the quid pro quo for operating on public roads to evaluate their avs. There should be a corresponding requirement that such avs are releasing their data to the public so that people can evaluate how safely these are operating and more importantly, circumstances in which they’re not operating safely.
This gets back to a lot of the other issues that we’re talking about release of data and the accessibility of the performance data. But this stands alone because, the AV simply must expedite and make it easier for a responsible third party request to recover and interpret the data that supports investigations of failure as far as crashes or cybersecurity violations, all of which can induce safety critical and life critical situations in a an autonomously driven vehicle.
And when I say av I’m, of course being extensive and looking at s AE level two cars, s AE level three, cars level four. Anytime you’re in a situation where the driver or the driver’s representative is allowed to take their hands off the wheel and have the driver, or have the vehicle drive itself for any extended length of time meaning multiple seconds is in fact subject to this This requirement that all the data that’s associated with the crash needs to be accessible to anybody who is authorized to look at that data.
Open standards should be used so that there’s no proprietary barriers on interpreting the data that’s available in the car. And the reportable data must include unfettered access to relevant geographical vehicular motion dynamics, parameters, software, firmware, configuration, video, and other vehicle or event specific data relevant to an investigation.
In particular, as we’ve mentioned a couple times, you need assessment of unins inspectable safety and life critical features that are included in the reportable data. The implication of that is that the manufacturers must design in and build in, built-in test information or capability and build in diagnostic capability so that they are able to understand when the vehicle is operating outside of its design operating envelope.
And also to inform anybody who’s in the vehicle when it has exceeded the safety limits that have been designed into it. So there’s a lot that needs to happen in order to support this particular requirement. Reminding the listeners that if you want to review these in detail, you can go to auto safety.org/av hyphen bill hyphen sl
Anthony: I’ll come up with a URL cuz that’s
Fred: let’s too more, right?
All right. Yeah, we’re gonna have to clean that up a little bit, but it’s AV hyphen bill H Oh of hyphen rights. And that’s no so easy. It rolls right off of your tongue. Come on. I love
Michael: that. We need open standards for our URLs. Come on. I think
Anthony: it’s, isn’t it in the main navigation? It is. I go to auto safety org g and it’s the third link up top AV Bill of Rights.
Fred: Anthony’s getting a little defensive about his website design, but that’s okay. We’ll go with that.
Anthony: No, I appreciate that you said forward slash instead of backslash. That drives me nuts. People like backslash, there’s no backslash, isn’t there? Yeah.
Fred: What, getting back to the fundamental issue, we, the public paid for the damn highways.
We the public have paid for the infrastructure around the highways. It should not be a free asset for use by the AV manufacturers to turn us all to crash test dummies and just go ahead and. Use the highways for free without giving anything back to the public end of rent.
Michael: Agreed. So we see this problem taking place in the non AV world, a lot involving a lot, some of the Tesla crashes that have been investigated by N NTSS B and nitso, where Tesla’s UNC uncooperative and providing all of the data that they’re collecting.
I know that in some cases they’ve only provided E D R data, which is very limited when they literally are holding thousands of other elements of data down to the millisecond on a lot of the, on virtually every crash that takes place in the Tesla, in America. So they’ve been really non-cooperative in a lot of federal investigations around the autopilot and full self-driving issue.
And, when it, Moving to vehicles that can actually drive themselves in the AV context, we just wanna make sure that the feds and state and local crash investigators are able to figure out what’s happening here. And they’re not gonna be able to do that with, the stuff that’s on an E D R, which is an ancient dinosaur of a data recorder that you, you have in your vehicle unless you own a Porsche.
So that’s why we think that open standards are necessary here, so that, manufacturers can’t hide crash data on behind the cloak of proprietary secrets. They, it’s not, it’s, these are things that are taking place on public roads. They’re impacting Americans give us the data so we can figure out what’s going on.
I like to
Fred: refresh people’s memory. EDR is event data recorder and it only has a very limited set of data for a few parameters from five seconds before the crash or right until about 10 seconds after the crash. Something like that.
Anthony: It’s limited. So speaking of bad companies and companies doing bad things.
Let’s go into recall Roundup. We got a couple minutes left
VO: strap in time for the recall
Anthony: roundup, and we’re gonna start off with company doing a good thing. Mercedes has a recall around their Batterman battery management system, and this is for their EQ s series of vehicles, which, if you can afford one of those, you should be donating to the Center of Proto Safety, because those are beautiful cars and they’re really expensive anyway.
Mercedes is being overly careful again for believing that Federal Motor Vehicle Safety Standards 3 0 5 record requires better banner. Bad. Just take it away, Michael. I,
Michael: 3 0 5, nothing. It’s Federal Motor Vehicle Safety standard 3 0 5. And we’ve looked at it a good bit and because, we talk about battery fires a lot and that’s a standard that Nitsa put into place.
I know we talked about this once before, cuz I remember saying, Anthony, here’s some area where Nitsa actually put a regulation out and it’s here and it’s applying to, they put it out about almost 20 years ago. And it’s preventing battery electrolyte spillage and preventing some first responder issues that happen when these vehicles are in crashes.
So here Mercedes said, y We’re gonna recall these vehicles because the battery management system may not be working properly. 3 0 5, I don’t believe even requires a battery management system. And it’s, there was certainly a big question here as to whether Mercedes needed to pursue this as a recall, but they did, and we applaud them for that.
I’m it’s good to see that you get treated well as an owner when you spend that much on a car.
Anthony: Yeah. If you spend a little less money and you bought a Ford Ford’s 2004 to 2006 Ford Rangers they looks like they might have installed some Tata airbags backwards. And now for those of you who haven’t listened to this show before, if you have a Tata airbag, get it replaced immediately, do it.
They’re bombs. But Ford put some of them in backwards, upside down. And I’m not sure why they’re, Just recalling these entirely and getting rid of them,
Michael: I’m sure. It’s, they’re, this is this, these were some of the earlier toccata bags that were very dangerous. You’re probably better off having a airbag installed backwards in your vehicle in this recall than you are leaving one of the old toccata bags in your vehicle.
Cause these are really old. I think these are 2004 to 2006 rangers. So they’re right there near the top most dangerous priority groups in the recall. The, I’m trying to remember exactly what basically here they just installed the airbags. I don’t know if it was backwards, improperly.
And they’re going back to do a check to make sure that all out of this big group of airbags that they installed, that they did them right.
Anthony: I’m sure just replacing them and be like, Hey, we put in your airbag backwards. But it’s also a ticking time bomb. Here’s the replacement.
Michael: The, these were already replacements.
This may even be the second round of replacements. I didn’t see. But some of these vehicles, they had a initial recall where they basically put the same airbag in that has the same problems, but because it was, 10 years newer, it wasn’t subject to the, the humidity conditions and the time factor that caused the airbag explosions, the inflater ruptures.
So it, you were getting a new bad airbag that wasn’t, that ultimately might have the same problem and then they were coming in later with an even better airbag and replacing those. So I’m not sure at what point in the process this happened, but any of the airbags even wrong, wrongly installed were probably better than the first airbags that these cars had.
Anthony: All right. And for our final one, for all of our listeners who own Bentley’s and paid attention to last week’s episode where we discussed securing items in cars. Cuz in the event of a crash, everything is projectile. Bentley’s rear entertainment screen brackets may fail. The Bentley’s recalling certain 2021 to 2023 flying spur vehicles.
I don’t know how many vehicles that would be, probably like eight. The rear entertainment screen retention bracket located in the rear of the front seats may have been incorrectly installed. So on a crash, this rear entertainment screen may go flying around via projectile. If you have this, put it in your trunk.
Michael: that’s not where we you can’t watch video in your trunk. You need to go get it. You need to go get it fixed so that it doesn’t fly around your car. I’m
Anthony: putting it in the trunk because obviously if I’m driving a Bentley, I’ve kidnapped somebody who owned a Bentley and I’ve something.
See if they can watch it. Give them something to do. I don’t wanna be a total jerk to them. Yeah, I still,
Fred: don’t want to be rude after you’ve stolen their car. I
Anthony: don’t. Rude. Their trunk is gonna be nicer than my car’s front seats. Let’s be, it’s a Bentley. Come on. Probably like baby yak skin in there.
Michael: An even better one on this issue came out this morning too. Jaguar recalled about 12,000 of its vehicles. Let’s see, land rovers, the owners were in the trunk, land Rover and Range Rover vehicles because a second row seat armrest storage compartment latch fails, which allows the object in the armrest storage compartment to be unsecured in the event of the crash.
Which, that’s a pretty solid recall that how big are these armrest storage compartments and what can you fit in there that would injure you in a crash? That’s a, that’s an interesting one as
Anthony: well. That’s where I put all my knives. I put all, oh, that makes sense. Yeah. That’s how you gotta do it.
Hey listeners this is just, one of many reasons that you sub subscribe and donate and tell all your friends. We’ll talk to you about recalls in an absurd and fantastical ways. Not even really that absurd. If you have a tcca airbag, that’s a pretty serious thing. Get that fixed. Where can they go to find out if they have a toccata airbag still?
Is it recalls.gov? No, there’s a separate airbag site, isn’t there? There’s,
Michael: I think it’s safe airbags.com. Yeah. And that’s where you can go search your vin, see if you qualify, and then get in touch directly with your manufacturer immediately. They sometimes can even come out to your house and repair it right there.
Yes. So everyone do that.
Anthony: Absolutely do that for everybody else. I feel bad about your Bentley. I will let you out of the trunk because I’m gonna go to the grocery store and pick up some things. And
Michael: They’ve got a federal motor vehicle safety standard for that. Anthony, we talked to Jeanette about that.
Remember? Oh, sh they can hop outta your trunk
VO: anytime they want.
Anthony: I don’t know if they’re listening. Okay. Come on. Yeah. So if you’re in the trunk yeah. Pull the release handle. Thanks for listening again. Please go to Auto Safety dot or g become a monthly donor. We’ll have a new Fred’s story.
Oh, have the story about how Fred’s experience at Woodstock. How’s that one? Okay. Five new monthly donors. Fred will reveal to you his latest felony. It’s past any sort of, statute of limitations. And this does not involve drugs. We’ll share that story. We need five new monthly donors.
You can do it. I know you can’t eat five bucks a month. That’s it. 60 bucks a year. Easy peasy. You spend more on that on trunk release handles.
Fred: I think it’s only a misdemeanor, but that who’s gonna quivel?
Anthony: Hey, good thing there’s a lawyer on the show. Thanks for listening. Thanks for subscribing.