The Ultimate Gaslight: Sudden Unintended Acceleration
Phil Koopman joins us to win the Lifetime Achievement Award for Gaslight Nominee with Sudden Unintended Acceleration. From the Audi 5000 to Tesla, Phil walks us through the nonsense that it’s always the humans fault and instead explains how bad hardware and software engineering has led to many of these dangerous situations.
- https://users.ece.cmu.edu/~koopman/pubs/koopman18_safecomp.pdf
- https://www.youtube.com/watch?v=DKHa7rxkvK8
- https://www.safetyresearch.net/how-ford-concealed-evidence-of-electronically-caused-ua-and-what-it-means-today/
- https://www.autoevolution.com/news/nhtsa-has-yet-to-answer-the-petition-about-tesla-s-sua-events-but-where-are-they-251095.html
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
Anthony: You are listening to There Auto Be A Law, the Center for Auto Safety Podcast with executive director Michael Brooks, chief engineer Fred Perkins, and hosted by me Anthony Cimino. For over 50 years, the Center for Auto Safety has worked to make cars safer.
Hey everybody. It’s Wednesday, November 5th after election day. I hope you’ve already voted because it’s too late now, right? I joined this week again.
Special Guest: Phil Koopman on Sudden Unintended Acceleration
Anthony: We have the man with Who doesn’t need an introduction, Phil Koopman, and he’s here to talk to us about the world’s longest ongoing gaslight. I think we’re approaching 40 years, the gaslight of all times sudden unintended acceleration.
Welcome, Phil.
Phil Koopman: Thanks for having me on. Great to see you guys.
Anthony: Absolutely.
Understanding Sudden Unintended Acceleration (SUA)
Anthony: Let’s talk sudden non intended acceleration and let’s treat this as you [00:01:00] did. Rear view cameras. Okay. The be all. Okay. End all of this is why. Back in, starting in the seventies, drivers are out on the road, they’re going along and all of a sudden the car’s speeding up and they’re like, what happened?
My car’s the problem. They report this. The manufacturer’s no, you driver, you are the problem. And Volkswagen must
Phil Koopman: do this. You’ve got your foot on the wrong pedal. That’s the right, exactly. Yeah. It’s always listen to them. It’s always the driver’s fault and Right. And they’re a crashes. There are alleged deaths.
There’s all this stuff going on and it has been going on for decades.
Anthony: Yeah. And Ford, Volkswagen, Toyota, also, I believe.
Phil Koopman: Toyota, definitely.
Anthony: Yeah. Yeah.
Phil Koopman: That’s the poster trial, right?
Yeah.
Anthony: And so this is an ongoing thing, so tell us. Why are drivers so bad they can’t tell the difference between the brake and the accelerator?
Phil Koopman: There’s actually no evidence to support that at all, but that’s part of the story. Okay, [00:02:00] so let’s let’s go there. Or I shouldn’t say not at all. The evidence is widely misstated and doesn’t mean what you think it means. But let’s be methodical. Let’s try and go through this. And I don’t have, I’m not selling anything.
I got no ax to grind other than don’t like being gaslit. Okay, perfect. So that’s my here today. Let’s, if you don’t mind, let’s start with a little bit of terminology because people get, there’s a lot of lawyers round wrapped up in this topic and they go after the superficialities and the terminology to try and twist and deflect things away from an honest discussion of what’s going on.
And they’ll say that wasn’t sudden unintended acceleration because. Of the definition but that doesn’t, that’s not relevant. So SUA sudden unintended acceleration is what it’s often called, but that’s not what it really is. A better term is un uncom commanded acceleration. And the reason it’s SUA is historical, but the fundamentally and you want it like we did with the camera, so I’m gonna go all the way down.
All right.
Technical Aspects of Uncommanded Acceleration
Phil Koopman: Back in the bad old days when I learned to [00:03:00] drive on a 66 Mustang with a stick shift and an inline six, that was actually after I’d already learned to drive tractors. But we’re not gonna go there today. When you oppressed, there’s an accelerated pedal. Some people would still call it a throttle for the reasons or accelerator, gas pedal, whatever.
The pedal on the right. That makes the car the go pedal, right? It used to be that it, that pushing the pedal down would increase the volume of fuel, air mix being fed to your engine. It didn’t actually control the gas at all. It controlled the air. And there’s this thing called the carburetor, but now I’m just acting like I’m really old, so we’re gonna move on.
And then it switched to fuel injection and now it’s the amount of electrons, all this stuff. But originally it was the amount of fuel, air mix that was combusted and then it switched when they got capable computers inside the car, they switched to instead be torque on a newer car.
It’s not actually controlling the energy flow. It’s controlling the torque. You apply more torque means the more push to, to get the engine the car moving [00:04:00] faster. Okay, so the issue with un commanded acceleration is that the baseline one is your foot’s not on the pedal and the car decides to go anyway.
There are more subtle ones such as your foot’s resting gently on the pedal, but it commands full acceleration. You know that, so un commanded acceleration is the car is commanded to go faster when you are not doing that. I didn’t say doing it on purpose, which is where it gets interesting.
If you were not telling it to go faster, it’s not supposed to go faster when it does. That’s un commanded acceleration. Now the now there are ways this can happen. There are documented ways this can happen. Absolutely documented. No question. There have been numerous nits of recalls. I’ve seen a lot of other stuff I can’t talk to specifically about, but you can have problems in the pedal sending the wrong electrical signal to the engine.
You can have problems in the hardware design of the engine controller that misreads the pedal signals. You can have problems in the software that [00:05:00] computes the wrong amount of acceleration. You can have problems in the circuitry that controls the throttle angle or other fuel. You can, and you can have floor mats that hold the pedal down.
You can have a person put their foot on their pedal thinking it’s the brake and pressing on it because it’s really the gas pedal. All those things are possible. And as far as I, and with very high likelihood or with certainty, all of those things have happened. And I have video of some of these things happening.
So all of these things have happened. There have been recalls. So to be clear, I never said drivers never pressed the gas by mistake. I didn’t say that. But if you think it’s possible for drivers to press pass by mistake, and the data I’ve seen suggests that’s a very rare occurrence, but I’m not gonna say it’s zero.
That can be true. Why is it people have trouble believing the computer can cause the problem, especially when we have many recalls for exactly that [00:06:00] having happened in documented to being fixed. Why is it people refuse to believe the computer could be the problem when they’re, and always blame the person?
And that’s the discussion. So that’s more of my intro. Let’s make it a little more interactive. But that’s that’s the tee up in, in the Gaslight is for decades, literally decades, the car industry and Nitsa have both acted in a way to blame the human in favor of blaming the possibility of software.
And I’ll be happy to dive into that in detail. It’s really quite a story, but that’s what’s been going on.
Anthony: Yeah, that’s fascinating.
Legal and Historical Cases of SUA
Anthony: I’ve never personally experienced sudden acceleration. I know one of my brothers did when he was 16 learning how to drive, and he hit the accelerator by accident, ran over woman’s bushes and she was very nice and said, oh, don’t worry, that happens all the time.
Okay. Which I thought was so strange, but okay,
Phil Koopman: So let’s go with that. People get hit the wrong pedal, but what happens is it’s rare when they hit it, they usually recognize it. And the difference between generic un [00:07:00] commanded acceleration and sudden unintended acceleration is that if you hit the wrong pedal by mistake with high probability, you notice it pretty quickly and you take foot off the pedal.
It’s very rare to have someone with their foot stuck on the wrong pedal. There. There’ll be some expert witnesses who say, oh yeah, that’s a thing. It’s there’s a difference between it being a one in a million thing and it being responsible for every single possible crash. The first one?
Yeah, the second one, no, not so much. But if you’re in a parking lot and you’ve got a few feet, by the time you perception response time kicks in and you say, oops, wrong pedal crash. You don’t have time to take your foot off. And so in some of the early cases, like the Audi five thousands, which I suspect we’ll get to, there were a lot of parking lot crashes and not so many highway crashes.
Why would that be? If you make a mistake it’s, you don’t have the time to recover before you hit something.
Anthony: Okay.
Phil Koopman: Not all of them are like that. Some of them are long duration highway events. We can get to all of these.
Anthony: Yeah.
The Role of Computers and Software in SUA
Anthony: So I [00:08:00] want to jump in ’cause you hit on one of my favorite topics is people don’t question computers.
I don’t understand this. It is bizarre. I think we’ve all run into this at some point in your life where you’re, you come across some bureaucrat, it says it says it here on the computer. I’m not actually a five foot two Malaysian woman. I’m standing right in front of you, but the computer says you are I’m not.
I can’t help you.
Phil Koopman: Apparently the worst thing that can happen to you with a computer is the computer thinks you’re dead and you’re not. Apparently. That is just a nightmare to deal with.
And this happens on a regular basis. Yeah.
Anthony: Yeah. So these cases have gone to trial and whatnot and what happens?
Why are, do juries get misled, confused? Or is it this just some sort of weird human behavior where we all think well. Computers are this magical thing that no one really understands, so it must be correct and that it has to be a human that did a mistake skipping the fact that humans built and [00:09:00] programmed computers.
Phil Koopman: This, that question like most of this area is pretty tangled up. So there’s, that’s why you’re
Anthony: here.
Phil Koopman: There’s the history of nitsa investigations when I get to, but we’re not gonna go there now. We’re gonna go, let me just hit the question. You said if a computer, what we’ll see on a lot of these crashes is the computer says that the foot was on the gas pedal, not on the brake.
And they say therefore the driver pressed the wrong pedal. And there is a presumption in law, not a lawyer. I am not a lawyer, I-N-I-A-N-L, whatever. I’m not a lawyer. But I’ve certainly been involved in law in this area, in, in a legal context quite a lot. And my understanding is there’s a presumption.
The computer is correct and you have to prove otherwise. So unlike innocent until proven guilty, if the com the computer is presumed correct unless you can prove it’s wrong. And if the computer says your foot’s on the gas, that makes you guilty. And the onus is upon you to prove your innocent.
And when I say guilty, because there have [00:10:00] been some cases where a driver has gone to jail because they alleged un uncommanded acceleration and the computer said their foot was on the gas and they killed someone or hurt someone. This has happened more than once or has been gone to jail or been on trial and had to figure a way out.
But the catchy is that. Computers can lie. Computers have faults in them. It’s a thing. Anyone who thinks a computer’s perfect has not been paying attention. And there’s all sorts of, I listed all the things that can go wrong. And then there’s single event upset where cosmic rays come down and flip bits.
No, I’m not making that up. That’s a thing. It’s documented. Data centers have a data centers pay extra for error detecting memory because it’s such a problem for them. There was a bunch of Cisco routers that got recalled because it was causing router failures. This, it’s a thing, right? And when you have one car yeah, probably not, but when you have millions of cars, this is gonna happen.
It’s easily a one in a million kind of thing. So a computer, when a computer gives you an answer, [00:11:00] it’s usually the right answer, but not always the right answer. And so that’s part of it. So when the computer says your foot was on the gas pedal, it doesn’t mean your foot was on the gas pedal.
It means the computer thinks your foot was on the gas pedal. And if the comp, if the computer suffered a failure or had a design defect, like the gas pedal was reading the wrong voltage and your foot wasn’t on it, but it was saying that your foot was, which is a documented phenomenon that has happened in for sure.
Okay. Then your foot wasn’t on it, but the computer says it was ’cause the computer doesn’t know what your foot’s doing. It knows what the electrical signals are or it knows what a software error messed up a calculation and said, this means your foot’s on the gas pedal even though it’s not, or there’s a wiring, the wiring fault and so on.
Got the picture. So the computer could be wrong, but we presumptively trust it.
Michael: Got it.
Fred: Cool. You said that. A cosmic ray could flip a bit.
Phil Koopman: Yep.
Fred: [00:12:00] What is a cosmic ray? Would you explain that to people in 10 words and without relativistic references?
Phil Koopman: Sure. High energetic par particles from outer space come down to the atmosphere.
They knock into molecules of air that knock into silicon chips and the silicon chips it breaks up an atom in a silicon chip, dividing it into a positive and a negative charge particles. And when negative charge particles move through silicon, that’s called electricity. And so it sends a little electric pulse that flips a memory back.
This is a well documented phenomenon.
Fred: That sounds bad. Are you sure you’re not making this up?
Phil Koopman: I am sure there’s been data since the seventies.
Anthony: Yep.
Fred: Thank you.
Anthony: Jewish space lasers, right? That’s what it is. No. All right, fine.
Phil Koopman: Not touching that one, Anthony.
Anthony: Okay.
Phil Koopman: Because this one’s real.
Anthony: Yeah. Okay.
Phil Koopman: It’s rare. It doesn’t happen all the time, but when you have millions of things, this is why safety people say, [00:13:00] Hey, I want error correcting memory. Hey, I want two computers that check each other. And most, but not all thro, electronic throttle controls have two computers that check each other because these kind of things happen, and that’s why you designed that way.
Anthony: Okay. So in our Michael sent around this thing that we’re posting in the show notes from safety research.net talking about it’s titled How Ford Concealed Evidence of Electronically Caused UA and What it Means Today. And this is fascinating talks going back to the seventies. It was a Ford engineer, William Fuller Fullmer warned about the risk posed by electric mag magnetic interference and caution to avoid disaster.
It was imperative to incorporate fa fail safe protection against EMI in the system of design. Ford, they attained a patent on how to deal with this. But later on a different division inside Ford said, nah, this isn’t a problem. And of course Ford puts this out in their cruise control systems and it becomes a real problem.
And then it becomes a worst problem. ’cause what does Ford do? They’re [00:14:00] like let’s not tell Nitsa what we know. Let’s hide all of this information. And is this and
Phil Koopman: be clear, Anthony, this is pre software, right? This is not software. This is just
Anthony: electronics, print circuit boards. Yeah,
Phil Koopman: it’s electronics, it’s resistors, passengers, inductors?
It’s not software. Okay.
Anthony: No, this is not, this is it was on the road to what, Michael? The mid eighties. He’s thinking there he’s muted. He’d, but he’s still thinking
Phil Koopman: There were computers then. But I don’t think this recall was a computer
Anthony: thing. I, yeah, I don’t think this was software.
This was circuit boards yeah. Printed circuit boards that controlled their crews control. Basically similar to what you were talking about cosmic ray, this is just electromagnetic interference, which is even more localized. Yeah. You don’t have to have some random electron from space.
Phil Koopman: When you send electrons down a wire, they set up a magnetic current and if you’re not careful, that can induce electrons in a different wire and you get interference.
Anthony: Yeah. So this is interesting ’cause it was what, 10 years later, NITSA does, opens an investigation. Ford knows the root [00:15:00] cause of this, but they’re like, let’s just keep this on the DL folks. Let’s not tell anybody. I still don’t understand why corporations haven’t learned that they always lose when they do this.
The coverups worsen the crime.
Michael: Yeah. I wonder if they always lose when they do this.
Phil Koopman: I think it is not a true statement that they always lose. And that’s the, and that’s why there’s an incentive to try. That’s the problem I.
Anthony: I’m
Michael: optimistic. As an example,
Phil Koopman: I know for sure they don’t always lose yeah.
Anthony: Thankfully.
Michael: Yeah. As a possible example of that, now, we’ve got, NITSA opened an investigation into Tesla’s sudden, an acceleration a few years back, and when Nitso reviewed Tesla’s data on the matter, Tesla’s data said that the, a hundred percent of the events were caused by pedal MIS application.
I don’t know if nsa
Phil Koopman: Mike, let me impact it. Let me unpack what that means. When Tesla says a hundred percent of the were caused by pedal MS application more rigorously, what they mean is the data [00:16:00] shows the pedal, the data shows commanded acceleration. That does not mean the person’s foot was on the accelerator.
It means the computer decided the person wanted acceleration, which is not the same thing if it misread the pedal or there’s a computational error. And especially when you have cruise control. Cruise control can say, I want acceleration, which is not the same as the pedal saying acceleration, but usually what you, there’s a readout saying, here’s the pedal position and it means the readout for the pedal position was a high number instead of a low number.
But that could be the person’s front of the pedal. It could be a computational error, could be a lot of other things. Could be electromagnetic interference, could be a lot of things.
Anthony: Okay.
The Audi 5000 Case Study
Anthony: So the first real case of this was the out Audi 5,000. Is that right? Back in the seventies.
Phil Koopman: That is the big one that involved yeah, that’s the big poster child.
So let me do quickly what happened with the Audi 5,000. If you read the main report, this is all government report me the rainbow report. What it said [00:17:00] was people are in parking lots and they press the wrong pedal. And so the car would hit something in a parking lot before they could take their foot off the wrong pedal.
And so it’s the driver’s fault. Last I checked. If you go to the Urban Dictionary even, it’ll say that’s what happened.
That’s not what happened. If you go back to the engineering agonists of the same report, also written part of a government report, it says no, that’s not what happened. What happened was there was a design defect, a reproducible design defect in the engine controller that would cause 0.3 G acceleration for no reason at all, that without driver input.
So the car would just decide to go 0.3 G acceleration. Now, that was written, I think to understate it. I went and looked up the zero to 60 time of that year on model, and zero to 60 is like 0.29. So basically it goes wide open throttle.
VO: Wow.
Phil Koopman: Just goes wide open throttle, it’s yes, it’s a wimpy engine, but this is [00:18:00] a long time ago.
Okay. So it goes wide open, throttle for no with no driver input. Documented says that right then the driver scared out of their wits that their cars decided to go crazy. Jabs at the jabs, at the brake pedal, and a small fraction of time they hit the wrong pedal because they’re like, I’m gonna die slam a pedal.
And a small fraction of time they hit the pedal and then it takes ’em too long to recover before they hit something. This is all in the report. This is what the report says. Alright. What’s the conclusion? The official conclusion is the driver caused the crash. What? No. What happened was, objectively what happened is the car went crazy.
The driver was tasked with stopping the craziness, and if they’re imperfect doing so, somehow they caused the, they caused it. This makes no sense. So that’s the Audi 5,000. All right. And later on they. They [00:19:00] concentrated on the pedals are too small. Are the pedals in the wrong position? It’s no.
What’s happening is the tri car tried to kill you and people aren’t great at it. Yeah, sure. Go ahead and increase the probability that the person will successfully recover from the car trying to kill them. But that’s not the root cause.
Anthony: So you’re talking about there was the pedals part.
’cause that’s what Audi used as a defense for a while, and I believe Toyota did that too, is Oh we the pedals were in the wrong place, or people weren’t used to that?
Phil Koopman: The Nitsa report enabled them to concentrate on the driver and the pedals instead of talking about the fact that the car tried to kill them.
That’s what happened.
Anthony: That’s amazing.
Phil Koopman: Okay. But that’s only step one.
Anthony: Oh
Phil Koopman: no it’s worse from there. So I was in, when I was working on the Toyota Unintended Acceleration stuff I was in a particular venue which which it was being argued that this that, every NSA report, so I’ll simplify it.
Okay. The idea was every NSA report ever has proved that it’s always the driver’s fault. Always, [00:20:00] right? And so I started going through every single report that had been named in this proceeding. Every report referred to a previous report. All of the reports, every single report that they were arguing, that the manufacturers arguing were relevant, all go back to the Audi report, which doesn’t say that at all.
Anthony: That’s
Phil Koopman: pretty common. So there’s this whole house of cards about, it’s always we, and then you go to, when the Toyota Unintended Acceleration came out, there’s a Wired magazine article saying we’d be known forever. It’s the driver’s fault. It’s not true. I have never seen data showing. It’s always the driver’s fault.
I’ve never even seen data sh showing. It’s usually the driver’s fault. All the data I’ve seen points back to this 45,000, which shows. The most you can do is blame the driver for not reacting properly. When the TR car all in its own tries to kill them, that’s it. That’s the data. Now, there’s another piece of data I unearthed when I did this, which was a nitsa survey of [00:21:00] the things that go wrong in cars that lead to crashes.
And they said there was a thousand, it was 996 or something, but let’s call it, it was about a thou, about a thousand events. And they said there’s one out of a thousand events where we think it was pedal miss application and there were 10 or 20 times as many. There were other electronic failures that were not pedal miss application.
So even NSA’s own data shows, it’s much more likely to be something else than pedal miss application. Yet we still have this narrative that it’s always the driver’s fault and there’s, I have never seen data to support that. That wasn’t based on we know it because we don’t, a basically a pointer to non-existent findings.
All the conclusions I’ve ever seen point to non-existent findings. That’s where we are now. Did I say it’s never the driver’s fault? I did not. But the argument the data shows it’s usually the driver’s fault, which is a f false statement. There’s nothing to say that, therefore it’s always the driver’s fault, which is a logical fallacy, is where we always end up in these discussions.
Anthony: Is Volkswagen still standing behind? It’s the [00:22:00] driver’s fault? No. ’cause they went ahead and they fix the software on this. But are they trying to take both sides still?
Phil Koopman: Which one are we talking about?
Anthony: The Audi 5,000. So
Phil Koopman: saying 5,000. Oh
Anthony: yeah.
Phil Koopman: No, the, they blamed the drivers and they did a lot of theater to fix the pedals to reinforce and to this day, everyone blames the drivers, but that’s not even what the government report actually said.
Anthony: But they still fixed the software behind it or the hardware.
Phil Koopman: I don’t know what happened to that.
Anthony: Okay.
Michael: Yeah, I don’t think there was software in those.
Anthony: So it was just hardware?
Phil Koopman: No, I think, yeah, the, those were still electronics. Yeah.
Anthony: Okay. And for, because I, what I’m getting at is I’m thinking is this is Tesla’s business model.
Is right. Always blame the driver. It is never our fault.
Phil Koopman: And so since then, we’ve seen many companies, this happened with Toyota. They used the same game book. There was a big Ford case. There’s been cases with other companies that were, that I’ve got involved with that are less newsy.
There’s the Honda Kia stuff going on. That was a big thing for a while. And then the Tesla cases [00:23:00] and in every case, what is a presumptive blame of the driver with no evidence to support that.
That’s where we still are.
Anthony: Let’s jump into the Toyota case.
Toyota’s Throttle by Wire Issues
Anthony: ’cause that was late eighties, early nineties.
Michael: Toyotas was in the early through the, I don’t think it was resolved until the mid 20 teens. 2011 through 2014 or so. Yeah. But the
Anthony: reports started, I remember was it early nineties,
Michael: early two thousands, I think that late Camry.
Phil Koopman: Yeah. So what we saw with Toyota and some other companies is they all switched to throttle by wire,
Anthony: right?
Phil Koopman: That there’s no longer a mechanical linkage. You the throttle, the accelerator pedals connected to the computer and it reads it as a digital data and computes on it, and then commands engine power. And that all the companies were almost all the companies between about 19 98, 19 99 and 2010 converted their fleets over to that technology.
Anthony: Okay. So what, let’s leave before we [00:24:00] jump into on, on intended acceleration, the benefit of removing that mechanical linkage. Is it weight savings, cost savings?
Phil Koopman: There, there’s weight, there’s cost, but there’s also fuel economy and and fuel efficiency.
Anthony: Okay?
Phil Koopman: So if you’re controlling airflow into an engine it’s hard to do really sophisticated control optimization.
If it’s all just a digital computer, you can be smoother. You can play games too. The they’re going down a hill. And so we can pull all the fuel out and just run air through the engine and, there’s all sorts. And you can optimize the combustion, the Soto sto geometry, where the fuel, air mixture, you could just, especially if using fuel injectors, you can do really amazing stuff to implo, improve fuel economy.
And remember, this is when the cafe limits were coming in, so it was huge incentive to, to make engines more efficient. And the software can be really good at that.
Anthony: Okay, so from an engineering point of view and from a customer point of view, there’s a ton of potential benefits to this, but
Phil Koopman: oh, there actual benefits [00:25:00] for sure.
Going to computers provided real benefits at the fleet level and to the individual driving experience, you could do torque based, which is more when the harder you press the pedal, the harder it pushes, which is not really what happens when you’re just controlling fuel and airflow. And so that’s great.
That those are all great benefits, but you have to get it right.
Anthony: Yeah, so that’s something I think we’ve talked about on this show before and I think most people have experienced in their own life, is it takes a really long time to get software, right? ’cause you put software out in the world and there’s a whole bunch of unknowns that you haven’t even thought about that are gonna come up over time.
And that’s why your phone, your desktop computer, constantly has all of these minor updates and whatnot, because software’s really hard to do well. Your impression of kind of what happened in this case is they put this stuff out there and then it’s a decade before you really iron out the bugs
Phil Koopman: That’s half of it.
Okay. And so the software that went out early was [00:26:00] pretty janky, some of it. And even the companies that had two computers, there was the main software, which was janky but their point was, but we have a backup. We have a safety computer. And the safety computers were day late dollar short.
Anthony: But they’re running the same software.
Phil Koopman: No, they’s completely different software.
Anthony: Oh.
Phil Koopman: But that software also had problems and the hardware had problems. Improper management, redundancy, a bunch of other things
Anthony: that
Phil Koopman: just sounds
Anthony: dangerous.
Phil Koopman: There’s another thing going well, two computers is better ’cause they both have to be happy for you, for the engine to move to go.
Anthony: Okay.
Phil Koopman: And the theory is one, the safety computer will be unhappy more easily. Except if you don’t design it right. There are these failure modes that sort of slip through like holes and Swiss cheese. But there’s a another perspective, which is it takes a while to mature the technology.
I get that. But the other thing was at the same time, the knowledge of safety in the car business was maturing. So what I saw was towards the end of the two thousands, especially in some companies, I saw dramatic improvements in quality of the safety software really bad at the start a lot [00:27:00] better. And that lot better coincided with ISO 2 6 2 2 6 2 6 2 safety standard coming out and being issued.
And I have every reason to believe that the people who were working on that standard got educated more about safety from with material from other areas. And so safety practices started coming in to the supply chain towards the end of the two thousands. And that’s why we saw a big spade of this stuff in the two thousands.
The computer your foot on the gas pedal’s merely suggestion the computer’s gonna do what it’s gonna do. And the safety stuff didn’t really come into play in until maybe a decade after the technology happened. And that standard times in to right around 2010. So that’s why that’s another explanation for why all this stuff happened in that decade.
The Importance of Safety Standards in Automotive Software
Anthony: Is that, do you think that’s what we’re gonna see more and more with as more and more software gets into cars? Is that the, this comes out early and the industry takes, x amount of time to catch up to all of these safety implementa implications and [00:28:00]
Phil Koopman: scenarios?
Anthony: We
Phil Koopman: should, there’s no reason to see it.
At the, in the around 2000, there were a thing called MR. Software guidelines, but they were seen as niche from the uk and they weren’t an ISO standard, but by 2010 there was, there’s this ISO standard out, so when you have a new car company, the new cool kids we’re going to, we’re gonna do the software based vehicle and we’re gonna start from scratch because we’re so smart and the standard’s there, but they choose to ignore it, is a problem.
Now, does Tesla follow that standard? I don’t know. Does anyone outside Tesla dope? I don’t know. But there’s a tendency when you’re trying to get product to market, to skimp on the safety standards just to get its functionality out. And until you go back and say, oh wait a minute, there’s a lot of engineering in safety.
It isn’t just seems to work. There’s actually a methodology. You have to follow, a process, you have to follow standards, you have to follow, until you get that level of maturity, you can expect to see dangerous software. That’s how it works in every industry.
Anthony: Are you are you seeing more of [00:29:00] that safety mindset and more of that standards mindset?
Or do you think we’re seeing less?
Phil Koopman: In regular automotive we saw that take hold between 10, 20 10 and 2020. What we’re seeing is a lot of thrashing on standards for self-driving. The standards have been out at this point for five years or more, depending on which standard you’re talking about.
But we’re seeing a lot of pushback and adoption ’cause they’re all in a hurry. But to get back to unintended acceleration don’t need to go down the robot taxi seat laying on this talk. What we saw was several multiple companies having these struggles in the 2000 2010s, but their playbook was to blame the driver.
Anthony: Oh.
Closing Remarks and Call to Action
Anthony: We don’t blame you listeners. Go to auto safety.org, click on donate, and then we’ll never blame you again. And if you donate a thousand dollars, Fred will come and mow your lawn. Fred? He’s nodding his head in agreement. It’s confusing.
Fred: If the lawn happens to be near a Piggly Wiggly,
Anthony: then yeah, of course I’d be happy to.
Well done. Well done.
Driver Fault and Hardware Failures
Anthony: Okay, so the driver totally at fault. ’cause you [00:30:00] don’t know where your pedals are, you’re doing the wrong thing. You’re pointing out to us that pressing the accelerator is just a suggestion. Which I love. And so we don’t know what’s happening in these black boxes with modern systems.
Okay. We understand the hardware failures with earlier systems. Is this still going on or is Toyota the end of this with sudden unintended acceleration?
Phil Koopman: There’ve been recalls in the last year for this, right?
Michael: Yep.
Phil Koopman: So yeah, I think there’s one a
Michael: couple week, a couple weeks ago.
Phil Koopman: Yeah. Yeah.
There’s continuous recalls for this stuff. Usually there are low severity. They aren’t car tries to kill you as hard as it can but yeah, this is going on and there’s no reason to believe it’s gonna stop. Yeah. Something else I want to talk about in this context is what is going on? Why is it that people are blamed so easily and no one says anything?
Expert Witness Process
Phil Koopman: To do that, let me pretend I’m being an expert witness which, by the way, I haven’t taken an expert witness gig in years. I’m, I hope I never have to testify [00:31:00] again. But if putting on that hat, when I went through the process, what I was told I had to do was the pro, it’s a differential diagnosis process.
So if you wanna say it’s the computer’s fault, the process to get there. And as an expert witness, it’s not my job to say it’s the computer’s fault. It’s my job to say, here’s what I think happened. And it’s the computer. It’s not as the case may be the you first rule in all the possibilities. It could be the driver pressed the wrong pedal.
It could be the computer sent acceleration command despite the driver not pressing the pedal. It could be a stuck pedal. It could be, who knows, there’s all sorts of things. It could be it could be that the driver was pressing the brake, but it was ice and the vehicle didn’t stop.
It could be also. And then, so you introduce all the possibilities, other people. Then the computer folks rule out things like icy roads, the accident reconstruction. Folks take care of stuff like that. And then at the end you say we’re gonna rule in all the possibilities. We’re gonna then rule out the ones that we [00:32:00] have.
Evidence are not the case. No, it wasn’t IC Road. Okay. And then whatever’s left standing, the jury gets to decide what, which one they want to win. And so as an expert, you rule in po. As an expert of Sun un command acceleration, I’m always gonna rule in the possibility of driver pedal mis application.
I’m gonna rule in the possibility of computer malfunction and other, I can rely on the car companies to testify that in their opinion, it’s definitely the person pressing the wrong pedal. I will point out any flaws in their testimony and I will see if it makes sense that it could be the computer and often cases it is to do that.
Toyota’s Design Errors
Phil Koopman: You go look at the design, you say, oh look and I’m gonna, I’m only, I’m gonna stick here to the Toyota case. I have a one hour video. I, you can put in the show notes where I go through this. But there are two different signals to the pedal and they both come into the same converter on the same chip.
So if there’s a problem in that analog to digital converter on that chip, it’s gonna read the both pedals [00:33:00] wrong and who knows what’s gonna happen, right? And I’ll say, alright, there’s all these opportunities for the computer to get it wrong by failing to follow basic safety critical design practices. So you have to, I can rule in the computer ’cause it has these design errors and I’m not gonna rule it out because I haven’t seen any evidence to say those design errors were not in play.
But the elect the event data recorder says the driver’s foot was on the pedal. It’s no. It says that it was reading a command from the driver, but I just told you a bunch of reasons why that command might have been created out of a computer fault and not actually reflecting where the driver’s foot was on the pedal.
So it left, gets left rule in and the jury gets to decide.
Anthony: So is this the jurors are almost, Hey,
Fred: Joel, can I
Anthony: Yeah. Go Red
Fred: jump with a couple of thoughts.
Single Point Failures and Redundancy
Fred: I think what you just described, Phil, is single point failures that occur in the design due to insufficient consideration of safety is, am I right?
Phil Koopman: One of the issues I’ve seen [00:34:00] is single point failures in the inputs, but also single point failure in the output. There’s a device that actually commands the throttle angle back on The old engines and Toyota got this right, but not every company did. There’s some companies where that thing could command a wide open throttle independent of the software in a failure mode.
So the single point failures all the way up and down the chain. Possible. Also common cause failures, where there’s, you could have a ground wire to the accelerator assembly, and if that ground wire gets a resistive fault, it’ll misread all the channels.
Fred: For listeners, I want to remind them that a single point failure can be cured by proper design, including redundancy and also use of high reliability components. There’s various ways of doing that, but there’s no requirement for any of the automobile manufacturers to do that. Is that correct?
Phil Koopman: That’s correct. And it’s even [00:35:00] worse in most, if not all the cases I’ve seen. The redundancy was available, but the engineers didn’t, did not use it properly. In the case of, because of lack of skills and knowledge near, as I can tell. So in Toyota there were two chips and each one could have taken a different one.
There were two track accelerator pedal, two different analog signals coming out of it. And there was a, an older Jaguar computer built by the same company with very similar components, if not the same components. An older Jaguar computer had each of the two tracks going into different chips, but for some reason, the Toyota design had both tracks going into the same chip, even though the other chip had an unused converter.
And that was just the design error. They didn’t do the redundancy properly, so
Fred: we should all buy Jaguars instead of Toyotas. Is that the conclusion?
Phil Koopman: In that era, the Jaguar engineers knew their safety pretty well. DENSO was apparently capable of [00:36:00] producing safe designs ’cause they actually produced the Jaguar design, but they also produced a Toyota design.
That was not like that. And the insight I have heard is that well, you only get safety if you pay for it. Okay. And I’m talking about you have to pay, you have to pay the CO if you want somebody to staff and execute safety engineering processes, the OE EM has to pay the supplier to staff and execute safety processes.
And if you don’t pay for it, you’re not gonna get it. That’s the kind of stuff that’s the kind of dynamic in the car industry.
Fred: That’s discouraging. So you’d think there’d be some nonprofit somewhere that would try to, oversee some of these safety conclusions, it,
Phil Koopman: Or try and advocate such an organization or try and advocate for safety.
Yeah. And I’ve seen other electronic engine controllers from that area that only had one computer instead of two, because that’s cheaper.
Toyota Legal Outcomes
Anthony: What was the outcome on the Toyota cases? Was did the humans take the fall?
Phil Koopman: What happened with Toyota was. [00:37:00] It got really complicated because it got political, right?
You had a secretary of transportation saying stuff. The outcome was, there were three cases. There was a class action for economic loss. Hey and I’m oversimplifying here, but hey, nobody wants to buy my used Toyota because it’s so dangerous. Read the news, right? And so Toyota, gimme back some money ’cause you sold me a defective, less valued product.
And that sold for a billion change. There were individual death and injury cases that went through a multi-district litigation and some other state cases, and that there were hundreds, what was it? What was it, Michael? Five, 600 cases settled at some point. I remember hearing those 500 lot. It was still going.
Yeah. And so those are. Those are all secret, but it would not be, it takes no imagination to think that they’re all millions of dollars a piece for the settlement. But, it all depends. Some of them maybe had no merit, who knows? But it was hundreds and for sure some of them had merit. And then there was a criminal fine for more than a [00:38:00] billion dollar for lie, about dollars for lying to nitsa about withholding, about the floor mats and the sticky throttles, which was what Toyota tried to tell everyone what it was.
So Toyota never admitted there was a software problem. They never admitted there was a hardware redundancy problem. They went with floor mats and sticky throttles and then paid to find a itsa for withholding that. That’s how it turned out.
Anthony: Yeah. The floor mats is the one I remember hearing about the most.
And this was that the floor mats would get jammed on top of the accelerator pedal or
Phil Koopman: somewhere if you have, if you hinge at the bottom, the floor mat can or the pedal gets too close. The floor mat can, like the heavy rubber winter floor mats can get up on top of the pedal. That’s happened with other cars as well.
Anthony: But again, this is not, this is it’s not our hardware, it’s not our software. It’s something
Phil Koopman: else. And the fallacy is if it happens once, therefore all of them must be that. It’s no, there’s room for it to be more than one possibility.
Anthony: And I’m just thinking from a human juror point of view, I can understand, oh, beyond a reasonable doubt type thing.
Phil Koopman: And it’s almost, oh, it’s not. No, it’s not. These are civil.
Anthony: Oh, these are civil. All right. [00:39:00] Okay.
Phil Koopman: It’s 51%. Or it’s to say I don’t have to get to the end zone. I just have to get to the 51 yard line and I win the case. That’s an approximate quote.
Michael: Yeah. And I don’t even, I think it’s always fun to mention too that, that NASA was brought in by Nitsa on this, which is something that’s, I don’t think has happened since or before.
And that’s, and wasn’t able to figure it out either.
Phil Koopman: Whoa, be careful, Michael.
Michael: Okay. Okay.
Phil Koopman: NASA was brought in, they said, my goodness, this stuff just reeks to high heaven. Stink. Okay. We’re not able to pin it down to any one thing that we can reproduce, but we’re not saying it’s good.
We’re saying it’s terrible, but we can’t, we didn’t find the smoking gun. And by the way, there are indications that they would’ve liked to work longer and we’re told to stop. Okay? And then the head of Department of Transportation says, jury’s in NASA said, it’s not that. It’s no, they never said that.[00:40:00]
They never said that. They never said, that’s not it. They said, I whereas what, they said they couldn’t find it. And I have every reason to believe they had limited time and resources and could have done more, but weren’t able to. And that’s not the same as saying it wasn’t there.
Tesla and Software Issues
Anthony: And this is still going on blaming the dri, the driver with Tesla. There’s a, what is it? This is a, there’s even a point where we have a link to auto evolution.com where Tesla owners in China, apparently they have cameras in their foot. To show that, hey, it’s not me pressing the wrong pedal.
Phil Koopman: That’s a smart, that’s a smart idea. It turns out, yeah, that’s a smart idea. But its also takes, that’s, it takes crazy. But that’s the only way you could, because that why is that smart basic safety, that’s an independent measurement that if the car computer misreads the pedal, it doesn’t have the ability to trick your camera into also misreading the pedal.
That’s why it makes sense. It’s used to redundancy.
Anthony: Yeah. It’s,
Phil Koopman: Lemme tell you, [00:41:00] this case are so hard. Those cases are hard because remember I talked about the ruling and rule out stuff.
When was the last time a police officer ruled in the possibility of software failure when they determined the cause of a crash?
Michael: Excellent. That, that’s not really within their purview, right? Because they
Phil Koopman: don’t, that’s not the review and all the nitsa investigations say we know because the last 17 papers in our reference chain I’m exaggerating, right? The last papers in our reference chain said it’s always the person.
So we know it’s always the person. So there’s no point even looking at the software.
Michael: Yeah.
Phil Koopman: So Nitsa ne nitsa habitually does not rule in software unless the manufacturer cops a plea that it was software. So the manufacturer doesn’t say software, NITSA does not rule it in habitually. I hope that changing these days as they get more sophisticated.
But historically, NITSA never ruled in software. They always ruled in person. And the only two things left landing are standing as possible. Explanations are software and person, but software we’re ignoring, therefore, it’s the person of course it’s a person at, there’s a word for that. There’s a phrase for that.
It’s [00:42:00] called self-fulfilling prophecy.
Anthony: With cars being,
Phil Koopman: and so Tesla, it’s it could be software, it could be the person working, ignore software, therefore it’s the person. I mean that, that’s the dynamic playing out right now.
Anthony: But would you suggest that all Tesla owners, so p put cameras in their foot wells, or you just do that for every, you buy a car, you get a little GoPro, you put it, film your feet.
If you don’t get into a crash, you put that on foot fetish.com, make some money.
Fred: Or maybe we should put ratings on the on the stickers, on the cars now that, like the movies that PG would be perpetual gaslight.
Anthony: Yeah,
Fred: I think
Phil Koopman: it’s tough. And again, I never said, it’s never the person put on the wrong pedal. I never said that. Totally
Anthony: understand.
Phil Koopman: It’s possible. I have to say this like 20 times, and even then people will say, but he said it’s, oh, is the computer right?
I’m not saying that. But let’s talk about a little more what’s going on with Tesla specifically. You have a very complex software. Do they follow ISO 2 6 2 6 2, which is a functional safety standard. They’ve never said they do. There’s no [00:43:00] reason to believe they do. If they said it, I’d love for them to say they follow it.
But there’s very complicated software. You’re going through machine learning, does the throttle go through the machine learning or does the manual driving signal go around it? I have seen car companies make the mistake that the accelerator goes through the machine learning and then, oh, I’ve seen that.
Okay. Wow. I have seen that on a low, it was a low volume production, but I’ve seen it. Okay. So who knows if they got the design right? I hope they got it right. They have a brake pedal, but an electric vehicle depending on the design. But I know for sure some hybrids, the brake pedal’s also just a suggestion.
And the hydraulics are isolated unless the computer allows you to break through and there’s a way for that to go wrong. Okay. So you have a gas pedal, you have a brake pedal, it’s all by wire. Who knows what happens in there? And if it goes crazy, it could be the driver pressed the wrong pedal by by accident and kept their foot down, in which case that’s the driver’s fault.
I get it. [00:44:00] And I can believe that’s happened. It could be the driver pressed the pedal momentarily, realized they made a mistake, and the torque on those Teslas is so high they had no time to recover. And that is a possible explanation for a lot of the crashing into building things we see. But it could be that they did not put their foot on the pedal, or they gently tapped the pedal and in the signal got mangled up by some software issues or some hardware issues or some combination.
And the car said, I’m gonna take off, even though the driver did not command it to take off. But the data recorder isn’t recording a photograph of the foot on the pedal. It’s recording the end of all these computations that if they’re wrong, it’s just gonna say it’s not what the driver did, it’s what the computer thought the driver did, which are somewhat different things.
Michael: You think there’s that,
Fred: there’s another thing I wanna bring up. You mentioned ISO 2 6 2 several times is the standard that’s relevant. And [00:45:00] at the heart of that is the definition of safety, which is that it’s an absence of unreasonable risk. So a company can be completely compliant with ISO 2 6 2 6 2 as long as in their opinion whatever risks are associated with that are reasonable.
There’s no official Regi official arbiter of what’s reasonable and what’s not. So even referring to that means that you’re depending on the courts to give you a favorable ruling. What reasonable actually means?
Phil Koopman: I look at it, there’s a little more anchoring, but not a lot more, if you look at the standard also, it’s what does unreasonable mean?
It’s according to valid societal whatever I, which is pretty squishy. There’s I should have memorized the wording for this, but I didn’t. But it’s according to relevant societal moral values, something like that. As you say, Fred, it’s pretty squishy in practice. Absence of unreasonable risk means NITSA hasn’t decided to recall it [00:46:00] because the reason safety for safety recalls that NITSA does is because they think it prevent presents an unreasonable risk.
So car companies can do whatever they want ’cause there’s no there’s no federal motor vehicle safety standard about software quality car. Car companies can do whatever they want. They have to be cognizant of the fact that if they have a pattern of crashes, NITSA might recall them. And that is, by definition, presence of unreasonable risk.
They may also have to worry about families of or victims suing them for a crash. And the thing in their favor is there’s a presumption right now that the computer is correct, even though the computer might be the source of the problem. And you gotta, you got a guy with a baseball bat standing near to a broken store window, and the cop walks up and says, do you do it?
The guy says, Nope. Okay. Why would you believe the thing that you’re suspecting that maybe the computer made a mistake. Why would you believe what the, with the computer, when it says, no, I didn’t make a mistake. Not me. That [00:47:00] was me. Why would you believe that?
Fred: There, and there’s at least one company that is going after the data that’s in the car and altering it after tragic circumstances have taken place.
Phil Koopman: Of course there’s an incentive to, so
you’ll
Fred: compensate for that.
Phil Koopman: Of course, there’s an incentive to do that. One hopes there’s forensic validity data, but, and Tesla, the data gets reported from the car up to their servers. And Tesla’s been in a situation where they couldn’t find the data they said until someone pointed out where to look.
And they said, oh, look at that. We have the data. After all, that was the Benevides case. And there’s an opportunity for hanky panky to happen there. I do. I know whether it’s happening or not. No, I don’t. But I’d love to see cryptographic protection on data so that it can’t be, so it’s tamper evident, for example, but we don’t see that happening.
Anthony: So with cars becoming more and more computers, I don’t imagine this problem will just magically disappear. It’s, it’s a software problem, it’s a hardware problem. It’s at times a human factors problem. But with more and more cars becoming [00:48:00] basically software on wheels, how do we track this stuff going forward?
Regulation and Software Quality
Anthony: Does the, do the risks increase? You mentioned there’s no F-M-V-S-S around software quality, which of course, and it also blows my mind ’cause that seems like the future of what nitsa should be really focused on is, oh yeah, this is a computer on wheels. Maybe we should learn how to regulate this and make it safer.
Phil Koopman: So I have a model for this and it actually drives what I’ve been doing the last decade or more at this point. The model is that if you believe it is more likely that it’s almost always the human and sometimes the computer, then you can get away with always blaming the human,
Anthony: but we’re eliminating the human.
Phil Koopman: That’s the problem, isn’t it? Now, I suspect that it is more often than the computer than people are willing to admit because we’ve, the regulators in the industry have doubled down on this. It’s always the human’s [00:49:00] fault no matter what. We’re just gonna ignore the possibility of computers and if it’s almost always the human, or mostly the human and nobody really knows what the fraction is, but if it’s 60%, the human, you can get away with it.
If it’s 10%, the human probably you can’t. So it’s probably more, I’m sure there’s a lot of human error involved. Never said otherwise, but there’s some amount of computer error. There’s, it’s absolutely crystal clear. I have a bunch of experiences I can’t talk about that are just, if you knew about them, you’d go, wow, that’s just horrific, that the computers are making that many mistakes, thousands of potential incidents.
Although it’s unclear. There’s one, one, there’s one situation where there were thousands of reports and people saying, yeah, I get, I, yeah, I get that. You’re supposed to say, it was my fault, but look, it wasn’t. So how can we fix this? Stuff like this going on, right?
Anthony: Just as aside, I have a 2020 Toyota Corolla.
Am I, should I never drive it again? Just win.
Phil Koopman: We’re, if I should, we’re not going, we’re not going there.
Anthony: Alright,
Phil Koopman: By 2020 you should be good tie out, learned some important lessons, and by 2020 they got it together. [00:50:00]
Anthony: Alright?
Phil Koopman: But I’m not gonna give you an exact cutoff here. So this stuff has been going on at a much greater scale than people appreciate.
Not zero, not a hundred percent, but somewhere in between. And adherence to safety standards helped some for the car companies who wanna do it, but it’s not required. Not everyone does. People cut corners. And as you pointed out the epiphany, some folks I was talking to came when Waymo took the driver out of the car the first time.
That entire playbook of driver blaming the driver goes away when there’s no driver. So what’s your plan?
Anthony: Yeah. What is the plan? I don’t know. ’cause
Phil Koopman: they’re just saying this is a well about then we fix this. About then you saw Nitsa get a lot more serious about getting computer skills because Nitsa plan for decades has been, since it’s never software, we don’t have to worry about software that may not be entirely fair.
And I know there’s smart people in Nitsa who care a lot, but in an organizational point of view, they, they didn’t do so if it’s [00:51:00] never software, you don’t have to staff up in software. And since you have nobody in software, you can’t blame the software. So it’s never software. So everything’s good and that.
Plan, that strategic plan worked great until the person went away and now they’re like, we’re caught. We got nothing. And so you saw them pivot into staffing up on software. I, my personal belief is that’s the mechanism at work. Just my opinion, it’s my conspiracy theory. I have no idea how true it is or not but observing all the moving pieces, I, I think maybe that’s what was going on.
Anthony: Michael, do you, have you seen any sort of regulations moving forward? Anything around software? No. Safety?
Michael: No.
Phil Koopman: No.
Anthony: See, Phil made me hopeful and now you can
Phil Koopman: brush it. Hiring a few people who have a clue about software is different than regulating on software. It’s, but it’s a it’s the first
Anthony: step. It’s
Phil Koopman: a except all those guys are the new guys, so most of them got laid off and it’s, that’s a whole different mess.
But that didn’t happen in Europe. So in Europe we’re seeing people get smarter about software and regulation. So it’s a slow. It’s a [00:52:00] big thing to get moving. We’re seeing regulators get smarter about computers. Computers get faster and more capable and it’s still outrunning the regulators, but we’re seeing some motion regulation.
But yeah, it’s at some point the sun is starting to set on pedal mis application ’cause there aren’t any human drivers anymore. Or because we’re worried not about pedal miss application, we’re worried about autopilot and other self-driving steering mistakes that the drivers expected to jump in and save, which is different than pricing the wrong pedal.
Anthony: And now we’re getting the new gaslight around. This is when Waymo or a Tesla or something does something wrong, they’re like, oh, we fix it. That was just a minor bug. It’s just a minor thing. We fix it over the air so it’s
Phil Koopman: not a problem. Yeah. The whole over there update means it’s not a safety problem though.
It was always a safety problem. It was probably out there for months before you found about it. The fact you can fix a multi-year long problem overnight doesn’t make it an overnight problem. That’s right.
Anthony: And then also skeptical to us that it’s actually been fully fixed.
Phil Koopman: We’ve seen cases where that was not really what happened.[00:53:00]
It was not really fully fixed.
Anthony: Oh my. So it’s,
Phil Koopman: so what else about unintended acceleration? There’s, there’s been all these lawsuits there’s all sorts of craziness associated with it. There’s a lot of astroturfing, incredible amounts of aft astroturfing that the industry spent a lot of time and energy making sure everyone knows it’s always the driver’s fault, whereas the reality is a lot more subtle and these crashes are still happening.
People are still getting hurt. They’re still potentially dying from situations where it could, it might or might not, but could be a computer design defect where they got the safety and redundancy wrong. That’s still happening. It’s gonna keep happening for a long time, but the default is always to blame the driver.
That’s where we are.
Anthony: I look forward to the next round of lawsuits where there is no, the computer’s, the driver and they figure out how to blame the passenger. You said you wanted to get there quickly.
Phil Koopman: Oh, that, oh, that’s coming. You had a stop button in the back. Why didn’t you stop the vehicle?
That’s comings
Anthony: coming. I was playing a video game, watching a movie, reading a book, taking a nap.
Phil Koopman: I was in the front seat and [00:54:00] the manufacturer told me I didn’t have to pay attention. And yeah, this is the new World, the brave new world is more complicated. A lot of the stuff, same stuff comes into play.
You agree? I do wanna make a minor plug. I have a paper on this from Safe Comp years ago that documents this historical arc and says, here’s where you look and here’s where Nitsa said the electronic data recorder didn’t show your foot on the gas pedal, but since it’s a driver’s fault, you must, your foot must have been on the gas pedal even anyway, even though the computer didn’t even show it.
It just gets crazy. It just gets crazy. They just make stuff up at some point,
Fred: we’d like to see that. Can you send us the link please? And we
Phil Koopman: can
Fred: post that?
Phil Koopman: Yeah. So you have the link to the paper.
Fred: Yeah.
Phil Koopman: And. Let’s see, I, there’s a lot of really weird stuff that’s happened around it.
Just crazy stuff has also happened around it. People sending emails to the university trying to get me fired ’cause they didn’t want to meet me to qualify to testify and, to trying dissuade me from testifying. And and there was the Russian bot, cyber attack on a [00:55:00] webinar and Right.
Anthony: You sent us a screenshot to that, and I’m looking at this going, wait, what?
Phil Koopman: Yeah, this carried over to my work in UL 4,600. There, when I was doing, working on that standards, the originator of the standard, there was the first webinar for non-technical stakeholders. And so this is gonna be, the regulators are gonna show up, right?
And as we’re trying to set up the webinar, ul underwriters lab ul.org, the nonprofit side, we’re trying to set up this webinar and they got a big pulse of registrations, like we got 3000 registrations. I said, wow, that’s a lot of regulators. And they dug into the logs and I had them send me a log.
So you have a picture right from the logs where they sent me the spreadsheet and there’s 3000 plus. That all showed up in a few minutes, and they’re all in ri
and they all and I think, there’s the word is ciri for congratulations in Russian. So somebody, it apparently repurposed a form like a form filling spam bot. And near as I could tell, [00:56:00] most webinars have a cap on registrants because you have to pay above a certain number and most of them are configured to turn off the registration when they hit the cap.
But the nice folks at UL who set this up misconfigured that. And so they didn’t have a cap, so it just kept eating registrations for 3000 of them until they got tired of it. And one of the poor folks at UL had to manually delete all 3000 registrations to, to not have to pay for the bigger webinar.
What a mess.
Anthony: Oh my god. And you show up and you’re like, this is gonna be 3000 people and it’s just
Phil Koopman: people. No, I knew. I knew what had happened. I knew what had happened. Yeah, it was crazy. So there’s all sorts of, you’re talking billions of dollars at stake. Billions of dollars when there’s billions of dollars at stake.
Weird things start happening,
Anthony: But it still just makes me jump back to, Ford engineers for their case pointing out before they put this product to market saying, Hey, here’s this problem. Hey, we have the solution to fix it. And then they just say, nah, let’s wait a [00:57:00] decade and lie about it.
Phil Koopman: And that happened with the GM ignition as well, didn’t it?
Anthony: Yep. It’s, it I
Phil Koopman: don’t get it. This is a recurring pattern and it’s happened. These kind of issues happen more often than the general public knows. Most of them never see the light of day. I’m not saying the whole car industry is like this all the time but it only takes a few with wide scale harm to be a problem.
Anthony: Bizarre. Ford could have licensed their patent ’cause everyone else probably had the same problem. They could have made money instead of laws lose money, but, hey, that’s Ford. Hey Jim Carley, whenever you’re ready. Come on board.
Phil Koopman: Let me put in another word there. There are a lot of hardware eng working engineers who think they’re doing safety and honestly believe they’re doing safety.
And some of them are being gaslighted by the companies too. So it’s a really, you have some folks I had a recent discussion not gonna be more specific, where this guy said, we thought we were doing the right thing, and the more. He learns of some of this stuff, the more he’s not so sure they were doing the right thing.
Not gonna say which company, not gonna say who it is, but [00:58:00] that’s, it’s really tough for everyone. And it’s all about the money,
VO: right?
Phil Koopman: It’s all about the money. It’s all about them trying to maximize profit and just make decisions like back to the Ford Pinto where this one is often it’s not as black and white as some people would like it to be in Ford’s favor.
It’s not quite as black as white, but it’s if it’s cheaper to kill people than to pay than to fix the problem. There’s a strong incentive to not fix the problem. That’s the business. It’s tough.
Anthony: Yeah.
Conclusion and Final Thoughts
Anthony: That’s been our never ending gaslight. Yeah. Thanks Phil for Yeah.
Fred: Thank you for that,
Anthony: that super deep
Fred: dive. Thank. I was just gonna say thank you for that cheerful postscript, Phil.
Phil Koopman: Yeah. It’s
Fred: a pleasure to have you on today.
Phil Koopman: I’m happy to be here and I’m sure when it’s a hundred percent computer drivers, all these dysfunctions incentives will go away.
Anthony: When it’s all computer drivers, no one will ever crash. The roads will be safe, the children will be good looking.
Phil Koopman: And all the drivers will be above average.
Anthony: Exactly. Exactly. There we go. Hey, with that, thank [00:59:00] you listeners. I hope you learned a lot from it. I did. I will still get upset at rear view camera and recalls no matter what.
But I understand them more and I understand sudden acceleration more and I’ll still get upset at those too. Till next week. Thank you. Bye
Phil Koopman: bye. Thanks. Bye everybody. Bye-bye.
VO: For more information, visit www.auto safety.org.