Phil Koopman, Autonomous Vehicle Safety and Tesla’s hidden door handles
Phil Koopman comes back to the show for a 2 part episode. During this first one we discuss the dangers of Tesla’s electronic door latches, the lack of federal regulations covering these safety gaps, and the broader implications for emerging vehicle technologies.
Then SAE and those levels… Phil summarizes this nicely.
This weeks links:
- https://philkoopman.substack.com/p/people-are-still-being-burned-alive
- https://safeautonomy.blogspot.com/2021/11/regulating-AVs-with-human-drivers.html
- https://philkoopman.substack.com/p/time-to-formally-define-level-2-vehicle-196
- https://www.jurist.org/commentary/2023/08/widen-koopman-automated-vehicles-criminal-law/
Subscribe using your favorite podcast service:
Transcript
note: this is a machine generated transcript and may not be completely accurate. This is provided for convience and should not be used for attribution.
[00:00:00] Introduction and Welcome
[00:00:00] Anthony: You’re listening to There Auto Be A Law, the center for auto safety podcast with the executive director, Michael Brooks, chief engineer, Fred Perkins, and hosted by me, Anthony Cimino for over 50 years, the center for auto safety has worked to make cars safer.
Hey listeners, welcome back to another episode of their auto be a law.
[00:00:30] Guest Introduction: Phil Koopman
[00:00:30] Anthony: This week, actually, and next week, we have special guest, Phil Koopman, a professor at Carnegie Mellon University, and a person who’s probably working on autonomous vehicles for longer than anybody alive. I’m gonna say that, because that sounds, you know, about thirty Not
[00:00:46] Phil Koopman: entirely true, Anthony, but, but not so, yeah.
Not so wrong, about, what, thirty years? Yeah, there’s there’s a crew from from almost 30 years ago who was doing it before I was doing it And I’m one of the first couple safety [00:01:00] folks. There’s another one in Europe I think beat me by a few months, but you know, there’s only a couple of us who’ve been doing safety for that long
[00:01:07] Anthony: Okay.
[00:01:07] Tesla’s Safety Concerns
[00:01:07] Anthony: Well great to have you back Looking forward to get into some very in depth conversations And so let’s start off because our last episode or two episodes ago I guess we were talking again about the issue with Tesla’s And what happens when their electrical system dies, you’re trapped inside, and the door handles no longer work.
We’ve pointed out repeatedly that in some cars, oh, you remove a speaker grill to get a handle, or you you know, pop some random part of the door off.
[00:01:39] Human Factors in Engineering
[00:01:39] Anthony: So, Phil, from your perspective, as A human factors expert, I can say, how does this happen from an engineering point of view? Like, why do you think, hey, let’s make it really hard for our customers to stay alive?
[00:01:53] Phil Koopman: So I’m a computer guy who spent a lot of time in my life learning about human factors and I’ve even taught [00:02:00] some about human factors to be, to be clear. Just. It just boggles my mind that you can have a consumer product that requires instructions that are not readily evident to save your life. I just, I just can’t imagine how someone would build something like that.
I’ve, I’ve had all these experiences that teach me that. An ordinary person in the back of a Tesla trying to use the, the arcane emergency escape systems they may or may not even have in some cars don’t even have them, you know, that ordinary person cannot be expected to get that stuff right. So I just, you know, it just, I, it’s scary as anything.
[00:02:38] Anthony: And Michael, how is there no?
[00:02:40] Federal Regulations and Safety Standards
[00:02:40] Anthony: federal regulations around this that you know, there, there’s crash testing regulations. There’s requirements for seat belts and airbags. Is it just something that people thought was too obvious? Hey, you should be able to open the doors in any situation.
[00:02:55] Michael: I think it’s because electronic door latches are becoming more [00:03:00] and more common now due to electric vehicles.
There are a couple, three or four vehicles that have come out in the past 20 years. That had electronic door latches, but I think that, you know, one of those was General Motors vehicle, I think a Cadillac that, that had a human factors approach to the problem. You know, the, the, the GM approach is if you are, if your electronic door latch fails, you can pull on your door handle twice, which is a, you know, it’s what I think we expect humans to do when they’re trapped in a vehicle and trying to exit quickly.
And because of that, we just haven’t. seen this issue rise to the level where there need to be federal regulations that, you know, require manufacturers to label these, the manual switch or the manual lever or whatever it is. It requires them to label it and also to standardize it. Placement, I think is important because if you standardize the placement of a manual door release, then, you know, anyone who gets into a [00:04:00] car only has to learn where it is once and it’ll be there in every vehicle versus, you know, people who have to figure out based on which Tesla model they had exactly how they’re going to respond if they become trapped in the vehicle.
[00:04:13] Fred: Well, Michael, isn’t there a systematic failure here at the initial level, because the only. Systematic review of the of the safety requirements that I’m aware of was to try to eliminate as many as possible to get in front of the AI developments. Right? So all of them are kind of frozen in time. They all have implicit human drivers.
They’re all are based upon the designs that were. Available back. Oh, I don’t know, 20 years ago, 25 years ago when these standards were being developed. So
[00:04:46] Michael: yeah, and this is not really good at doing systematic reviews of safety and figuring out, you know, which parts or which potentially dangerous things in the vehicle need to be regulated.
Typically, as we’ve seen over the last 20 years. You know, the [00:05:00] agency usually has to be forced by Congress to write a motor vehicle safety standard. And so, you know, this, this issue has really not come to the fore until we’ve seen a lot of problems with Tesla and with other electric vehicles and in egress.
So, you know, I don’t expect anything to happen in the next few years, but, you know, at some point, there definitely needs to be a motor vehicle safety standard that covers this area because with, with EVs, you know, However fast they, they begin taking over the roads in America, they’re all going to have an electronic door latch, it appears, and there’s going to have to be safety standards around that to allow for proper egress.
[00:05:41] Phil Koopman: So trying to look at this from a higher level point of view, the approach we take in this country is there’s no point having safety standard if it’s not a problem, right? Yeah. You know, and, and, and I understand the logic of that, but a, a Completely inherent property of that system is that means there have to be a lot of problems before you have a standard, [00:06:00] right?
I mean, it just goes, right? If you’re going to wait until there’s a lot of problems before you have a standard, that means you’re going to have to see a lot of problems first. And sadly, in this case, problems means people dying. And that’s the way our system works. So, so it’s not unexpected we have this.
What’s disappointing is the manufacturers have the option to not have a problem in the first place by just designing it in a way that’s going to be safe. And then you never need the standard, but that’s not where we are.
[00:06:26] Owner’s Manual and Consumer Awareness
[00:06:26] Anthony: All of us have bought a new car at some point, right? Fair enough. Yes. Oh, yeah. Not many,
[00:06:31] Phil Koopman: but yes.
Sure.
[00:06:33] Michael: As few as possible. Okay. Exactly.
[00:06:36] Anthony: So I mean, we’re going to be a unique group to ask this question to how many of you have read your owner’s manual cover to cover?
[00:06:44] Phil Koopman: No. Whoa, really? I read one cover to cover my first car. Okay. And then after that I skim because it’s like, yeah, I’ve already read this section on the last car.
[00:06:54] Anthony: Okay. But even
[00:06:55] Phil Koopman: then, it doesn’t mean that, that eight years later, it’s still in my head. I still have to pull out the [00:07:00] manual and say, now, where was that? Where was the hood release? I don’t remember where the hood release was.
[00:07:04] Fred: I’ve pretty much gone through it on my Subaru. I did not memorize the index, but I think I got the high points out of it.
[00:07:12] Anthony: Yeah,
[00:07:12] Michael: I read it. I read it as necessary. If I want to know my tire inflation pressure or how to how to change my clock and during daylight savings time, things like that, where typically it’s related to the electronics and the, you know, how to make sure that my blind spot warnings are turned on and that kind of thing.
But beyond that, It’s a fairly dry reading and it’s probably not going to hold the attention span of most Americans, I would guess.
[00:07:40] Fred: All right, well, I’m going to take this opportunity to do a spot quiz for the professor here. So, Phil, what is the recommended Pressure for your spare tire in your car.
[00:07:51] Phil Koopman: I go to the label on the side of my door.
[00:07:55] Fred: Nope, that won’t get you there because the spare tire is not there.
[00:07:59] Phil Koopman: Yeah, so I look it up in my, [00:08:00] so I’ve only done that once. I look it up in my manual.
[00:08:02] Michael: Yep. Yep. Yep. Michael? I don’t have a spare tire. Oh. That’s a simple solution. They, they, that was not included with my vehicle because of the companies wanting to lighten the vehicles to meet fuel economy regulations.
And so what they started doing was taking spare tires out of your car. So I have a can of Fix A Flat back there.
[00:08:23] Anthony: There you go. Your turn, Anthony. 24 PSI. Nah, you’re not even close. Nah, I’m just making numbers up.
[00:08:30] Phil Koopman: Yeah, probably up above 40 for an SUV. That’s 50 for the donut.
[00:08:36] Fred: 50 for the donut tire in my car.
[00:08:38] Phil Koopman: There you go. Let me point out something we just saw, even with a sample size 4, is that the more engineer you are on the spectrum, I’m having fun there, the more of your manual you’ve read. But most people are not on that spectrum. So,
[00:08:52] Emergency Egress Challenges
[00:08:52] Anthony: the reason I ask that question, we have links to a bunch of articles that Phil, you’ve written on your sub stack and in the, this one here about titled [00:09:00] people are still being burned alive in Teslas you point out where kind of these emergency releases are in them.
And for the Model 3 rear door, it says not equipped with manual release. Right. And in the manual, only the front doors are equipped with a manual door release. So yeah, that’s right, and I’ve heard that there’s
[00:09:18] Phil Koopman: even more nuances beyond just the manuals that some, some other cars don’t have it. Yeah, there’s no, there’s no manual release.
So when you get into a Model 3 robot I’m sorry, when you get into a Model 3 Tesla ride, hail, you know, Uber, Lyft, whatever you get into the back of a Model 3 ride, hail, and if you have to get out, the manual releases up front. You don’t have one. You don’t get one.
[00:09:41] Anthony: What if I pay for an Uber XL? Do I get, do I get more?
No? Depends what car they send. That’s insane. I don’t understand the engineering mindset behind this. Is it just thinking, Ah, yeah, we won’t crash.
[00:09:57] Michael: I don’t think there is one, right? Isn’t it just [00:10:00] missing? They’re missing human factors. I mean, I don’t believe it’s just in this instance where you see that problem with Tesla.
I mean, I think it’s certainly arguable that they’re the way they’ve approached full self drive and autopilot show a pretty significant lack of human factors considerations.
[00:10:16] Phil Koopman: Sure, sure. There’s an overall, in fact, there was a NHTSA query where they We’re decoding the polite wording. It’s like, do you even have any human factors engineers?
You know, they’re in the letter. It didn’t quite say that, but that’s what it said to me when I read it. Well, let’s try and be specific because there’s the, there’s the human factors of everyday use and not making mistakes. And then it’s somewhat different in an emergency. You know, the way people behave in a crisis is a little bit different, and we’re talking cars on fire have to get out, you know, that, that’s a crisis, so it’s a little different than, is it too easy to auto, to accidentally disable auto steer, you know, we’re not talking about that, we’re talking about, hey, I gotta get out of the car, how do I do it and, [00:11:00] and the designs in Tesla’s are, are really bad from that point of view, even the ones where there is a release but, but, you know, maybe we should, I spent a little time chewing on that because that’s not something most people spend a lot of time with.
[00:11:13] Military Training and Crisis Management
[00:11:13] Phil Koopman: I did spend a fair amount of time with that because I was in the military. And in the military you get exposure to crisis training and, and the occasional actual crisis. And it’s, it’s tough. I mean the way, what they say in the military is you fight how you train. And that applies to everything, right?
You fight how you train. And so they make sure you’re training on everything. So I used to drive submarines for a living, you know, seal yourself inside a giant steel tank and go several hundred feet underwater for You know, however many weeks it is, and and by the way, your best buddies are a ginormous lead acid battery and a nuclear power plant, and some, some you know, torpedoes and some other special weapons I can’t talk about, and some other [00:12:00] things that may or may not have been in my particular submarine on any particular day and, and so on.
So it’s, it’s a somewhat, oh yeah, and you make, and, and, you know, the air you’re breathing is coming from machines, right? So it. It’s kind of a hazardous environment and so you spend a lot of time, six days a week, six days out of seven, seven, seven we cleaned and we had off, six days a week in the morning all of a sudden some various, some of the various alarms will ring and, and there will be an emergency and you’re going to have to go through the motions and, and the actual actions of how do you deal with this and so multiple times a week we’d be putting on breathing masks and with a hose hooked up to a, to an air, air plenum up above.
If you want to walk down the hall, you have to pull the hose off, take a few steps, put it back in, take another breath. All right. You know, this is the real deal. It’s, there’s no difference between what’s going on in these drills and what you would actually have to do. And you do that every single day, six days a week.
And even then, when the real thing hits, you get this huge hit of adrenaline and you have to stop, take a deep breath, center yourself and say, okay, what’s the procedure? All right, I’m going to follow [00:13:00] the procedure. You know, that’s the only way you make it through this stuff. Let’s con Oh, and, and the training.
The training I’ve been to firefighting school three times, I think. Where, you know, you get to do a fire hose and walk into a burning area and put out fires. And then there’s damage control training where they have a big room. Basically, it’s sealed, so it’s like a tank, okay? There’s fresh air, that’s about it.
Full of pipes. The pipes have holes in them. They give you damage control kit with some plugs and some, some rubber and some, some you know, ties and all this kind of stuff. And they send a groupie in, and they close, they slam the door shut, and they turn on the water and they say, have fun fixing holes.
Oh my god. And they turn it off when the water gets about four feet deep.
[00:13:44] Anthony: Oh, wow. Okay. But how many times have you had to train, like, that specific exercise? Three times. Three days. Okay. Okay.
[00:13:53] Fred: Okay. Okay, Phil. But did you do it in the dark? Did you do it in the dark?
[00:13:58] Phil Koopman: I don’t think they turned off the lights [00:14:00] for us.
That’s probably advanced training. Yeah, they
[00:14:02] Fred: were taking it easy on you.
[00:14:03] Phil Koopman: But, but, you know, there’s, there’s battle lanterns with flashlights. I mean, I certainly have, I certainly have done things where the only thing you had was a, was a battle lantern. That’s part of the training too. Not, not underwater, but you get the idea.
Okay. So now so I’m, I consider myself, that’s average. That’s kind of average. That, I’m nothing special there. Maybe, most people only probably go through fighting school once or twice, but it’s, you know, I’m pretty average. That’s just what you do when they’re in the submarine force. Okay, so so everyone, all my colleagues, all my shipmates, they all have that training.
And still, when, when things get hard, it’s, it’s really hard to, to maintain, you know, to maintain possession of your wits and do the thing you need to do. When the real thing happens, You know, you can follow your training, but not everyone does, and certainly no one’s perfect. I wasn’t perfect, nobody’s perfect.
Okay, now, so, if you put me in a, in a burning car, and I have trained by doing it at least once per, once per week for the last three [00:15:00] years on how to get out of a burning car, I’ll probably succeed, okay? Now, let’s compare and contrast this to, I just, I just got in the backseat of a ride hail. And I didn’t even know it was a Tesla until the, the app told me, oh, by the way, it’s a Tesla, or whatever the car, but Teslas are the ones that, that we’re talking about here.
I’m not, this is not unique to Tesla. It’s unique to any car that has bad emergency egress, to be clear. We’re going to use Tesla as an example. And I get in and, and I may not, and you know, we’re all kind of sophisticated. We know about this problem. This is a person who has, No, zero engineering, they’re not, zero on the engineering spectrum, right?
They just, that’s not how they think, that’s how they’ve been educated, they don’t know any of that stuff. They’ve never read the manual because they don’t own the car. They’ve never been in a Tesla before, or once or twice, they didn’t even know it. They don’t even know, you know, they don’t even care what kind of car they’re in, they’re just not that kind of person.
They don’t, Know that there are emergency handles because the door handles are [00:16:00] supposed to work, right? Right. They’ve never been through training No one’s ever told them the car’s on fire. You have to get out. Let’s practice it And they’re in a car that has that has what let me see the ones I can remember one is there’s nothing The other one is you have to pull up the floor mat and maybe use a screwdriver to pry it out of the floor Okay?
Another one is that it’s it’s in the bottom of the door well of the, like the map holder, the drink holder in the back, right? I think that’s the cyber, cyber truck. And you have to pull up the rubber band on the bottom, if you even know, and then there’s a little tiny loop you have to grab. Okay? And another one is you have to pry off the speaker cover with the screwdriver you don’t have.
Okay? And you’re expecting that person to be able to pull that off and get out. After they’ve just been through a crash and are traumatized? That, yeah, I’m, I’m astounded anyone pulls that off, even if they had training, and they don’t.
[00:16:56] Fred: We forgot to add that there’s poisonous gas coming into the cabin at the [00:17:00] same time.
[00:17:01] Phil Koopman: Well, potentially poisonous gas, potentially batter fire, you know, potentially smoke so you can’t see, so now we’re back to having to, to do it by feel, right? I just, That’s just not going to happen.
[00:17:12] Anthony: No. And with the, the Model 3 you’re talking about, there’s no manual release in the backseat. So, I get into a crash, I’ve got some kids in the backseat that sounded weird.
But anyway, I get into a crash and I get out, I manage to get out the front doors. And now there’s no way for me outside the vehicle to open the back doors, either?
[00:17:30] Phil Koopman: So I haven’t tried that.
[00:17:32] Anthony: Okay, I’m curious.
[00:17:33] Phil Koopman: It sounds like the answer is no, right? You know, because they’re electronic, they’re electronic door latches, right?
So if you’ve lost and I think it’s 12 volt power, not the main battery power, but if you’ve lost the auxiliary power, you can’t open the door, so now you have to reach in from the front seat and get your kids out or something like that. You know, that this is not a not a good situation now again, this is going to be true of any car with that with electronic door locks.
So the question is, how do you make sure you open the doors when you lose power [00:18:00] and it’s got to be straightforward? It has to be obvious,
[00:18:03] Anthony: right? I mean, we’ve we’ve talked in the show about one, I don’t understand the benefit of actually doing these electronic releases because you need an annual override.
So you’re really saving any money. Are you is there any cost savings here? And I think Fred or Michael pointed out in the past, I think you were mentioning the Cadillac one, whereas if you pull twice on it in succession, that it goes to manual mode to open a door.
[00:18:25] Phil Koopman: Well, so let’s go there. So what do people do?
When there’s a crisis, what do people do? They do what they’ve been trained. You fight like you train. So, if the last thousand times you got out of a car, you pulled the pull out door level. And by the way, I think I was actually, was in a Tesla ride hail, and the driver had to explain to me how to open the door normally.
Right?
[00:18:45] Anthony: Yeah. Okay. I’ve had friends who can’t open the door. They’re like, what’s going on? I’m like, bye, Liz.
[00:18:50] Phil Koopman: So, so you gotta, so you see, but you’re going to do it the normal way, and so anytime there’s a door opening inside any car that is not what people have been trained to [00:19:00] expect, with lots of variety, I mean placement’s a little different, but it’s like, there’s a thing you pull, got it, but if I pull that thing, the door opens, and if it isn’t that, you can expect them to get it wrong in normal life, and if it isn’t that, in a crisis, it’s pretty much guaranteed they’re going to get wrong, okay, because they’re going to be panicked, they’re going to be, you It’s, it’s very hard to think clearly when you’re in a crisis like that.
You do what you’re trained. So, if there is the door handle. And it’s a crisis and you pull it, nothing happens. What do you do? You pull it again. Start, start pulling up carpets, looking for emergency release? No! You pull it again! Except this time, you pull it a lot harder, as hard as you possibly can, and you keep beating on it, hoping it’s open, and, and with good probability, you fixate on beating on it, hoping, well, you know, because we know things aren’t perfectly reliable, we’ll just try it.
What, what do you, you know, if it doesn’t work, what do you do? First thing you do is try it again. Button doesn’t work. What do you do? First thing you do is you press it again, right? And, and you don’t have a lot of time of, of, of pulling on the handle again before things get so out of control that you’re, you’re done.
So any, in my mind, anything [00:20:00] that is not, use the standard control that you’ve been taught to use in, in a way that, you know, if it’s not that or pull it twice or pull it harder anything other than that, there’s going to be high fraction of people who just don’t succeed. And we’re talking about people dying if they don’t succeed, so this is a big deal.
[00:20:18] Anthony: Right. And this is the thing is I don’t understand what’s the point of these electronic systems with having need that manual fallback. You’re not saving any money. If anything, you’re, you’re making it cost more.
[00:20:29] Phil Koopman: Well, there are potential problems. I’m not going to argue that it’s a good idea to put them in.
You know, that’s, that’s if it sells cars, then it’s a good idea because they’re in business to make money. I get that. Right.
[00:20:40] Anthony: Yeah. But I’m wondering like from like from a customer point of view, like what, how does this help me as a customer? I don’t know. Well,
[00:20:46] Phil Koopman: so there’s a, there’s a minivan. Let’s say you have a minivan.
And one of the features on a minivan is the slide doors can be activated by the driver to open and close. Okay. That’s an advantage. That’s clear. I had kids in a minivan. That’s clearly an advantage. There’s no question that that’s an advantage. All [00:21:00] right. I get that. Okay. And there’s some other, there’s some taxi cabs where they have a remote, a remote thing that may be mechanical.
I don’t know how that works that pops the door for, you know, pops the door when they want to let you in. That kind of stuff. Right. So there, there, you know, if you want to, if you want to be able to open the door, one door for one person at a particular time, yeah, there’s some sense to it. But, but designing it so that you now need an obscure mechanical workaround for power failure loss that’s going to happen, you know, it’s guaranteed to happen at pretty much the worst time when you’re in a crash app to get out.
It’s just, you know, it’s, it’s making it so people are likely to be hurt.
[00:21:38] Anthony: Only if we had a regulation around this. Come on, we’ll have one soon with this new administration, right? I don’t know if
[00:21:46] Child Locks vs. Electronic Door Latches
[00:21:46] Fred: I But let me ask the question, how is this different than the child lock that almost everybody’s got on the back seat, right?
So that the kids can’t open the door into traffic. How is this fundamentally different [00:22:00] from that?
[00:22:00] Phil Koopman: Well, there’s a couple things going on there, so that’s a good question, Fred. One is, the child locks were created back when doors, handles were still mechanical, right? And so there may be some interactions that, that have gone past what people intended.
But let’s go back to, you have an old car with a child lock, and it, and it used to be, it was like a little mechanical thing that just sort of disabled the mechanism on the inside, right? A little flip, you open the door and this little switch. Okay, what’s the presumption? If there’s a child in the back, that must mean there’s an adult in the front.
You’re not supposed to leave your kid in your car, okay? So if there’s a crash you’re not assuming that your kid can get themselves out of the straps in the child seat anyway. If they could, it would not be an effective child seat, right? If you have a little kid, you don’t want them being able to undo the straps, right?
So, so they’re, they’re stuck in the child seat no matter what. And, and there’s the presumption is the adult is going to be responsible for opening the door from the outside, which can be done because it’s mechanical. And getting them out, right? So there’s an inherent assumption of adult rescue. And you’re not supposed to have the [00:23:00] child locked activated unless there’s someone helpless in the back seat who can’t rescue themselves.
If you activate the child lock and there’s actually a person in the back, that’s actually dangerous. You should not be doing that.
[00:23:12] Anthony: Right. But even then you get somebody who’s in the driver’s side or the front seats, they get outside, they can open those doors for them.
[00:23:18] Phil Koopman: Right. Well, the risk is the driver’s incapacitated and you can’t get out.
If you’re a kid, it doesn’t matter because the kid couldn’t have gotten out anyway. Right. Right. So it’s that’s a trade off that, that you’re willing to say it’s too risky to have a kid able to get out compared to the probability of a driver being incapacitated. But for an adult in the back, that risk equation goes away because they’re, you’re not worried about them opening the door.
[00:23:39] Fred: Well, I think the other difference is the agency, right? The person driving the car has got the agency whether or not to elect that system and presumably if they’ve got the agency to do that, they also accept the responsibility for the consequences of that. Restriction they’re putting on there. There’s nothing like that in the in the Tesla [00:24:00] environment.
[00:24:00] Phil Koopman: Well, and also if you’re an adult who’s been child child locked into the back, you’re going to complain loudly about it and the driver is not going to do it again. So, so yeah, it’s not, it’s not only agency, but it’s also it’s agency of the driver to. To do that only when it makes sense, and it’s also agency of the passenger to complain about it when it’s inappropriate.
But if that’s the design, and there’s nothing you can do, you’re right, Fred, the agency’s been removed.
[00:24:23] Anthony: Yeah, my son’s friend turned all of those on, and I was in the backseat, and I couldn’t get out, and I was like, I hate this kid. This kid’s the worst. Maybe he just
[00:24:32] Fred: didn’t like you.
[00:24:34] Anthony: I, the feeling was mutual.
So it’s okay. You know, you love your kid, but not all their friends. Okay, so this is clearly a big problem. Tesla won’t, probably, I’m going to make the assumption, won’t do anything about it, because, well, why do they have to? Regulators are useless for the next few years. It’s It’s, there’s got to be a way to inform the consumer, hey, this is a giant danger.
[00:24:59] Phil Koopman: [00:25:00] Michael, is it, I think it’s true that even if there were rules, it would not be retroactive, right?
[00:25:06] Michael: Right. It’s very, very rare that a rule would come out and then require manufacturers to retrofit their vehicles with a safer system. So, so yeah,
[00:25:16] Phil Koopman: the next time I’m in California getting a ride hail and it’s a Tesla coming for me you know, honestly, I have.
More than once gotten the, the higher level of ride hail so that I would not have to be in a Tesla because I worry about stuff like this. I guess that’s good for the ride hail business. They get more money out of me, right? So, But I, I mean, why, why wouldn’t I? And, and there’s also the you know, there’s, there’s also In these cars that can, there might be a higher risk of problems when you do crash, right?
So that’s just,
[00:25:47] Anthony: Oh, that’s, that’s pretty clever. All right, listeners, if you’re using a ride hail and says Tesla coming up, you push that away and wait for the Toyota to come up. There you go. That’s the advice. Anthony’s baiting
[00:25:57] Phil Koopman: me, but I’m not going to take it. I know, I
[00:25:59] Anthony: [00:26:00] know.
[00:26:02] Fred: Phil, just before we leave this topic, how How much benefit could be derived from effective labeling, like, you know, in a submarine or an aircraft in the military, you’re going to have rescues placards that are sitting on the side of the vehicle.
You’re going to have emergency exit pull here. You’re going to have a lot of visual cues for what you should do in an emergency. Is that an approach that would be useful for the AV industry in particular?
[00:26:37] Phil Koopman: Yeah, that would help some. It’s as much muscle memory as it is the marking, to be honest. And the marking triggers the muscle memory, okay?
And so if you haven’t practiced, it has some, you know, it helps a little. I’d still, I’d still rather have it be the usual mechanism as the emergency mechanism.
[00:26:54] Anthony: Right. So if you’re on a submarine, do you know where the manual release is? You, [00:27:00]
[00:27:00] Phil Koopman: you know, you know where the emergency flow valves, valve levers are, is where do you know?
[00:27:05] Anthony: Oh my God. Wait, did you have to train on those
[00:27:07] Phil Koopman: too? And do that? Not everyone got to play with them, but we all knew where they were and we all knew what they did.
[00:27:12] Anthony: Oh my God. Okay. This is totally off topic, but I’m just dying to know. Okay. So you’re down. I don’t know how deep you’re going. 5, 000 feet we’ll say.
Is that fair?
[00:27:20] Phil Koopman: Well, let’s see. That’s a long way down. Okay, that’s too extreme. If you go to the, I can neither confirm nor deny the accuracy of the Wikipedia page for my class of submarine, but the Wikipedia says 1, 300 feet.
[00:27:32] Anthony: Okay, so you’re 1, 300 feet down and you, and you blow those valves, like, that’s gotta be a fun ride.
[00:27:39] Phil Koopman: No one’s really stupid enough to, to fill the tanks all the way there, cause that’s a heck of a ride. You may, there’s a video. There’s a video that was around of a Los Angeles class doing that during, probably during the initial trials just to make sure they worked one time, right? And it just pops out of the water and comes crashing back down.
Yeah. Yeah. Heck of a ride.
[00:27:58] Anthony: [00:28:00] Oh boy. All right.
[00:28:01] Autonomous Vehicles Discussion
[00:28:01] Anthony: So let’s let’s switch gears a little bit. That was a little on the nose and let’s talk autonomous vehicles.
[00:28:07] Introduction to Regulating Automated Vehicles
[00:28:07] Anthony: Shall we get into how great the future will be when we get rid of those dirty, smelly, distracted humans? And this is we’re gonna start off with another one of your posts from your safe autonomy blog and regulated, regulating autonomy.
Let’s try that again. Regulating automated vehicles with human drivers.
[00:28:29] Challenges with SAE Levels
[00:28:29] Anthony: And so What I take away from a lot of this is, a lot of the AV industry will be like, Hey, we’re following some standard that the public doesn’t really know anything about and doesn’t really know what it says. And from talking to Fred, even people in the SAE don’t really agree fully what these things say.
And there’s a lot of, it’s almost like they’re written by lawyers. So we see, so this article you kind of start, start talking about, and you’ve talked about on the show in the past, the whole SAE levels 1, 2, 3, 4, 5, We’ve pointed out numerous times [00:29:00] they’re useless because most vehicles are all level 2, 2 2 2 and SAE.
It’s like, you can’t do that! No! And it’s very, it’s very strange, so. Walk us through, and you don’t have to go in depth, because you’ve done that in the past, and I think our listeners understand all the levels so much, but what’s the issue with the level two and the level two plus and plus plus and, and is what not?
Is this an engineering issue? Is this a marketing issue? Is this a both?
[00:29:31] Phil Koopman: Well, there’s a couple issues. One is that let’s, let’s say that engineers, actual engineers building cars want to use the levels and that provides value to them. That’s fine. That’s not what we’re talking about here. We’re talking about ordinary folks and regulators.
And, and other, other purposes. So the, the levels, and I, I, I have three law journal papers, which is three more than I thought I would ever have, co authored with an actual law professor, [00:30:00] so, so I’ve dug deep into this. And the conclusion we came to was that the levels don’t work for regulation. Because they’re, they’re too easy to game, too easy, too squishy, too easy to game, too sub, too subjective in a certain sense.
Just as an example, what’s the level? Ask the manufacturer. They, they tell you a number. That’s the level. It has nothing to do with what the car actually is. It’s whatever they say it is.
[00:30:24] Regulatory Incentives and Loopholes
[00:30:24] Phil Koopman: Well, you know, if you have a regulation based on what the manufacturer says today the level is and, and at level three and above there’s regulated, and level two and below it’s not, guess what answer they’re going to give if they don’t want to be regulated?
Level 2. They’re going to say 2. All right. So that, that’s one of the, one of the many reasons why the levels just don’t work for, for things where there’s an incentive to game them. So engineers, you could hope there’s no incentive to game, but for regulations there absolutely is an incentive, right? You know, if you say level three and above is regulated with special regulations, level two below is not, [00:31:00] that’s about the biggest incentive you could possibly have.
So that leads us to the only difference that really matters is the regulated versus unregulated. And the regulated tends to be things where the SAE levels say they can drive themselves under normal circumstances. And so level three, four, five, they’re all the same in that while you’re on the road and nothing’s happening.
They all do the same thing, they all drive themselves. Now, the differences are if something breaks, does the human have to jump back in? Or if it hits a geofence boundary or something else, does it keep going or not? You know, those are the differences between 3, 4, 5. But for our, it’s all one lump.
[00:31:36] Defining Autonomous and Level 2 Vehicles
[00:31:36] Phil Koopman: I call it autonomous.
They’re autonomous. Okay the other, the other bin is, well, no, there’s a human driver who’s supposed to be paying attention while it’s driving under normal circumstances, right? And that’s typically called Level 2. But the issue you have is because of this regulatory incentive so right now, for example, in California, Level 2 is not regulated at all, [00:32:00] other than as a regular car.
In federal The reporting standards, as long, we’re going to see if those stay around, but those start at level three. There’s, all the laws are written either starting at level three, level four. There are no robo taxi autonomous vehicle specific laws at level two. So, if you’re a manufacturer and you don’t want to be regulated, you call it level two no matter what it is.
Now, now, okay, so that’s it, that’s the divide. Now, here’s the tricky bit. You can, when, when somebody says level two plus and, and everyone, Fred and I are both on that committee, right, so. Everyone gets upset when you say Level 2 even though SEA has a publication saying Level 2 We’re not going further on that one.
What do we mean by that? Well, let’s get rid of the plus because that just That just gets people upset, and that isn’t really the point here. I’m not advocating to put a new point on the SAE scale because it won’t fix all the other issues. What you have is cars that do not have continuous steering, right?
They have adaptive cruise control, but you have to steer all the time. And we’re setting aside momentary [00:33:00] collision avoidance steering, you know, sustained steering, right? You have cars that do not have sustained automated steering. Let’s call those just regular cars. By
[00:33:10] Anthony: sustained steering, does that mean, so my car it’s not fancy, but it will have lane centering, adaptive cruise control, and will steer itself.
[00:33:18] Phil Koopman: Lane centering and sustained steering. That’s right.
[00:33:20] Anthony: That’s right. So that even counts because my car will say, Hey, every 30 seconds I’m not feeling any torque on the wheel. So yeah, yeah, yeah,
[00:33:27] Phil Koopman: yeah, yeah, yeah, that, that, yeah, most, if not all of the probably I have to say most, cause there’s always exceptions.
Most of the high end cars today have this. Yes, absolutely. So most of them are not ordinary conventional cars. Most of them are what been, what? That’s what he calls level two. And the thing that makes them level two in practice, and I’m intentionally, I have a whole webpage about the ins and outs of the standard, how crazy things get.
You know, if you really want to know, Oh, you can put a pointer in there, you more than you ever wanted to know about the standard. But I’m, so I’m keeping things simple [00:34:00] for practical purposes, if it can steer itself for more than say the count of five. without your hands on the steering wheel, and it’s not just because you’re on a straight road and some grooves and the mechanics are doing it.
You know, if there’s a computer actually keeping you in your lane for more than the count of five that’s level two. That’s it. That’s what it takes to be level two. So now you have the problem that level two is this huge sweep between cars that, let’s call them Cruise control plus plus, right? Ah, I see.
Not only does it do speed keeping, it also does distance keeping with the car in front of you. And it also keeps you between the lines in your lane. Is that not really just cruise control, just steering cruise control, right, you know? Right. It’s old, old speed cruise control. You want to be 55, I’ll be at 55.
I’ll be right into that stop truck at 55 because I just know how to be 55, right? Yeah. And then smart control is like, hmm, I think, I think we should not hit that truck. I’m going to slow you down, right? And, and then for lane keeping, it’s the same thing. There’s, there’s There’s not [00:35:00] level two, there’s lane keeping consistence, which is like putting up bumpers in a bowling alley to bounce the ball so it doesn’t go in the gutter, right?
You know, bounces you off the lane boundaries. And then there’s lane centering which is, no, no, really, I’m gonna keep you between the lines all on my own, you know, I want your hands in series whatever, but, but my job is keeping the lines and that’s it, that’s what I do. Okay, so if you want to call those, I can’t use super cruise because somebody already used that, but you know, if you want to use those, you know, cruise control plus plus, it’s, it’s cruise control for speed and for lanes.
That’s baseline level two. Lots and lots of cars have that.
[00:35:32] Human Factors and Overtrust in Automation
[00:35:32] Phil Koopman: Now there’s still an issue because we found out in the 90s that people stop paying attention when you take away continuous steering. There’s still some serious human factors issues, but for our discussion, let’s say that driver man, driver monitoring is going to solve that well enough and it’s not.
That’s not the battle we’re here to fight today. So let’s just set that aside and say we don’t, for now, until we have more data, let’s not worry about that one because that’s not the house that’s on fire. Okay? But what about things that, that [00:36:00] say, well, you know, I’m a robot taxi, but you have to jump in if I try and kill you.
What do you do about those? Those are way more than just going down your own lane in traffic, maybe just on a highway, right? And at some point, the problem is that people get comfortable that if it does it five or six or seven times, they assume it’s going to do it forever. They, they overtrust too much.
And so if it stops at red lights, and it stops at a hundred red lights in a row, how many people, even engineers, can honestly be saying they’re going to be looking at the red light on the hundred and first time when it doesn’t stop? You know, somewhere around red light 5 to 15, everyone stops paying attention to red lights because that’s how people are wired.
And so those are, those systems are the ones that are a problem because they’re guaranteed to make you not pay attention.
[00:36:46] Anthony: So, yeah. I look at this as, so, level 3 is, it’s telling you, the human, you don’t have to pay attention anymore. Is that correct?
[00:36:53] Phil Koopman: So level, level three is,
[00:36:56] Anthony: yeah,
[00:36:56] Phil Koopman: so for our purposes, I have to resist the temptation to get technical [00:37:00] and quote all that.
No, that’s fine. That’s fine. So yeah, yeah, for practical purposes, a level three is you don’t have to pay attention unless the car tells you to pay attention. And the only time it tells you to pay attention is it’s time to take over.
[00:37:12] Anthony: Right. And so I, I look at the analog comparing to autopilot on an airplane or autopilot on a boat.
Like you’re supposed to be
[00:37:21] Phil Koopman: paying attention all the time. All the time. That’s what I’m saying. pay attention. Yeah.
[00:37:23] Anthony: Yeah. Which is surprising to me because there’s a lot of training that goes into all of those things. Like I just completed this, this, uh, sailing course where it’s like the regulations say someone must be keeping watch 24 hours a day.
You always have to have this. Well, yeah, I used to
[00:37:40] Phil Koopman: do that job.
[00:37:40] Anthony: Yeah. Yeah. Right. So on an ocean, like you’ve got, you know, you’ve got time to react like you’re generally not in close quarters unless you’re entering a harbor or marina, but you’re out, you know, out
[00:37:51] Phil Koopman: in an ocean, like, and there’s 10 pairs of eyeballs on you if you’re on an expensive ship.
Yeah. Right. Aircraft, the same thing. You have plenty of time unless you’re landing. That’s right. Right.
[00:37:59] Anthony: Whereas with [00:38:00] cars, you’re on a highway and there’s someone just next to you, you know, a foot away. You have no time to react.
[00:38:05] Phil Koopman: Yeah, it’s, it’s maximum, maximum risk the entire time you’re in the car, as opposed to landing, taking off, pulling to port, pulling out of port, whatever.
Okay,
[00:38:12] Anthony: just, just want to make sure that that’s right. This is saying, Hey, this is, you have no training. You don’t have to pay attention anymore. You have training,
[00:38:20] Phil Koopman: but not training on automation, just training on normal driving. Those aren’t the same thing.
[00:38:23] Anthony: Well, I’m going to argue you don’t necessarily have training on driving.
Wow, you managed to get a driver’s license in the United States, which is If you
[00:38:30] Phil Koopman: got your driver’s license during COVID where they skipped the driving test, you’re correct.
[00:38:35] Anthony: Even depending on the state you’re in, you know, the driving test is like, all right, you’re good. Okay, so sorry, continue.
[00:38:42] Phil Koopman: Okay, so you’re okay, so the, the issue is, I think we need to distinguish between plain old lean centering and things that are so capable, people will naturally treat them as if they’re Robotaxi fully level 3, level 4, even though they’re [00:39:00] not.
Okay, and, and as human nature, this is human nature, denying human nature is a good way to get people killed. You just can’t do it. Right? So, so we need to have a distinction between things that are plain old lane centering where we can presume that people know they’re going to die if they don’t pay attention, so they’re going to pay attention.
You know, there’s nothing like self preservation as a motivation. And, and people who are trained through experience, because as, as you said earlier, most, most people probably don’t read the manual much, right? That, that is normal human nature, right? Especially if they don’t need to. If you sit in and it seems to work and you get, Oh, I see how this works.
I’m good to go. Engineers are as guilty of that as anyone, believe me. You know, I see how it works. It’s gonna work. You treat it that that’s how it works. In the fine print and the, by the way, it doesn’t work sometimes for reasons a normal person cannot understand or detect or see a pattern to. You know, this, this traffic light, it didn’t work on the rest.
It did. What’s, I can’t tell why. It’s like, well, that’s not actionable. What are you supposed to do? So we need to [00:40:00] distinguish them for regulatory purposes and my proposal is setting aside regular level two has some issues, but let’s just let that one go.
[00:40:07] Proposals for Better Regulation
[00:40:07] Phil Koopman: It’s not the house that’s on fire. Okay? That if people, ordinary folks, when they’re driving it, feel like they’re getting a RoboTaxi experience, it should be regulated as a RoboTaxi, no matter what the manufacturer says.
Okay. I know that’s a little squishy, right? Yeah. But intuitively, hopefully that makes sense. You know, if you get in and say, oh, this is a robo taxi. Yeah, it says level two on this sticker, but that’s just to keep the lawyers happy. You know, who cares about those, those stinking lawyers, right? Sorry, Michael.
[00:40:38] Michael: Yeah, I mean, so, so anything that makes the driver feel like, The car is going to take care of them at any point. And that way, that’s going to be, should be regulated as an autonomous vehicle.
[00:40:50] Phil Koopman: If, if you’re selling something to a customer that says you don’t have to worry about driving, we’re going to deal, deal with it.
What, regardless of the fine print, if that’s their takeaway and they take [00:41:00] action on that and they act like that and they truly believe it, not because they’re being jerks and abusing it because they know they, they truly believe. That is going to take care of them, then it should be regulated as something that is supposed to take care of them.
And if the manufacturer doesn’t like that, they have to be a little more vigorous in explaining to people, no, no, it’s not going to take care of you, instead of putting on social media, oh yeah, it’s safer than a human driver, et cetera, et cetera, right? So that’s, so my, my approach to this is that the level three regulations should reach down and scoop up the, the robotaxi wannabes.
The things that are, things that people perceive as might as well be a robo taxi. Now, if you want to be technical, the technical proposal I have, it’s, it’s hard, because you need a bright line. That’s kind of subjective, like, how do you do that? So, the technical proposal I have is that if it can depart its current lane of travel intentionally, then it should be regulated as level three.
[00:41:54] Anthony: That sounds great to me. That sounds pretty clear. I mean, there’s, there’s, there’s
[00:41:56] Phil Koopman: things, there are loopholes to think, you know, and, and some manufacturers may [00:42:00] not be able to do a simple lane change feature because of that. The amount of chaos it’ll bring to the market is, is near zero. Cause most lane centering doesn’t do that, right?
So all those guys are out of it, and all the people who change lanes are on their path to be wannabe robo taxis, right? And if somebody has a smarter way to make a bright red line, I’m fine with any bright line, but it’s got to be easy, easy to enforce, it has to not be gameable and the one I do is if it can change lanes on its own, then you’re up in robo taxi land.
Get rid of
[00:42:33] Michael: So it’s got to be really bright to prevent the level two loophole from occurring, essentially.
[00:42:38] Phil Koopman: So you said level two loophole. I
[00:42:40] Michael: say
[00:42:41] Phil Koopman: that a lot. I don’t know who invented it, but I say that a lot. Level two loophole is Let me give you an example of how bad this can get, okay? Let me, this is hypothetical, but I want to, this illustrates why the SAE levels are insuitable for regulation.
Let’s say that I build the following. I’m [00:43:00] going to build a level 2 car, so I’m not regulated. I’m going to make it so it drives completely autonomously in all circumstances except when it sees a circus elephant in the middle of the road. It does not know how to recognize circus elephants. Now, I saw circus elephants in the road when I was a kid, but I don’t think they do that anymore.
But my ODD is, I drive everywhere there aren’t circus elephants on the pavement. Okay? And because, and I rely on the human driver to complete the object event and detection, object event detection response subtask by recognizing circus elephants. By all measures, that makes me a level two according to the strict letter of the regulation.
I’m sorry, strict letter of the standard and which is also baked in regulations because I rely on the driver to detect circus elephants. That makes me a legit, straight up, level two. But in practice, it’s a robotaxi because there aren’t any circus elephants. Okay. That’s how easy it is getting to the standard.
[00:43:57] Anthony: I am so looking forward to your autonomous [00:44:00] vehicle company. It’s going to be great. I mean, the mascot’s going to be incredible and you get to self certify. It’s yeah. It’s genius. Yeah.
[00:44:09] Phil Koopman: So if I do a book, do I have to put a circus cell phone on it now? Is that the absolutely
[00:44:13] Anthony: come on, please call it, you know, Dumbo and the level two loophole.
So that’s why I say
[00:44:20] Phil Koopman: it’s a loophole. You can drive a fully automated big rig through.
[00:44:27] Fred: Yeah.
[00:44:28] Tesla and Regulatory Actions
[00:44:28] Fred: I’m just going to refer back to the NTSB, National Transportation Safety Board of Recommendations after the Mountain View accident, which is now what, five years ago, something like that in which a Tesla on autopilot crashed into a.
A barrier on the side of the road and killed the driver and injured other people as well. One of the recommendations to the National Highway Traffic Safety Administration is to evaluate Tesla autopilot equipped vehicles to [00:45:00] determine if the system’s operating limitations, foreseeability, or driver misuse and ability to operate the vehicles outside the intended operational design domain pose an unreasonable risk to safety if safety defects are identified.
Use applicable enforcement authority to ensure that Tesla Inc. takes corrective action. So this is back in 2020, and I don’t think any action’s been taken since. Well,
[00:45:25] Phil Koopman: there has been, Fred, but let’s talk about that for a minute, because it’s important. So back with one of the early Tesla crashes, NTSB, so National Transportation Safety Board, is not a regulator.
They’re just an investigator and they just make recommendations. People conflate them with National Highway Traffic Safety Administration, NHTSA. So, NTSB investigates, NHTSA regulates, and they’re different. They even have different bosses so they’re, they’re completely different. Okay and so NTSB is just the gold standard for what, what happened, what should be done about it.
They’re, they’re, they’re, they’re great folks. And. [00:46:00] They make recommendations, including one to NHTSA, and what they said, summarizing back then, was there’s two problems that you have to fix, there were many, but the two that we’re talking about, one is that, that people stop paying attention, if they’re, if you let people stop paying attention, they will pay attention, they will stop paying attention, so you need good driver monitoring, and the other one is, you have to not have this stuff turn on in situations it’s not supposed to work in.
You have to enforce the limits, the operational limits, ODD. And that went, fell on deaf ears for years and years and years. There was a recent Tesla autopilot recall where NHTSA actually pulled out those two things. They said, you know, we were, you know, you’re, you’re letting it, you’re having most of the crashes by far, the majority of the crashes of autopilot were happening on roads that’s not supposed to be used on with cross traffic.
It’s not supposed to be used on roads with cross traffic, but they let it be turned on. You know, all these cross traffic, including the three, count them three, tractor trailer, underrun. [00:47:00] Crashes. We’re all with cross traffic. So, you need to enforce your ODDs. And the other one is people are gaming the, the driver monitoring.
You need to do decent driver monitoring. So, NHTSA actually told Tesla they had to fix it. And Tesla came out with a, with a recall, a remedy that, that waved its hands. It didn’t really fix it. And now NHTSA launched an investigation. So, it took, it took those four and a half years, Fred, for NHTSA to do anything about it.
And then the remedy was ineffective. And now there’s an investigation. And who knows how many years that’s going to take. So that’s kind of where we are, but for these cars, driver monitoring and operational limits enforcement are the two things that you need to do before anything else. And, and some companies just don’t want to do it.
[00:47:45] Fred: No, thanks for that. I think this plays right into one of the other topics you have often talked about, which is just the ability to game the system by running out the clock, talking about, you know, things that you’re going to do and, [00:48:00] and. Requiring litigation and all of these devices that are available to well funded manufacturers to simply push the horizon of regulations into the future and let them do basically run right on the roads as much as they want until all of these issues are resolved.
[00:48:22] Phil Koopman: Well, we, we certainly seen that with automatic emergency braking, which I’m not a, I’m not a policy expert on. Michael does more of that than I do, but they had an agreement and then they sort of phoned it in on the agreement. They did some, now some companies do great and other companies do the minimum, right?
Some do maybe as little of the minimum as they can get away with. But years and years passed, and now there, now there’s a rule saying we have to have, no seriously, we have to have real AAB, and now they’re, but the manufacturer’s saying, oh, but there’s no time because we didn’t work on it earlier, and yeah, that, that’s how this stuff plays out, right?
How did I do, Michael? Is that about it? I’ve heard you go on. That’s about it. Okay. So apparently I’ve been listening to your [00:49:00] podcast well enough.
[00:49:02] Michael: They were touting, they were touting when they, when they sued NHTSA to now to invalidate the rule, touting that they had invested a billion dollars as a group into AEB over the past 10 years, which when you compare it to what they’re investing in autonomy and partial autonomy is a pittance.
[00:49:19] Phil Koopman: That’s like, that’s like a year, a year. One year burn rate at one of the, the Robotech companies, right?
[00:49:26] Anthony: That’s Waymo’s lunch bill. Like, come on. Okay.
[00:49:29] Mercedes and Liability Issues
[00:49:29] Anthony: Before we wrap up this one real quick. So we’re talking level three and we’re talking, you know, we got blue cruise and Mercedes is really the big one kind of touting this.
They’re doing it on city streets with it. And I kind of like Mercedes approach and that they are illuminating the outside of the car in that kind of teal color warning other drivers. There is no one behind the wheel. No one is paying attention. Stay
[00:49:54] Phil Koopman: away from me, I’ve got this light on. Yeah,
[00:49:56] Anthony: yeah, yeah. I
[00:49:58] Phil Koopman: mean, I’m, I’m glad, [00:50:00] I’m glad to see that someone’s trying to think about that.
You know, caution, this car may not behave the way a person would because it’s not a person. That, that’s kind of interesting. You know, I’m glad to see, I don’t know what will happen. The human factors stuff gets really tricky in a hurry. Are, you know, people going to taunt it? Are they going to mess with it?
I don’t, I don’t know. But I’m glad to see that someone’s trying to address that. Absolutely.
[00:50:21] Anthony: I love that. I think that should be regulated. I mean, it’s student driver on board. It’s that giant sign.
[00:50:27] Phil Koopman: And I think it would be treated like your interpretation. Other people would say other things, but fair enough.
[00:50:31] Anthony: Well, I’m being an optimist.
[00:50:33] Phil Koopman: Okay. Yeah, I can,
[00:50:34] Michael: I can cut that. I can cut this car off anytime I want. That’s the other thing that sign says.
[00:50:40] Anthony: Yeah. But people don’t do that with student drivers. You don’t, you know, you’re not, you know, you sit back and go.
[00:50:46] Phil Koopman: But the thing is, it’s a person you’re cutting off instead of machine and who cares about the, you know, the machines.
[00:50:51] Michael: Machines feelings.
[00:50:53] Phil Koopman: There are no, there are no feelings, even if it’s running chat GPT, a whole different time. Machines have no feelings. [00:51:00] But there’s a dark side to the DrivePilot stuff. That’s important because I still hear people, journalists even, not really getting this. Mercedes, so, so when you have a level three vehicle, the idea is that you do not have to pay attention under normal circumstances.
You just have to take over. Let’s set aside takeover. We’re normally driving down the road, okay? And the car is driving and you’re watching a movie on the dashboard or playing Tetris. Those are all things they tell you you can do. Okay, okay. Find as far as it can goes. The question is, if there’s a crash, who’s on the hook?
And Mercedes has said so many times we accept liability, but it’s not true. Okay, not not Michael’s pointed that out right
[00:51:42] Anthony: away
[00:51:42] Phil Koopman: when you say when you say it, they don’t accept liability. Sadly, this is where my co author with my, with William Wyden, my, my law professor co author, shout out to William. Shout out to Bill.
All right, so there’s product liability, which they not only do they accept, they don’t have a choice. Of course they accept it. They don’t have a [00:52:00] choice. So, oh, we accept product liability. Well, good for you. You weren’t going to have it anyway, right? That’s not what people talk about. They’re worried about tort law.
They’re worried about long, wrongful death. They’re worried about criminal law, being thrown in jail. And Mercedes has gone on the record multiple times for multiple journalists, some, some of which may have been at my instigation, you know, saying, no, no, no, we mean product liability, tort liability, we don’t accept.
And criminal liability, you can’t accept because it doesn’t work that way. You know, you can’t say, it’s okay to kill this guy, I promise. It doesn’t work that way, right? No contract can fix that. So, the last I’ve heard, and I’ve watched a Mercedes representative on a public stage when asked a direct question.
Junko Yoshida was running a session, and she asked the Mercedes guy, so, what’s the deal with tort liability? And he eventually worked around to, no, they do not accept tort liability, and of course, they can’t accept criminal liability. So, If, if you’re in your Mercedes running down the road and you run over [00:53:00] someone and there’s a negligent homicide or whatever the state charge is coming your way or wrongful death, you know, Mercedes can’t Mercedes has said they’re not going to protect you from that.
Now there may be some states like California has a level three law. I, I, I have a feeling that would get messy, but some states there is no level three law. And so you’re, you’re playing Tetris, you’re watched a movie and your car hits someone on, hits someone. You better be ready to defend yourself because you’re probably on the hook for at least being exposed to trial and maybe on the hook for actually being just as guilty as if you had been driving.
[00:53:36] Anthony: But if you can afford a Mercedes, your pockets are deep enough where you can sue Mercedes, right? Good luck with that. All right. And they’re going to say, well,
[00:53:46] Phil Koopman: we said, we only said product liability, prove it’s defective. And what if you have a product? What if you have a product that sees 95 percent of pedestrians and that’s the best that current technology can do?
I made up a number. Hopefully it’s higher, right? [00:54:00] That could be ruled not a defective product because. Because cost effective products just aren’t perfect, deal with it, right? And, and you were a fool for believing them when you said, when they said you could watch a movie. Maybe, maybe, I don’t know how it ends up, but it’s, I don’t want to be that driver.
[00:54:16] Anthony: No, and no, no listener to this show should be that driver either. If you’re behind a wheel of a car, you’re driving it. I don’t care what you’re doing, stop playing Tetris, drive the car.
[00:54:27] Phil Koopman: Until this legal stuff gets sorted out, no matter what the manufacturer tells you, if there’s a steering wheel in front of you, you had better be paying attention and better accept responsibility.
And every time you trust the car to drive you, you’re taking a chance with your financial freedom and with your liberty. You know, you may end up in jail. That’s how you have to look at it.
[00:54:49] Anthony: Please don’t end up in jail.
[00:54:50] Conclusion and Next Episode Teaser
[00:54:50] Anthony: Hey, we’re going to wrap up this episode, but we’re going to come back next week’s episode.
We’re going to start off with unregulated cyber cabs. [00:55:00] Hey listeners if you haven’t subscribed, subscribe already. If you haven’t donated, go to autosafety. org, click on that big red button. Donate! There should be big red buttons in these cars that maybe should, as an emergency thing. That’s not really ever gonna happen.
Until next time listeners, bye bye. Bye bye. Bye.
[00:55:16] Fred: For more information, visit www. autosafety. org.