I mean the way to solve this is to prioritize waiting for a turn over the penalty to missing it.
You would have the same issue if you missed the last exit in San Francisco and got stuck going all the way across the Bay Bridge (can easily hit 2 hours trying to go over and back in traffic both ways)
Sort of. The issue is that it's not just "waiting for a turn", it's often "start changing lanes into a small gap because you know the human drivers will make space and let you in". In which case, what you're saying is "drive more aggressively if there's a large time penalty for not doing so", which is at least a mildly uncomfortable criteria to put into a computer algorithm
>a mildly uncomfortable criteria to put into a computer algorithm
I suspect this is going to be a fundamental issue with AI. Far from some idealized 3 laws of robotics, AIs will need to behave like humans to fit in our society. And that will force us to confront the ways in which we don't follow our own rules - indeed can't follow our own rules, the rules being impractical but a convenient fiction to allow us feel better about ourselves.
The driverless car is definitely not liable, but the reason it's getting hit is because it's violating an expectation that people will bend the rules in this circumstance
I think it is the same criteria you apply as a human. If you need to get into a lane and make a turn, you will slow down, block your lane and inch in till someone lets you in.
I do think there has to be an aggressiveness level in making maneuvers for an autonomous car. It doesn't mean its unsafe, it just means it could be MORE safe if the time penalty isn't big (a decision that regular drivers have a hard time evaluating since we don't measure our own maneuvers' safety accurately)
Actually it was a wrong turn in an unclearly-marked construction zone. I was using Google Maps at the time, but the road change was recent enough (perhaps that same morning) that the big G was wrong.
Then you'll just have a car stopped in a lane until someone takes pity on it. That isn't an acceptable solution either.
The solution is to figure out what the traffic rule is based on what the other traffic is doing. But that introduces other pitfalls if you don't do it right.
I mean this is what happens to folks as they try to get into a one-lane exit, someone has to let you in.
I guess one benefit of more autonomous vehicles might be cooperation between vehicles to greatly reduce traffic and congestion in these sorts of situations
The big isssue with lane merging is that you have to be assertive and risky in cities or you are just going to be trapped and no one will let you in. You have to almost dare cars to hit you in order to force yourself to have space. I can't imagine a self driving car ever doing that well. It's an entire dance.
It’s quite a safe assumption that no one wants to get into a crash if they can possibly avoid it, so perhaps all that’s necessary to calculate when a car can “safely” be cut off is whether they will have time to stop, even if it’s an abrupt and unpleasant stop? It’s not pretty, but it’s the same calculation human drivers make all the time in these situations.
I mean humans do this and probably a high majority of the time make the merge with both parties unscathed. I also do not agree that you are REALLY paying much attention to who you are cutting off. Generally the process is: look for a gap sufficiently close to your exit, swoop in, if its too tight, keep going...till eventually you just stop and wait till you get the gap.
You would have the same issue if you missed the last exit in San Francisco and got stuck going all the way across the Bay Bridge (can easily hit 2 hours trying to go over and back in traffic both ways)