We Don't Need No Level 5
Stupid customers is one thing, the other thing is investment fund managers who are not able to distinguish the grains from the chaff. Based on the (correct) assumption that convenience sells, they have lured crowds to put money in their funds, without doing proper due diligence. The investment thesis is "AI will solve it, because AI solves every problem, e.g., it can play chess or go". But actually there is the fundamental difference between a game of chess or a game of go and a self driving car. The difference is the rules for these games are finite and whatever happens, happens within these rules. You do not have to deal with a situation that suddenly the opponent has an extra bishop at the beginning or 9 pawns. When driving a car the rules are open and possibilities of what can happen are infinite.
In other words there are infinite number of corner cases. And as @LeeRatliff points out "This article is all about the complexity of resolving corner cases in game design. Read it, then tell me how long it will take to design self-driving cars when an overlooked corner case could easily be fatal.". Indeed, Level 5 may simply not happen at all.
But turning the problem upside-down, we may not need Level 5 in the first place. Self driving within a geofenced infrastructure, when conditions permit, is probably sufficient. There are vehicles today doing that - like automated trains connecting airport terminals. Or even metro in some cities. They of course run on dedicated tracks in tunnels with no intersections and are shielded from other road occupants, so this is easy, but serves well as an example of a Level 4 system. And it would not be a huge stretch to imagine a freeway enabled with infrastructure supporting L4 for cars. Then you could sleep, after entering such freeway, and take over the control when approaching the exit.
Similarly the whole concept of the lower levels, especially 2 and 3, which potentially offload the driver to some extent seems to be wrong. The tricky point is when the system gives up and the human driver has to take over. This is simply impossible. Such situations most of the time require an immediate takeover and the better the system is when driving, the less attention the human is paying, which simply leads to lack of orientation and judgement and a very likely bad decision leading to an accident.
Again, if you formulate the role of a L2 or L3 system as an opposite - a human is driving and the system takes over when capable, we would be in a much better (and safer) position. Some subsystems have been doing that for years. Like ABS takes care of pulse braking. Or a stability control subsystem straightens a skidding car.
Comments
Post a Comment