Google's self-driving car is seeming more and more human. And like the rest of us, it's subject to traffic stops.
The head of Google's rapid rollout lab, David Weekly, Thursday of the prototypical car stopped by a motorcycle officer. Apparently, the vehicle was going too slowly in a 35 mph zone, causing .
— David E. Weekly (@dweekly)
That's because the cars' speed is capped at 25 mph, a move by Google to make the cars a little less of a mystery and more friendly and approachable, rather than zooming scarily through neighborhood streets, the company says in the project's blog.
The company was quick to point out that , despite logging 1.2 million miles of autonomous driving. That's the human equivalent of 90 years of driving — and you thought logging driver's ed miles with your teenager was a lot.
Google's little run-in with the police is happening against a myriad of questions surrounding self-driving cars.
Sometimes, obeying the law can actually be the wrongchoice when it comes to defensive driving. As points out, sometimes good judgment means acting illegally.
Imagine, for example, a deer is standing in your lane. No cars are approaching. A defensive driver would probably slow down and go around it, encroaching into the other lane. But an automated car, following the law to a T, might come to a full stop — and avoid crossing a double-yellow line. Envision the pileup that might ensue.
Instead of programming millions of directions for specific instances — as wacky as a — Google is teaching cars how to respond to the more fundamental aspects of unpredictable driving and respond to a wide variety of instances accordingly.
Even worse: What about the instance of an unavoidable accident? Cars could be programmed to either a) minimize the loss of life — even if it means or b) protect riders at all costs. Would the answer be the same every time, or would it be random?
Information on the situation's surroundings would be determined by a , called Lidar. Using illumination invisible to the human eye, it captures information and converts it to a 3-D model of the scene. Despite Lidar's extreme accuracy, there will always be .
Back to the traffic stop. How do you program cars to respond appropriately to the all-too-familiar flashing lights in the rearview mirror? It makes us wonder: Could self-driving cars be programmed, in some way, to avoid traffic infractions — or even the police? What if self-driving cars were pulled over ... by self-driving police cars?
In the meantime, the matter of ticketing itself is still . Who, or what, gets the ticket? Google has said that if one of its cars breaks a law it'll foot the bill, but sometimes state law isn't quite as clear — especially if there isn't anyone in the driver's seat.
Given the plethora of surrounding self-driving cars, there's still a lot of thinking, and programming, to be done. And with the Institute of Electrical and Electronics Engineers predicting that these vehicles will make up , we'll have to make some of these decisions soon.
Copyright 2020 91¸£Àû. To see more, visit https://www.npr.org.