The idea that a vehicle could drive itself without a human operator behind the wheel was nothing more than science fiction 10 years ago. However, modern technological advancements and artificial computer intelligence have come a long way in recent years. Self-driving cars are already being tested and used on Washington roads. The question is: Are they safe?
A recent accident caused by a Tesla, which was being driven by it’s “Autopilot” feature at the time of the crash, has inspired new questions about whether the world, human drivers — and the technology — are ready for mass implementation. According to details pertaining to the accident, the Tesla drove into the back of a fire truck while the truck was stopped and helping out in another accident. Miraculously, no one was hurt.
The driver told police that he had activated the Tesla Autopilot system and the car was, for all intents and purposes, driving itself at the moment of the crash. As such, one might think that the vehicle should have been able to stop for the fire truck, or at least navigate around it. However, Tesla has since issued a statement that its Autopilot feature may only be used by attentive drivers.
Authorities are still investigating this crash, and it may later be revealed that even if the driver had been fully attentive, given the circumstances of the accident, he might not have been able to avoid a collision either. We may never know for sure.
In the meantime, we have to wonder if the roads, drivers and the legal system are truly ready for driverless cars. Perhaps it will take some time to reap the potential benefits of doing away with drunk drivers, speeders, inattentive drivers and human errors that cause avoidable car crashes every day. We’ll also have to wonder how the legal system will sort out issues of liability in driverless car collisions.
Source: channelnewsasia.com, “Self-driving Tesla crashes into fire engine in California,” Jan. 25, 2018