Whether you call it self-driving or driverless cars, autonomous vehicles, or any other name – these type of robot, computer chip and algorithm operated vehicles are increasing in popularity with each passing day. Their safety and the impact these vehicles will have on accident rates remains an unanswered and pressing question.
The idea that the human driver is not in control of the vehicle is incredibly freeing in the best of scenarios and terribly frightening in the worst of scenarios. Interestingly, automation in automotive vehicles is not that new. Features such as cruise control and ABS braking systems have become trusted automated features that are so standard that drivers don’t even really think of them as being automated.
So called driverless cars, however, will be equipped with a much wider range of automated features. Examples include lane control, adaptive cruise control, and intuitive pedestrian detection. These features, their safety and their effectiveness has yet to be proven to consumers, and it is unlikely that drivers will immediately place their trust in them, especially in light of recent accidents involving these types of cars.
In California in late March of this year a Tesla Inc. Model X collided with a highway barrier while driving in the semi-autonomous mode at high speeds. In another accident involving a self-driving vehicle in Tempe, Arizona, an SUV hit and killed a woman. Based on investigations of the March 18th incident, the car did not even attempt to brake prior to the fatal accident outside of Phoenix.
For drivers, pedestrians and legislators, these recent accidents again call into question the safety of the system, the responsibility of the drivers, and the type of legislation that currently exists or needs to be implemented to regulate this emerging industry.
It appears that self-driving vehicles may give drivers a false sense of security, making them feel as if it is safe to take their attention away from the road.
Explains Mainor Wirth Injury Lawyers, “Technology is not flawless. Computer chips fail, circuits break, and algorithms can derive the wrong solution. Moreover, when the autonomous systems fail, it reduces the amount of time that a driver has to react in order to avoid a collision with another vehicle, pedestrian, or fixed object.”
There are not yet federal laws relating to the testing and safety of self-driving cars. Rather, individual states have their own regulations, although Congress is attempting to pass federal legislation.