According to Alert Driving, 90% of auto accidents are caused by human error. This means that the driver of either vehicle is responsible for the car accident. However, modern technology now allows for a car to be driven autonomously, without the driver in control. If an accident occurs in that situation, would the cause still be considered human error? Who is responsible for an accident caused by a self-driving car?
During October 2014, automaker Tesla began equipping Model S vehicles with hardware to allow for the incremental introduction of self-driving technology. An October 2015 software update enhanced the auto pilot capabilities, allowing the Model S to steer within a lane, change lanes by tapping the turn signal and manage speed by using traffic-aware cruise control. Digital controls for motors, brakes and steering would help avoid collisions and prevent the vehicle from wandering off the road, reported Tesla in a blog post.
However, these self-driving capabilities may not be living up to the company’s expectation of safety. At least two fatal self-driving car accidents have been reported within the last year, ones of serious injuries – more often, Toronto based brain injury lawyers report. On May 7, 2016, 40-year old Joshua Brown of Ohio was killed in Williston Florida when his self-driving Tesla crashed into a truck. The Model S failed to distinguish the tractor trailer against a bright sky, according to The Guardian.
Brown was believed to be the first autonomous vehicle fatality, until recent reports suggested that the first self-driving vehicle death occurred in China in January 2016. The 23 year-old victim was also driving a Model S when his vehicle crashed into the back of a street sweeping truck. There was no evidence to suggest that the vehicle attempted to stop prior to the collision, reported Money.
Money also reports that the victim’s family filed a wrongful death lawsuit against Tesla (course: Guajardo & Marks, LLP), but the automaker claims the company had no way of knowing if autopilot was engaged during the time of the crash.
“The Tesla vehicles with autopilots are vehicles waiting for a crash to happen — and it did in Florida,” Executive director of the Center for Auto Safety Clarence Ditlow told LA Times. Ditlow called for the company to issue a recall and disable the autopilot function until NHTSA issues safety guidelines.
In a response to the May 2016 crash, Tesla claimed Autopilot ‘is an assist feature that requires you to keep your hands on the steering wheel at all times,’ and that ‘you need to maintain control and responsibility for your vehicle.’”
While that statement suggest that the operator is responsible for the vehicle and any crashes that may occur, others disagree. Scientific American suggests “When a computerized driver replaces a human one, experts say the companies behind the software and hardware sit in the legal liability chain—not the car owner or the person’s insurance company. Eventually, and inevitably, the car-makers will have to take the blame.”
Share this: