Self-driving technology is being actively tested all over the world shaping the future of transportation. Companies including General Motors, Tesla, Uber, and Google are some of the major automakers and researchers who have already begun to implement this revolutionary technology. But for all the confidence surrounding self-driving cars, there’s an equal amount of skepticism and concern. Various regulations and barriers are hampering the growth because, as with anything that becomes digitized, the threat to cybersecurity increases rapidly.
Currently, there are no legally operating, fully-autonomous vehicles in the U.S. However, there are partially-autonomous cars and trucks with varying amounts of self-automation. The digitalization of the automotive industry seems to be in the most advanced stage and with growth increases the necessity to regulate and address the legal implications of their use.
In 2018, in the U.S., legislation was introduced at the federal level to create a baseline for the testing and operation of highly automated vehicles. Although the statutes in Title 49, Subtitle VI of the US Code and the regulations in Title 49, Subtitle B, Chapter V of the Code of Federal Regulations fill entire pages with motor vehicle standards, many of them were drafted with human drivers in mind.
Today, more than 25 states have enacted legislation related to autonomous vehicles and their systems. In order to encourage technology and motor vehicle companies to create testing programs, some state governments have enacted permissive regulation for autonomous motor vehicles. Consequently, most states’ standards were drafted with human drivers in mind, but the specifics of how many of the vehicle’s tasks must actually be undertaken by the driver and which can be taken by the vehicle are not clear. These laws simply require the presence of a licensed driver who can be able to take control of the car in case of an emergency.
Florida, a leader in welcoming self-driving technology, declared the legislative intent to undertake any necessary steps to support the safe development, testing and operation of motor vehicles with autonomous technology on public roads of the state. The statute emboldens manufacturers that are developing autonomous cars to test their products all over the Sunshine State. In accordance with state law, auto manufacturers’ liability will be limited if an accident or injury occurs related to an autonomously operating vehicle when the outfitted car is equipped with aftermarket parts, making the party that installed the autonomous technology liable instead. Taking autonomous cars to a new level and out of the legal gray zone, the laws enacted in Florida, California, and Nevada even include stipulations that anticipate autonomous vehicles driving without human drivers in them.
When trying to elucidate the new legislation regarding self-driving cars, there are many questions that must be answered. Most pressing include: Who is liable for the accident? What percentage are they responsible for? Who must provide compensation? The differences between semi and fully-autonomous vehicles lead to further questions concerning liability if an accident and injury occur.
One of the most important aspects to understand is the difference between semi-autonomous vehicles and completely autonomous vehicles. In a semi-autonomous vehicle, the human driver is always expected to share responsibility for any incident that occurs, while in fully-autonomous vehicles the self-driving software is in control and the human is considered an emergency backup at the wheel. Who is liable when a person is injured by one of these vehicles? In cases that involve self-driving vehicles, liability could rest with many parties, such as the driver, the vehicle’s manufacturer, the self-driving software developer, or the suppliers of individual car parts. In these cases, investigators will have new types of information available to determine that one of the aforementioned entities is responsible, such as data from the vehicle including video, GPS coordinates, sensor readings, and other new technology.
Accidents occurred due to unforeseen events, like deer running into the road or a weather event, will be covered by liability laws as they have been so far. Plenty of laws come from the rulings issued by the Supreme Court in cases that have already occurred.
The first recorded case of a pedestrian fatality involving Uber’s self-driving car in Tempe, Arizona, demonstrated that these vehicles can be involved in accidents just like human-driven vehicles can be, shattering some perceptions of safety regarding the software controlling this vehicle. The vehicle involved in the accident, a Volvo XC90 sport utility vehicle, was traveling at 40mph at the time of the accident and did not appear to have slowed down or detected the woman even though she was visible in front of the car prior the impact. If the victim’s family were to pursue a civil case, attorneys could potentially make a range of negligence claims. Anyone from the driver to Uber to Volvo to the manufacturer of the parts could potentially have been held liable if litigation ensued from the Tempe accidents. Early this year, the prosecutors have ruled that ride-hailing Uber will not criminally be charged for the fatal self-driving crash. However, the National Transportation Safety Board and the National Highway Traffic Safety Administration are still investigating. The ‘safety’ driver who was behind the wheel but who appears not to have operated the vehicle could face criminal charges. If there had been passengers in the Uber car that were injured in the accident, they would have been entitled to file personal injury claims against the responsible driver.
Nine months after the tragic crash, Uber officially resumed testing its self-driving cars on public roads. However, questions about liability in the event of a death involving self-driving cars are still completely up in the air.