HOT TOPICS – Baby You Can Drive My Car: Product Liability Exposure in the Age of Self-Driving Vehicles

Hot Topics - Baby You Can Drive My Car
By Kenneth A. Krajewski and Neil L. Sambursky.

“The intersection of law, politics, and technology is going to force a lot of good thinking.”

– Bill Gates.

As the world continues to make leaps in technological advances, the law must deal with innovations’ effects. New technology enabling cars to practice judgment and control will force vehicle manufacturers to consider their impact on their exposure to product liability lawsuits. With self-driving vehicles becoming more accessible to the average consumer, liability questions will increasingly reach autonomous vehicles’ hidden defects.

Self-driving vehicles have been involved in crashes since the release of early models in 2016. While product liability in vehicles is already an established area of law, automated vehicles’ manufacturing and design lend themselves widely to product liability exposure in law more than non-automated vehicles have in the past. Once the primary legal concern in a car accident was the driver’s negligence, with the introduction of self-driving vehicles onto our roads, we may now find ourselves asking the car manufacturers to bear the blame for crashes.

Though there have not been many reported crashes involving self-driving vehicles, there are still potential legal risks that manufacturers and sellers may face. As we inevitably continue to shift towards automation, courts will need to adapt to a new wave of product liability claims in car accidents stemming from software and manufacturing defects of the intricate systems used to power self-driving vehicles.

In the past, most product liability lawsuits in vehicles arose out of defects in airbags, batteries, and brakes- now the concern reaches even the vehicles’ software and hardware systems.

In April, the family of a pedestrian killed by an automated vehicle filed a lawsuit in federal court in California. The lawsuit alleges that the manufacturer is liable for “defective design, failing to warn of the alleged defects of its autopilot technology and driver assistance features, and negligence.” In previous lawsuits, manufacturers have deflected blame to the vehicle’s driver, arguing that the driver bears full responsibility for controlling the vehicle even if the autopilot is engaged. The manufacturer will likely try to use the same argument. Regardless of the manufacturer’s position, the autopilot system will be carefully scrutinized to determine whether there is a defect or deficiency in the autopilot system’s provided capability that failed to drive the vehicle safely.

The self-driving vehicle’s technology is its ability to “see” its surroundings mainly through its LiDAR system. LiDAR, which stands for light detection and range, is an infra-red pulse system that uses laser pulses to build a three dimensional model of the environment surrounding the car. LiDAR and other control systems of the self-driving vehicle, including radar, sensors, software, and onboard cameras, as well as light detection, are all controlled and executed by onboard computers that make driving decisions 1

With automation playing a larger role in the “decision-making,” the focus will shift to the software controlling the vehicles. The novelty and “newness” of this application of LiDAR lends itself to scrutiny 2. A driver’s dependence on these systems for practicing safe driving measures, avoiding collisions, and sensing sudden risks increases litigation risk. Many hope that extensive testing and data analysis has weeded out any defects that the system may suffer. 

Automation also raises the concern of data security. There have been few but notable reports of hackers managing to gain control of non-self-driving vehicles and turning them off while passengers were in the car. Although no reports of this happening to self-driving cars have been reported yet, the technology used by self-driving vehicles lends itself to the same threat and exposure, especially when drivers are utilizing the same network. Utilizing the same network allows city-wide traffic management to direct driverless cars on which routes to take to manage traffic flow across a city. In 2019, Georgia Tech and Multiscale Systems, Inc investigated the ‘cyber-physical’ risks of hacked Internet-connected vehicles 3. By studying traffic patterns in New York City, they found that even a small-scale hack affecting 10-20% of vehicles could immobilize half the city. An easy solution to this threat- smaller networks. While this, in theory, reduces efficiency, it also reduces the ease with which a hack can occur.  

Case law and consumer analysis are limited in the area of automated vehicle liability. Still, we can expect that many lawsuits will center on the defects and risks associated with the use of smart technologies. As companies become more innovative with the technology they use, they must also consider the consumer risks associated with building a self-controlling vehicle.

1. LiDAR is only one piece of the puzzle. If it were only LiDAR, it would be easy. LiDAR runs every post-2010 car’s emergency braking systems, pedestrian detection, and collision avoidance systems. On the other hand, adaptive cruise control is run by ultrasound. Surround-view – the light indicator on your side mirror – that’s radar. Park assist – that’s cameras and radar. Tesla’s “autopilot” ™ uses eight LiDAR cameras to provide 360-degree visibility, while twelve ultrasonic sensors and a front-facing radar work to analyze the vehicle’s surroundings.

2. LiDAR is decades-old technology. Every state trooper uses LiDAR instead of radar to measure speed. It’s this application of LiDAR that is unique.

3. This is your meat and potatoes. This limitation is why driver-less cars aren’t being mass-produced. Mass production of driverless cars will need 5G technology. Cars are going to need to talk to each other. Think of it this way – if I’m driving a tractor-trailer at 65 mph on the Thruway, and I see a deer about to jump in front of me – LiDAR makes me swerve to get out of its way. You are following me – at 65 mph. That’s 95 feet per second. My tractor is about 20 feet long. My trailer is 55 feet long. I swerve – you have less than one second to react and swerve too. Even LiDAR can’t do that safely. So my tractor and your car have to communicate. Your vehicle has to know what my tractor is seeing, so it has time to react safely. This communication is part of why Tesla’s Autopilot system is alleged to be responsible for the accident in Mountain View, CA, another in Delray Beach, FL, another in Connecticut, and the latest in Taiwan. It isn’t just that the LiDAR didn’t “see” what the car hit; it’s that LiDAR isn’t enough. Cars have to talk to each other, so my car can know what your car is sensing and vice versa. Autonomous vehicles won’t be for real until then.

Should you have any questions, please call our office at (914) 703-6300 or contact:

Jeffrey T. Miller, Executive Partner

Kenneth A. Krajewski, Partner

Neil L. Sambursky, Partner