Self-Driving Accident: Who Is Liable?
Autonomous vehicle accidents in NC raise complex liability questions. Product liability, software defects, contributory negligence, and data preservation.
The Bottom Line
If you are involved in an accident with a self-driving car in NC, the liability question is far more complex than a typical crash. North Carolina has no specific autonomous vehicle legislation, so existing negligence and product liability laws apply -- but they were not written for this technology. Multiple parties may be liable, including the manufacturer, software developer, and vehicle owner. Preserving the vehicle's electronic data is critical, and you should consult an attorney experienced in both personal injury and product liability.
A Legal Framework That Has Not Caught Up
Autonomous vehicles are already on North Carolina roads. Tesla vehicles with Autopilot and Full Self-Driving (FSD) capabilities are common across the state. Other companies are testing and deploying increasingly autonomous systems. But NC's legal framework has not kept pace.
As of 2026, North Carolina has no specific autonomous vehicle legislation. The NC General Assembly has considered multiple bills addressing self-driving vehicles, but none have been enacted. This means that when an autonomous vehicle causes an accident in NC, the case is analyzed under laws written decades ago for human drivers -- laws that do not address critical questions about algorithmic decision-making, sensor failures, and software updates.
This legal gap creates uncertainty for everyone involved -- accident victims, vehicle owners, and manufacturers alike.
Who Could Be Liable?
Unlike a typical car accident where fault usually falls on one or two human drivers, an autonomous vehicle accident can involve multiple potential defendants.
The Vehicle Manufacturer
If the autonomous driving system had a defect that caused or contributed to the accident, the manufacturer may be liable under NC product liability law. There are three theories:
- Design defect -- the autonomous system was inherently dangerous in its design, even when working as intended. For example, if the system cannot reliably detect pedestrians in certain lighting conditions, that is a design problem.
- Manufacturing defect -- a specific vehicle's system deviated from the manufacturer's design. A faulty sensor, improperly calibrated camera, or defective processor could cause the system to malfunction in a way the design did not intend.
- Failure to warn -- the manufacturer did not adequately warn owners about the system's limitations. If a manufacturer markets a system as "Full Self-Driving" but the system cannot handle certain road conditions, failing to clearly communicate those limitations could be actionable.
The Software Developer
In some cases, the software powering the autonomous system is developed by a different company than the vehicle manufacturer. If the driving algorithm made a flawed decision -- such as failing to brake for a stopped vehicle or misinterpreting a traffic signal -- the software developer may bear liability independent of the vehicle manufacturer.
The Vehicle Owner or Operator
If the vehicle owner was supposed to be monitoring the autonomous system and failed to intervene when the system needed human input, the owner may be partially or fully liable. This is especially relevant for Level 2 systems (like Tesla's Autopilot), which require the human driver to remain attentive and ready to take over at all times.
The "Safety Driver"
Some autonomous vehicle testing programs use safety drivers -- human operators who sit behind the wheel and are supposed to intervene if the system fails. If a safety driver was not paying attention or failed to react when the system encountered a situation it could not handle, the safety driver (and their employer) may be liable.
NC's Contributory Negligence Problem
North Carolina's contributory negligence rule creates unique challenges in autonomous vehicle cases.
If you were hit by an autonomous vehicle: Contributory negligence still applies. If you were jaywalking, crossing against a signal, distracted on your phone, or otherwise partially at fault when the self-driving car struck you, your claim could be completely barred -- regardless of how badly the autonomous system malfunctioned.
If you were inside the autonomous vehicle: If you were the operator of a Level 2 system and failed to take over when the system alerted you (or when a reasonable driver would have recognized the danger), the manufacturer may argue you were contributorily negligent. Your failure to monitor the system could bar your claim against the manufacturer.
Data Is Everything
Autonomous vehicles generate and store massive amounts of data. A single Tesla, for example, records continuous feeds from multiple cameras, radar sensors, ultrasonic sensors, GPS coordinates, vehicle speed, steering inputs, brake applications, and the decisions the Autopilot system was making in real time.
This data is the most important evidence in an autonomous vehicle accident case. It can show:
- What the system "saw" in the moments before the crash
- What decisions the software made and why
- Whether the system alerted the driver to take over
- Whether the driver responded to alerts
- The exact speed, steering, and braking inputs at the time of the accident
Preserving the Data
The critical step after an autonomous vehicle accident is preserving this data before it is overwritten or deleted. Your attorney should immediately send a spoliation letter to:
- The vehicle manufacturer (Tesla, Waymo, GM/Cruise, etc.)
- The vehicle owner
- Any third-party software provider
A spoliation letter is a formal demand to preserve all evidence related to the accident, including all electronic data from the vehicle. If a party destroys evidence after receiving a spoliation letter, courts can impose severe sanctions -- including adverse inference instructions that tell the jury to assume the destroyed evidence was unfavorable.
Insurance: Whose Policy Covers the Crash?
Insurance coverage in autonomous vehicle accidents depends on the specific circumstances.
Personally owned autonomous vehicle (e.g., Tesla with Autopilot): The owner's personal auto insurance policy responds first, just as it would in any other car accident. The owner's liability coverage pays for damage they cause to others. If the autonomous system malfunctioned, the owner may then pursue a product liability claim against the manufacturer -- but the owner's insurance is the initial coverage.
Commercially operated autonomous vehicle (e.g., Waymo, Cruise): These companies carry commercial auto insurance with high liability limits -- typically in the millions. The commercial policy is the primary coverage for accidents involving their vehicles.
Product liability insurance: If the manufacturer is found liable for a defective autonomous system, the manufacturer's product liability insurance responds. This is separate from auto insurance and is relevant when the claim is based on a defect in the vehicle rather than driver negligence.
Federal vs. State Regulation
The regulatory landscape for autonomous vehicles is split between federal and state authority:
- Federal (NHTSA) regulates vehicle safety standards -- including the safety of autonomous driving systems, recalls, and crash reporting requirements
- State (NC) regulates drivers, driver licensing, insurance requirements, and traffic laws
Because NC has not passed autonomous vehicle legislation, there is no state-level framework specifically addressing how self-driving cars should be registered, insured, or operated. Federal safety recalls and investigations by NHTSA (such as those into Tesla Autopilot crashes) can provide evidence for individual accident claims in NC, but they do not change the state-level legal analysis.
This Area of Law Is Evolving Rapidly
Autonomous vehicle accident law is one of the fastest-evolving areas of personal injury and product liability law in the country. Courts across the US are handling these cases with increasing frequency, and each decision shapes the legal landscape for future claims.
If you are involved in an accident with a self-driving car in NC -- whether you were hit by one, were a passenger in one, or were the operator -- the complexity of the legal issues demands an attorney experienced in both personal injury and product liability. Standard car accident attorneys may not have the technical knowledge to handle the data preservation, engineering analysis, and product defect arguments that these cases require.
Frequently Asked Questions
Frequently Asked Questions
Does North Carolina have specific autonomous vehicle laws?
As of 2026, North Carolina does not have specific legislation governing autonomous vehicles. The NC General Assembly has considered bills addressing self-driving cars, but none have been enacted into law. This means NC's existing negligence, product liability, and insurance laws apply to autonomous vehicle accidents -- laws that were written for human drivers and do not neatly address the unique issues these vehicles present.
Who is liable when a self-driving car causes an accident in NC?
Potentially multiple parties. The vehicle manufacturer may be liable under product liability if the autonomous system had a design defect, manufacturing defect, or inadequate warnings. The software developer may be liable if the driving algorithm failed. The vehicle owner may be liable if they were supposed to be monitoring the system and failed to intervene. The specific liability depends on the level of autonomy, what went wrong, and who had control at the time of the crash.
Does contributory negligence apply to autonomous vehicle accidents in NC?
Yes. NC's strict contributory negligence rule applies to all negligence claims, including those involving autonomous vehicles. If you were hit by a self-driving car but were jaywalking, distracted, or otherwise partially at fault, your claim could be completely barred. If you were in a self-driving car and failed to take over when the system alerted you, that could also be used against you.
How do I preserve evidence after an autonomous vehicle accident in NC?
Autonomous vehicles record massive amounts of data -- camera feeds, sensor logs, radar and lidar data, and the decision-making algorithms the system was executing at the time of the crash. This data is critical evidence. Your attorney should send a spoliation letter immediately to the vehicle manufacturer and owner, demanding they preserve all electronic data from the vehicle. If this data is deleted or overwritten, it may be impossible to prove what the autonomous system did or failed to do.