Skip to main content
NC Accident Help

Tesla Autopilot & ADAS Accidents in NC

ADAS accidents involving Tesla Autopilot, Ford BlueCruise, and GM Super Cruise in NC -- who is liable, how fault works, and what data matters for your case.

Published | Updated | 9 min read

The Bottom Line

ADAS (Advanced Driver Assistance Systems) like Tesla Autopilot, Ford BlueCruise, and GM Super Cruise are involved in a growing number of accidents in North Carolina. Under NC law, the driver is ALWAYS legally responsible -- these are Level 2 systems that require constant human supervision, and no currently available system shifts legal liability from the driver to the manufacturer. However, if the system itself was defective -- phantom braking, failure to detect a stopped vehicle, or misleading marketing -- you may also have a product liability claim against the manufacturer.

What Is ADAS?

Advanced Driver Assistance Systems (ADAS) are technology packages built into modern vehicles that assist with driving tasks like steering, braking, and lane-keeping. The most well-known systems include:

  • Tesla Autopilot / Full Self-Driving (FSD) -- uses cameras and neural networks to steer, accelerate, and brake
  • Ford BlueCruise -- hands-free highway driving on pre-mapped roads
  • GM Super Cruise -- hands-free highway driving with driver attention monitoring
  • Honda Sensing -- adaptive cruise control, lane-keeping assist, collision mitigation
  • Toyota Safety Sense -- pre-collision system, lane departure alert, adaptive cruise control

Despite names like "Autopilot" and "Full Self-Driving," every one of these systems is classified as SAE Level 2 autonomy. That classification means one critical thing: the human driver must remain in control at all times.

Level 2 Autonomy: The Driver Is Always Responsible

The Society of Automotive Engineers (SAE) defines six levels of driving automation, from Level 0 (no automation) to Level 5 (full automation with no human input needed). Every ADAS system currently sold to consumers in the United States operates at Level 2 -- meaning the system can assist with steering and speed, but the driver must:

  • Keep hands on the steering wheel (or be ready to take over immediately)
  • Keep eyes on the road at all times
  • Be prepared to intervene if the system fails or encounters a situation it cannot handle

Under North Carolina law, the driver is responsible for operating the vehicle safely. There is no exception for using an ADAS feature. If you are driving with Tesla Autopilot engaged and cause an accident, you are liable under NC's standard negligence analysis -- the same as if you had been driving manually.

ADAS technology is not perfect. These systems fail in predictable and well-documented ways:

Phantom Braking

Phantom braking occurs when the system suddenly applies the brakes for no reason. The sensors or cameras misinterpret shadows, overpasses, road signs, or changes in lighting as an imminent obstacle. At highway speeds, sudden unexpected braking can cause a severe rear-end collision by the vehicle behind you.

NHTSA has received thousands of complaints about phantom braking in Tesla vehicles specifically. This is one of the most common ADAS failures and one of the most dangerous.

Failure to Detect Stopped Vehicles

Multiple ADAS systems -- particularly Tesla Autopilot -- have been documented failing to detect stationary vehicles, emergency vehicles with lights flashing, or construction equipment on the road. The system may be designed primarily to track moving objects and can struggle with vehicles that are stopped or moving very slowly. Rear-end collisions with stopped emergency vehicles on highway shoulders have been a recurring pattern in NHTSA investigations.

Lane Departure Into Oncoming Traffic

ADAS lane-keeping systems can fail at curves, lane merges, unclear lane markings, or construction zones where lane lines are missing or contradictory. When the system loses track of lane boundaries, the vehicle may drift across the center line into oncoming traffic -- one of the most dangerous collision types.

Failure to See Pedestrians and Cyclists

ADAS systems have documented difficulty detecting pedestrians, cyclists, and motorcyclists -- particularly at night, in poor weather, or when the person is not in a crosswalk. The system may not recognize a human body as an obstacle, or may detect them too late to stop.

Who Is Liable for an ADAS Accident in NC?

ADAS accidents can involve multiple potentially liable parties:

The Driver (Always, Under NC Law)

Under NC's existing legal framework, the driver is responsible for the safe operation of the vehicle. Using Autopilot or any other ADAS feature does not shift that responsibility. If you caused the accident while relying on ADAS, you are liable to the other driver.

The Manufacturer (Product Liability)

If the ADAS system itself was defective, the vehicle manufacturer may be liable under NC product liability law. There are three theories:

  • Design defect -- the ADAS system was inherently unsafe as designed. For example, if the system's architecture makes phantom braking inevitable, or if it cannot reliably detect stopped vehicles at highway speeds, that is a design defect
  • Failure to warn -- the manufacturer did not adequately warn drivers about the system's limitations. If the owner's manual buries critical warnings about situations where ADAS will fail, or if the dashboard warnings are insufficient, this may be a failure-to-warn claim
  • Marketing defect -- the manufacturer's advertising or naming implied capabilities the system does not have. Calling a Level 2 system "Full Self-Driving" or "Autopilot" may create expectations that the car drives itself -- expectations the manufacturer knew were false

NHTSA Investigations as Evidence

Tesla Autopilot has been the subject of multiple federal investigations and recall orders by the National Highway Traffic Safety Administration. NHTSA investigated Tesla over concerns about crashes into emergency vehicles, phantom braking, and the effectiveness of driver monitoring systems. In 2023, NHTSA required Tesla to recall over 2 million vehicles to update Autopilot's driver attention monitoring.

These investigations and recalls can be powerful evidence in your case. If NHTSA found that the system had safety defects, that finding supports a product liability claim against the manufacturer.

N.C. Gen. Stat. 99B-1 et seq.

The "Over-Reliance" Defense

When you file a product liability claim against an ADAS manufacturer, expect this argument: "You should not have relied on the system."

The manufacturer and its insurance company will argue that the owner's manual clearly states the system requires driver supervision, that dashboard warnings told you to keep your hands on the wheel, and that you chose to rely on a system you knew had limitations.

This is a real defense, and it has merit in some cases. But there is a powerful counter-argument: the manufacturer sold you the system. They marketed it as making driving safer. They named it "Autopilot" or "Full Self-Driving." They designed the user experience to encourage reliance -- the car steers itself, brakes itself, and changes lanes itself. If the manufacturer creates a system that encourages over-reliance through its design and marketing, and then blames the driver for relying on it, that is itself a product liability issue.

Dashcam and Vehicle Data: Critical Evidence

Tesla vehicles -- and increasingly, other modern vehicles with ADAS -- record extensive data that becomes critical evidence in accident cases:

  • Speed, acceleration, and braking data -- exactly how fast you were going and when the brakes were applied
  • Steering input -- whether the driver or the ADAS system was controlling the vehicle
  • Autopilot/ADAS status -- whether the system was engaged, disengaging, or requesting driver takeover
  • Camera feeds -- Tesla vehicles have multiple cameras recording the road and the cabin
  • Driver attention data -- whether the driver's hands were on the wheel and (in some systems) whether the driver was looking at the road

This data can prove what the ADAS system was doing at the moment of the crash, whether it failed to detect a hazard, and whether the driver was paying attention. Send a spoliation letter to the manufacturer immediately to prevent this data from being overwritten or deleted.

NC Contributory Negligence and ADAS Accidents

NC's contributory negligence rule creates a particular risk in ADAS accident cases. If you were not paying attention while Autopilot was engaged -- checking your phone, looking at the touchscreen, or simply not watching the road -- the other driver's insurance company (or the manufacturer in a product liability case) will argue you were contributorily negligent.

Under NC law, if you were even 1% at fault, your entire claim can be barred. In an ADAS accident, the question is whether a reasonably prudent driver would have been monitoring the road and ready to take over. If the evidence shows you were distracted, that can destroy your claim -- even if the ADAS system also failed.

The last clear chance doctrine may provide an exception in some cases. If the other driver saw you coming and had the opportunity to avoid the collision but did not, they may still be liable despite your contributory negligence.

The Future: More ADAS, More Cases

ADAS technology is becoming standard equipment in new vehicles. Every major manufacturer now offers some level of driver assistance, and the technology is rapidly expanding from highways to city streets. As more vehicles with ADAS hit NC roads, accidents involving these systems will become increasingly common.

The legal framework will eventually catch up. NC may pass specific autonomous vehicle legislation. Federal regulations may evolve. But for now, the rules are clear: the driver is responsible, the manufacturer may also be liable if the system was defective, and NC's contributory negligence rule applies to both sides of the case.

Frequently Asked Questions

Frequently Asked Questions

Am I still liable if Tesla Autopilot was driving when the accident happened in NC?

Yes. North Carolina law requires the driver to maintain control of the vehicle at all times. Tesla Autopilot, Ford BlueCruise, GM Super Cruise, and every other currently available system is classified as Level 2 autonomy -- meaning the driver must keep hands on the wheel and eyes on the road. If you cause an accident while using any ADAS feature, you are legally responsible under NC law. You may have a separate product liability claim against the manufacturer if the system malfunctioned, but that does not remove your liability to the other driver.

What is phantom braking and can it cause an accident in NC?

Phantom braking occurs when an ADAS system -- most commonly Tesla Autopilot -- suddenly applies the brakes when there is no actual obstacle ahead. The system's sensors or cameras misinterpret shadows, overpasses, road signs, or other visual inputs as an imminent collision. This sudden, unexpected deceleration at highway speeds can cause the vehicle behind you to rear-end you. NHTSA has investigated thousands of phantom braking complaints, and this data may support a product liability claim if phantom braking caused your accident.

Can I sue Tesla if Autopilot caused my accident in NC?

Potentially yes. If Tesla Autopilot had a design defect, failed to detect a hazard it should have detected, or if Tesla's marketing led you to over-rely on the system, you may have a product liability claim under NC law. However, NC uses strict contributory negligence -- if Tesla can argue you were even 1% at fault (for example, by not paying attention while Autopilot was engaged), they may try to defeat your claim entirely. These cases are complex and typically require expert analysis of the vehicle's data logs.

How do I get the data from my Tesla after an accident in NC?

Tesla vehicles record extensive data including speed, steering input, Autopilot status, brake application, camera feeds, and whether the driver's hands were on the wheel. You or your attorney should send a spoliation letter to Tesla immediately after the accident, demanding they preserve all data from your vehicle. Tesla can remotely access this data, and it may be overwritten if not preserved quickly. You can also request your vehicle's event data recorder (EDR) data through a qualified technician with the appropriate tools.