What Happens if a Driverless Car Hits You in Charlotte? | Negligence Law in NC

Charlotte is no longer watching the driverless car debate from a distance. What happens if someone gets hurt in an accident? Do the ordinary negligence laws in North Carolina apply? If a “driverless” vehicle hurts or kills someone, who is legally responsible? Who does the plaintiff sue?
Automated cars are officially on the move in Charlotte! As of late February 2026, the city has entered a new phase of high-profile testing.
While the “futuristic” era feels like it just arrived, there have actually been several waves of testing in the Queen City over the last few years. Here is the current status:
1. Waymo (Alphabet/Google) – Current Status: Active Testing
In February 2026, Waymo officially began deploying a fleet of its autonomous vehicles in Charlotte.
- What you’ll see: White Jaguar I-PACE SUVs equipped with spinning sensors (LiDAR).
- The Phase: Currently, they are in the “mapping and data collection” phase. This means there are still human “safety drivers” behind the wheel. They are learning Charlotte’s unique road quirks, like the complex merges in Uptown and “Charlotte’s Most Hated Interchange,” Exit 3A on I-277, before they transition to fully driverless rides.
2. Beep & CASSI Shuttles – Current Status: Pilot Completed
You might remember seeing boxy, slow-moving shuttles at UNC Charlotte.
- The Project: From July to December 2023, NCDOT ran a pilot program called CASSI (Connected Autonomous Shuttle Supporting Innovation).
- The Result: The shuttles operated on a 2.2-mile loop connecting the light rail to campus. While successful as a research project, a 2024 report noted that the technology wasn’t yet “mature” enough to replace standard city buses due to their low speeds.
3. Cruise (General Motors) – Current Status: Paused
Cruise began testing in Charlotte back in 2023. However, after a high-profile incident in San Francisco and subsequent safety reviews, the company paused its operations nationwide. While they have expressed interest in returning, Waymo is currently the primary player on Charlotte streets.
Why Charlotte?
North Carolina is a “friendly” state for this tech because of a 2017 state law, North Carolina General Statute Chapter 20, Article 18, that prevents local cities from banning autonomous vehicles. Companies like Waymo choose Charlotte specifically to test how the cars handle:
- Heavy Southeast traffic patterns.
- The city’s rapid growth and shifting construction zones.
- Regional weather (though they are also testing in Chicago right now to master the snow).
Waymo announced on February 25, 2026, that it intends to bring fully autonomous taxi service to Charlotte. Unless you’ve already seen a car going down the road without a driver, that will most certainly freak some out. It can frankly take a minute. If you’ve been to San Francisco, after a while, you get used to it. Folks out there have been used to seeing experimental tech from Cupertino on streets, sidewalks, and in public areas for years. That’s not necessarily the case in a conservative, Southern town like Charlotte, North Carolina.
The rollout is expected to begin with manually driven mapping and data collection before broader autonomous deployment. That makes this a live Charlotte legal issue, not a California thought experiment and not a distant policy debate.
That single development turns an abstract technology discussion into a Mecklenburg County civil liability question. It places a new kind of vehicle on a collision course with old North Carolina doctrines that were built for flesh-and-blood drivers, not software, sensors, remote oversight, and a front seat that may not matter very much at all.
What’s the Law in North Carolina?
North Carolina is not starting from zero.
Since 2017, Article 18 of Chapter 20 has allowed fully autonomous vehicles on public roads if the vehicle complies with federal law, can achieve a minimal risk condition, can satisfy the statute’s post-crash requirements, carries liability insurance meeting North Carolina financial responsibility requirements, and is properly registered as a fully autonomous vehicle. The statute also defines a fully autonomous vehicle as one that will not require an occupant to perform any portion of the dynamic driving task while the automated driving system is engaged. If equipment exists that would allow an occupant to perform that task, the equipment must be stowed or made unusable so the occupant cannot assume control while the system is engaged.
That statutory framework answers one set of questions and leaves the hard ones untouched. Yes, North Carolina permits the technology. Yes, the vehicle must be insured. Yes, the law provides a substitute compliance model for crash reporting and remaining at the scene. But Article 18 does not decide negligence. It does not tell a Mecklenburg County jury who breached a duty of care when software makes the driving decisions. It does not tell the court how contributory negligence and last clear chance fit a vehicle that may have no steering wheel, no pedals, and no licensed human driver.
Driverless Car Accident Charlotte | Not Like Ordinary Wreck Lawsuits
A conventional wreck lawsuit associated with a personal injury claim for damages usually begins with a familiar question. What did the driver do wrong? Did the driver speed, fail to yield, text, drift left of center, miss a stopped car, or misjudge a pedestrian in the roadway?
A true Article 18 autonomous vehicle may very well disrupt that model. North Carolina broadly defines the dynamic driving task. It includes steering, acceleration, and deceleration, monitoring the driving environment, object and event detection, classification, response preparation, response execution, maneuver planning, and related control functions.
If the system, rather than the occupant, performs those tasks, the lawsuit starts to migrate away from the human seated inside the vehicle and toward the entity that deployed, programmed, maintained, monitored, or owned the machine.
That does not convert every case into a classic products lawsuit.
A Charlotte plaintiff firm such as Powers Law Firm likely may still pursue ordinary negligence theories grounded in deployment decisions, maintenance practices, sensor performance, remote oversight, mapping limits, or operation outside the vehicle’s design domain.
The point is simpler. Once the statute says the occupant does not perform the dynamic driving task, the defense cannot casually pretend the rider was just another driver who happened to let the car steer itself.
Driverless Car Accident Charlotte Litigation | Contributory Negligence and Last Clear Chance?
North Carolina still applies contributory negligence. That means a plaintiff’s own negligence can bar recovery. Driverless technology does not appear to repeal that overall doctrine. If a pedestrian steps directly into the roadway, if a cyclist creates an avoidable hazard, or if another motorist cuts across traffic, the defense will still argue plaintiff fault, that is, “contrib.” The automated car (and associated computer program) does not erase the plaintiff’s duty to use reasonable care.
But North Carolina also recognizes something known as “last clear chance,” and that doctrine could become unusually interesting when the defendant is a vehicle marketed as safer, faster, and better at hazard detection than a human driver.
North Carolina courts have long viewed the “last clear chance” rule as a standard part of negligence law. The law requires a last clear chance, not a last possible chance, to avoid the injury.
That distinction might prove decisive, because autonomous vehicle litigation could produce unusually detailed proof about when the system first saw the hazard, how it classified the hazard, whether it predicted the hazard’s path, and whether braking or evasive action remained available. Of course, that assumes the vehicle manufacturer (the defendant) voluntarily turns over such materials and doesn’t allege a “proprietary interest” precluding the disclosure of such information.
A human driver might say, “I never saw the plaintiff until impact.” A fully autonomous vehicle in many, if not most instances, has lidar returns, camera frames, radar data, event logs, object labels, path prediction records, braking commands, and internal timestamps. In the right case, that evidence could sharpen the last-clear-chance analysis in a way that old-fashioned wreck litigation never could.
Everyone knows computers make decisions in thousanths of seconds, extraordinarily more efficiently than humans. Will automated cars be held to human standards or those of computers?
This is where the beloved, common law prose of the past meets the cold, hard data of the future. It’s the exact friction point between human fallibility and digital perfection.
The legal shift potentially moves from “he said, she said” to “the log files said.” Here is how the standard is likely to evolve:
1. The Death of the “Inattentive” Defense?
In traditional “last clear chance” (LCC) litigation, a defendant often attempts to escape liability by claiming they honestly didn’t see the plaintiff until it was too late.
- The Human Standard: Evaluates what a “reasonable person” would have seen.
- The Computer Standard: If the LiDAR “saw” the pedestrian 4.2 seconds before impact, but the braking command wasn’t sent until 0.5 seconds before impact, the “last clear chance” is no longer an abstract theory. It is a mathematical certainty.
2. “Reasonable Person” vs. “Reasonable Robot”
Courts will be called to decide whether to apply Product Liability (the car failed to perform) or Negligence (the car failed to act as a “reasonable driver”).
- If we hold a computer to a Human Standard, do we give it a “pass” for slow reactions that it is physically incapable of having?
- If we hold it to a Computer Standard, we are essentially demanding strict liability. If the sensor data shows a path to avoid the crash existed, and the computer didn’t take it, the “chance” was there, does the “last clear chance” doctrine kick in?
| Feature | Old-Fashioned Wreck | Autonomous Vehicle Litigation |
| Reaction Time | Estimated (approx. 1.5s) | Logged to the microsecond |
| Visibility | Witness testimony/lighting | LiDAR intensity & Camera frames |
| Intent | Unknown | Object labels (e.g., “Pedestrian: 98% confidence”) |
| Last Clear Chance Analysis | “Should he have seen them?” | “The sensor did see them; why didn’t it brake?” |
The Verdict?
We could be moving toward a “Best-in-Class” Standard. If a human driver is expected to be “reasonable,” an automated car very well might be expected to perform as well as the most capable version of its software allows. A computer can’t claim it “panicked” or “got distracted by a billboard” or “was tired.”
A jury may be asked not just whether the defendant should have seen the danger, but whether the machine in fact saw it, recognized it, calculated it, and still failed to avoid it.
That does not mean every plaintiff reaches the jury on a last clear chance legal theory. Physics still matters. Visibility still matters. Road conditions still matter. Reaction windows still matter. But the doctrine may take on fresh force in a driverless case because the evidence may show far more about perception and response than you would get from a human memory and a few skid marks.
The Occupant Is Not Automatically the Driver
One of the more thought-provoking aspects of Article 18 is its handling of the occupant. The statute says an operator of a fully autonomous vehicle with the system engaged does not need a driver’s license. It also says an operator does not include an occupant performing solely strategic driving functions, such as trip scheduling or destination selection. That language likely matters because it separates choosing where the vehicle goes from performing the acts that actually drive it.
That does not answer every case. Article 18 also has a companion provision, G.S. 20-402, for vehicles that can perform the entire dynamic driving task but still expect a human operator to respond to a request to intervene. That type of vehicle looks different. If the system expects a human takeover and the safety operator ignores the request, the lawsuit may look much closer to a traditional negligence case.
The first practical question in any Charlotte autonomous vehicle case may therefore be basic but decisive. Was this a true fully autonomous vehicle under Article 18, or a vehicle that still relied on a human operator to rescue the system?
In a true fully autonomous vehicle, Article 18 cuts against the lazy argument that the rider should have grabbed the wheel or stomped the brakes.
The statute says the vehicle cannot require an occupant to perform any portion of the dynamic driving task, and if manual controls exist, they must be stowed or made unusable while the system is engaged.
That does not create blanket immunity for everyone seated inside. A rider who tampers with equipment, obstructs sensors, disables safety functions, or interferes with vehicle operation may still create a negligence issue. But the statute does not support treating the ordinary rider as the default tortfeasor merely because a seat was occupied.
“A Person May Operate a Fully Autonomous Vehicle” Is More Interesting Than It Looks
Article 18 says, “A person may operate a fully autonomous vehicle” if statutory conditions are met, and the definition section says an operator “is a person as defined in G.S. 20-4.01.”
That wording invites a real drafting question. It may have been written to accommodate a corporate fleet model. It may have been written with a testing-stage operator in mind. It may reflect a background definitional assumption. But the text does not cleanly resolve that issue on its face, as a careful litigator would want.
That uncertainty matters because civil litigation turns on who owed the relevant duty and who had operational responsibility.
The same statute places responsibility for moving violations on the registered owner of the fully autonomous vehicle. It also provides that, after a crash, “the vehicle or the operator” may promptly contact law enforcement and seek medical assistance, and the vehicle may remain at the scene until registration and insurance information is provided. That language suggests the General Assembly understood that responsibility might sit with something other than a licensed human sitting behind a wheel. It still stopped short of writing a complete tort map.
Insurance Exists, but Insurance Does Not Answer Fault
Article 18 requires a motor vehicle liability policy meeting N.C.G.S. 20-279.21. That helps with collectability and road legality. It does not answer breach, causation, or defenses. Insurance is a funding mechanism. Negligence doctrine still decides who pays and why.
That is why this subject matters for a Charlotte personal injury practice. A driverless vehicle case will not stop at “there was coverage.” The fight will likely move into operational logs, remote supervision, mapping history, software updates, event data, vehicle classification, human-intervention expectations, and the practical question whether this collision was avoidable under North Carolina negligence law. The existence of a policy gets you into the building. It does not win the case.
Suing a Large Company Is Less Mystical Than People Think, but Proving the Case Is Hard
If a company is properly registered to do business here, service of process ordinarily begins with the registered agent. North Carolina law also provides a backup path. If the entity fails to maintain a registered agent, the agent cannot with due diligence be found, or the Secretary of State revokes the relevant authority, the Secretary of State becomes the entity’s agent for service. So service of process is not the part of the case that should intimidate plaintiff’s counsel.
The hard part is elsewhere. Identifying the correct defendant or defendants can be complicated. Preserving telemetry and backend data will likely become standard practice, with counsel issuing a “spoliation letter” early on. Removal fights and forum disputes may appear quickly. Protective-order battles over proprietary software can become expensive. Experts will matter. Discovery will matter. The proof problem, not the summons, is where these cases will likely become demanding.
Charlotte Automated Car Accident FAQ
Under North Carolina Article 18 of Chapter 20, the registered owner of a fully autonomous vehicle is generally responsible for moving violations. However, a civil lawsuit for damages might also name the software developer, the manufacturer, or the company overseeing remote fleet operations. Determining the correct defendant deserves a detailed review of the vehicle’s operational logs
North Carolina follows a contributory negligence model. If you are even 1% at fault for the accident, you could be barred from recovery. This is why the “last clear chance” doctrine might become important in automated vehicle cases and accident lawsuits in North Carolina. If the vehicle sensors detected you but the software failed to respond, the technology itself could prove you had no “last clear chance” to avoid the collision while the machine did.
Preserving evidence after an automated car accident in North Carolina may require an immediate legal “spoliation letter” tohelp ensure the company does not overwrite or delete the records of the crash.Autonomous vehicles may record a substantial amount of telemetry, including LiDAR returns and path prediction logs.
The immediate steps after an accident with an automated vehicle are similar to any other wreck. You should call 911 and ensure a police report is filed. North Carolina law requires the “operator” or the vehicle itself to remain at the scene and exchange insurance information. In these types of insurance claims, preserving the digital evidence may become important. Your accident lawyer may believe it a good idea to send a letter to the owner of the fle, requesting the LiDAR and camera data from the moments before the crash are not deleted.
Under G.S. 20-401, an occupant of a fully autonomous vehicle is generally not considered the driver. They are not required to have a license if the system is fully engaged. However, if the vehicle requires a human to “intervene” or take over in an emergency, the occupant could still face legal liability. The specific classification of the vehicle under North Carolina law could help determine who is responsible for the crash.
North Carolina is a contributory negligence state. If a pedestrian is found even slightly at fault, they may be barred from recovering damages. This is where the data from the vehicle could become central to the case. If the car sensors detected the pedestrian well in advance but the software failed to brake, the doctrine of last clear chance might allow the injured plaintiff to recover damages despite any initial fault.
Under North Carolina law, Chapter 20, Article 18 requires every fully autonomous vehicle to carry a liability policy that meets North Carolina financial responsibility requirements. This insurance covers the vehicle regardless of whether a human was behind the wheel. The policy exists to provide a path for recovery, but it does not necessiarily automatically settle the question of who was at fault for the accident
In a typical car accident, we traditionally rely on memory and skid marks. In an autonomous vehicle crash, plaintiffs (the person injured in an accident) may have access to the vehicle’s “perception.” We may be able to see exactly what the car “saw” and how it “labeled” objects in the road. That type of evidence in a accident rial could serve to that the vehicle had a clear opportunity to avoid a collision, even if a human driver might have missed it. It will most certainly be a developing area of personal injury law and litigation in North Carolina. Have questions? Call Bill Powers at Powers Law Firm.
Charlotte Automated Car Litigation | A Developing Legal Landscape
The arrival of automated vehicles in Charlotte presents a new set of questions for our courts. While the technology is advanced, the legal framework remains unsettled. We are moving into a period where digital logs may supplement or even replace traditional witness testimony.
North Carolina follows the rule of contributory negligence. This means that if a plaintiff is found even slightly at fault, they may be unable to recover damages. The last clear chance doctrine has long served as a check on that outcome. In cases involving automated systems, this doctrine will likely focus on what the vehicle sensors actually detected. A computer does not have the same reaction time delays as a person. It processes data in milliseconds. This might change how a jury looks at whether a crash could have been avoided.
At Powers Law Firm PA, we focus on the facts of each case. We know that the law in this area is not yet decided. It will take time for the courts to determine how old rules apply to new machines. We take a measured approach to these cases because the details matter.
If you have questions about an accident involving an automated vehicle, it would be an honor to help guide you through the claims process. We help people work through difficult legal situations one step at a time.
Call Powers Law Firm at 704-342-4357 to speak with a Charlotte accident lawyer about the specifics of your legal matter and potential insurance claim for damages.












