A mass shooting. A driverless car blocking an ambulance. And a city left scrambling to answer a question that legal and safety frameworks were simply never built to address: who is responsible when autonomous technology gets in the way of saving lives?
That question took center stage this week at an Austin City Council committee meeting, where Council Member Zo Qadri presented bystander video from the March 1 shooting at Buford’s Backyard Beer Garden on West Sixth Street — an incident that killed three people and injured more than a dozen others. The footage showed a Waymo autonomous vehicle blocking an ambulance responding to the scene. An Austin Police Department officer had to manually move the car out of the way, costing precious minutes at a mass casualty event.
Officials said the delay did not critically compromise the response — Austin Police Chief Lisa Davis confirmed officers and paramedics arrived within 57 seconds of the call — but the incident exposed a gap that attorneys, lawmakers, and safety advocates are only beginning to reckon with.
A City Used as a Testing Ground
Austin has become one of the most active autonomous vehicle testing corridors in the country. Companies including Waymo, Tesla, and Avride operate self-driving vehicles on city streets, often without a human safety driver behind the wheel. Texas law has broadly permitted autonomous vehicle testing and commercial operation since 2017, making the state one of the least restrictive regulatory environments in the nation.
That permissiveness has invited innovation — and complications. At Wednesday’s council meeting, Austin first responders raised several distinct problems they’ve encountered with AVs in the field: autonomous vehicles driving through flooded roadways where human judgment would override sensors, cars crossing railroad barricades, and most critically, vehicles that disregard hand signals from officers directing traffic. These are not edge cases. They are documented, recurring failures of current autonomous systems to operate safely in dynamic, human-directed emergency environments.
Waymo representatives did not appear at the meeting. The company issued a written statement saying that safety is foundational to their work and pledging to continue collaboration with Austin leadership and first responders. A spokesperson said the company had already provided city and state officials with a detailed — and confidential — account of the Buford’s incident and planned improvements to its emergency response protocols.
“We want to make sure innovation does not come at the expense of safety,” Qadri said. “And we want the community to understand what is happening on our roads.”
The Legal Landscape Hasn’t Kept Up
Here is where the story becomes more than a local news item. The Buford’s incident is a vivid example of a broader and growing legal problem: autonomous vehicles create accident and liability scenarios that existing law was never designed to handle.
Traditional vehicle accident law is built around human negligence. A driver makes a bad decision, causes harm, and bears responsibility — along with their insurer and, in commercial contexts, their employer. Autonomous vehicles scramble that framework entirely. RAND Corporation researchers have documented that determining fault in AV-related incidents requires navigating overlapping responsibilities among the vehicle manufacturer, the software developer, the fleet operator, and sometimes the municipality that permitted the operation.
The National Highway Traffic Safety Administration (NHTSA) has been developing guidance on AV safety and incident reporting since 2016, but federal regulations specifically governing liability in AV-involved crashes remain largely unresolved. Texas has moved faster than most states to permit AV operation, but it has not moved nearly as fast to define what happens when an AV causes harm — directly or indirectly, as in the Buford’s case, where obstruction rather than collision was the issue.
That distinction matters. If a Waymo vehicle had struck the ambulance, liability would be clearer — a product liability claim, likely against the manufacturer or operator. But when an autonomous vehicle simply fails to yield, fails to read an officer’s hand signals, or freezes in a position that delays emergency care, the legal path is murkier. Is it negligence? Product liability? A regulatory failure by the city that permitted the operation? All three?
Key Legal Questions Raised by AV Emergency Obstruction:
Who bears liability when an autonomous vehicle — with no human operator to sue — obstructs emergency medical care?
Can victims of delayed emergency response due to AV obstruction bring a civil claim, and against whom?
Does Texas law adequately define the duty of care owed by AV operators in emergency scenarios?
New Vehicle Classes, New Accident Types
The Waymo incident is not isolated. Across the country, autonomous and semi-autonomous delivery vehicles, rideshare robots, and logistics drones are operating in urban environments in ways that generate entirely new categories of accidents and injury. NHTSA’s Standing General Order data has documented hundreds of AV-involved crashes since mandatory reporting began, including incidents involving Waymo, Tesla Autopilot, and other systems — many of which involved emergency vehicle interactions.
Autonomous delivery vehicles present a particularly acute version of this problem. Companies like Amazon, FedEx, and a growing number of startups now operate robotic delivery vans and sidewalk robots that share space with pedestrians, cyclists, and traditional traffic. The Insurance Institute for Highway Safety (IIHS) has noted that while AV technology may reduce certain crash types, it introduces new vulnerabilities — particularly in complex, unscripted scenarios like active crime scenes, natural disasters, or medical emergencies where human adaptability is essential.
For anyone injured in or around an autonomous or semi-autonomous vehicle incident in Texas, the question of who to hold accountable is genuinely complicated. An experienced Austin Delivery Vehicle Accident Attorney who understands the intersection of product liability, commercial vehicle law, and emerging AV regulations can be essential to navigating a claim that falls outside the bounds of a traditional auto accident.
What Austin’s Situation Reveals About Regulatory Gaps
Council Member Qadri made clear that this is not purely a Waymo problem. He reached out to other autonomous vehicle companies operating in Austin in the days before Wednesday’s meeting, urging all of them to incorporate lessons from the Buford’s incident into their emergency response planning. His broader message — that public safety cannot be sacrificed to the pace of private innovation — reflects a sentiment growing among city governments nationwide.
San Francisco experienced its own high-profile AV emergency response failures in 2023 and 2024, prompting the California Public Utilities Commission to significantly restrict Cruise’s operating permits after an autonomous vehicle struck and dragged a pedestrian. The California Public Utilities Commission now requires AV operators to maintain 24/7 contact with a remote operations team capable of overriding vehicles in emergencies. Texas has no equivalent requirement.
That regulatory asymmetry is a legal problem waiting to happen. When a state permits broad autonomous vehicle operation without requiring robust emergency override protocols, and an incident causes harm, questions of governmental liability and AV operator negligence can overlap in ways courts have not yet fully addressed.
The Liability Framework Texas Needs
Legal scholars and transportation safety advocates have been calling for clearer AV liability frameworks for years. The Brookings Institution has argued that manufacturers and fleet operators — not individual human users — should bear primary liability in AV incidents, given that the technology, not human judgment, is making the driving decisions. This would represent a significant shift from traditional negligence-based auto liability and would require legislative action in Texas.
Until that framework exists, injured parties face an uphill battle. Victims may have valid claims under product liability theories — arguing that the AV’s failure to yield to emergency vehicles constitutes a design defect — or under negligence theories targeting the operator’s failure to ensure adequate emergency override capability. Either path requires sophisticated legal analysis and a clear understanding of how autonomous systems work.
Texas Transportation Code Chapter 545 governs right-of-way and emergency vehicle protocols for drivers. Whether and how those obligations apply to autonomous systems with no human occupant is a legal question that Texas courts will eventually need to answer — and incidents like the one outside Buford’s are exactly the kind of cases that will force those answers.
A City at the Intersection of Innovation and Accountability
Austin has long embraced its identity as a technology hub. That identity comes with real benefits — economic growth, early access to transformative technologies, and the kind of civic energy that makes the city a national conversation. But it also comes with obligations. Testing new vehicle technologies on public streets means accepting that those technologies will sometimes fail, and that real people may bear the cost of those failures.
Three people died at Buford’s. A Waymo vehicle sat in the path of an ambulance responding to that scene. And the legal system — at both the state and federal level — does not yet have clear answers about what accountability looks like when the vehicle doing the blocking has no driver to hold responsible.
That is a gap that lawmakers, regulators, and the courts will need to close. For victims in the meantime, the path forward runs through attorneys who understand the emerging complexity of autonomous and commercial vehicle claims — and who can hold the right parties accountable even when the law has not yet caught up to the technology.