Mercedes-Benz addresses Level 3 legalities; lawyers say uncertainty lingers

Industry

Mercedes-Benz says existing laws and regulations are sufficient to determine the automaker’s liability for crashes and incidents that may occur when its groundbreaking Drive Pilot system is engaged, but legal experts aren’t so sure.

New rules and laws should make clear whether motorists or automakers will be held at fault for everything from a speeding citation to a major crash when computers do the driving, said William Widen, a University of Miami law professor.

“The law should demand the same driving performance of a computer driver as it requires of a human driver,” he said. “Running a red light is running a red light.”

Uncertainty exists over whether a human driver would get a ticket for such a traffic offense while using Drive Pilot. Bryant Walker Smith, a University of South Carolina law and engineering professor, said the answer may vary depending on the state.

“In a lot of states, the human person has the overall authority,” he said. “But in others that expressly state the vehicle is driving, then legally that ticket may go to the company.”

Time is running short to provide firm answers. Mercedes-Benz intends to launch the system on the 2024 S-Class and EQS sedans within months.

Drive Pilot allows a motorist to “take their mind off the traffic,” according to Mercedes-Benz.

Assuming the company’s launch plans remain on track, Mercedes-Benz would be the first automaker to deploy what’s known as a Level 3 automated driving system. That’s one in which the human is not considered the driver when the system is engaged, though they must be available to retake control should the system prompt them, according to the Levels of Driving Automation established by standards organization SAE International.

But Mercedes-Benz has been tight-lipped on some safety-critical aspects of Drive Pilot. The company declined to say whether a motorist can read a book or watch TV when the system is active. Nor would it say whether humans can remove their eyes from the road or hands from the wheel. Mercedes-Benz also declined to say how long motorists might have to respond to a prompt to retake control.

A company spokesperson said “a more detailed tech update” that clarifies those ambiguities may be forthcoming closer to Drive Pilot’s launch.

In the meantime, Mercedes-Benz issued a response in late June that addresses one of the lingering questions: how the company views its liability for crashes or incidents that may occur when Drive Pilot is active.

In California, Nevada and Germany, the first three locations where Mercedes-Benz intends to launch Drive Pilot, “there are well-established legal systems for determining responsibility and liability of roads and highways,” the company told Automotive News in a written statement.

“While they might differ between jurisdictions, they still provide the legal foundation that is the basis of the respective tasks and duties,” the company said.

That’s both nebulous and inadequate, Widen said.

Because the technology is new, the status quo does not necessarily delineate responsibility between computer and human. He cautioned motorists should not assume they have been legally absolved when Level 3 systems are active nor feel reassured by statements made by manufacturers.

Without legal clarity, “then the whole line about relaxing and taking your time back is nothing but air,” he said.

Few precedents exist for how courts might treat cases that arise from Drive Pilot crashes, and the ones that do exist are imprecise comparisons:

  • A Tesla owner awaits trial on a vehicular manslaughter charge in California related to a fatal crash during which his Autopilot feature was engaged. But Autopilot is considered a Level 2 driver-assist system. With those systems, humans always remain responsible for vehicle operations even when the system is engaged.
  • General Motors settled a lawsuit that alleged a vehicle from its Cruise autonomous vehicle subsidiary knocked a motorcyclist to the street in San Francisco, causing injuries. But that involved a Level 4 self-driving test vehicle. With Level 4, human motorists have no role in the driving process.
  • The most direct precedent may arise from Brouse v. United States, a case stemming from a 1947 midair collision between a U.S. Army fighter plane and a small plane over Ohio. Although the fighter was under the control of an autopilot system, the U.S. district court ruled the human pilot still had an obligation to keep “a proper and constant lookout,” according to the ruling.

Motorists face similar exposure when using Level 3 systems unless new laws are written, said Widen, who co-authored a paper, alongside Carnegie Mellon University professor Phil Koopman, that proposes rules for attributing liability when computers and humans share control.

“They need a shield law for owners who engage Level 3 automated driving systems unless new laws are written,” he said. “You at least want an interim period where the company is on the hook because you have no evidence that warrants a belief that these systems are safer than a human driver.”

Not everyone is so sure that laws must be rewritten or that motorists are at risk.

Motor vehicle laws and the principles that underpin them have evolved over a century, and that evolution should continue, Smith said.

“We don’t need to throw everything out and start over,” he said.

Drivers who use a Level 3 system within the bounds of the manufacturers’ directions should not be held liable for system mistakes so long as they use it as directed, he said.

If a vehicle with Drive Pilot engaged strikes and kills a pedestrian, the human motorist using the system as directed, “would not have the legal culpability to be charged with a crime,” Smith said. “At the same time, it is not clear Mercedes would be charged with manslaughter either.”

Mercedes-Benz did not address its potential criminal liability in such cases but acknowledged its overall responsibility “expands as the vehicle assumes more of the dynamic driving task.”

“In the context of Drive Pilot, this means that if a customer uses the system as intended and instructed and the system fails to perform as designed, we stand behind our product.”

But the company stopped short of acknowledging a “duty of care,” a legal term used to discern how a reasonable entity or person might act in particular circumstances. Courts can use duty-of-care standards to determine negligence.

Motorists should not expect the company’s assurances to either supersede state laws or exonerate them with courts, Widen said. They should wait to use such automated driving systems until laws clarify their role in the driving process.

“The law is simply not clear on these points, and it should be clear,” he said.

Products You May Like

Articles You May Like

Tesla uses this Cybertruck wiper Easter Egg to help efficiency and aerodynamics
Pérez leaves Red Bull; Lawson set to take over
2027 Mercedes-Benz GLB-Class EV spied for first time
2026 Honda Prelude’s interior revealed
Cybertruck gets un-founded, Tesla computers fail, a Trump has a plan

Leave a Reply

Your email address will not be published. Required fields are marked *