Uber should pay more attention to industry standards for self-driving technology
"Until the company can show that it has met the appropriate standards for its technology, it should keep its self-driving operations off the road," Allen Patatanyan of West Coast Trial Lawyers writes.
Editor's Note: The following is a guest post from Allen Patatanyan, a personal injury and transportation attorney with West Coast Trial Lawyers. The opinions represented in this piece are independent of Smart Cities Dive's views.
Uber has become the poster child for lack of corporate responsibility. From sexual assault by drivers to the well-publicized death of a pedestrian by a self-driving car this past April, the company has been straddling the line between transportation and technology. As it positions itself to compete in the technology space, it can no longer operate as an upstart.
Uber was quick to tout the safety benefits of self-driving cars at the very same time that is was rushing to beat the competition to market. In the mad dash to be first off the starting block, Uber appears to have skimped on fundamentals, such as doing simulations with its self-driving vehicles.
Simulations involve putting cars through multiple drives using software, rather than gasoline and rubber. Uber’s competitors, in contrast, invested significant resources in simulation before putting their cars on the street. For example, Waymo, the Google self-driving offering, reported simulating more than 5 billion driving miles as of a few months ago. In the absence of established regulatory standards, it could be argued that Uber’s competitors had set a de facto industry standard for autonomous vehicles.
What qualifies as an industry standard? Standards are criteria within an industry relating to the standard functioning and carrying out of operations. They are the generally accepted requirements followed by members of an industry. The law recognizes different levels of standards. Consensus-based standards are published guidelines that have been reviewed and approved be national or international standards organizations. The U.S. Food and Drug Administration, for example, provides a mechanism for recognizing voluntary standards as consensus-based standards. De facto standards, in contrast, are those standards that have been widely accepted and used within an industry but have never received formal review and approval from a standards body.
One purveyor of simulation software states the following on its website: "You can't put a car on the road, a drone in the air or a robot in a warehouse without satisfying the strictest of safety standards. For an autonomous car, for example, this means driving billions of miles — a practical impossibility when time to market is critical."
The fact that Uber’s competitors expended significant time and money on simulation confirms its importance. Just as with drug companies and medical device manufacturers, developers of self-driving vehicles must do a careful cost-benefit analysis. Drug companies look at the ratio of risks and side effects to health benefits when they conduct drug trials: The greater the risk, the higher their investment in testing. They rely on these analyses to support their applications for FDA approval of pharmaceuticals.
The National Highway Traffic Safety Administration, a division of the U.S. Department of Transportation, also relies on manufacturers of autonomous vehicles to go through a careful analysis of risks and benefits in building their cars. These expectations appear in “Automated Driving Systems 2.0: A Vision for Safety”:
“The design and validation process should also consider including a hazard analysis and safety risk assessment for ADSs, for the overall vehicle design into which it is being integrated, and when applicable, for the broader transportation ecosystem…. Ideally, the process should place significant emphasis on software development, verification, and validation. The software development process is one that should be well-planned, well-controlled, and well-documented to detect and correct unexpected results from software updates. Thorough and measurable software testing should complement a structured and documented software development and change management process and should be part of each software version release....Design decisions should be linked to the assessed risks that could impact safety-critical system functionality.”
For autonomous vehicles, the cost of getting it wrong — running over an innocent pedestrian — is sufficiently great that it warrants the highest level of trial. That other companies conducted extensive simulation underscores the seriousness of the potential risk. An expert witness for plaintiffs could make a strong showing that the design of Uber’s self-driving vehicle was defective because it failed to conform to industry standards. That expert would likely cite this language from the NTSB guidelines:
"Entities are encouraged to follow a robust design and validation process based on a systems-engineering approach with the goal of designing ADSs free of unreasonable safety risks. The overall process should adopt and follow industry standards, such as the functional safety process standard for road vehicles, and collectively cover the entire operational design domain (i.e., operating parameters and limitations) of the system. Entities are encouraged to adopt voluntary guidance, best practices, design principles, and standards developed by established and accredited standards-developing organizations...as well as standards and processes available from other industries such as aviation, space, and the military and other applicable standards or internal company processes as they are relevant and applicable."
Although Uber’s sparse investment in simulation software may not actually have caused the Tempe, AZ crash, it could end up becoming an issue in a products liability lawsuit. Tort law defines a duty of care as a legal obligation requiring adherence to a standard of reasonable care while performing any acts that could foreseeably harm others. The public is held to the reasonable person standard, but members of an industry are generally held to that industry’s standards of care. If an industry has no established normalized standards, then a court will apply the reasonable person test, along with any standards that the business may have set out to its patrons.
In many industries, the standard of care will be determined by the standard that would be exercised by the reasonably prudent manufacturer of a product, or the reasonably prudent professional in that line of work. Failing to meet industry or self-scripted standards could be interpreted as a failed duty by the defendant to plaintiff giving rise to a finding of negligence.
Uber, which halted all tests of its self-driving cars following the crash, is reintroducing them to the streets of Pittsburgh — this time, with a driver manually controlling the car. These tests will help improve maps and simulation software. Until the company can show that it has met the appropriate standards for its technology, it should keep its self-driving operations off the road.