For some safety experts, Uber’s self-driving taxi test isn’t something to hail
Uber’s decision to bring self-driving taxis to the streets of Pittsburgh on Wednesday is raising alarms among a swath of safety experts who say that the technology is not nearly ready for prime time.
The experiment will launch even though Pennsylvania has yet to pass basic laws that permit the testing of self-driving cars or rules that would govern what would happen in a crash. Uber is also not required to pass along any data from its vehicles to regulators.
Meanwhile, researchers note, autonomous cars have been thrown off by bridges, a particular problem in Pittsburgh, which has more bridges than any other major US city.
“They are essentially making the commuters the guinea pigs,” said Joan Claybrook, a consumer-protection advocate and former head of the National Highway Traffic Safety Administration. “Of course there are going to be crashes. You can do the exact same tests without having average citizens in your car.”
But advocates of autonomous vehicles say that the technology might never have happened if companies had to wait for governments to pass rules first. With nearly 1.3 million Americans dying in car crashes every year, largely due to driver errors, technologists have stressed the critical need to push forward on the testing of driverless cars on actual roads.
In many ways, these competing views, brought into stark relief by Uber’s Pittsburgh project, reflects the wider tension over how major innovation in America should take place.
Pittsburgh might be the exact environment that innovators love to leap into - a legal void that can be defined by technologists, not bureaucrats. The question is how fast, and under what conditions, should the testing of a life-changing technology occur. While many companies, including Google and General Motors, are conducting trials of automatic vehicles on public roads, Uber is the first to bring everyday commuters along for the ride.
“We’ve seen that this is coming - faster than anyone had imagined, ” said Roger Cohen, policy director for Pennsylvania’s Department of Transportation, who said that Uber was not legally required to ask for regulators’ advance permission before its launch. “Current law, in its silence, is permitting it by not prohibiting it.”
Uber’s Pittsburgh project isn’t only the most high-stakes test of a promising, nascent technology. It also reflects a belief that runs deep in the Silicon Valley DNA. That ethos holds as an article of faith that innovation will always be far ahead of the rules. It sees the world as a laboratory in which life can be made better when innovators are afforded the freedom to experiment.
In a talk two years ago at a Washington Post forum, Chris Urmson, the former Google executive who once led the company’s self-driving car project, said “one of the great things about American innovation” was that if the law “doesn’t say you can’t do it, then you can.”
Justin Kan, a partner at the Silicon Valley start-up incubator and seed fund Y Combinator, said that for all its fights with regulators, including scathing battles with taxi commissions in cities across the world, Uber has helped to pioneer and legitimise the world’s most advanced taxi service in an arena where the law was quiet.
To be sure, the Pittsburgh project does not reflect the hard-charging Uber of its early days - back then, the start-up stormed into cities and lacerated taxi commissions for bilking consumers and preventing private citizens from making money by picking up passengers in their own cars. Though Uber was not legally required to do so, the company worked closely with Pennsylvania regulators in the rollout of the project. That strategy allowed the ride-hailing giant to get buy-in from officials, Cohen said, and exploit existing loopholes with fewer obstacles.
Uber’s vehicles will include two trained safety drivers who can take over the wheel in an emergency.
Cohen and Bryant Walker Smith, an autonomous-vehicle expert at the Centre for Internet and Society at Stanford Law School, are both comfortable with the tests because of the safety drivers. Still, they acknowledged that doesn’t mean it will be collision-free. “You’re not going to have perfection. There is going to be trial and error and it’s not going to be problem free,” Cohen said.
Even so, the effort is sparking alarm from safety experts who say the technology has major limitations that can be very dangerous. Self-driving cars have trouble seeing in bad weather. Sudden downpours, snow and especially puddles make it difficult for autonomous vehicles to detect lines on pavement and thereby stay in one lane.
Walker Smith added that self-driving cars have often confused bridges for other obstacles.
The vehicles also have difficulty understanding human gestures - for example, a crosswalk guard in front of a local elementary school may not be understood, said Mary Cummings, director of Duke University’s Humans and Autonomy Lab, at a Senate hearing in March. She recommended that the vehicles not be allowed to operate near schools.
Then there’s a the human factor: Researchers have shown that people like to test and prank robots. Today, a GPS jammer, which some people keep in their trunks to block police from tracking them, will easily throw off a self-driving car’s ability to sense where it is, Cummings said.