Autonomous cars are on the way, with some experts forecasting that 33 million will hit the road by 2040. But public acceptance hasn’t caught up yet.
That’s according to a new survey conducted by PSB Research and commissioned by Intel, which found that only 21 percent of Americans would trade their cars for driverless vehicles today. Moreover, most said they’re wary of autonomous vehicles; nearly half (43 percent) don’t feel safe around them.
That’s despite the fact that about 94 percent of car crashes are caused by human error and that in 2016 the top three causes of traffic fatalities were distracted driving, drunk driving, and speeding. According to the National Safety Council, Americans’ odds of dying in a car crash are one in 114.
At the same time, there’s palpable enthusiasm about the technology. About 63 percent of people surveyed expect self-driving cars to be the norm in 50 years, and more than half said they “looked forward to the day when they [wouldn’t] have to drive.”
Here’s what people plan to do in self-driving cars, according to PSB and Intel:
- Consume entertainment (58 percent)
- Socialize (57 percent)
- Work (56 percent)
- Host meetings (33 percent)
- Groom (26 percent)
- Exercise (14 percent)
“We must bridge the gap between acceptance of today’s automated driving assist features and full autonomy,” Jack Weast, Intel senior principal engineer and vice president of AV Standards at Mobileye, said. “Today, passengers are asked to blindly trust a manufacturer’s ‘black box’ safety approach. What is needed is for the industry and policymakers to rally around a transparent safety model that builds trust between humans and machines.”
Skepticism in autonomous cars isn’t a new phenomenon.
Three separate studies this summer — by the Brookings Institution, think tank HNTB, and the Advocates for Highway and Auto Safety (AHAS) — found that a majority of people aren’t convinced of driverless cars’ safety. More than 60 percent said they were “not inclined” to ride in self-driving cars, and almost 70 percent expressed “concerns” about sharing the road with them.
High-profile accidents haven’t helped instill much confidence.
In March of this year, Uber suspended testing of its autonomous Volvo XC90 fleet after one of its cars struck and killed a pedestrian in Tempe, Arizona. Separately, Tesla’s Autopilot was found to have been engaged in the moments leading up to a fatal Model X collision this spring — the second fatality involving Autopilot since a crash in May 2016.
Critics contend that the autonomous car industry lacks an empirical, agreed-upon method of gauging in-vehicle safety. Earlier this month, the RAND Corporation published an Uber-commissioned report — “Measuring Automated Vehicle Safety: Forging a Framework” — that laid bare the challenges ahead. It suggested that local DMVs play a larger role in formalizing the demonstration process and proposed that companies and governments engage in data-sharing.
Tel Aviv, Israel-based Mobileye, which Intel acquired in a $15.3 billion deal last April, proposed a solution — Responsibility-Sensitive Safety (RSS) — last October at the World Knowledge Forum in Seoul, South Korea. An accompanying whitepaper describes it as a “deterministic … formula” with “logically provable” rules of the road intended to prevent self-driving vehicles from causing accidents.
More recently, in October, Arizona and Intel partnered to found the Institute for Automated Mobility (IAM), a brain trust of enterprises, government agencies, and universities that will collaborate on autonomous vehicles testing in Arizona.
“The ability to assign fault is the key. Just like the best human drivers in the world, self-driving cars cannot avoid accidents due to actions beyond their control,” Amnon Shashua, Mobileye CEO and Intel senior vice president, said in a statement last year. “But the most responsible, aware, and cautious driver is very unlikely to cause an accident of his or her own fault, particularly if they had 360-degree vision and lightning-fast reaction times like autonomous vehicles will.”