What It’s Like To Ride The Streets of Silicon Valley In Nissan’s Autonomous Electric Car

The Japanese car giant is teaching its vehicles to deal with the most challenging road hazards of all—such as those pesky pedestrians.

Ordinarily, the most notable thing about Nissan’s Leaf hatchback is that it’s the world’s most popular all-electric car. But on Thursday afternoon, I went on brief excursions around Sunnyvale, California in a couple of Leafs which were anything but ordinary. Both were test vehicles which had been modified by the Nissan Research Center to do almost all the driving themselves.

I was able to be chauffeured by autonomous Nissans because I wasn’t attending CES in Las Vegas, where self-driving cars were one of the hottest topics. One of the vehicles at the show, an Audi A7 sedan, even drove itself all the way from Palo Alto, California. That journey, well chronicled by Wired‘s Alex Davies, involved 500 miles of highway driving.

My two trips only added up to around 20 minutes, in an area full of office parks and other business establishments; about the most interesting thing we saw along the way was the headquarters of Nissan’s Silicon Valley neighbor Yahoo. But surface roads present driving challenges that highways generally do not. The Leafs needed to be able to handle intersections, with and without stop signs or traffic lights. They had to understand the concept of railroad crossings. They couldn’t be fazed by pedestrians, joggers, bicyclists, or skateboarders—all of which we encountered on our quick jaunts around Sunnyvale.

Leaf #1 uses a roof-mounted Velodyne Lidar unit to see the world around it

“For us, autonomous highway driving is not research anymore,” says Maarten Sierhuis, the director of the Nissan Research Center Silicon Valley and a veteran of both NASA and Xerox PARC. He points out that the company’s Infiniti Q50 is already available with adaptive cruise control and active lane control, which take much of the work out of highway motoring. It’s city streets which are still a major project for Sierhuis and his colleagues, including AI experts in the Sunnyvale office and automotive engineers at Nissan’s Japanese Research Center.

“What we’re working on is 2019, 2020—what the company is thinking of bringing out in a product,” Sierhuis says. Some of that work will be done in collaboration with his old employer, NASA: Nissan just announced a five-year partnership with the U.S. space agency to collaborate on autonomous-vehicle development.

(Almost) Autonomous Driving

The two Leafs I rode in had much in common with each other, including machine-vision cameras; a Linux-based system with a flat-screen display which visualizes the road, vehicles, pedestrians, and other items; and electronic componentry stuffed into every available nook and cranny, including the glove compartment and armrest cubbyhole. The big difference was that one of the cars, dubbed Leaf #1, had a giant spinning Velodyne Lidar unit mounted on its top, letting it capture the best, widest-range possible view of the world around it. Leaf #2 lacked that $70,000 gadget, and therefore required the person at the wheel to check for oncoming traffic at stop signs, then push a button on the steering wheel when it was safe to proceed.

Leaf #1 and Leaf #2

I sat in the front passenger seat—which, since these very special Leafs were based on models designed for the Japanese market, was on the left side—while a Nissan engineer took the driver’s seat. In many ways, the trips we took felt like my time in a Google self-driving car last May. For all the little indications that a computer was using math to make driving decisions, the most remarkable thing about the overall experience was how unremarkable it was.

True, there were moments when the Leafs seemed to exhibit a Spock-like rationality that few human drivers do. At one stoplight, we were behind a BMW and a Yahoo employee bus. When they crept up impatiently in anticipation of the light turning green, we stayed put. At another point, one of the Leafs also barreled down one Sunnyvale street in a way which, though I’m sure it was well within the speed limit, did not display any of the tentativeness which a flesh-and-blood motorist might have demonstrated in the same situation.

Mostly, though, the cars seemed to do pretty much what you or I would do when taking the same route. I never felt endangered or even unnerved, and sometimes had to peek to confirm that the Nissan employee wasn’t actually driving.

Google Maps vs. Sparse Maps

Though the end result was similar to my trip in a Google car, it’s important to note that Nissan and Google are applying distinctly different strategies to teaching cars to drive.

Google has been showing off a cartoony mini-car prototype that the company wants to be so truly self-driving that it doesn’t even have a steering wheel. It’s not talking about who will manufacture these cars or where or when anyone will be able to buy one.

Nissan, by contrast, wants to offer something it calls Autonomous Drive in what are otherwise typical Nissan, Infiniti, and Renault vehicles by the year 2020. It’s okay if Autonomous Drive does most, but not all, of the heavy lifting of driving, which is why Nissan draws a distinction between autonomous vehicles (which it’s working on) and self-driving ones (like Google’s prototype).

The Google car I rode in was able to navigate its way around the company’s hometown of Mountain View so confidently in part because it could rely on obsessively detailed maps which Google has created. It knew the area cold, down to the locations of curbs and driveways. How Google cars will handle streets in cities which the company hasn’t mapped to a fare-thee-well remains unclear.

“Basically, Google builds a 3-D world wherever they drive,” says Sierhuis. “They know how high the traffic light is off the ground.” Nissan, however, uses what it calls sparse maps. They’re less detailed, and based on third-party data. “Part of the research is to figure out how sparse the sparse map can be, and what information needs to be in it,” Sierhuis says.

“I don’t have a strong negative opinion about Google’s approach. If they can make it work, who am I to say it’s a stupid approach? But that company is trying to solve a vastly different problem than we’re solving. They’re really a data company. They might have other corporate reasons to have the whole world digitized.”

Work In Progress

Tooling around Sunnyvale in an autonomous Leaf, it’s easy to get giddy: If I’d been able to put a down payment on one on the spot, I’d have been sorely tempted. But if the technology was ready for consumers, it would already be here. Sierhuis says there’s lot of work left to be done, especially when it comes to handling less-than-optimal situations.

“If you don’t have a perfect map, how do you know where you are on the road? Once you have the right information, the actual calculations are not that hard. But you need centimeter-level accuracy. If your steering command is half a millimeter off, you might end up in the other lane.”

And it’s not just about machine vision and math. Autonomous cars need to be trained to make decisions that make sense to everybody on the road. Sierhuis gives an example: “If you see a car in the middle of the road standing still, how long do you wait before you go around it? If you do it at a traffic stop, people might start yelling at you.”

For all the pleasure that AI and robotics researchers take in cracking the code of making cars autonomous, some of the questions Nissan is trying to answer—such as the basic value proposition of such vehicles to average consumers—are impossible to answer in the lab. “We don’t know what people want, because we don’t have that many people experiencing it yet,” Sierhuis says. “It’s part of the reason why it’s good to bring this technology out in phases, to learn.”

“I know it’s hard to build autonomous vehicles. It isn’t easy, but it’s happening anyway, whether we like it or not. I personally think it’s the greatest challenge I can help the world achieve. One of the greatest challenges—I’m not that narcissistic.”

[Photos: Ian Merritt]

 

 

Fast Company, Read Full Story

(174)