Scatterings image

Jason Eichenholz at FiO 2017.

One of the four core themes of the 2017 Frontiers in Optics meeting—optics and photonics in the automotive sector—was fully on display on the Tuesday morning of the conference.

Kicking off the day in a plenary talk, Jason Eichenholz, the cofounder and CTO of Luminar Technologies, USA, took audience members on a tour of the advanced lidar technologies necessary to make the autonomous-vehicle dream a reality. And Eichenholz joined two other attendees, Anthony McDaniel, with the technical staff of Sandia National Laboratories, and John D’Ambrosia, the chair of The Ethernet Alliance, in a spirited discussion of the promise of, and challenges for, cutting-edge automotive technology.

From “feet-off” to “mind-off” systems

In his plenary talk, Eichenholz noted that the answer to the question “when will we have self-driving cars” depends on the specifics of the question. “Self-driving” can mean anything from “feet-off” vehicles in which the car’s systems handle basic acceleration and braking without human intervention, all the way to fully automated (“mind-off”) systems or even completely autonomous vehicles with no driver at all. Eichenholz’s company, Luminar, is, he said, focusing on the latter two schemes, evoking a vision of future vehicles that can, by any reasonable measure, completely drive themselves at least as well as humans can—and eventually far better.

Eichenholz framed the case for such autonomous vehicles in stark terms. “Every year, 30,000 people in the U.S. die in traffic accidents,” he said. That’s equivalent, he noted, to a passenger-loaded Boeing 737 dropping out of the sky every working day. “Can you imagine what would happen if a 737 dropped out of the sky every day?” he asked. “There’d be an uproar.”

Surprisingly, however, despite this safety imperative, Eichenholz pointed out that the lidar system used (for example) in Uber’s 2017 self-driving demo has essentially the same technical specifications as the system of the winning vehicle in DARPA’s 2007 autonomous-vehicle grand challenge. “In ten years,” he said, “you have not seen a dramatic improvement in lidar systems to enable fully autonomous driving. There’s been so much progress in computation, so much in machine vision … and yet the technology for the main set of eyes for these cars hasn’t evolved.”

Steep demands

Scatterings image

Toward a “mind-off” driving future? [Image: Getty Images]

Eichenholz spent much of his talk describing the requirements needed to take the next step with autonomous vehicles, and some of the decisions his company has made in its own quest to provide workable, cost-effective lidar for self-driving cars.

On the requirements side, the array of demands is sobering. They include, of course, a bevy of specific requirements: a 200-m range, to give the vehicle passenger a minimum of seven seconds of reaction time in case of an emergency; laser eye safety; the ability to capture millions of points per second and maintain a 10-fps frame rate; and the ability to handle fog and other unclear conditions.

But Eichenholz also stressed that an autonomous vehicle on the road operates in a “target-rich” environment, with hundreds of other autonomous vehicles shooting out their own laser signals. That environment, he said, creates huge challenges of background noise and interference. And he noted some of the same issues with supply chain, cost control, and zero error tolerance that formed the focus of an FiO presentation on Monday morning.

“If you look at all these things, the handcuffs go on in how you’re going to design your system,” Eichenholz said. “We joke, having worked on this for five years, that we’ve discovered two thousand ways not to build a lidar system.”

Toward “enabling autonomy”

Eichenholz outlined some of the approaches and technical steps that Luminar has adopted in its path to meet those many requirements in autonomous-vehicle lidar. One step, he said, was the choice of a 1550-nm, InGaAs laser, which allows both eye safety and a good photon budget. Another was the use of an InGaAs linear avalanche photodiode detector rather than single-photon counting, and scanning the laser signal for field coverage rather than using a detector array. The latter two decisions, he said, substantially reduce problems of background noise and interference. “This is a huge part of our architecture.”

Eichenholz closed his presentation with some impressive views of the performance that Luminar has achieved thus far in its lidar systems. He noted that lidar isn’t a “one size fits all” technology, and that other onboard systems, such as cameras, have to be part of the ecosystem. But “when you make technology like this,” he said, “you’re enabling autonomy.”

Changing infrastructure

Scatterings image

At the FiO 2017 edition of “Speakeasy Science,” John D’Ambrosia, Anthony McDaniel and Jason Eichenholz traded views on the promise and challenges of autonomous vehicles.

After the plenary session, discussion continued in a “Speakeasy Science” session including Eichenholz, McDaniel, and D’Ambrosia. Part of the discussion focused on the infrastructure changes in roads and urban services that will both move forward the revolution in automotive technology and be necessitated by it.

That potential revolution goes well past autonomous vehicles. McDaniel, for example, works on the U.S. government’s hydrogen fuels program, and has seen firsthand the importance of government incentives to encourage technology development. He cited in particular the efforts of the state of California to build fueling stations in an effort to jump-start hydrogen-fueled vehicles, in keeping with the state’s mission to move away from fossil fuels and “decarbonize” its economy.

Similarly, Eichenholz added, the actions of specific cities and municipalities—such as London, U.K., which Eichenholz said has “done a lot” in terms of limiting traffic in cities—can set up progress in self-driving vehicles. “Twenty or thirty years out, there will still be cars, but there will be fewer and fewer roads to drive on,” he said. “The infrastructure for cars will completely change.”

Network challenges

D’Ambrosia, meanwhile, looked at autonomous-vehicle requirements from a rather different perspective from that of Eichenholz: the perspective of the wireless and optical networks needed to support them. And he added a palpable tone of skepticism to the discussion. “I work a lot on bandwidth,” he said, “and the plain and simple truth is that, while we tend to just throw more bandwidth at our problems, we really don’t know what the bandwidth is for self-driving cars.”

In response to an audience question, D’Ambrosia acknowledged that current plans for autonomous vehicles envision much of the data load being handled on the vehicle itself, rather than in vehicle-to-cloud communications that could cripple wireless networks. But he stressed that autonomous vehicles actually involve three separate data streams—within-vehicle communications, vehicle-to-vehicle communications, and communications between the vehicle and the larger network. And in none of these cases, he suggested, have the challenges of transferring, managing and storing the vast volumes of data implied in an autonomous-vehicle future been adequately explored.

“This is a lesson we really need to take to heart,” D’Ambrosia said, “and start having these honest conversations to get to your utopia.”