figure[Velodyne]

The race to develop safe, inexpensive, fully autonomous vehicles is proceeding at full throttle, and photonics is destined to play a central role. In addition to radar and sonar, self-driving cars will undoubtedly rely heavily on photonics technologies found in new cars today, including cameras, LEDs, diodes, displays and optical sensors that help drivers park, stay in their lane and maintain speed.

Now, a new generation of photonics technologies is almost ready to take the lead in the suite of sensors required to replace humans at the wheel. Many expect light detection and ranging technology (lidar), the remote sensing approach that bounces laser signals off of surfaces to calculate distances and map the environment, to become the key reliable sensor in a wide variety of applications in all kinds of conditions.

Lidar promises to “see” where humans can’t—in shadows and in darkness; some systems offer additional perspective through foliage, fog, rain and dust.

Lidar promises to “see” where humans can’t—in shadows and in darkness; some systems offer additional perspective through foliage, fog, rain and dust. Lidar sensing technologies are primarily used to measure distances to surrounding objects for navigation and mapping. In the complex, futuristic world of robotic transportation, systems must also gather and analyze information on direction of motion, relative instantaneous velocity and reflectivity in a moving vehicle. For more than a decade, the commercial leader in that space has been Velodyne (San Jose, Calif., USA), whose large, complex spinning sensors adorned the top of many early autonomous vehicles. These first-generation lidar systems are limited by range, which limits vehicle speed, and exceptionally high cost in the tens of thousands of dollars.

Next-gen lidar technologies are pushing the bounds of physics to create 3-D point clouds (x, y, z) over time, maximizing range, FOV and resolution at the fastest frame rates. Autonomous driving is the most extreme application propelling next-gen lidar development, because the system must instantly discern subtle differences and act on them in real time—distinguishing, for example, a white sky from the side of a white diesel truck, or black ice from the black rubber of a tire in the road. The stakes are considerably higher than they are for, say, the basic lidar system supporting a robot vacuum. Mistakes are unacceptable.

Autonomous driving is expected to greatly improve safety and convenience in transportation, and ultimately to save money by eliminating paid human drivers in ride-sharing and commercial trucking. Optimistically, robotic vehicles will enable a world in which anyone—children, elderly, physically or mentally disabled—can safely “drive” without licensing or the need to own, maintain or operate vehicles. Time spent commuting can be put to more efficient use. Real estate used for parking lots and garages could be leveraged for other purposes.

In April, French market analyst firm Yole Développement estimated that the total global lidar market in 2018 was valued at US$1.3 billion. The company’s LiDAR for Automotive and Industrial Applications 2019 report predicts that in just five years, the market could grow nearly five-fold to reach US$6.4 billion by 2024. The report expects the biggest segment of the market will be in robotic, fully driverless vehicles, with an impressive compound annual growth rate (CAGR) of 55%, reaching US$2.8 billion in the year 2024. The lidar market for advanced driver automated systems (ADAS) vehicles with driver assistance could exhibit twice that growth, an astounding 113% CAGR, garnering US$1.4 billion by that time. Other applications expected to benefit include drones, security, 3-D terrestrial and aerial topography, industrial automation, mining, tracking, security, wind energy and agriculture.

Tesla vs. the world?

Despite lidar’s projected growth and a beehive of focused development, Tesla’s Elon Musk sparked headlines in April with his pronouncement that “lidar is a fool’s errand” in the track toward fully autonomous vehicles. In fact, the eccentric billionaire dismissed the technology as “expensive” and “unnecessary,” adding that “anyone relying on lidar is doomed. Doomed.”

Musk espouses the idea that—with the help of eight strategically placed, inexpensive cameras, a front-facing radar and ultrasound sensors—the data trove collected over millions of miles of driving will eventually enable artificial intelligence (AI) systems, neural networks and machine learning to make driving decisions as well as humans. Musk has even suggested that Tesla may launch “full self-driving” features, including responsiveness to traffic lights and stop signs on urban streets, by the end of this year via a remote software update.

Musk’s version of full self-driving, though, in this case comes up short of current industry standards. And Tesla’s “Autopilot” system—a somewhat misleading name, given that the system is only semi-automated and requires driver awareness—has been linked to several high-profile fatal accidents.

The vast majority of car companies don’t seem to share Musk’s views on lidar, and are instead well along in the development of ADAS, incorporating lidar as a critical part of a safe system where no system failures are allowed. Even if it takes decades for regulations, infrastructure and other sensor modalities to evolve to accommodate fully robotic vehicles, many automakers have targeted 2021 for self-driving debuts. In spite of the daunting task, the transportation transformation is already in full swing.

figureLuminar’s lidar system acquires a 3-D point cloud (250 points per square degree) in real time via a 1550-nm amplified fiber laser and a custom InGaAs chip. [Luminar]

Multiple players

Automaker Volkswagon/Audi AG subsidiary Autonomous Intelligent Driving (AID; Munich, Germany) has collaborated in the past year with several lidar developers on ADAS systems and autonomous vehicles including Luminar (Palo Alto, Calif., USA), Valeo (Paris, France) and Aeva (Mountain View, Calif., USA). In an industry first, Audi announced in late 2017 the incorporation of lidar into the front bumper of its high-end 2019 Model A8, in addition to radar and sonar sensors, an infrared camera, and front and rear cameras that provide a 360-degree view of the nearby surroundings.

The multiple sensors are integrated and controlled by the company’s zFAS processor, the effective nerve center that leverages AI and machine learning for perception and computer interfacing. The technology enables a much-touted Traffic Jam Pilot (TJP), the first feet-, hands-, and eyes-off “Level 3” (L3) autonomous system (see “Levels of driving automation,” below) that can handle in-lane braking, steering and accelerating from standstill at speeds up to 60 kph (37 mph). While Germany approved on-road testing of L3 vehicles in 2017, and other EU regions are working on it, a lack of coherent federal regulations caused Audi to defer rollout of the TJP option in the U.K. and the U.S.

figure

U.S. drivers of the 2019 Audi A8 will still see the first lidar available in a commercial vehicle, albeit in the form of an L2 (feet-off, hands-off) adaptive cruise-control system that steers, accelerates and fully brakes while the driver watches. Audi plans to incorporate the A8 technology as a future option in its Q8, A7 and A6 models. Audi has announced long-term plans to incorporate a lidar-enabled system for L5 full autonomy by 2021. Other vehicle brands planning to incorporate lidar in upcoming models include BMW, Ford, GM Cruise, Honda, Hyundai, Nissan, Mercedes, Porsche, Toyota, Volkswagen and Volvo.

figureAudi’s laser scanner is built into the front bumper of the 2019 Model A8. The system uses a rotating mirror at 750 rpm to scan an eye-safe 1550-nm beam over a 145-degree FOV out to 80 m. The results are fused with that of other sensors on the car to measure distance to objects around the vehicle. [Audi AG]

Some companies—Ford, Jaguar and Volvo among them—aim to skip L3 automation altogether, in which humans are still at the wheel and the driver is still ultimately responsible for re-engagement. Skipping straight to L4 (mind-off), in which the driver can focus on other tasks until the vehicle exits its autonomous route, would decrease the risk inherent in a distracted driver. Even so, analysts estimate that fully autonomous, L5 vehicles (in which drivers can essentially take a nap) may not be commonplace on the roads for 15 to 20 years.

In April 2017, Google’s self-driving car affiliate, Waymo (Mountain View, Calif., USA), began limited testing of a self-driving taxi service in the Metro Phoenix, Ariz., USA, area. In May this year, the company announced a partnership with the ride-sharing company Lyft. A small number of Lyft users are able to select a Waymo self-driving vehicle from the Lyft app for eligible rides. In fact, ride-sharing companies Uber and Lyft have not yet been profitable, and their long-term success depends on eliminating human drivers.

Waymo is also testing a fleet of lidar-enabled autonomous commercial Class 8 trucks in California, Georgia and Arizona. A fleet of 600 Waymo vehicles are now in operation nationwide, incorporating Waymo’s Laser Bear Honeycomb sensor, a large car-top scanning lidar system with a 360×95-degree FOV. By 2021, Uber and Baidu also plan to incorporate robotic vehicle programs.

figureWaymo has begun testing of a lidar-equipped driverless ride-sharing service in Phoenix, Ariz., USA (left), and of automated Class 8 trucks (with drivers on standby in the truck) on highways in California and Arizona. [Waymo]

Next-gen lidar mileposts

In pursuit of the automated car, dozens of start-ups have recently emerged, each with different sensing approaches, to develop 3-D (point cloud) lidar sensors with a smaller footprint, lower cost, higher range, lower noise and better resolution than today’s commercially established lidar systems. Industry analysts estimate 70 new next-gen lidar startups exist currently.

Many companies on the scene are keeping so tight a lid on what they’re doing that it’s difficult to glean the progress of the various approaches, much less the actual performance specs. News reports reveal that numerous upstarts are in prototype stage, quite a few are venture-backed and pairing up with Tier 1 partners, and a handful are beginning to usher products into production.

Self-driving startup Aurora (Palo Alto, Calif., USA) emerged from stealth mode in early 2018 when Hyundai and Volkswagen announced a collaboration with the company. A year later, the company received US$530 million in series B funding. In May, the company announced plans to acquire lidar manufacturer Blackmore (Bozeman, Mont., USA) and its lidar technology using 1550-nm eye-safe lasers, frequency-modulated continuous wave (FMCW) distance measurement and mechanical scanning.

In March, Lumotive (Bellevue, Wash., USA), a company backed by Bill Gates and Intellectual Ventures, surfaced from secrecy with a novel lidar design featuring 905-nm lasers and a silicon-based chip incorporating waveguides, phase shifters and tunable liquid-crystal metamaterial antennas. The company is currently working on the prototype and plans to ship samples by the end of 2019.

Sense Photonics (Durham, N.C., USA) stepped out of the shadows in June with US$26 million in funding to build a prototype solid-state lidar emitter using flash illumination and a large, convex, flexible array of thousands of tiny lasers. The company also announced a partnership with Infineon and an unnamed Tier 1 automaker.

Lidar startup Ouster (San Francisco, Calif., USA) received US$60 million in March and announced plans to start shipping by year-end its mechanically scanning system with two custom CMOS chips—one with 64 vertical-cavity surface-emitting lasers at 850 nm and the other with 64 single-photon avalanche detectors (SPADs)—and multi-beam flash-type illumination. In May, they announced a two-week lead time guarantee for delivery of the first two sensors of any customer’s order.

Another lidar startup, Innoviz (Rosh Ha’ayin, Israel), announced in April 2018 a partnership with BMW to produce solid-state lidar for the German luxury vehicles to reach autonomy by 2021. In June, the company closed a Series C round of venture funding worth US$170 million.

The renowned Nanophotonics Group of OSA Fellow Michal Lipson at Columbia University, N.Y., USA, has spun-out a silicon photonics lidar company called Voyant Photonics with a US$4.3 million venture round along with continued funding from the Defense Advanced Research Projects Agency (DARPA) under the Modular Optical Aperture Building Blocks (MOABB) program. The Voyant approach uses FMCW and an optical phased array (OPA) for on-chip beam steering.

The primary objective of the company is to shrink lidar onto a chip that fits on a fingertip and is exponentially scalable to affordable volume production. “Lidar systems today are large and expensive,” says Lipson. “But when you integrate things on a chip, the cost and size are drastically reduced.”

What’s the buzzword?

The cohesive common theme in all this behind-the-doors maneuvering is integrated lidar—that is, squeezing all the components into a chip-scale 3-D lidar system that has as few moving parts as possible. Whatever the application, the fewer moving parts to manage, the more reliable the system. To adapt systems to the level of cost and reliability expected in the automotive industry, lidar companies including Velodyne are jumping on the integrated lidar bandwagon. In addition to improved reliability, smaller sensors with fewer moving parts can more easily be fused into a single control system with other sensors, as well as attractively streamlined into a bumper.

Luminar developed its own photonic integrated circuit (PIC) with a 1550-nm eye-safe amplified fiber laser and a tiny custom InGaAs APD as the receiver. After partnering on the chip design with Black Forest Engineering (Colorado Springs, Colo., USA), Luminar acquired the company in 2018. The packaged chip enables a pixel density of 250 points per square degree resolution down to the horizon, which approaches camera vision. The single laser/receiver architecture greatly improves the potential economics of the system, which could cost an unprecedented $3 per receiver to build. This cost–performance–scalability trifecta, if realized, will enable lidar to be not only a powerful part of the sensor suite, but the most powerful sensor in the vehicle—in fact, the primary one.

Luminar has recently announced partnerships with Toyota Research Institute, Volvo and Audi to design long-range, chip-based 3-D lidar systems for fully autonomous vehicles by 2021. Currently, AID is conducting tests of an autonomous fleet on roads in Munich with Luminar’s lidar sensors.

If no moving parts is the goal, the ultimate Holy Grail is a fully solid-state chip-level laser scanner leveraging very low-cost silicon photonics chip fabrication. According to Louay Eldada, CEO of lidar company Quanergy (Sunnyvale, Calif., USA), true solid state means keeping the cost low via well-understood, mature semiconductor fabrication and packaging capabilities. A solid-state lidar system might involve silicon-based chips for the emitters, receivers and process layers, each of them with photonics embedded on the same chip with the application-specific integrated circuits (ASICs).

Without mechanical scanning, the disadvantage of solid-state lidar is that the horizontal FOV is typically 120 degrees or less. Aesthetic integration of sensors inside the body of the vehicle further limits the FOV to approximately 100 degrees. But chips are cheap. A solid-state lidar in automotive applications could reach 360 degrees if desired, via multiple sensors on various parts of the car.

A very promising design for a solid-state lidar system involves a technique lifted from radar systems: the phased array. The optical phased array (OPA) starts with a single laser source split into an array of beams that interfere to create a beam-formed signal. Adjusting the phases of the individual beams steers the signal. The OPA approach uses a laser source integrated into a PIC for fast scanning of low-divergence, wide-angle beams and can be fabricated in a CMOS foundry.

In November 2018, Quanergy began manufacturing its integrated solid-state lidar sensor, which uses inexpensive 905-nm laser diodes steered via an OPA as the emitter. The SPAD receiver detects the returning signal, and the processor calculates distance to an object from time of fight (ToF). Currently, Quanergy is developing multiple grades of solid-state sensors ranging from 0 m (no blind spot) to 150 m. The emitter suffers from high absorption loss as the beam goes through the OPA and the chip itself. A longer range requires a stronger signal to reach the detector, so the biggest challenge is to minimize the insertion loss to increase the range.

With its current outdoor lidar sensor range of 50 m, Quanergy’s technology may not be prepped to be the primary sensor in autonomous driving. Still, the company is working with Tier 1 automotive partners on short-range applications.

figureQuanergy’s mechanical lidar is used to monitor security, count people and precisely track wait times in airports and in other public spaces with varying levels of illumination across 5,000 sq. ft. (up to 200-m range indoors), with a 360×20-degree FOV. Future lidar systems will be solid-state, on-chip systems. [Quanergy]

Beyond automotive

Even with powerful partnerships, it takes time to earn third-party certifications and regulatory compliance, which can deplete all those hard-earned venture dollars. While next-gen lidar prepares to leap the high bar set for automotive safety, many companies are turning to other, less-demanding lidar applications beyond autonomous vehicles to grow their business.

While next-gen lidar prepares to leap the high bar set for automotive safety, many companies are turning to other, less-demanding lidar applications to grow their business.

Citing the need to spur growth and propel business, Waymo announced in March that it wants to make its Laser Bear Honeycomb lidar system available to companies in robotics, security and agriculture. Quanergy, too, has announced several projects with non-automotive partners. The company has partnered with UAV-based 3-D mapping company LiDAR USA (Hartselle, Ala., USA) to develop a mapping system for drones that use lidar to navigate during flight. On the security and surveillance front, Quanergy began testing lidar systems with local law enforcement agencies in Texas and New Hampshire in 2018. Also, the U.S. Department of Homeland Security granted Quanergy US$200,000 to develop lidar to provide security and tracking of wait times in airports.

According to Eldada, Quanergy has always intended to build lidar for applications beyond automotive. “Probably the most exciting application for me beyond automotive is robotics,” said Eldada. “Our customers contact us with new applications almost every week. It’s an exciting, growing space.”

In a lecture at the Massachusetts Institute of Technology, USA, in February entitled, “Deep Learning for Self-Driving Cars,” Waymo’s director of engineering Sacha Arnoud explained the adage called the “90-90 rule,” which is known in coding circles: “When you’re 90% done, you still have 90% of the work left.” According to the saying, when it appears that a project is 90% complete, the last 10% will require 10 times the capabilities, 10 times the team size, 10 times the level of testing and 10 times the overall quality of the system. The adage pertains nicely to the effort of attaining widespread self-driving cars—90% might get you a working demo, but that last 10% to get an industrialized product on the road is a major leap.

Even as the sector refines next-generation lidar technology, some of the biggest challenges facing robotic vehicles lie in effectively managing and applying data to develop the AI part of the system. Mastering that task is a major goal for safety, which would help smooth the way to removing regulatory and legal barriers. Smooth or not, the investors and tech players have gone all-in on full autonomy.


Valerie C. Coffey is a freelance science and technology writer and editor based in Palm Desert, Calif., USA.