Skip To Content
ADVERTISEMENT

Sizing Up Markets for Autonomous Systems

Alexis Debray photo

Alexis Debray of Yole Développement at FiO+LS 2019. [Image: Alessia Kirkland]

In a Monday-morning session at Frontiers in Optics+Laser Science 2019, representatives from a market-research firm and an autonomous-systems trade organization offered differing perspectives on the emerging markets for such systems. The two talks agreed on one thing, however: There’s a lot of potential growth out there.

Sensors for self-driving vehicles

The first speaker, Alexis Debray, from the semiconductor technology market-intelligence firm Yole Développement, Lyon, France, addressed the market for sensors in autonomous vehicles.

Debray limited the scope of his talk to four kinds of autonomous systems: advanced driver-assistance system (ADAS) cars, robotic cars, autonomous trucks and flying taxis—leaving out other systems under active development such as delivery robots, autonomous cranes and autonomous marine vessels. In addition, he focused only on four sensor systems bearing on optical technology: lidar, visible-light cameras, far-infrared cameras and inertial measurement units (IMUs), or gyroscopes—passing over non-optical sensor systems, such as radar, that could also be important in autonomous vehicles.

Two roads to autonomy

Even that limited scope left Debray plenty to talk about, however. He began by noting that four macro-trends in the auto industry—the move toward electric cars; ridesharing; increased vehicle connectivity (particularly in 5G networks); and a tilt toward active-safety systems—all are combining to push autonomous driving forward.

Debray sketched out two paths being pursued to get to that autonomous-vehicle future. One—which he called the “classical” road, largely adopted by traditional car manufacturers—is gradual implementation of ADAS systems, to move cars up the chain to full “Level 5” autonomy over time (see “Lidar for Self-Driving Cars,” OPN, January 2018). The other road, followed by companies like Waymo, Uber and Tesla, seeks to leapfrog the intervening steps and move directly to robotic vehicles that can be fully autonomous.

At present, the two approaches represent radically different-sized markets, Debray said—100 million cars per year for ADAS carmakers, versus a few tens of thousands of robotic cars to date. And they also, he suggested, call for very different sensor systems.

Sensor by sensor: Lidar …

Alexis Debray photo

[Image: Alessia Kirkland]

Debray next drilled down into the various sensor systems, beginning with lidar, where he noted that “a lot of things are happening today.” These include Audi’s move last year to become the first car manufacturer to include lidar in an ADAS car, and Waymo’s decision this year to form an arm to commercialize the lidar systems it’s developing for its own autonomous vehicles. The ADAS and robotic approaches are looking at different tech with different times to market, according to Debray.

Robotic vehicles are still using mainly mechanical lidars, but solid-state scanning lidar and “flash” lidar that captures a full scene in a single shot could come online in 2020 and 2021. Slightly longer term, a more advanced and potentially promising technology, optical phased arrays, could arrive on the market by 2025, Debray predicted.

For the lidar market as a whole, Yole Développement is looking for a blistering compound annual growth rate of 64%, with lidar revenues moving from US$216 million in 2018 to some US$4.2 billion by 2024. In Yole’s model, the ADAS carmakers will account for the lion’s share of the total revenue, but lidar for robotic cars will still show substantial growth off of a currently relatively small base.

… and everything else

For visible-light cameras, meanwhile, Yole is looking for slower growth—on the order of a 16% compound annual rate, to US$6 billion by 2024. Much of that expansion, Debray said, may stem from an increase in the number of cameras per vehicle. ADAS cars were featuring five cameras per vehicle by the mid-2010s; Tesla has recently moved the total to eight, and Debray says the number could eventually rise to 11.

“There is a big difference in the camera technology used for ADAS cars versus robotic cars,” Debray added, with the automotive market favoring cameras developed specifically for cars, and the robotic-car makers tilting toward industrial and machine-vision cameras used in smart factories. In the latter approach, he said, the unit cost is higher—but the devices are more advanced and easier to calibrate, and could thus have a long-term edge.

Far-infrared cameras are an interesting wild card in the deck. Previously used only for passive monitoring, these cameras are now increasingly being pressed into service—especially by robotic-car makers—for use in distinguishing living organisms (pedestrians and animals) from inanimate objects. Increased adoption could push this market up at a compound annual rate of 34% over the next five years, in Yole’s model—but here, the total numbers are somewhat smaller, with overall sales expected to reach only US$299 million by 2025.

Finally, IMUs—which range from silicon-MEMS devices, which currently sport very poor stability and are used mainly in consumer applications, to fiber-optic and ring-laser gyros that have excellent stability—could experience 25% compound annual growth, to US$892 million, by 2024. Debray suggested that the market could increasingly move toward silicon-MEMS platforms as stability continues to improve.

Air, sea, land

David Klein photo

David Klein of AUVSI at FiO+LS 2019. [Image: Alessia Kirkland]

The morning session’s second speaker, David Klein, is a research analyst at the Association for Unmanned Vehicle Systems Inc. (AUVSI). Klein stepped back and took a broader look at the market for autonomous unmanned vehicles generally. These include systems in the air domain (drones and larger unmanned aerial vehicles), the maritime domain (ranging from remotely operated vessels to autonomous underwater and surface vessels), and the ground domain. The latter category, he said, includes mobile robots that do “the dull, dirty and dangerous jobs,” ranging from security to explosives detection to delivering food, as well as more complex systems like autonomous vehicles.

Vehicles in these three domains, Klein said, are sold into five broad markets—academic, civil, commercial, military and consumer—with many, many sub-markets in each. As an example of possible future growth in autonomous vehicles generally, he reviewed a number of use cases, such as e-commerce delivery, infrastructure inspection and humanitarian aid, that together could drive substantial growth of unmanned aerial vehicles in particular. The growth of these systems has also fueled substantial innovation activity; Klein noted that more than 80% of all patents in air-domain autonomous systems were captured in only the last three years.

What about the optics?

In a short Q&A session that followed the talks, Klein fielded a number of questions about what the growth of autonomous systems as a whole means for optical sensors in particular. Klein observed that sensors, including optical sensors, are one of the things that actually make a drone valuable. On its face, that suggests that optical sensors could capture a lot of the value of drone market growth.

But as the subsequent discussion brought out, the actual price premium that might be paid for the optics is apt to vary widely from application to application. Some high-value-add products could command premium prices for some applications—such as lidar for guidance of remote vehicles inspecting chemical plants, where crashes could prove costly and catastrophic—while behaving like commodity items in other application spaces. That could make the net impact of the growth of autonomous-vehicle platforms on the optics market a tough call. “It truly is on a case-by-case basis,” Klein said.

 

Publish Date: 16 September 2019

Add a Comment