Skip To Content
ADVERTISEMENT

Computing’s Future, Along Three Paths

Heike Riel

Heike Riel, speaking at FiO+LS 2018.

In the opening talk at the Tuesday morning plenary session of the 2018 Frontiers in Optics­­/Laser Science meeting in Washington, D.C., the focus was squarely on the future—and, in particular, on the computers that will help drive it. Heike Riel of the IBM Research Frontiers Institute in Switzerland laid out a number of different pathways along which the basic hardware technology of computers continues to evolve, to meet a continually changing suite of needs.

Keeping the performance gains going

Riel—who apologized to the optically predisposed plenary-session audience that her talk would be “more about electrons than photons”—began her look at the future with a glance at the past, and in particular at the jaw-dropping growth in computer power, efficiency and cost-effectiveness made possible by increased integration.

In 1958, she pointed out, the first integrated circuit had two transistors; the IBM P9 IC, unveiled last year, sports more than 5 billion transistors. But since 2005, she added, “smaller” no longer necessarily equals “faster and cheaper,” which means that “we have to use a lot of tricks” to get to the next level of performance.

Making those performance leaps, Riel suggested, can happen via a number of avenues. One is moving to hybrid or 3-D architectures on the chip itself—such as FinFET architectures, in which the transistor channel is extended from 2-D into a 3-D, nanometer-scale fin (enabled, Riel noted, by the optical technology of lithography). To get to even greater integration, IBM is looking at stacked silicon nanosheet transistors, with thicknesses of 5 nm and gate lengths of 12 nm. And integrating III-V materials onto silicon substrates will enable even more functionalities, including electro-optical and photonic components on chips.

Neuromorphic architectures—and their demands

Beyond increasingly refined systems using existing technologies, Riel said, continued progress will come with new computing architectures—especially the “neuromorphic” computing architectures that are springing up to empower deep-learning and machine-learning applications.

computing data center

[Image: Getty Images]

These latter architectures, she noted, and the workloads they support, have some interesting trade-offs with more traditional computing workloads. While deep-learning deals with tens of petabytes of data, for example, the applications tend to be “noise tolerant,” so they’re potentially amenable to more flexible, low-precision approaches that could reduce overhead. Indeed, reduced-precision or mixed-precision approaches for deep learning can, Riel said, buy you tenfold to fifty-fold efficiency gains. Riel and her colleagues are currently working with a variety of approaches, including phase-change analog memory elements, to speed up matrix multiplication, a big bottleneck in these deep-learning systems.

Quantum computing, for scientists and the masses

No picture of future computing would be complete, of course, without quantum computing. Riel views quantum as a complementary architecture to the conventional and neuromorphic approaches—and one that might be the only way to get a purchase on “hard” or intractable problems for machine learning and cryptography, optimization problems (such as the proverbially hard “traveling salesman” problem), and simulating dense quantum-mechanical systems.

“Quantum computing is now at a stage where it can solve fundamental problems,” said Riel. “We’ve reached a point where exciting progress is being made.” But it’s important, she added, that people not just focus on the number of qubits that a system can support. More important is the system’s “quantum volume”—which she defined as “the useful amount of computing that can be done with a device before error masks the result.” So achieving long qubit coherence times (on the order of a millisecond), high connectivity, and fast gate speeds will all be important—and these are all things, Riel noted, that IBM is working on.

IBM is also, Riel added, active in bringing quantum computing to a mass audience, through its IBM Quantum Experience project—which she called “the first quantum computer on the cloud.” The Quantum Experience offers online access to a quantum machine including up to 16 qubits—along with a “developer ecosystem” and educational and tutorial materials.

The Quantum Experience has run continuously since May 2016; Riel said that, within the first two weeks after launch, a paper had been published that used the system. Since then, she added, more than 3 million experiments have run on the system—and more than 70 scientific publications have come out of it.

Publish Date: 18 September 2018

Add a Comment