Nitish Padmanaban at FIO+:LS

Nitish Padmanaban at FiO+LS 2019. [Image: Alessia Kirkland]

The technology for augmented vision and virtual reality is undeniably cool. But for some, the question “Yes, but what is it really good for?” always seems to be lurking in the background.

At a Wednesday morning session at Frontiers in Optics+Laser Science 2019, “Applications of AR/VR,” Nitish Padmanaban, a Ph.D. candidate in the Computational Imaging Lab of Gordon Wetzstein at Stanford University, USA, offered one practical answer. He did so via a walk-through of the lab’s work creating “Autofocals”—computer-aided glasses that, the team believes, might someday help vision-impaired humans regain the ability to dynamically focus on objects near and far.

Targeting presbyopia

Padmanaban’s work focuses on presbyopia—the loss of the ability to accommodate, or refocus, when moving the eye from gazing on points at different distances. The syndrome, related to hardening of the crystalline lens with age, affects some 20% of people worldwide—and the solutions tried out thus far have tended to be imperfect at best.

Progressive bifocals, for example, enable refocusing across a continuous range of distances. But the glasses take a long time to get used to and require tilting of the head to find the lens’ sweet spot for a particular object, and also show distortion on the image edges. Another solution, so-called monovision, arranges the viewed scene such that one eye’s view is optimized for far focus and one for near focus, and relies on the brain to combine the two—an approach that can fall apart at close ranges.

Eye tracking, depth sensing and liquid lenses

A better approach, Padmanaban said, would be to “have things work as your eyes are working”—that is, a system that would automatically refocus the view when the viewer switches from looking at near to far objects and back. The Stanford lab’s Autofocals unit attempts to do that through a combination of optical elements and computational power.

The system, currently a rather bulky headset, starts with an eye-tracker to dynamically ascertain where the wearer is looking. To this, the team added a depth camera, called RealSense, that combines with triangulation from the eye tracker to get a fix on the distance to the object in view.

The next element—“where the magic happens,” Padmanaban said—is a pair of liquid lenses that can be dynamically reshaped based on input data from the sensing systems. Finally, the system includes a coma corrector to offset specific aberrations in the liquid lenses, and slots for offset lens holders to adjust to individual vision variations.

Dynamic refocusing

Much of the team’s work has involved iteratively solving problems on this hybrid system, to get ever closer to a device that can seamlessly refocus as the wearer’s eyes dart to different parts of the scene. Challenges, according to Padmanaban, included mapping the frame rate of the depth camera to the faster rate of the eye tracker, and compensating for the effects of gravitational settling on the liquid lenses—the reason for the added coma corrector.

The end result, however, appears to be a system with a remarkable ability to refocus almost instantaneously. Padmanaban shared the results of several user studies, including eye-chart tests, contrast sensitivity, and dynamic tests for speed and accuracy of re-focusing; in all cases, he reported, the Autofocals performed better than progressive bifocals.

Device prototype on user

A user wearing the prototype Autofocals device in a test. Team member Nitish Padmanaban expects to soon co-launch a start-up to address size and form-factor issues and move the prototype toward commercialization. [Image: Nitish Padmanaban, Stanford University]

Toward commercialization?

After the talk, Padmanaban told OPN that the work originally grew out of other efforts of the Wetzstein group in virtual-reality—particularly solving problems related to vergence–accommodation conflict, a long-standing problem in VR systems. Some solutions to those problems that worked fine for younger subjects, it turned out, did not work for older people, who tended to be presbyopic.

“It kind of naturally evolved from there,” Padmanaban said. “If we’re using optics in VR to solve for a problem that’s oh-so-similar to presbyopia—but kind of flipped—maybe we can turn it around.”

The remaining hurdle for the system (and it’s a big one) appears to be comfort and convenience. The current system is a bulky headset including not only the eyewear but power and processing units, and the tests were conducted with the unit physically linked to a computer—not hugely practical for everyday use. Padmanaban said that, in partnership with a coauthor on the work, Robert Konrad, he expects to soon launch a start-up to work on issues of power, weight and vergence tracking to move the system closer to commercialization.