diver with mantis-cam

University of Illinois researchers, using a camera inspired by the polarization-sensitive eyes of mantis shrimp, have collected subsea information that could lead to a new form of underwater GPS. [Image: University of Illinois/ Viktor Gruev]

Two recent studies led by OSA Member Viktor Gruev, University of Illinois (USA), have put cameras inspired by the eyes of invertebrate animals into service in very different settings. In one study, the research team used a camera based on the polarization-sensitive eyes of the mantis shrimp to develop a potential new approach to underwater geolocation (Sci. Adv., doi: 10.1126/sciadv.aao6841). And, in the other, scientists showed that a multispectral imager based on the intricate compound eye of the blue morpho butterfly could be used to guide cancer surgery in real time (Optica, doi: 10.1364/OPTICA.5.000413).

Creating an underwater GPS

The geolocation study relied on the “Mantis Cam,” an imager developed last year by a team including Gruev, then-grad student Missael Garcia, and several other researchers (Optica, doi: 10.1364/OPTICA.4.001263). The camera—a compact device that can record color and polarization on a single chip—is based on the complex structures in the compound eyes of the mantis shrimp, known for its polarization-sensitive, hyperspectrally tuned vision.

In the course of collaborations with marine biologists using the new camera, Gruev’s team took polarization measurements in a variety of underwater settings worldwide. Back in the lab, the researchers, including grad student and lead author Samuel Powell, ran the numbers and determined that the diverse readings from the polarization camera related systematically to the elevation angle and compass heading of the sun at the measurement location. This raised the prospect that, with the right kind of controls, the camera could act as a sort of underwater GPS, providing location data from any submarine point on Earth.

Global geolocation with 61-km accuracy

To provide those controls, the team hooked up the Mantis Cam to an electronic compass and tilt sensor, and used the device to measure polarization conditions at worldwide undersea locations, at different depths, wind conditions and times of day, according to Gruev. The researchers found that the polarization data, controlled for these variables, allowed them to determine their position on the globe to an accuracy of as little as 61 km.

While the accuracy of this proof of concept doesn’t compare with that of the modern terrestrial GPS, the researchers write that the work does establish that “underwater polarization angles can reasonably serve as a solar compass for animals with polarization-sensitive vision and can also be used to determine global location.” Powell, in a press release, suggested that as the accuracy of the system is improved, it could be used in underwater efforts to locate missing aircraft, and even for remote sensing by swarms of undersea robots, fitted with polarization sensors to provide location data.

Inspiration from a butterfly’s eyes

morpho butterfly with microchips

A team of scientists from the University of Illinois and Washington University in St. Louis has developed a chip-based multispectral imager based on the eyes of morpho butterflies, and has shown that the device can be used to provide real-time guidance during cancer surgeries. [Image: Alex Jerez Roman and Jose Luis Vazquez]

Back on land, Gruev, Garcia and colleagues were busy developing another bio-inspired device—a multispectral imager based on the photonic-crystal structures in the eyes of morpho butterflies—and exploring its use in a decidedly non-marine setting, the surgical suite.

The blue morpho is justly celebrated for its scales, which provide the animal with its beautiful iridescence. Less well known is the complexity of the insect’s compound eye, tuned over the course of half a billion years of evolution to enable the insect to acquire and integrate multispectral information. The secret of these abilities lies in photonic crystals, known as tapetal filters, within the insect’s eyes. The crystals are made up of alternating layers of air and cytoplasm and serve as interference filters sensitive to different parts of the spectrum.

The Gruev-led team was able to mimic these structures in stacks of dielectric and silicon-based photosensitive materials to create a single-chip multispectral imager. According to the team, the imager can capture both color and near-infrared fluorescence with high accuracy and sensitivity, even under bright ambient lighting; weighs in at only 20 g; and could have a manufacturing cost as low as US$20 apiece.

Detecting tumors during surgery

Those advantages could, according to the team, prove significant in a key application—image-guided surgery. The researchers note that, in breast cancer removal and other procedures, surgeons often rely on sight and touch to determine what to remove, a notoriously unreliable approach that often requires patients to come back for another round of surgery to make sure the whole tumor has been caught. Using fluorescent markers that bind to tumors as a beacon for detecting their location and extent is an emerging alternative. But this requires large, expensive equipment, and the weak fluorescent signal tends to be drowned out in the bright lights required in an operating room.

To test out the morpho-camera alternative, the researchers integrated the device into low-cost surgical goggles, and assessed its ability to pick up fluorescently labeled tumors in mice and sentinel lymph nodes (SLNs) in human breast cancer patients. The bio-inspired imager reportedly was able to identify tumors and SLNs with comparable or better accuracy relative to much larger, more expensive, non-real-time systems, under full surgical illumination.

“Under bright surgical lights, our instrument was a thousand times more sensitive to fluorescence than the imagers currently approved for infrared image-guided surgery,” Gruev noted in a press release. The researchers—who include scientists from Washington University in St. Louis and the University of Arizona, USA, as well as the University of Illinois—are forming a startup company to commercialize the new imager and move it into clinical trials.