figure[Getty Images]

The International Year of Light and Light-Based Technologies will underscore the role of optics and photonics in improving the quality of life worldwide. And for the delivery of medicine and healthcare, some of those improvements may come in a surprising package: the smartphone that you carry in your pocket or purse. Advances in processing power, connectivity and sensor technology are creating opportunities for smartphones as a cost-effective, portable alternative to expensive diagnostic equipment—one that can transform aspects of medical care in low-resource and field settings [see Viewpoint by Richards-Kortum and Carns]. In this article, we survey a few interesting developments in this smartphone science, particularly those relevant to care of an aging population and to applications in eye care.

The power of smartphones

Apart from their portability, which makes them a natural for medicine in the clinic or the field, four qualities of smartphones have opened up new opportunities in diagnostics and treatment:

Processing power. Today’s smartphones have truly become “the new personal computer,” and are comparable in processing power to the high-performance personal computer systems of only a half-dozen years ago and to the supercomputers of several decades ago.

Smartphones and sensors

Whether for monitoring the health of individuals in daily life, or providing aid to people with disabilities, the smartphone gets its power from its ability to link wirelessly to sensors, process their information, and communicate that information through visual, audio, or haptic (tactile) feedback. While near-field communication, currently in use for payment transactions, is being explored for drug prescriptions or medical-office transactions, for actual connection to external devices and sensors, Bluetooth connectivity tends to be the channel of choice.

The sensors themselves are myriad, including (usually wearable) devices that measure critical health signals, such as heart, brain, and muscle activity. Both Apple and Google have introduced development software (such as Apple’s HealthKit and Google Fit for Android) to facilitate creation of health applications. Sensors such as the Apple iWatch and the iHealth phone app or Google’s Google Glass and Android Wear smartwatch, which connect sensors to the central hub of the smartphone, are relatively inexpensive and connectable via Bluetooth. Relatively inexpensive sensor kits are now available that can easily interface with a smartphone, including body sensors (Bitalino, TrueSense), body pressure sensing pads (Sensing Tex), other cameras (Raspberry Pi Camera) and motion sensors (TI SensorTag).

In addition to these external sensors, smartphones come equipped with a suite of internal sensors: the camera, inertial instruments (accelerometer and gyro), GPS and altimeters. GPS can localize a person worldwide, while the inertial sensors can capture his or her motion and activity. Smartphones are also equipped with output capabilities including standard display screen, audio and tactile feedback. External communication modalities can include braille (for the blind) or virtual-reality devices such as the head-mounted Oculus Rift.

In addition to software development kits from Apple and Google, open-source software can be leveraged. For example, the smartphone camera can be used, through the open-source computer vision library OpenCV, to develop computer software for processing images and video to create visual maps, localize the camera in the scene and detect motion, and for obtaining visual odometry, among other applications.

Ubiquity. By the end of 2014, according to the market research firm eMarketer, there were some 1.75 billion smartphone users worldwide, which creates the potential for profound worldwide penetration at the point of care.

Connectivity. Smartphones can connect with “the rest of the world” in many ways: through mobile/cellular data connections, local Wi-Fi, Bluetooth for communication with other nearby devices, and, most recently, near-field communication (NFC), which operates at a very short range for fast and secure data transmission. These global and local connectivity schemes allow smartphones to tie into the “Internet of Things”—and, in particular, into a range of sensors and output channels that can transmit vital data to and from the smartphone (see sidebar, “Smartphones and sensors,” at right).

Programmability. The final strength of smartphones, of course, lies in the apps—in the ability to use the platform’s programmability to combine commonly used tools for clinical evaluation and education into a single, easy-to-use, portable interface. Applications—which can target health care professionals, medical/nursing students, or the general patient population—have been developed to facilitate evidence-based medicine, mobile clinical communication, patient assessment and education, disease self-management and remote patient monitoring, as well as common administrative and billing tasks in the clinic.

A wide variety of smartphone applications, ranging from simple flashcards to virtual surgery applications, can provide surgical exposure and familiarization with common procedures. These apps are available for both Google Android and Apple iOS, with the latter (on iPhone and iPad platforms) more common in ophthalmological settings.

Serving an aging population

Care of an aging worldwide population has emerged as a significant societal challenge. According to the Population Division of the United Nations Department of Economic and Social Affairs, the global share of persons 60 years of age or older is expected to climb from 11.7 percent in 2013 to some 21.1 percent by 2050, with the absolute number of persons in the 60-or-older demographic more than doubling over that period, from 841 million to more than two billion. Further, this older population will increasingly reside in the less developed regions of the world, according to the U.N. And, of course, as populations age, the prevalence of non-communicable disease and disabilities, and the need for their care, will increase, putting new strains on an already overburdened health care system.

Smartphones and their interfaces can help serve the needs of this increasing elderly population. Many operating systems already have assistive-tools options such as larger text, audio volume control and even the text-to-speech and speech-to-text functionality that can help people with certain disabling conditions such as presbyopia. Applications today can use the inherent capabilities of the phone for routine functions such as reminders to take medications (Pillboxie, MedSafe), for tracking pain levels (CatchMyPain), for assisting with hearing loss (Hearing Aid with Replay), for tracking moods (MoodPanda), for learning and communicating with sign language (MobileSign) and for tracking blood glucose, carbohydrates and calories for diabetics.

This is only a small sample of applications available today; a wealth of others, yet to be conceptualized, might be envisioned that use the smartphone as the computer engine, both using capabilities within the phone itself and by interfacing with additional sensors and external networks. Fully realizing the potential of this platform for advancing safety, health and wellness of the disabled and elderly community will require careful attention to designing systems that actually address the unique interface needs of this demographic.


figureThe Near Eye Tool for Refractive Assessment (NETRA), used to diagnose ordinary refractive error. [MIT Media Lab]

Smartphones and distributed eye care

Perhaps not surprisingly for a device that prominently features a camera for obtaining visual images and increasingly bright, high-resolution screens for display, ophthalmology is emerging as an important practical area for smartphone science. And the prospects for smartphones in eye care offer an interesting window into the technology’s potential broader impact in other medical areas.

One simple example of that impact lies in the correction of refractive error, which is estimated to afflict two billion people worldwide. Relatively few of those patients have access to quality eye care, as existing solutions require a trained optometrist, expensive equipment, or both. As a result, it’s estimated that some 517 million persons have uncorrected near-vision impairment that affects their daily livelihood. Another 153 million have uncorrected far-vision impairment. Uncorrected refractive errors are the second leading cause of blindness (defined as vision worse than 3/60 in the worst eye). And, globally, about 87 percent of those affected live in the developing world. The loss of productivity associated with uncorrected refractive error is estimated at US$88.7 to US$133 billion worldwide.

Smartphone science apps: Beyond eye care

In additional to ophthalmic applications, other fields of medicine are finding ways to use smartphones to improve and monitor patient health. Here’s a sample:

Colorimetrix: In development at the University of Cambridge, U.K., this app turns a smartphone into a portable spectrophotometer. It uses the camera and an algorithm to convert color-based test results (like those used for HIV, TB and malaria) into numerical values, and enables their transmission from the point of care to centralized labs or health care professionals.

NutriPhone: The NutriPhone proptotype uses the camera and paper test strips to detect nutritional deficiencies. A test strip that runs an immunoassay plugs into the phone. The test strip changes color based on vitamin concentration, the camera records it, and an algorithm on the phone converts the data into a number used for diagnoses.

Phone oximeter: This device, currently in development, combines a low-cost finger sensor with a smartphone to provide the same functionality as a conventional pulse oximeter—measuring blood oxygen levels. Many operating rooms in the developing world do not have pulse oximeters, which are valuable in preventing anesthesia deaths.

Droplet lens: A team from Australia National University uses gravity to make affordable smartphone lenses from transparent silicone. The new method could make microscopy more available to rural and developing areas—the lenses, which can magnify objects 160× with a resolution of 4 µm, cost less than one penny apiece.

iBGStar: Currently on the market, the iBGStar tracks blood glucose levels, as many other diabetes apps do, but it also integrates the actual glucose meter into the phone, allowing people with diabetes to easily communicate blood glucose levels to their health care providers.

Smartphones offer significant possibilities for ameliorating these avoidable economic and social costs, as they can bring basic diagnostics to areas that lack trained optometrists or expensive tools. One notable example is the Near Eye Tool for Refractive Assessment, or NETRA (Sanskrit for “eye”), which uses a pinhole adapter and software to estimate the refractive error (subjective spherical equivalent). The results from NETRA, moreover, do not show a statistically significant difference from conventional subjective refraction tests.

A project of the Media Lab of the Massachusetts Institute of Technology, USA, NETRA operates through a sort of ersatz version of the Shack-Hartmann wavefront sensor, an adaptive-optics tool used for characterizing corneal aberrations and in other optometric applications. Traditional Shack-Hartmann systems in the clinic use laser light, reflected from the back of the eye; the laser and the Shack-Hartmann sensor are expensive and require careful optical alignment by a trained operator. NETRA, by contrast, uses a clip-on eyepiece and the smartphone display rather than an aimed laser light source, which significantly reduces the cost and can even allow self-evaluation, while still providing data comparable to clinical systems.

In this system, the subject looks into the display at a very close range and aligns (overlaps) patterns on the phone display. Since the light rays from these patterns pass through different regions of the visual system, the alignment task gives a measure of the optical distortions of those regions. The subject repeats this procedure for a few meridians with appropriate variation in the patterns, and the system computes the corresponding refractive error for myopia, hyperopia and astigmatism. The clip-on eyepiece costs an estimated US$2.00. Clearly, this solution could prove very useful in remote fieldwork or in rural or low-resource settings where no ophthalmologist is available.

Other methods for assessing refractive error include the technique of eccentric photorefraction, in which refractive correction is estimated from the shape and brightness of the crescent of light reflecting from the eye’s interior surface, or fundus. The phone’s camera flash is used as the source of light flooding the retina. This can be used to help assess refractive state of the eye in large groups of people very quickly, such as in the developing world, and among populations of patients, such as the elderly and infants, where there may be difficulties in communication.

Imaging the eye for ophthalmic care

NETRA is just one of a wide variety of ophthalmological applications, including more than 350 dedicated apps and more than 600 surgical applications, of which more than 120 are ophthalmological. These apps can turn smartphones into sophisticated medical devices. Several applications are available for different ophthalmological examinations that can assess various visual functions such as visual acuity, color vision, astigmatism and pupil size, as well as the Amsler grid test, which is used as a common “first-alert system” for macular degeneration. Application kits for the smartphone, such as the Portable Eye Examination Kit (PEEK), have even been developed to convert the phone into a general routine-eye-exam platform.

The camera is one of the smartphone’s more powerful built-in sensors, and the use of smartphones for ophthalmic photography has become increasingly popular. Recent smartphones have cameras with resolutions of 8 megapixels and higher, which enables users to capture high quality images. Several photoadapters available for smartphones can make them useful devices for taking images of both anterior and posterior eye segments. These photoadapters require that the smartphone’s camera be aligned with the optical axis and placed sufficiently close to the slitlamp eyepiece.


figureOphthalmic photography goes mobile. Smartphone attachments to investigate the retinal fundus include the iExaminer Pan Optic Ophthalmoscope from Welch Allyn, Inc. (left), and the Magnifi iPhone adapter from Arcturus Labs (right).

The adapter-equipped smartphone can attach to equipment such as the PanOptic Ophthalmoscope for capturing photos of the fundus through an undilated pupil. It is also possible to take quality pictures of the retina using only a smartphone and an indirect lens. The examiner can watch the smartphone display, evaluate the real-time images of the anterior and posterior eye segment with other practitioners, and record and share the findings.

The smartphone camera can also be trained on other parts of the body, to compute blood pressure, respiration rate or even heart rate. The skin color changes on each heartbeat; with a camera with significant spatial and temporal resolution, those color changes can be captured by focusing the camera on the face or a finger. This process is referred to as capturing a photoplethysmogram, or optically obtained plethysmogram, a volumetric measure of an organ usually obtained by measuring the changes in light absorption.

Rethinking the eye exam

The preceding paragraphs have only scratched the surface of the many ways mobile technology is creating new opportunities in both ophthalmology and more general medicine. Medical examinations are relatively straightforward in traditional clinical settings in the developed world, but for outpatient consultations, emergency room visits, or field use in developing countries, the advantages of smartphone applications can become significant. Numerous patient assessment tools have been developed, such as apps to assess visual acuity using the Snellen visual acuity test or modern interactive visual acuity tests for preschool children and the illiterate. Some applications have tests for color vision, astigmatism, pupil size, the Amsler grid test, oculomotor reflex, the Worth four-dot test (to test binocular vision or stereoacuity) and accommodation targets, the red desaturation test, and an optokinetic nystagmus (OKN) drum simulator, for testing eye movement abnormalities.

The increasing use of smartphones does raise some important caveats. When interpreting the examination results, the examiner/ophthalmologist needs to recognize that the testing tools are not ideally standardized and should be supplemented with professional experience and judgment. Smartphones can store a great amount of information, which makes them useful for long-term monitoring—but which also highlights the need for controls to safeguard patient confidentiality.

Taking these caveats into account, smartphones clearly have enormous potential in making medical care more available, at lower cost, and the scope of smartphone science in medicine and health care is expanding accordingly. Given the growing number of smartphones and their ever-expanding functionality, they will likely have an increasing role in patient care, both in the clinic and in the field.

Vasudevan Lakshminarayanan is with the University of Waterloo, Canada, and is also associated with the University of Michigan, USA. John Zelek is with the University of Waterloo, and Annette McBride is an independent consultant in Ann Arbor, Mich., USA.

References and Resources