Over the last four decades, spaceborne lidar instruments have evolved beyond their original application—altimetry—to tracking glacier melting, gauging wind speeds and spotting snow on Mars.
Artist’s conception of the Lunar Reconnaissance Orbiter surveying the Moon.
When laser-ranging technology was first developed in the 1960s, no one could have envisioned that it would become a key tool for exploring Earth and its neighboring planets from space. From hundreds of kilometers away, lidar systems can give scientists a broad overview of features that would have escaped notice from lower altitudes.
Whether the platform is a spacecraft, an aircraft or a base on the ground, the components of lidar systems are the same: a laser and scanning optics, a receiver (often a telescope equipped with a photodetector) and position and navigation systems. As in radar, with its radio-frequency pulses, lidar sends out laser pulses that reflect off a certain target and register as a signal back in the system’s detector. Frequencies range from near-infrared to ultraviolet, depending on the need.
In space, lidar systems have taken careful measurements of other solar system bodies and have even collected data on martian weather. From low Earth orbit, lidar has tracked the shrinkage of glaciers and the movement of clouds, and, in the next few years, the technology will tackle the three-dimensional profiling of wind—a critical missing component from short-term meteorological and long-term climate-change models.
Space lidar started with the ruby laser, but the Nd:YAG laser has become the light source of choice, especially since its fundamental, doubled and tripled frequencies can sense aerosol particles as well as macroscopic surfaces. In meteor-ology, geology and even forestry, lidar systems flying on airplanes and spacecraft complement each other in terms of ground-coverage time and field of view.
One big difference between spaceborne and airborne (or ground-based) lidar is the expense of the former. An instrument that flies into space must incorporate electronics hardened against radiation and must be “space-qualified” to withstand the rigors of launch, extremes of temperature and the harsh vacuum of space. All that extra testing comes with a hefty price tag.
Space lidar systems must perform better than ground-based and airborne systems because of the difference in altitude. Lidar performance decreases as the inverse square of the distance between the instrument and its target surface. Airborne lidar might fly 10 km off the ground, but spaceborne systems will orbit at altitudes of several hundred to 1,000 km. Given the inverse-square law, space lidar systems must have 1,000 to 10,000 times better performance than airborne ones. Thus, to fly in space, lidars need more powerful laser transmitters or bigger receiver telescopes.
A properly working lidar system aboard a spacecraft will operate constantly and will pass several times a day over the same region, with slightly different tracks. Thus, researchers can look at changes over time periods and track how a variable surface, such as a melting glacier, changes its height with time. On the other hand, satellites fly along fixed orbital paths and do not capture what is going on between those tracks. The satellite data give scientists broad overall coverage, especially in interior regions of the ice sheets; aircraft provide more detail and are less expensive to fly. Ultimately, space-based observations that are global and limited in spatial and temporal resolution need to be calibrated with and validated against regional observations from the ground or aircraft.
The laser is still the biggest risk in any spaceborne lidar instrument, said Geary Schwimmer, a former NASA researcher now with Science and Engineering Services Inc. (Columbia, Md., U.S.A.). “You’re concentrating a lot of energy in materials that break down and are damaged as a result of that energy,” he said. When a human technician isn’t around to remove contamination from the laser optics, the laser could break down.
(Top) Apollo 15 command module Endeavour, in a photo taken from the lunar module Falcon in lunar orbit, shows the instrument bay containing the lunar mapping camera and ruby lidar. (Bottom) The Apollo lunar mapping camera on a lab bench prior to flight. The lidar occupies the lower left portion of the instrument.
Early lidar experiments
NASA’s last three Apollo missions to the moon each carried laser ranging experiments—possibly the first lidar systems to fly into space.
While his colleagues explored the lunar surface for four days in mid-1971, command module pilot James Worden spent four days alone in orbit, running the Scientific Instrument Module (SIM) in the Apollo 15 command module. Among other experiments, SIM contained a laser altimeter that used a Q-switched ruby laser giving off 200-mJ pulses.
Worden collected altimeter data for approximately four and a half orbits before the laser, which had been operating intermittently, failed altogether. The same type of experiment collected data for seven and a half lunar orbits aboard Apollo 16 in April 1972 and for 12 orbits aboard Apollo 17 in December 1972.
Used together with a mapping camera for calibration, the altimeter measured the height of the command module above the surface of the moon to ± 2m. Previous studies of the lunar cross-section, based on shadows in Earth-based photographs and ground-based radar, had much larger errors, especially near the limb (the apparent visual edge of the lunar disk). Ultimately, the data helped lunar scientists to estimate the relative altitudes of highlands, plains and craters in unexplored regions of the moon, particularly on the far side, and to analyze the ellipsoidal shape of the moon and find its center of gravity.
Lidar experiments on the space shuttle
One of the first lidar systems to examine Earth from space was the Lidar In-space Technology Experiment (LITE), which orbited on the space shuttle Discovery in September 1994. The experiment was designed primarily to prove that modern lidar technology, with Nd:YAG lasers and, increasingly, solid-state photodetectors, could work in space. LITE nevertheless provided new views of the cloud structure of the atmosphere and gathered much information about tropospheric and stratospheric aerosols.
The LITE backscatter lidar, under development since 1988 at NASA Langley Research Center, included two flash-lamp-pumped, Q-switched 1,064-nm Nd:YAG lasers, frequency-doubled and -tripled to 532 and 355 nm. (Two lasers were flown for redundancy; LITE used only one at a time.) To collect the reflected beams, the instrument had a 1-m telescope and beamsplitters to direct the incoming light into the detectors (two photomultiplier tubes and an avalanche photodiode).
LITE operated for more than 220 hours during the space shuttle mission and gathered the first global measurements of the height of the planetary boundary layer. LITE also measured the directional reflectance of the sea surface and its dependence on the speed of surface winds, which in turn drive the ocean’s waves.
A second diode-pumped, Q-switched Nd:YAG lidar system, the Shuttle Laser Altimeter (SLA), was a secondary payload on two space shuttle missions. SLA-I flew aboard Endeavour in January 1996, and SLA-II rode Discovery in August 1997. Operating for about 80 hours at an altitude of 305 km, SLA-I measured Earth’s topography to a vertical resolution of 0.75 m.
Schematic view of a spaceborne lidar. A short laser pulse is emitted towards the atmosphere where air molecules and particles reflect a small portion of the light pulse back to the lidar. A telescope collects the light and directs it to the receiver. The signal is recorded as a function of time to determine the altitude of the scattering layers.
SLA-II’s team at NASA Goddard Space Flight Center (Greenbelt, Md., U.S.A.) assembled the instrument from spare parts from the Mars Orbiter Laser Altimeter (MOLA) project. Unlike its predecessor, SLA-II’s receiver contained a variable-gain amplifier, which allowed the experiment’s ground-based operator to avoid the detector saturation that had happened to SLA-I occasionally. NASA has posted data sets from both SLA missions online.
The SLA experiments showed scientists how to analyze on-orbit laser ranging measurements of the Earth’s surface and to perform laser-pulse waveform analysis to assess overall surface roughness and tree heights in forested regions.
Lidar altimetry around the solar system
The unpiloted U.S. lunar spacecraft known as Clementine, which flew in 1994, was most famous for finding evidence of a large water ice deposit in a crater near the moon’s south pole. However, Clementine also carried a lidar altimetry experiment that used the modern Nd:YAG laser technology unavailable during the Apollo era.
Clementine’s lidar transmitter was a frequency-doubled Nd:YAG laser; the spacecraft caught the reflected pulse in its high-resolution camera equipped with a silicon avalanche photodiode detector (APD). The compact (2.37 kg) system collected data for nearly three months at a vertical resolution of about 40 m; the spacecraft ran out of electrical power about five months after launch. Combined with the satellite’s multispectral imaging data, Clementine’s lidar altimetry provided a detailed and uniform topological map of the moon between latitudes 60 degrees north and 60 degrees south.
Lidar measurements of lunar topography made by the Clementine mission.
Although lidar measured the moon in the 1970s and the 1990s, the quest for more precise data continues. The recent Japanese satellite known as Kaguya or Selenological and Engineering Explorer (SELENE) carried a laser altimeter as well as radar and a terrain camera.
Polar-orbiting 100 km above the lunar surface, SELENE’s Nd:YAG laser started gathering topographical data in late 2007. The science team produced a global topographical map with a spatial resolution of better than 0.5 degree and used the data to model the mechanical properties of the lunar lithosphere. The spacecraft also found new features within craters near the lunar polar regions.
In November 1996, NASA launched its first spacecraft to the Red Planet in two decades. The Mars Global Surveyor (MGS) carried MOLA, a lidar system that produced a precise pole-to-pole topographic map of the martian landscape over the course of a Mars year (687 Earth days).
Orbiting at an average altitude of 378 km, MOLA found that the northern hemisphere of Mars is flatter than the southern hemisphere and that the planet’s south pole is about 6 m higher in elevation than its north pole. Despite the “flatness,” the lidar survey revealed the northern plains have subtle ridges that were probably caused by tectonic activity. The MOLA instrument also measured seasonal variations in the carbon-dioxide “snow” cover in the polar regions.
The Mars Orbiter Laser Altimeter (MOLA) measured the Red Planet’s global dimensions. In this 90-degree 3D view, the large blue spot is the massive Hellas impact basin in Mars’ southern hemisphere.
Part of MOLA’s laser stopped working in mid-2001, but a near-infrared sensor on the instrument continued to study cloud coverage on the Red Planet. Communications between Earth and MGS failed in November 2006 and NASA officially ended the mission two months later.
For NASA’s Phoenix mission, which landed near the north polar cap of Mars in May 2008, the Canadian Space Agency built a meteorological station with a lidar system looking up at the martian atmosphere from ground level. Last fall, the lidar detected snow falling from martian clouds about 4 km above the landing site, although the snow vaporized before it hit the ground.
MESSENGER, the NASA mission en route to the planet Mercury, is carrying a laser altimeter to gather the most precise topography yet of this small, rocky body, which was surveyed only by a flyby spacecraft in 1974 and 1975. MESSENGER will insert itself into orbit around Mercury in March 2011.
ICESat and CALYPSO
With the ICESat satellite, launched in 2003, NASA deployed lidar technology for environmental studies of Earth, rather than simple altimetry. ICESat (or the Ice, Cloud and land Elevation Satellite) is part of the U.S. space agency’s Earth Observing System mission. Designed and built at NASA Goddard, the Geoscience Laser Altimeter System (GLAS) aboard ICESat measures ice-sheet topography and changes in cloud and atmospheric properties.
Not long after reaching Earth orbit, though, GLAS started experiencing problems with its three lasers (three had been considered sufficiently redundant for a planned five-year mission). Over the years, each of the lasers has failed at least once, and one of the three is now completely inoperable. As a result, GLAS is collecting data for two brief windows per year. The ICESat science team is trying to conserve the spacecraft’s remaining laser lifetime in order to stretch out the time-series of its ice-sheet measurements as far as possible.
As far as engineers on Earth can tell, the GLAS laser troubles were caused by diode pump modules that failed due to contamination problems. In the cold vacuum of space, indium solder in the device grew “whiskers” that in turn led to electrical shorting.
The GLAS instrument was also the first spaceborne lidar to have a large, fixed-focus, meter-class beryllium telescope coupled to a complex detector package. Initial results, before the laser problems revealed themselves, demonstrated that photon-counting detection at 532 nm could be an effective way to study the atmosphere from space.
ICESat was the first mission that was specifically designed for measuring ice sheets, said Robert H. Thomas, a glaciologist at NASA’s Wallops Flight Facility (Wallops Island, Va., U.S.A.). With ICESat, scientists have been able to measure the inland retreat of the so-called grounding line that marks where an ice sheet slides off land and flows horizontally over the ocean.
Greenland’s ice sheet is thinning quite rapidly in places, but ICESat misses some of those basins. “So if you only have ICESat data, you don’t know those areas are thinning,” Thomas said. Aircraft have the ability to fly at specific times and altitudes and can measure ice-sheet thickness as well as elevation. Some of these glaciers are thinning fast—by several tens of meters per year—and glaciologists want to understand why. Thomas uses both aircraft and spacecraft data in his research.
(Top) The flight model of the Phoenix Mars Lander’s lidar just before it was placed on the spacecraft during assembly operations. (Bottom) The lidar experiment for the Lunar Reconnaissance Orbiter (LRO).
In 2007, the National Research Council’s survey of the Earth-science community ranked a follow-on ICESat-II mission as one of its highest priorities for the next decade. NASA’s Web site for science missions says that the project is expected to launch in 2015. According to Thomas, the future lidar would have a multi-beam laser that would make more accurate measurements of sloping surfaces. The GLAS beam footprint—70 to 100 m across—caused larger than desired errors in measurements of slopes of more than a few degrees.
In April 2006, NASA launched the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) spacecraft. CALIPSO is a joint mission of NASA and the French space agency, Centre National d’Études Spatiales (CNES), but NASA Langley built the laser for its lidar experiment. The lidar system provides high-resolution vertical profiles of aerosols and clouds.
Currently CALIPSO has the highest-power laser in space, said Upendra N. Singh, chief technologist of the systems engineering directorate at NASA Langley Research Center (Hampton, Va., U.S.A.). It performed well until March, when ground crews detected a small pressure leak in the container holding the instrument’s primary laser. The lidar was shut down while the crew switched the instrument over to its backup laser, but it was expected to operate again by late April.
Wind measurements from space
The next frontier of lidar is getting a better picture of terrestrial wind patterns, especially in places where scientists can’t easily place an anemometer or wind-speed gauge, like the middle of the ocean.
“Wind is one of the most critical things that hasn’t been measured from space,” Singh said. Today’s meteorologists can predict two or three days out reasonably well, but forecasts of five to 10 days in advance are less certain. Weather scientists would like to improve the accuracy of their five- to 10-day forecasts to the current level of the two- to three-day forecasts, but one of the critical missing components is a three-dimensional profile of wind.
Even across the United States, government meteorologists have numerous weather stations but sparse coverage of the vertical wind field and thus wind gradients with altitude. Over the Pacific Ocean, in the polar regions and throughout the southern hemisphere, huge regions of the Earth’s atmosphere have no wind-field coverage. And, even though climate models have a longer-term focus than the “operational” weather forecasts, they too depend on understanding wind-field patterns at high horizontal and vertical resolutions.
Another key issue with wind monitoring is timeliness. Wind fields are quite ephemeral, and scientists need to capture them globally quite often. A satellite in low Earth orbit will complete 15 circles of the planet every day.
According to Singh, there are two ways to measure wind speeds: with aerosols and with air molecules. Sensing backscatter from aerosols works in Earth’s atmosphere below altitudes of
8 or 9 km. Above that altitude, one can use shorter wavelengths to scatter laser light off the actual molecules in the air to get the wind speed. In meteorological circles, this is called direct detection, Singh said.
The European Space Agency (ESA) has adopted the direct detection strategy for the Atmospheric Dynamics Mission (ADM-Aeolus), a spacecraft that had been planned to launch in 2008 but is now scheduled for 2010. The ESA experiment will use a 355-nm Nd:YAG laser to measure wind at the molecular level. It will operate in “burst mode,” in which the laser will make one complete observation consisting of 700 shots over 7 s, followed by 21 s of downtime. The satellite will travel 50 km during each observation.
The ESA satellite’s high-repetition-rate laser will be at least five times more powerful than the one on CALIPSO, but the ultraviolet light is very corrosive, and contamination inside the laser itself could be a problem. According to Singh, the team was awarded the ADM-Aeolus mission first and then started development of the laser technology. Now the rest of the mission is ready to fly, but it still awaits completion of the laser. Thus, Singh advocates for keeping in-house development capability for lasers within NASA because of the space agency’s unique requirements.
“I’m not saying that NASA should be building all the lasers, but we should have the knowledge to make space lasers or how to make the laser appropriate for the space-based environment,” Singh said.
Singh cited the fate of the Space Readiness Coherent Lidar Experiment (SPARCLE), a planned shuttle payload that was supposed to test the technology for detecting tropospheric winds. The project was canceled in its second year (1999) because the company that was supposed to build the laser could not do so, he said.
In 2001, Singh started NASA’s Laser Risk Reduction Program, which has been developing better spaceborne-laser technologies as well as the methods for testing them prior to launch. According to Singh, NASA Langley has been developing the 2-µm laser—both new laser crystals for high power and new laser diodes for long lifetimes—for using aerosols as a space-lidar target. Singh spearheaded an agreement to pursue a hybrid approach: a spacecraft that will use 2-µm lidar to investigate the troposphere, up to about 8 km, and 1,064-nm lidar to study wind movement from
4 km up to 15 or 20 km. The project, to be called 3D-Winds, is slated for the 2016-2020 timeframe in the U.S. earth science decadal survey.
Beyond ADM-Aeolus and 3D-Winds, Europe and Japan are collaborating on another cloud- and aerosol-monitoring spacecraft called Earth Clouds, Aerosols and Radiation Explorer (EarthCARE), scheduled for launch in 2013. Plans call for the satellite to carry both active sensors (backscatter lidar and radar sounders) and passive detectors (a multi-spectral imager and a broadband radiometer). From 450 km above the home planet, EarthCARE’s lidar will use its high spectral resolution at 355 nm to discriminate between Rayleigh scattering from molecules and Mie scattering from aerosols and cloud particles in the lowest 20 km of the atmosphere.
This spring, NASA is launching the Lunar Reconnaissance Orbiter (LRO) with a lidar instrument that will take a high-resolution map of the moon’s entire surface with the specific intent of identifying sites for future human landings and lunar bases. LRO will also search for laser pulse reflections from the possible water ice deposits in the deep polar craters; large stores of water, if they exist, would be a huge boon to human space exploration.
LRO’s laser pulses will make a new multiple-spot pattern on the lunar surface, thanks to a diffractive optical element that splits the single laser beam into multiple beams directed downward. This paves the way for surface lidar to start imaging topography, instead of profiling topography and reconstructing it later from the one-dimensional scans.
Undoubtedly, other lidar-wielding spacecraft will continue to explore Earth and its sister planets in the years and decades to come.
Patricia Daukantas is the senior writer/editor of Optics & Photonics News.
References and Resources
>> W.M. Kaula et al. Proceedings of the Third Lunar Science Conference (Supplement 3, Geochimica et Cosmochimica Acta), 3, 2189-204 (MIT Press, 1972).
>> W.L. Sjogren and W.R. Wollenhaupt. Science 179, 275 (1973).
>> S. Nozette et al. Science 266, 1835 (1994).
>> J. Bufton et al. NASA 1995 Shuttle Small Payloads Symposium, p. 83-90 (SEE N96-13754 02-12).
>> D.M. Winker et al. Proc. IEEE 84, 164-80 (1996).
>> Y.Y. Gu et al. Appl. Opt. 36, 5148 (1997).
>> R.T. Menzies et al. Appl. Opt. 37, 5550 (1998).
>> P. Withers and G.A. Neumann. Nature 410, 651 (2001).
>> U.M. Singh and M.J. Kavaya. Proceedings of the 22nd International Laser Radar Conference (ILRC 2004), held 12-16 July, 2004, in Matera, Italy, ed. by G. Pappalardo and A. Amodeo (Paris: European Space Agency, ESA SP-561, 2004), p. 49.
>> H. Araki et al. Science 323, 897 (2009).