figureApollo 11 astronaut Buzz Aldrin assembles a seismic experiment on the moon. The surface stereo close-up camera is behind Aldrin. [NASA]

Fifty years ago this month, U.S. astronaut Edward White stepped out of his spacecraft, Gemini IV, to take the first “spacewalk” by an American, an event captured in some famous photos by his crewmate, James McDivitt. The Gemini program—which explored astronaut endurance on long missions, docking and navigational techniques, and other fundamentals for long-term space travel—marked a key proving ground for the techniques and equipment used in the Apollo program, which achieved the first manned lunar landing only a few years later. And, while rocket fuel and liquid oxygen may have propelled the crews to the moon, optics formed an integral part of the great adventure.

The pilots of the Apollo command and lunar modules had sextants and telescopes for checking and correcting the inertial measurements of the spacecraft’s positions. Lasers measured the contours of the moon and the distance from the Earth to the moon, while cameras mapped the lunar surface in exquisite detail. Television cameras brought hundreds of millions of spectators along for the ride.

NASA divided up the responsibilities for the various optical technologies among several branches of the Manned Spacecraft Center (now known as the Johnson Space Center) in Houston, Texas. The center had a photographic technology division that supplied the astronauts with cameras, processed the film and occasionally salvaged some of the imaging mistakes. The guidance and control division designed the sextant-telescope combinations for navigation, and the biomedical laboratories tested materials to protect the astronauts’ vision from the harsh, unfiltered sunlight outside Earth’s atmosphere. The space agency worked closely with its contractors in the optics and electronics industries to produce instruments suitable for space travel.

Early space photography: Mercury, Gemini

As the United States prepared to send its first astronauts into space, photography was a low priority. The designers of the single-seat Mercury capsule, flown from 1961 to 1963, initially did not even want to install a window for the pilot to look out, but added one at the astronauts’ insistence.

John Glenn, who made the first three American orbits of the Earth on 20 February 1962, was the first astronaut to bring a camera into space. Since weight-conscious NASA officials objected to adding a camera to the spacecraft, Glenn carried along an Ansco Autoset 35-mm rangefinder camera as part of his personal equipment allowance, and he took about 70 photographs during his 4-hour mission. His thick pressurized gloves made operating the camera difficult, according to Richard W. Underwood, the NASA employee who served as the Apollo astronauts’ photography consultant.

Two Mercury flights later, in October 1962, astronaut Walter Schirra wanted to bring his Hasselblad 500C camera along on his six-orbit trip, according to Underwood. Hasselblads are medium-format cameras that use 70-mm-wide 120 film for sharper images than 35-mm cameras. To save every bit of weight, NASA removed the trendy leatherette trim as well as the internal mirror, focusing screen and viewfinder hood.

Although the astronauts took an active role in shaping the Mercury program, companies like Hasselblad had to be willing to participate in the space race as well, says Jennifer Levasseur, a space history curator at the National Air and Space Museum (NASM) in Washington, D.C. For adaptations of commercial equipment, NASA officials typically developed a set of specifications, looked for equipment that met most of the specifications, then worked with the manufacturer under contract.

Hasselblad cameras flew on several of the two-man Gemini missions in 1965 and 1966. At least one of the cameras stayed behind in space—during a Gemini X spacewalk (also known as extravehicular activity, or EVA) in which astronaut Michael Collins retrieved a cosmic dust-collecting panel from an unmanned craft, the Hasselblad floated away. Gemini was considered an experimental program, and behind the scenes NASA engineers were furiously developing the technology that would take humans all the way to the moon.

 

figureApollo 13 astronaut Fred Haise uses a 16-mm data acquisition camera during a training session on Earth. Note the reflection of NASA personnel in his partially deployed sun visor. [NASA]

Apollo’s film cameras and accessories

By the time Apollo began flying, imaging was an integral part of NASA’s public-information mission. When the astronauts had questions about the photographic equipment, they went to Underwood, who in 1964 had transferred from NASA’s photogrammetry lab in Maryland to the Manned Spacecraft Center in Houston.

Underwood, who died in 2010, once recounted an incident when he helped salvage something on the ground from a mistake made in space. While the Apollo 8 astronauts made the first human trip behind the moon in December 1968, astronaut William Anders overexposed a roll of film by 10 stops. Since Anders had been trying to photograph features that are invisible from Earth, he had Mission Control roust Underwood from bed in the middle of the night. Underwood and his NASA colleagues looked up some papers that Kodak research pioneer (and OSA Honorary Member) C.E. Kenneth Mees had published back in the 1920s and conducted experiments on film they deliberately overexposed, so that by the time the astronauts returned home, the photo team had figured out how to tease some grainy images out of the precious film.

“A man from his grave rescued us at that period of time,” Underwood said of Mees, who died in 1960. “So you don’t want to turn your back on your predecessors in your profession. They’ve done some remarkable things.” In 1970, the International Astronomical Union named craters on the far side of the moon for Anders, his crewmates—and Mees.

Apollo 11’s Camera Equipment

Hasselblad 70-mm camera

Features: Motor drive powered by two NiCd batteries; 80- and 250-mm lenses

Purpose: General still photography

Hasselblad super-wide Electric Data Camera

Features: Manual film advance; Reseau plate for photogrammetry; Polarizing filter

Purpose: Still photography on the lunar surface

Hasselblad film magazine

Features: 200 frames of thin-base b&w film, 160 frames of thin-base color film, or 100 frames of standard-base color film

Purpose: Protect film from light exposure and return film to Earth without the camera

J.A. Maurer 16-mm data acquisition camera

Features: Frame rate from 1 to 24 fps; Internal heater to protect film from extreme cold

Purpose: Recording sequential engineering data, including the Apollo 11 moon landing

Normal and auxiliary 16-mm film magazines

Features: 39.6 m and 122 m of 16-mm film, respectively

Purpose: Holding and protecting film

[Source: Opt. Spectra, September/October (1969)]

The Apollo 11 crew carried a substantial arsenal of film cameras. NASA and Hasselblad worked together to customize Apollo’s still cameras. According to Levasseur, the space agency needed to minimize friction inside the cameras because, in a high-oxygen environment, any kind of spark can be dangerous, as the fatal Apollo 1 capsule fire in January 1967 demonstrated.

Glass items were prohibited inside the spacecraft because broken shards could injure the crew, but NASA made an exception for camera lenses because, as Levasseur says, “you can’t really have a lens without glass.” The transition from a pressurized spacecraft interior to the vacuum of space didn’t affect the cameras, but the models that were slated for EVA use got a silver-toned, highly reflective exterior coating to reflect solar heat. For the opposite extreme, interior lubricants had to be replaced with substances with a lower temperature threshold.

NASA procured Apollo’s 16-mm motion-picture camera from a New Jersey company called J.A. Maurer. The space agency had some issues with the firm because the cameras often returned to Earth with malfunctions, Levasseur says. Still, NASA flew Maurer cameras from Gemini 4 in 1965 through the Skylab missions in the 1970s.

The 16-mm camera was considered a data-acquisition system, not a device for generating Hollywood-style movies. In at least one case, however, its black-and-white footage has become famous. Today’s documentary films about Apollo 11 superimpose the audio communications between Mission Control and the lunar module on the black-and-white movie of the approaching lunar surface as it appeared through the spacecraft’s window. The movie film was actually processed back in Houston; while the Apollo 11 landing was happening, television viewers heard the live audio but saw an animated simulation on their screens.

Perhaps the most unusual film camera aboard any Apollo spacecraft was the Apollo Lunar Surface Closeup Camera, also called the “optical walking stick,” which flew on Apollo 11, 12 and 13. Designed by Eastman Kodak, it was a 35-mm camera outfitted with an electronic flash lamp and fixed-aperture stereo microscopy lenses designed for high-resolution close-up images of the lunar surface without the need for astronauts to bend over in their heavy pressurized suits. To take a picture, astronauts held the walking stick up to a rock or clump of lunar regolith (i.e., dirt) and hit the trigger. From the landings of Apollo 11 and 12, the camera gave scientists hundreds of stereoscopic images of undisturbed lunar surface and of rocks that the astronauts didn’t carry back to Earth.

On the last three lunar missions in 1971 and 1972, the Apollo service module carried a camera that garnered much less attention than anything the moonwalkers used. According to Levasseur, one of the command module pilot’s most important jobs was to activate the panoramic camera, which rolled out of its storage bay on rails and took photos that would become intensely detailed maps of the lunar surface.

The panoramic camera, made by ITEK Optical Systems Division, was equipped with an eight-element Petzval lens and recorded film images an astonishing 1.15 m long and 0.115 m wide. A ruby-laser altimeter in the instrument bay bounced light pulses off the moon’s surface to measure the height of lunar features, although on Apollo 15 it failed after a few orbits (OPN, June 2009, p. 32). After the crews of Apollo 15, 16 and 17 departed lunar orbit for Earth, the command module pilots made short EVAs to retrieve the rolls of film, because the service module was jettisoned just before the command module re-entered the atmosphere.

 

figureStan Lebar, project manager for Westinghouse’s Apollo television cameras, shows the field-sequential color camera on the left and the monochrome lunar surface camera on the right. [NASA]

Live from the moon: Video cameras

Live television brought the excitement of lunar exploration to millions of homes and classrooms around the world. As with the still cameras, NASA had to alter the bulky video technology of the era (the CCD was invented the same year as Apollo 11) to make television cameras compact, lightweight and easy for astronauts to operate.

Westinghouse Electric Corp.’s aerospace division developed the miniaturized video cameras carried on Apollo 11: a full-color camera for the command module and a black-and-white model for the lunar module. Transmitting monochrome video for the Apollo 11 moonwalk conserved precious bandwidth for voice, medical and other telemetry data, which Mission Control monitored simultaneously. The black-and-white camera ate up only 500 kHz of bandwidth, versus 4.5 MHz for the color camera.

The Apollo 11 command module’s color camera had respectable specifications: a scan rate of 30 frames per second (fps), minimum signal-to-noise ratio of 32 db, 425 lines of resolution and a weight of just under 6 kg. The 3.4-kg black-and-white lunar camera imaged moonwalkers Neil Armstrong and Buzz Aldrin at 10 fps and a resolution of 320 lines. The scan rate and resolution were less than standard for broadcast television, so NASA had to convert the slow-scan image by displaying it on a specially designed monitor on Earth and re-filming it with a conventional television camera. Despite the ghostly image quality resulting from the kludged conversion, the first EVA on the moon was one of the most watched TV programs ever.

The Westinghouse color cameras flown on Apollo 10 and 11 were designed for use inside the spacecraft: for surveying the moon’s surface from lunar orbit and for live broadcasts to Earth from inside the command module. Both Apollo 11 cameras contained a secondary electron conduction (SEC) camera tube, which was reasonably state-of-the-art for the late 1960s.

Anyone who has ever fumbled around with a camera knows that photography requires a certain amount of dexterity. The multilayered, pressurized gloves worn on spacewalks and moonwalks greatly limited the astronauts’ dexterity, so NASA and its contractor designed the Apollo 11 cameras so that the astronauts had only to move the devices into position, change lenses and switch scan modes if necessary. Setting up the TV camera was relatively easy: as astronaut Neil Armstrong climbed down the lunar module’s ladder, he pulled a D-ring that unfolded a storage module on the side of the spacecraft and started up the camera. To minimize weight, the camera was outfitted with four replaceable fixed-focus lenses, with fields of view ranging from 7 to 80 degrees.

The Apollo 12 crew used the color camera during their moonwalk, but lunar module pilot Alan Bean accidentally pointed the device directly at the sun while trying to mount it on a tripod. The intense sunlight burned out the video pickup tube, and the astronauts brought the camera home with them so that Westinghouse could evaluate it. In subsequent lunar excursions, astronauts were told to cap the lens before repositioning the camera.

During Apollo 14’s lunar surface activity, mission controllers observed that the camera’s gain control caused the astronauts’ white spacesuits to be overexposed in television images. For the last three moon landings, NASA put a newer color camera on the lunar surface. The RCA camera featured a zoom lens, a rate of 30 fps and up to 5 MHz bandwidth, and TV viewers noticed a distinct improvement in picture quality.

 

figure

 

figureNavigating by the Stars: The Apollo spacecraft’s navigational gyroscopes and accelerometers created a slight drift that would add up over the course of the 384,000-km trip. To keep Apollo on course, astronauts used a sextant, telescope and star chart. Although their equipment was updated for space travel, the basic concepts of these ancient sailing tools remained the same. For celestial navigation, Apollo 11 astronauts would enter code numbers for select groups of stars from the chart above into the guidance computer while taking readings with the sextant. Readings from the sextant were automatically entered into the guidance computer. [National Air and Space Museum, Smithsonian Institution (USA)]

Optics and navigation

Just as Renaissance explorers steered by the stars, the Apollo spacecraft relied on sightings of stars and lunar features for proper navigation. Two optical navigational instruments complemented the Inertial Measurement Unit, which contained accelerometers and gyroscopes.

Aboard the Apollo command module, the Optical Unit Assembly (OUA) consisted of a sextant and telescope with more than 50 optical elements. Its counterpart on the lunar module, the Alignment Optical Telescope (AOT), had 23 optical elements. Each instrument “had to be reliable and accurate and very precise and rugged, and it had to be operated by someone wearing spacesuit gloves—not always, but in case of an emergency they would be wearing their spacesuits,” says Paul Ceruzzi, NASM’s curator of guidance and navigation systems.

Although the gyroscopes and accelerometers could maintain the spacecraft’s direction and velocity on their own, they would slightly drift, according to Allan Needell, NASM’s curator of Apollo spaceflight. Radio signals from Earth could have provided necessary corrections, according to Needell, but NASA did not want to depend solely on them in case the Soviet Union jammed those signals (after all, the space race was still running). The optical instruments helped the astronauts correct the guidance system using bright guide stars.

The AOT was a telescope in name only, Needell says; it had a magnification of 1. It was fixed at a 45-degree angle to the lunar surface and could be turned to one of six preset positions going around the lunar module. The astronaut using the AOT simply identified a star in the viewfinder, measured how far the star was from the center of the instrument’s field of view, and typed the parameters into the spacecraft’s computer.

The Kollsman Instrument Corp. built the OUA, the AOT and a 21-optical-element rangefinder for the command module. The company created the rangefinder, which was used for rendezvous and docking between the command and lunar modules, in just 53 days so that it could fly on Apollo 9, the March 1969 test flight of the lunar module in Earth orbit. The specially designed rangefinder used the fixed size of the lunar module to gauge its distance from the command module during critical maneuvers.

Visors, lunar experiments and ground-based optics

Without protection from the atmosphere, astronauts on EVAs would expose their eyes to dangerous levels of ultraviolet light and wind up with photokeratitis, also known as welder’s burn or snow blindness. The spacefarers needed their helmet visors to guard their ability to distinguish colors as well. The Lunar Extravehicular Visor Assembly, as it was dubbed in NASA-speak, also had a gold-plated sun visor made of polysulfone; its reflective surface is familiar to anyone who has seen photos of the moonwalks.

On Apollo 11, 14 and 15, astronauts left behind a retro-reflector experiment on the lunar surface. The device, built by Bendix with optical expertise from Arthur D. Little Inc. and Perkin-Elmer Corp., consisted of an array of high-precision corner reflectors designed to reflect light back along the same path with as little angular spread as possible. Scientists built the corner reflectors out of synthetic fused silica that could withstand both temperature changes from –150 °C to +130 °C and bombardment from cosmic and ultraviolet rays. The retro-reflectors remain on the moon to this day, enabling precision measurements of the Earth-moon distance limited only by atmospheric scatter and instrumental uncertainties.

During and after Project Apollo, other optical technologies played key roles back on Earth. For instance, since 1970, a hydrogen maser based at Goldstone, California, has served as the stable frequency source for the master atomic clock of NASA’s Deep Space Network. Although the agency’s Manned Space Flight Network was generally sufficient for communications with Apollo spacecraft, the bigger dishes had to track Apollo 13 during its life-threatening emergency.

Legacy of Apollo’s optical technology

Some of the gear manufactured for the Apollo program, but not used due to the cancellation of the last three moon missions, flew on the three Skylab missions and 1975’s Apollo-Soyuz Test Project. In particular, the Apollo Telescope Mount (ATM), a solar observatory that flew on the Skylab space station, grew out of efforts starting in the mid-1960s to find additional uses for the special technology developed for the lunar program. In 1977, Applied Optics devoted a large part of one issue to the calibration of the ATM and the ultraviolet spectroscopic experiments carried out with the instrument.

Earthbound scientists still bounce laser pulses off the retro-reflectors that the Apollo astronauts left behind; the results have provided tests of general relativity and placed tighter constraints on variations in Newton’s gravitational constant, G, over time.

Thanks to the miniaturization of electronics and the advent of digital photography, images from the International Space Station, the Mars rovers and other spacecraft exploring the solar system look better than ever. Whenever the next humans travel to another lonely rock orbiting the sun, they will use the lessons from their predecessors to send back ever more stunning pictures.


Patricia Daukantas is a freelance writer specializing in optics and photonics.