This month’s edition of the journal Applied Optics is a special issue devoted to the contributions of the late Emmett Leith. As a tribute to his dear friend, the author discusses the significance of Emmett’s work and his own role in inventing computer-generated holography.
Adolf Lohmann Emmett Leith
In the 60 years since Gabor’s invention, holography has become much more than a way to capture and visualize 3D information. The tools of holography have been applied to many fields of optics and photonics, including optical data and image processing, invariant pattern recognition, optical fuzzy logic control and super resolution and imaging.
Despite the fact that Gabor won a Nobel Prize, for many years holography languished as a field of research. One of the reasons was Gabor’s inability to solve the so-called “twin image” problem—i.e., how to eradicate the image formed by the phase conjugate of the object recorded. Gabor’s original proposal for holography caused the two images to fall on top of one another. He struggled to solve this problem in his subsequent work but could not. Why?
In my view, Gabor’s papers were examples of physical optics, and the tools he used in his unsuccessful attempt to kill the twin image were physical tools, such as beam splitters. By contrast, Emmett and I considered holography to be an enterprise in optical information processing. The titles of our two main papers support this view: “Wavefront reconstruction and communication theory” (J. Opt. Soc. Am. 52, 1123) and “Optische einseitenbandübertragung angewandt auf das Gabor-mikroskop (Optical single sideband modulation applied to the Gabor microscope)” (Optica Acta 3, 97-100).
In our work, we considered images as information, and we applied notions about carriers from communications and information theory to separate the twin image from the desired one. In other words, our approach represented a paradigm shift from physical optics to optical information processing. This is what enabled us to revive holography.
The earliest holograms of Gabor and Leith were photographic recordings of an interference pattern. Emmett introduced his spatial carrier directly using an off-axis optical beam. I approached this problem from a different perspective. Computers were just beginning to appear in research and industrial settings, and it occurred to me that, if one modeled the interference recording, it would not be necessary to perform the experiment. However, it would still be imperative to produce some physical output to function as the hologram.
It was probably in 1964 when my optics group at IBM in San Jose hosted a young ambitious summer student from Berkeley named Byron Brown. Byron wanted to learn everything there was to know about computers and holography—in two months. As I considered projects that he could sink his teeth into, I remembered that, deep down in my desk, in a folder titled “Proposals for Later,” there was the sketch of an idea: “Making a Hologram by Computer.” Byron jumped at the opportunity. At that time, we had an IBM 7094 and a Calcomp plotter available to us. Our technicians Harold Werlich and Antonio Hafarate, as well as our computer expert Dieter Paris, helped Byron to get going.
My approach to modeling was inspired by the manner in which Lord Rayleigh fabricated his gratings—by scratching a lead-based plate at equal distances. The quality of the gratings was dependent upon the quality of a large screw called the spindle, which was rotated in small steps—say, one degree per step and 360 degrees for one “macro periodicity.” If the spindle was a bit elliptic, the grating period would be modulated slightly and generate ghost lines close to the desired lines. The “ghost” was sometimes misinterpreted as a new isotope in an emission or absorption spectrum. Thus, I chose to model the hologram as a modulated grating.
Although the modeling was easy, producing the output was not. Output devices at the time were as crude as input devices. The problem was figuring out how to take a gray-scale image and encode it in such a way that a device capable of only binary output could produce a reasonable facsimile. Fortunately, history provided antecedents for the solution.
St. Jerome in His Study by Albrecht Dürer, a native of Nuremburg, is an example of the gray scales possible using only binary output.
In medieval times, graphic artisans also were limited to binary outputs. They drew black lines with macroscopic curvatures. The overall picture would look fairly dark when the adjacent lines were close together, and light where they were farther apart. The structures were rather fine, so the human eye would perceive the local average in different shades of gray. In signal processing language, a binary fine structure simulates a continuous tone coarse pattern with analogue amplitude. “Pulse frequency modulation” is the proper term, on loan from the signal processing community for the graphic arts community.
Talbot, the clever inventor of photography (in spite of Daguerre) and of “self-imaging,” modified the graphic arts approach so that the printing industry could benefit from the “micro-binary” to the “macro-analogue” (or gray-tone) convention. He laid out his 2D array of black spots in Cartesian fashion. That made newspaper printing cheap and convenient.
Alexander Reeves initiated the scientific investigation of pulse width modulation in 1905. The 50-year anniversary of the invention of pulse modulation was celebrated serendipitously by a brilliantly written article by Oliver, Pierce and Shannon in 1948 (Proc. IRE 36, 1324-31). Even though the signals they considered were, of course, embedded in time, the step from time to space was a small one to take. We could apply pulse modulation in two spatial dimensions to produce our output.
With a binary output device, one has the choice of placing either white objects on a black background or black objects on a white background. Our choice was influenced by the work of Kastler and Wolter (Ann. Der Phys. 10, 94). In emission spectroscopy, one looks usually at bright lines on a dark background. It is known that the accuracy of the wavelength measurement can be improved by converting the bright lines, which have a sinc2 profile, into dark lines, whose profile follows 1–sinc.
The detour phase idea: Illustration from D. Hauck and A.W. Lohmann, “Minimumstrahlkennzeichnung bei Gitterspektrographen,” Optik 15, 275-7 (1958) representing the detour phase idea. In this instance, a shift by half the grating period imposes a 180-degree phase delay in the wavefront.
Dark lines appear much finer than white ones, due to the Weber-Fechner law of logarithmic perception. A. Kastler and (independently) H. Wolter converted the bright line into a dark one by imparting a π-phase shift over half of a grating. Such a dielectric phase shifter varies with the wavelength and also due to the materials variation of the thin film. That chromatic defect can be resolved perfectly if the grating is halved parallel to the grooves. A sketch of the rays from the source through the grating reveals that the phase shift is purely geometric, not refractive. We called this a “detour phase” and applied it to our method (Optik 15, 275-7).
Our basic approach to computer-generated hologram (CGH) design was to modulate each period of a binary grating with small shifts in the position of the grating line and small changes in the grating line width. The shifts were determined from the Fourier transform of the desired object to be reconstructed, since the reconstruction of the hologram occurred in a Fourier transform plane. The reconstruction was obtained in the first diffraction order of the grating.
In our basic approach, the shift of the local grating line within each period (Pnm) is proportional to the phase, while the width of the line (Wnm) is related to the amplitude of that reconstructed pixel:
Wnm = arcsin(Anm/πN) Pnm = Φnm/2πN . (1)
Anm and Φnm are the amplitude and the phase, respectively, of the reconstruction matrix in its (n,m) pixel. N is defined as follows:
N = x0δν , (2)
where x0 is the spatial distance of the first order of diffraction (where the reconstruction is obtained) and δν is the resolution of the reconstruction. Note that N functions effectively as the frequency of the spatial carrier. Various modifications of the basic approach by us and others changed the relations between the width and position of the line in a certain period to improve the reconstruction.
Byron Brown completing one of the first computer-generated holograms. (The disposition of his body is uncannily similar to that of St. Jerome’s.)
The figure on the right shows Byron Brown filling in by hand the apertures printed by our binary Calcomp plotter. This sheet was photoreduced to produce one of the first CGHs. One of the first Fourier holograms generated a letter “F.” In the spring of 1965, I presented our results at an ICO conference in Paris.
The lack of computer power at the time was also an obstacle. We needed a two-dimensional Fourier transform of an object with 36 x 36 pixels. I had to receive special permission to access the computer and pay a considerable amount of my department fund. The price for a CPU millisecond was about $20 if I remember it right. Soon, the Cooley-Tukey algorithm (now called the fast Fourier transform) became available. That made a big difference.
The “light efficiency” of our holograms was another handicap. With a binary amplitude hologram, one may be happy to achieve 10 percent light efficiency. Some lithographic tricks could have been helpful, as we explained in an IBM internal journal (IBM J. Res. Develop. 13, 160-8). However, our ideas to improve efficiency did not come to fruition since our small group (never more than four or five) dissolved for various reasons. Computer-generated holograms and other diffractive elements are now manufactured using lithographic techniques, and many of the light efficiency problems have been solved.
An early CGH and its reconstruction
The CGH is distinguished primarily from its optical cousin by the fact that the computer is able to design a hologram of a non-existent, synthetic or virtual object. While the applicability and potential of holography were clear from the work of Gabor and Leith, CGHs allowed broadening this applicability into many fields of optics and photonics.
Computer-generated holography now has many successful applications in advanced scientific and technological fields, such as optical lithography and fabrication (as in micro-electronics) and photonic manipulation of particles (optical tweezers). For instance, holography is now commonly used for the manipulation of single micro- and nano-particles for biological and biomedical research.
Emmett Leith was one of the pioneers of applying information and communication theory to an optics problem, and these influences continue as the format in which information itself is stored, transmitted, and displayed moves from one dimension to two dimensions and even three. Emmett recognized these contributions and expressed them in a historical article on the topic (in Trends in Optics, Anna Consortini, ed., Academic Press, San Diego, 1996).
I am proud to have been a part of this revolution in optics and I close with Emmett’s words, which give credit in a gentleman’s style to my early efforts:
In the 1960s, while the Optics Group was busy making its innovations, there was a parallel activity conducted by A. Lohmann, who functioned as a one-man counterpart of our coherent optics group, busily applying communications concepts to optics and thereby developing holography and optical processing. Our conditions were different; we were a large group, extremely well-funded, and with some very specific missions. The price paid for this enviable position was, at least until 1963, difficulty and long delays in publishing our work in the open literature. Despite the differences of operating modes, his and our work had strong parallels. It is interesting to speculate on what might have been accomplished had Lohmann been a member of this group.
[ Adolf Lohmann is a Fellow Emeritus and retired professor at the Universität Erlangen-Nürnberg in Germany. ]
References and Resources
>> G. Rand and C.E. Ferree. “An analysis of the visibility curve in terms of the Weber Fechner law and the least perceptible brightness,” J. Opt. Soc. Am. 6, 408 (1922).
>> D. Gabor. “A new microscopic principle,” Nature 161, 777-8 (1948).
>> B.M. Oliver et al. “The Philosophy of PCM,” Proc. IRE 36, 1324-31 (1948).
>> H. Wolter. Ann. Der Phys. 10, 94 (1952).
>> A. Lohmann. “Optische Einseitenbandübertragung angewandt auf das Gabor-Mikroskop,” Optica Acta 3, 97-100 (1956).
>> D. Hauck and A.W. Lohmann. “Minimumstrahlkennzeichnung bei Gitterspektrographen,” Optik 15, 275-7 (1958).
>> E.N. Leith and J. Upatnieks. “Wavefront reconstruction and communication theory,” J. Opt. Soc. Am. 52, 1123 (1962).
>> A. W. Lohmann and D.P. Paris. “Binary Fraunhofer holograms, generated by computer,” Appl. Opt. 6, 1739 (1967).
>> A.W. Lohmann and D.P. Paris. “Computer generated spatial filters for coherent optical data processing,” Appl. Opt. 7, 651 (1968).
>> B.R. Brown and A.W. Lohmann. “Computer generated binary holograms,” IBM J. Res. Develop. 13, 160-8 (1969).
>> A. Ashkin et al. “Observation of a single-beam gradient force optical trap for dielectric particles,” Opt. Lett. 11, 288-90 (1986).
>> E.N. Leith. “A short history of the optics group of the Willow Run Laboratories,” in Trends in Optics, Anna Consortini, ed. (Academic Press, San Diego, 1996), pp. 1-26.
>> R. Eriksen et al. “Fully dynamic multiple-beam optical tweezers,” Opt. Exp. 10, 597-602 (2002).
>> S.F. Johnston. Holographic Visions: A History of New Science, Oxford University Press, N.Y., 2006.
>> Y.–C. Cheng et al. “Extreme ultraviolet holographic lithography: Initial results,” Appl. Phys. Lett. 90, 023116 (2007).