figureWesley Snipes in a scene from the film “Blade,” 1998. [Photo by Amen Ra Films/Getty Image]

That, biscuit boy, is a UV lamp,” threatens Wesley Snipes as the vampire-hunting hero in the 1998 movie “Blade.” “We’re gonna play a game of 20 questions. Depending on how you answer, you may walk out of here with a tan.” Blade, dressed confidently and cooly all in black, is armed to the teeth in this scene, including a gun hanging from his hip and a sword strapped to his back. Yet he threatens the vampire with ultraviolet (UV) light.

In the past 20 years or so, a pop-culture trope has emerged that has identified UV as the component of sunlight responsible for causing vampires to burn and disintegrate under solar illumination. Now, Hollywood is cutting out the solar middleman and using UV sources as convenient, portable and deadly weapons against the fanged fiends. But why, you may ask, UV?

The real story of how UV sensitivity came to be vampire kryptonite reveals the way that scientific discovery proceeds in the public arena.

What is the difference between incandescent light, fluorescent light, candlelight and sunlight? Natural sunlight has a little extra ultraviolet kick. Therefore, UV light must be the fatal component. Why else would vampires be able to glide unsuspiciously and without harm through artificially lit rooms, yet burn in the sun? Since the 1990s, that pseudoscientific idea has been weaponized against vampires, and UV light has taken its place beside the wooden stakes, crucifixes and holy water that fill the vampire hunter’s arsenal.

In previous investigations about spectroscopy and the knowledge of spectra in pop culture, I’ve been surprised at how few instances there were. Yet here, the concept is rooted in esoteric knowledge of spectra that was nurtured in the mainstream world of entertainment. The real story of how UV sensitivity came to be vampire kryptonite reveals, I think, the way that scientific discovery proceeds in the public arena.

Disintegrating in sunlight

Vampires were not always harmed by sunlight—this notion is a byproduct of the 20th century. We first saw a vampire disintegrating in sunlight in F.W. Murnau’s 1922 silent horror film “Nosferatu.” In the film—an unauthorized adaptation of Bram Stoker’s Dracula—the vampire Count Orlok is closely associated with pestilence and plague. The female protagonist, Ellen Hutter, reads that the vampire can be destroyed if he is detained until sunrise, so she sacrifices her blood to distract Orlok until daybreak.

figureGerman actor Max Schreck, as the vampire Count Orlok, being destroyed by sunlight, in a still from F.W. Murnau’s “Nosferatu,” 1922. [Photo by Hulton Archive/Getty Images]

Too late, Orlok realizes that he has been tricked, and he vanishes in a puff of smoke when sunlight touches him. There may have been a touch of the notion that sunlight makes the best disinfectant in the way the pestilential vampire is destroyed by sunlight (see “The Best Disinfectant,” Optics & Photonics News, March 2019, p. 24.)

Many film historians leave it there, apparently secure that Murnau’s introduction of the concept made it part of the canon. But vampire films are filled with variations on the standard myth, adopted either for the sake of the story or for the beauty of a shot, that never became accepted tenets of vampire lore. Why did this one stick?

Florence (Balcombe) Stoker, Bram Stoker’s widow, was in deep financial straits when the film was released, and was furious that permission had not been asked nor royalties paid. She brought legal action against Prana Film, the makers of “Nosferatu,” and tried to have every copy destroyed. Had “Nosferatu” been the only film using the idea, the trope of the sun as vampires’ Achilles’ heel might never have caught on.

Horror revival

The reason that didn’t happen, I suspect, was World War II. There was a renaissance of horror films at Universal Studios in the 1940s, in which the monsters created in earlier decades were revived and their stories retold. These fantasies provided a diversion from the real-life horrors of war. And so, when it came to killing vampires, the studios may have wanted to get away from the bloody impalings that killed Dracula in 1931 and his daughter in 1936.

When Universal made “Son of Dracula” in 1943, directed by Robert Siodmak and written by his brother Curt, it went in a different direction. The Siodmaks had emigrated from Germany, and Curt was likely familiar with his countryman Murnau’s film. Dracula’s son, played by Lon Chaney, Jr., dies from sunlight irradiation in the movie, which provided wartime audiences with a bloodless death for the supernatural creature. One year later, Siodmak used the same method to kill off John Carradine as Count Dracula in “House of Frankenstein.”

Siodmak was not the only one weaponizing sunlight against vampires at the time. The year 1943 also saw the release of “Return of the Vampire” from Universal’s competitor, Columbia Pictures. This one starred Bela Lugosi, Universal’s original Dracula, as Armand Tesla, effectively Dracula by another name. He, too, is destroyed by sunlight at the end of the movie.

Again, the idea might have flamed out, but 15 years later yet another studio, Hammer Films, had Count Dracula (played by Christopher Lee) succumb to sunlight at the conclusion of “Horror of Dracula.” Having been featured in five films by three different studios, the concept took root, and ever after sunlight dissolution rivaled staking as the preferred method of vampire elimination, not only in films, but on television, in novels and in comic books.

We all scream for sunscreen

How do we get from sunlight to UV light specifically? The answer could lie along an unexpected byway—the development of sunblock creams, or sunscreens.

Another outcome of World War II was that many people from northern climates found themselves in places with harsher sunlight and longer days, and they were getting burned. A lot of research went into developing protective creams to alleviate sunburn. This effort continued after the war.

In 1956, the German scientist Rudolf Schulze came up with a way to measure and categorize the effectiveness of different sun protections. He evaluated commercially available protective creams by exposing treated skin to light from Osram-Ultra-Vitalux lamps, which duplicated the solar spectrum, and measuring how long it took for the skin to redden. His “Schulze Factor” was the first comparative basis for gauging the effectiveness of sunscreen.

Another German professor, Franz Greiter, extended the work. An avid mountain climber and skier, he was badly sunburned in the mountains, so he prepared his own protective sunscreens. In 1974, he published a paper introducing what was called, in English, the Sun Protective Factor, or SPF. The SPF was accepted as the basis for rating sunscreens by the United States Federal Drug Administration (FDA) in 1978, and the FDA publicized the idea of the SPF and its value.

From UV rays being harmful to people to UV rays being harmful to vampires is not a huge leap—nor is the complementary idea that, if sunblock with a high SPF will protect people, it will also protect vampires. In 1996, Paul Barber, author of the excellent book Vampires, Burial, and Death, which exposes some of the roots of vampiric legends and traditions, quoted Stephan Kaplan as saying that “vampires can come out in the daytime; they just need to wear sunblock of 15 or higher.” Kaplan, who founded the Vampire Research Center in 1972, was speaking mainly about humans who believed themselves to be vampires, but the phrase could as easily apply to the supernatural beings. The notion was also mentioned in the July 1986 issue The Magazine of Fantasy and Science Fiction.

The seed was planted. The concept of effective sunblock was embedded in the collective consciousness, and the idea that vampires could profit from blocking harmful UV rays with adequate SPF inevitably followed. The first fictional instance of vampires using sunblock as protection that I am aware of is in HBO’s “Tales from the Crypt” horror comedy “Bordello of Blood” in 1996. The idea soon showed up in the “Blade” movie trilogy, and after that, it took off.

UV lamps, bullets and bombs

As for the idea that ultraviolet light itself, separate from sunlight, is harmful to vampires, the first expression that I’ve come across was in the Rosicucian Digest in 1977. By 1985, it was being casually discussed on “The Science Fiction Radio Show,” and was even mentioned in a couple of books in 1990.

After 2000, the idea started to spread like wildfire. The “Underworld” movie series has a culture of werewolves using “ultraviolet bullets” (which somehow emit UV light after being shot into a vampire) to destroy their vampire enemies. Author Christopher Moore has a vampire-fighting kid outfit his jacket with ultraviolet LEDs in the vampire trilogy, Blood-Sucking Fiends, Bite Me and You Suck.

The graphic novel Thirty Days of Night, as well as its subsequent movie adaptation, depicts humans fighting off vampires with ultraviolet lamps after being trapped above the Arctic Circle (where night can last for months—a vampire’s ideal location). The movie “Van Helsing” features a sort of flash-bang grenade (probably a magnesium bomb) that eliminates an entire room full of vampires at once. This century has seen ultraviolet light become fully weaponized in pop culture’s anti-vampire arsenal.

Interestingly, the rise of sunblock and the effort to educate the public about the teratogenic properties of solar ultraviolet light provides a new wrinkle in the technological war between vampires and humans. (Ever since Stoker’s Dracula, at least, vampire stories have been about the supernatural creatures adapting to emerging technologies.) This cultural phenomenon also provides a new explanation for the specific quality that makes sunlight so deadly to vampires—the same thing that makes extended sunlight exposure harmful to humans, UV rays.


Stephen R. Wilk is with Xenon Corp., Wilmington, Mass., USA.