Donate

EN
/

Join our email list

The Insanely Fascinating History of Hanukkah Light

It is Hanukkah’s perennial irony that the so-called “Festival of Lights” is literally outshone by the Christmas lights that blanket trees and yards for the duration of the holiday and beyond.
©sergei_fish13/stock.adobe.com
©sergei_fish13/stock.adobe.com
Dr. David Zvi Kalman is a research fellow in the Shalom Hartman Institute’s Kogod Research Center. He is a scholar, writer, and entrepreneur working at the intersection of technology, religion, and art. In addition to his work at the Shalom Hartman Institute, he has held research and consulting roles at Sinai and Synapses and the Sapir Institute. He is the owner of Print-O-Craft Press, an independent publishing house that has released books including Jessica Deutsch’s

Originally published in The Forward

It is American Hanukkah’s perennial irony that this so-called “Festival of Lights” is both literally and figuratively outshone by the myriad Christmas lights that blanket trees and yards from well before Hanukkah begins to well after it concludes. Though observant viewers might glimpse a handful of candles by the window during the brief time that those candles remain lighted, Hanukkah candles have become, for all intents and purposes, an entirely internal affair — this despite the legal mandate that they be lighted in a place where they can be seen from the street.

It is unusual for a holiday to be so thoroughly outclassed; given the choice, religions would rather not compete so directly at all. In the early church, the date of Easter was set in relation to Judaism’s Passover; breaking the link between the two was an important early marker of Christianity’s independence from its Jewish origins. But light is light, and neither Hanukkah nor Christmas — nor the other winter solstice holidays — would be complete without it. In the bleakest month of the year, competition is almost inevitable.

The comparison, however, rests on a fault. In a candlepower battle between Christmas and Hanukkah, Hanukkah will always be overshadowed. But Hanukkah was never about lighting up the night; it was about burning candles. To understand why the latter would have been so meaningful, we first need to understand a little about artificial lighting.

We in the 21st century are accustomed to radical technological change and to peering into the recent past and reveling in the advances in technology that have been made since our childhoods and those of our parents. When it comes to artificial lighting, though, there are few people alive who remember a time when the technology was inadequate or too expensive. Artificial lighting, in America at least, is a solved problem, and it has been solved for such a long time that most forget it was ever a problem in the first place.

Instead, the past century of electric lighting masks Homo sapiens’ long struggle to find sources of lighting that were steady, safe and cheap. Of these, cost was far and away the greatest impediment. A 1994 paper by Yale economist William Nordhaus estimated that it would have taken the average worker more than five hours to earn enough money to buy enough candles to equal roughly the output of a single 60W incandescent bulb over the course of an hour; in 1992, the same could be purchased in less than half a second. With the advent of light-emitting diode bulbs, costs continue to plummet.

Lighting was expensive because, until very recently, appropriate fuels were few and far between, and all options had significant drawbacks. Fatty floralike pine shards were short-lived and too smoky for interior use; naphtha, a petroleum product known from antiquity, was considered too volatile for anything but explosives. Tallow (rendered animal fat) was frequently the candle of choice for the working class, but its light was so meager and its smell so unpleasant that the rabbis prohibited its use for Shabbat candles. Beeswax was preferable, but was too expensive for regular use. Even worse: Until the 19th century, all major sources of lighting were simply repurposed foodstuffs. For those with limited resources, every candle lit meant calories uneaten.

The limitations on artificial lighting had a profound influence on society. It was never impossible to light up the night, but light always came with costs, and so the rhythms of society were designed to avoid using it. Whenever possible, one worked by day and slept by night; dinner, from Ancient Greece to medieval Europe, was commonly eaten around the “ninth hour” — that is, nine-twelfths of the way between sunrise and sunset, or sometime in the middle of the afternoon. Public lighting could not be taken for granted; even Rome had none until the end of the second-century C.E., and Antioch’s public lighting was so remarkable that travelers wrote home about it well into the fourth century.

The smoke produced by indoor lighting needed attention, too, because of both the fire risk and the need it would provide for constant dusting. Marcus Vitruvius Pollio (d. 15 CE), author of a key work on architecture, warned that complicated moldings should not be used in rooms that would be lit at night, since they would quickly accumulate debris. In fact, it was not until indoor gas lighting had given way to electricity that houses with open floor plans were possible; previously, rooms had been carefully walled off from one another so as to curtail the unwanted spread of odors and soot.

The cost and trouble of indoor lighting didn’t prevent candles from being used, but it meant that you wouldn’t use one unless you had a specific reason. To eat at night was a luxury for those who could afford both candles and servants to watch them; a poor person, the Talmud reports, was more likely to eat well before nightfall. Studying, a common nighttime activity, has long been associated with lengthy nighttime candle use. Shakespeare uses “candle-waster” as a demeaning term for a studious individual; the playwright Ben Johnson insults someone as “a whoreson book-worm, a candle-waster.” Candles were also associated with domestic work; Eshet Hayil, a portion of Proverbs 31 sung in some families on Friday nights, praises a woman who is fastidious in her duties: “Her lamp never goes out at night.”

Beginning with the Bible itself, Judaism has always understood lighting to be a serious expense; its donation was understood to be a meaningful gift. The cost of the synagogue’s eternal flame (ner tamid) was so substantial that it needed to be endowed, and records of such endowments can be found in medieval and early modern Jewish wills. In early modern Amsterdam, a special collection box was placed by the synagogue door to fund the large and numerous wax candles used for Yom Kippur. To this very day, the Shabbat liturgy used in Conservative and Orthodox services single out “those who give candles for illumination.”

It was against this backdrop of lighting’s preciousness that both Christmas and Hanukkah rituals were born. Their development could not have been more different.

Christmas lights started with the tree, then spread to the yard. The origin of the tree itself is uncertain, but by the 18th century the Lichterbaum was firmly established as the prototypical German Protestant custom. The lights were originally made of expensive beeswax and attached from the branches of the tree itself. The lighting revolutions of the 19th century — most notably the introduction of paraffin, the first cheap wax — brought Christmas lighting into easier reach. From its inception, the purpose of the lights was to create an atmosphere of warmth in the depths of winter. Martin Luther, who was imagined to have started the custom, supposedly desired that the stars — including the star of Bethlehem — be brought inside each Christian house, affirming the presence of God. This atmosphere is captured by the poet Goethe, who, in “The Sorrows of Young Werther,” writes of “the joy the little ones would have and of the times when the unexpected opening of the door and the appearance of the marvelous tree with its wax candles, sweets, and apples would put them in heavenly rapture.” To put it differently, the weather outside is frightful, but the fire is so delightful.

It was electricity that finally brought Christmas lights outside for all to see. The earliest bulbs were extremely expensive. When General Electric Co. introduced the first pre-strung lights, in 1903, keeping them lit for a week would have cost a little less than an average week’s wages. So fire prone were the early bulbs that insurance companies refused to cover the cost of resulting damages. Over the next century, both safety and cost improved dramatically. When Edison first sold electricity, he charged 24 cents per kilowatt-hour — more than $7 in 2017 dollars. Today, after more than a century of inflation, the average American pays just 12 cents per kilowatt-hour. The price plummet has led to a continuous growth in outdoor Christmas lighting on large outdoor trees and, subsequently, on houses. With the ability to blanket a house with light for just a few cents an hour, it has never been easier to illuminate the night.

At every turn, Christmas lights pushed the bounds of feasible incandescence; the end results are the massive displays that one can now find in every American neighborhood. But Hanukkah’s candles — with the exception of Chabad’s public menorahs — have remained candles. Even electric menorahs, which could be as bright as possible, are usually modest affairs.

The tragedy in this is that Hanukkah candles were always intended to be a public spectacle, against which Christmas displays present themselves as nothing but Johnny-come-lately show-stealers. The requirement of pirsumei nisa — that one use the menorah to publicize the miracle of the long-burning oil — dictates many of the particulars of the ritual, including the fact that they must be visible from the street. Spectacle is, in fact, the only permissible use of the candles; repurposing them for internal use (for example. lighting other candles, heat or just interior illumination) is strictly forbidden.

In the dark nights of antiquity, the light of the Hanukkah candles — which the Talmud mandates be burned at the very end of the market day — could have conceivably served as precisely the kind of lights that the Talmud envisions, a small sign of conspicuous consumption. But the same laws that mandated a fidelity to the oil of the original miracle pegged the menorah beacon to an old era of lighting, barring Hanukkah from receiving the benefits of the technological revolutions to come. At the same time, European Jews who, lacking access to olive oil, resorted to wax, a substance of pure convenience, eroded the historical significance of oil candles. The result is what we have today: a candle-lighting ritual unmoored from both the present and the past. If a menorah is no longer good for publicity, then can it be good for anything at all?

To be fair, the purpose of the candles themselves has never been entirely clear. The miracle of oil is first articulated only in the Babylonian Talmud, hundreds of years after Titus Flavius Josephus (d. 100 CE) reported that the holiday was called the “Festival of Lights,” and hundreds of years more before the ritual of lighting candles was first reported. Perhaps, as Shai Secunda has argued, it was the Zoroastrians who got the rabbis excited about a fire-related ritual; if that is true, then Hanukkah’s survival as the most minor of Jewish holidays may be nothing but a bit of unacknowledged syncretism.

Still, the early reports of Hanukkah are bound together by a memory of a moment when sacrifice was briefly restored. Sacrifice is how the Hasmoneans signal that they have won back the Temple; the miracle of the oil mattered precisely because it allowed for a brief return to the normal rhythms of Temple service in a Hellenist era of ritual irregularities and rapid religious change. In the context of sacrifice, the oil is not just a signal — it is an element of sacrifice itself.

In the Bible, oil is everywhere: It’s eaten, burned, worn as a cosmetic and applied to leather. Both the Bible and other Near Eastern cultures treated oil as one of life’s basic necessities, together with food and clothing. Oil is what makes a monarch (it still is). A messiah is literally one who is anointed with oil.

Above all the oils is olive oil. Today, this is just another bottle in the pantry; in the Bible, olive oil is one of the few products — together with livestock, fowl and flour — worthy of being offered to God on the Temple altar (not to mention the menorah itself). Pure oil is purifying; no sacrifice calls for more oil than the one for ridding an individual of leprosy.

In their original oily form, the Hanukkah lights — a ritual development in its fullness only long after the Temple was gone — remember all this. They do so not by lighting up the darkness, but by re-creating the oil sacrifices (and sacrifices generally) in miniature, right there in the window for all to see. The point isn’t the light itself; it’s the fuel traveling up the wick. Christmas light illuminates; Hanukkah light offers up.

What might it mean to take seriously the idea that Hanukkah should involve a sacrifice of light?

If Hanukkah is about the conspicuous consumption of light, one might try to relive the original Hanukkah by replicating its expense. This, in the age of electricity, turns out to be effectively impossible.

The Edict on Maximum Prices, issued by Diocletian in the year 301 C.E., gives us a sense of relative costs in the Roman world. According to the edict, a farm laborer’s daily wage would have been sufficient to purchase enough oil to burn a single candle on regular-grade olive oil for 80 hours. Since a Hanukkah candle must burn for around half an hour, a worker would be able to earn enough to light a Hanukkah candle with 1/160 of a day’s work.

How much light would a modern worker be able to burn in that amount of time? In America, 1/160th of the median daily wage works out to a little more than a dollar, which, at current electricity prices, would buy 8.3 kWh of electricity. If one used that electricity in modern LED bulbs, one could power about 2,000 bulbs for 30 minutes, far more than all but the most elaborate Christmas displays. Most houses would blow a fuse under this load, but assuming they didn’t, a grid of such bulbs, placed in front of one’s house, would create a display so blinding that it could not be viewed directly. From a passing car, it would be appear to be about as bright as an overcast day — and that, dear reader, would be for only the first night of Hanukkah.

More than that: Such a practice would run afoul of the 20th-century rabbis who, understandably, frowned on Hanukkah lights whose fuel source was separated from their flame. Sacrifice is not just about spending money; it is about seeing something of yours being used up, transformed into light and vanished into the air.

If a deep appreciation of Hanukkah candles requires an affinity for the Temple, perhaps the lights cannot be saved from obscurity. But perhaps attachment is still possible through the candle itself.

Jews — and Jewish women, in particular — have long had personal relationships with candles through the process of making them, laboriously, by hand. Soul candles have been made for centuries; with each thread, a woman would recite the name of a deceased relative. What remained from these candles after their debut on Yom Kippur eve would be reshaped for menorah use. The experience of seeing such a candle slowly disappear into smoke could not be replicated with a million bulbs.

Hanukkah lights are dim when they are generic; they are brilliant when they are personal. This Hanukkah, I suggest you roll your own.

You care about Israel, peoplehood, and vibrant, ethical Jewish communities. We do too.

Join our email list for more Hartman ideas

More on
Search
FOLLOW HARTMAN INSTITUTE
Join our email list

SEND BY EMAIL

The End of Policy Substance in Israel Politics