fbpx
01 / 05
Grim Old Days: Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770

Blog Post | Human Development

Grim Old Days: Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770

The book gives insight into a far crueler and more unpleasant society than people today can easily fathom.

Summary: The realities of life in preindustrial England reveal a world teeming with physical discomforts, social cruelty, and environmental hazards unimaginable to modern sensibilities. England from 1600 to 1770 was plagued by disease, primitive hygiene, adulterated food, and oppressive punishments. Far from the romanticized notions of simpler times, living in this early modern time and place often meant enduring relentless hardship and indignity.


British historian Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770 provides a window into the lives of ordinary people in the preindustrial and early industrial era. A “social history,” the book conveys how the world sounded, smelled, felt, and tasted—a horror show beyond the comprehension of most modern people. The chapters bear titles such as “Itchy,” “Mouldy,” “Grotty,” “Dirty,” and “Gloomy.”

A preindustrial person transported to the present day would be amazed by the current prevalence of relatively smooth, clear skin enabled by better general health in addition to the widespread use of sunscreen, moisturizers, and all manner of modern beauty treatments. In the past, frequent illnesses left victims permanently marked. To be “Pock-broken” or “pock-freckled” was a common descriptor. Skin was often directly disfigured by diseases and further damaged by how fleas and then-common medical conditions caused compulsive scratching. “Fleas would have been a common feature of institutions and inns, as well as domestic settings,” proliferating in crammed households, cities, and seaports. A Dutch traveler named William Schellinks (1623–1678) found the fleas in one English inn so “aggressive” that he opted to sleep on a hard bench rather than the provided bed. But fleas were far from the sole culprit. “Many conditions would have caused itching, including eczema, impetigo, ‘psorophthalmy’ (eyebrow dandruff), scabies, chilblains, chapped and rough skin, tetters’ (spots and sores), ‘black morphew’ (leprous or scurvy skin) and ringworm. Few citizens [of Britain] enjoyed smooth unblemished skin.”

If you could visit the past, you would be shocked at the commonness not just of pockmarks but also of oozing open sores. “Venereal disease was the secret epidemic that blighted the entire period,” resulting in such outward signs as “weeping sores on the lips” and “pocky” countenances. Many other diseases also produced wounds that festered and exuded foul discharges on the faces of everyday people. “In this pre-antibiotic era, skin eruptions in the forms of bulging pustules, lesions, acne and gout-induced ulcers could all have become infected, causing chronic wounds.” Such skin problems affected all social classes. “In 1761, as an Oxford undergraduate, the parson-in-waiting James Woodforde . . . was plagued by a ‘bad Boyle on my Eye-brow’. This boil reappeared the following year, to be joined by a stye among his lower right eyelashes.”

With so many faces covered in scars, as well as boils and sores emitting blood and infected pus, it is an understatement to say the people of the past were in desperate need of skincare. Sadly, their primitive skincare and makeup regimens made matters even worse. “Caustic and toxic ingredients lurked in many ready-made and home-mixed cosmetics and toiletries. Eliza Smith’s cure for pimples included brimstone (sulphur). Johann Jacob Wecker suggested the use of arsenic and ‘Dogs-turd’ as ingredients for ointments to ‘make the nails fall’. The Duchess of Newcastle warned that the mercury in some cosmetics could cause consumption and oedema. Indeed, some preparations were so toxic that they could ‘take away both the Life and Youth of a Face, which is the greatest Beauty.’ The Countess of Coventry was said to have died from toxic properties in her cosmetics.” That countess, Maria Coventry née Gunning (1732–1760), died at age 27, likely of lead poisoning, as lead was a common ingredient in skin-whitening makeup at the time, despite lead’s propensity to make its wearers ill (or, in Maria’s case, deceased).

An 18th century English aristocrat who was likely killed by the lead in her makeup at age 27.
Maria, Countess of Coventry, killed by lead in her makeup. Photo credit: Wikimedia.

Even nonlethal makeup was of far poorer quality than today’s cosmetics, frequently dissolving and dripping. Women “shunned hot places for fear of melting visages.” Even royalty, with access to the best cosmetics of the era, fell victim to this tendency of makeup to drip. One observer remarked in his diary after seeing the queen of England at a banquet in 1662 that “her make-up was running down her sweaty face.”

The state of clothing for the masses contributed to skin and health problems. The truly poor bought used garments. “Poorer citizens rarely bought new items of clothing, but made do with second-, third- and fourth-hand clothes. . . . By the time they reached the poorest members of society, garments would be smutted, food-stained, sweat-ridden, pissburnt and might shine with grease. . . . Clothes in such a state would be hard, unyielding and smelly.”

“The second-hand market was a thriving one” in early modern London. “Some specialised in old shoes, or even old boots. [The Dutch-born artist] Marcellus Laroon included an image of a trader who exchanged brooms for casto-off shoes . . . in his Cryes of London (1688). . . . A high demand for second-hand clothing meant that garments constituted a considerable proportion of property that was stolen. Thomas Sevan was apprehended . . . wearing three stolen shirts in 1724. He had left his old ragged shirt behind at the scene of the crime. Elizabeth Pepys’s new farandine waistcoat was snatched from her lap as she sat in traffic in Cheapside. On Easter Monday 1732 John Elliott became the victim of highway robbers who relieved him of his hat, wig, waistcoat and shoes. . . . No item of clothing was immune from theft—even odd shoes and bundles of dirty washing were lifted.”

“Clothes could be taken to a botcher, or a botching tailor, for patching and repair. . . . Old shoes were rejuvenated or modified by cobblers, or ‘translators.’ The subsequent wearers of shoes would have worked their feet into spaces stretched to fit a foreign shape, which might have caused blisters, bunions and corns. . . . Partial unstitching and ‘turning’—the inner parts becoming the new exterior—could prolong the life of coats and other garments. Even the rich eked out the life of their favourite garments by turning, dyeing and scouring. . . . However, clothes could only be refashioned a limited number of times before they became napless, threadbare and tattered. If enough good fabric remained, this could be reused to make a smaller item of clothing, a garment for a child, or a cloth cap. . . . Tired garments were passed down to apprentices or servants.”

The condition of teeth was also disturbingly poor. “Queen Elizabeth sported black teeth. Emetics were popular cure-alls, and these would have hastened tooth decay through the acidic erosion of the enamel. Archaeological surveys suggest that the majority of early modern adults suffered tooth decay.” While they did not meet with much success, the people of the past certainly attempted to keep their teeth from rotting. “There was an array of dentifrice powders and cures on the market. Although most would have had little or no effect on cavities or diseased gums, some of these powders and recipes would have carried away some dirt and plaque from teeth. Powders were concocted from cuttlefish, cream of tartar and sal amoniack (ammonium chloride). These abrasive substances could be rubbed on” teeth, and some recommended “hard rubbing with a dry cloth or sage leaf” to cleanse teeth. The writer Thomas Tryon (1634–1703) recommended swishing river water as a mouthwash. Needless to say, such routines were insufficient. “A lack of adequate tooth cleansing and an inappropriate diet led to bad breath and also caused tooth decay.” Missing teeth were common. “A character in an eighteenth-century play bemoaned the poor dental state of London’s women” by claiming that “not one in ten has a Tooth left.” When those suffering from toothaches sought dental care, what passed for dentistry at the time could make matters even worse. Consider the unfortunate case of the English lawyer and politician Dudley Ryder (1691–1756). “After spending a month in 1715 chewing on just one side of his mouth to avoid the pain of a severely decayed tooth, Dudley Ryder finally summoned up the courage to have it drawn. In the process, a little of his jaw was broken off, but he rallied, claiming it didn’t hurt. Much. By the mid-eighteenth century wealthier citizens would have the option of trying out a transplant, using teeth from a paid donor.”

Tooth and skin problems were visible, but internal ailments that were less apparent also plagued our ancestors. One of the many negative health effects of animals crowding the cities was that parasites from the creatures often spread to humans. “The abundance of dogs and pigs on the city streets provided the perfect breeding ground for a variety of intestinal parasites, many of which wormed their way into humans. Eliza Smith asserted that ‘vast numbers’ were infested. Many bottoms would have itched with discomfort thanks to the presence of thread and tape worms in the digestive system. According to the numerous contemporary adverts, worms created a myriad of physical discomforts, including ‘pinching Pain in the Belly, when hungry, a stinking Breath’, vomiting, nightmares, pallidness, fever and teeth gnashing.” The animals caused other problems as well. “Neighbours near to houses in which beasts were kept or slaughtered would have endured stench and noise.” For example, “those living near Lewis Smart’s huge piggery on London’s Tottenham Court Road described how servants fell sick and resigned on account of the smell, which ‘Drive thro’ the walls of the houses.’ Visitors to the house opposite were forced to hold their noses, and one neighbour explained how the fumes dirtied newly laundered linen and tarnished plate.”

The people of the past often went hungry. “Recording a high rate of corn spoilage in 1693, due to a wet summer season, [the English antiquary] Anthony Wood noted that scarcity pushed prices out of the pockets of the poor, who were forced to ‘eat turnips instead of bread’. During this dearth [the writer] Thomas Tryon outlined a diet for a person on a budget of twopence per day. The recipes are uniformly bland: flour, water, milk and peas, all boiled to differing consistencies.”

Food often spoiled during transport to the market. “Eggs that came to London from Scotland or Ireland were often rotten by the time they arrived.” Food was often adulterated, and some degree of adulteration was considered unavoidable. Malt was only deemed unacceptable if it contained “half a peck of dust or more” per quarter. “Butchers would disguise stale slaughtered birds. [A contemporary account] warns of one such operator who greased the skin and dredged on a fine powder to make the bird strike ‘a fine Colour.’” Butter was frequently adulterated with “tallow and pig’s lard.” “Some fishmongers coated gills with fresh blood, as red gills indicated a recent netting,” to misrepresent stale fish to the unwary buyer. Fish were often wormy and if not cooked thoroughly remained so at the time of serving. The English statesman Samuel Pepys (1633–1703) once noted his disgust at the sight of a sturgeon dish upon which he observed “very many little worms creeping.”

Bread, the mainstay of most diets, was not immune to contamination. “Some loaves were deliberately adulterated with stones and other items to bulk them up.” In 1642, an unscrupulous Liverpool woman named Alice Gallaway “tried to sell a white loaf that contained a stone, to make up its weight. This sort of practice would have been widespread—the baker could claim that the stone had not been removed in milling, and blamed the miller. Stone, grit and other unwelcome contaminants would have posed dangers to the teeth of the unwary.” Millers also engaged in such unethical behavior as adding “beanmeal, chalk, animal bones and slaked lime” to disguise musty flour. Perhaps it should be no surprise then that London bread was described in 1771 as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.”

There are even accounts of human remains being added to food for sale, resulting in unknowing cannibalism on the part of the buyer. The author of the 1757 public health treatise Poison Detected claimed, “The charnel houses of the dead are raked to add filthiness to the food of the living.” The squalid state of the marketplace further exposed food to pollution or contamination. “The market stalls, and the streets on which they stood, were frequently described as being filthy and strewn with rotting debris.” Flies and other insects swarmed each market. “Hanging meats were vulnerable to attack by hopper-fly, and if they got too warm they would rust and spoil.” The smoke of London’s chimneys was said to fill the air and “so Mummife, drye up, wast and burn [hanging meat in the marketplace], that it suddainly crumbles away, consumes and comes to nothing.”

The population was so accustomed to foul-smelling meat that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.” Between the bugs, the smoke, and the dirt, few groceries reached customers unscathed. One 18th-century writer complained of “pallid contaminated mash, which they call strawberries; soiled and tossed by greasy paws through twenty baskets crusted with dirt.” The state of the marketplace even inspired deprecating lyrics, such as these from 1715, “As thick as Butchers Stalls with Fly-blows [where] every blue-ars’d Insect rambles.” “As the market day progressed, perishables . . . were more likely to be fly-blown or decayed.” Those undesirable leftovers unsold at the end of the market day were often later hawked by street vendors. A letter in The Spectator in 1712 complained that everything sold by such vendors was “perished or putrified.” Recipes took into account the poor quality of available ingredients. “Imparting some dubious tips for restoring rotting larder supplies, [cookbook author] Hannah Glasse’s strategy ‘to save Potted Birds, that begin to be bad’ (indeed, those which ‘smell so bad, that no body [can] . . . bear the Smell for the Rankness of the Butter’) involved dunking the birds in boiling water for thirty seconds, and then merely retopping with new butter.”

Yet those shopping at the marketplace with all its terrors were relatively fortunate compared to others. Broken victuals, the remnants and scrapings from the more affluent plates, were a perk of service for some servants, and the saviour of many paupers.” One account from 1709 tells of a woman reduced to living off “a Mouldy Cryst [crust] and a Cucumber” while breastfeeding, an activity that greatly increases caloric needs. Desperation sometimes resulted in swallowing nonfood objects, such as wax, to ease hunger pangs. “Witnesses reported that a young London servant girl was so hungry in 1766 that she ate cabbage leaves and candles.” She was far from the first person to use candle wax as a condiment. “The underfed spread butter thickly on bread (this was necessary to facilitate swallowing dark or stale bread). Cheap butter was poor grade, akin to grease . . . a ‘tallowy rancid mass’ made of candle ends and kitchen grease was the worst type” of concoction to pass under the name of butter. Another account of hunger from 1756 relates how a starving woman felt “obliged to eat the cabbage stalks off the dunghill.”

The people of the past also had good reason to wonder whether their homes would collapse around them. “A proverb warned that ‘old buildings may fall in a moment’. So familiar was the sound of collapsing masonry that in 1688 Randle Holme included ‘a crash, a noise proceeding from a breach of a house or wall’ in a list of only nine descriptive sentences to illustrate the ‘Sense of Hearing’. Portmeadow House in Oxford collapsed in the early seventeenth century. Among the casualties recorded in the Bills of Mortality for 1664 was one hapless soul killed by a falling house in St Mary’s Whitechapel . . . Dr Johnson described London of the 1730s as a place where ‘falling Houses thunder on your Head.’ . . . In the 1740s, ‘Props to Houses’ appeared among a list of common items hindering free passage along the pavement in London. A German visitor wondered if he should go into the street in 1775 during a violent storm, ‘lest the house should fall in, which is no rare occurrence in London.’” “Thomas Atwood, a Bath plumber and property developer, died in 1775 when the floor of an old house gave way.” Regulations sometimes made matters worse, preventing the tearing down of homes on the verge of collapse. One account notes that homes in disrepair became “the rendezvous of thieves; and at last . . . fall of themselves, to the great distress of whole neighborhoods, and sometimes to bury passengers in their ruins.” Windy days could knock down homes. “Gales swept [London] in 1690, leaving ‘very many houses shattered, chimneys blowne down.’”

Inside, homes were often filled with smoke from fireplaces. “With open fires providing most of the heating, filthy discharges of soot and smut clung to interiors.” Even with regular chimney sweepings, clogged chimney pots and soot deluges could and did occur. One writer railed against the “pernicious smoke . . . superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.” Interior smoke disturbed the air of the humblest homes and the grandest palaces alike. The German consul Zacharias Conrad von Uffenbach (1683–1734) complained that the Painted Chamber of London’s Westminster Hall could “scarce be seen for the smoke” that filled the interior; in the Upper Chamber he similarly noted that the tapestries were “so wretched and tarnished with smoke that neither gold nor silver, colours or figures can be recognized.”

“Householders struggled to contain infestations of vermin.” This was a problem even in well-off homes. Samuel Pepys recorded in his diary his multiyear struggle with mice, which “scampered across his desk” with abandon despite his purchase of a cat and deployment of mousetraps. “In 1756 Harrop’s Manchester Mercury ran an advert for a book detailing how to rid houses of all manner of vermin,” including adders, ants, badgers, birds, caterpillars, earwigs, flies, fish, fleas, foxes, frogs, gnats, lice, mice, moles, otters, polecats, rabbits, rats, snakes, scorpions (an invasive species of which had entered England via Italian masonry shipments), snails, spiders, toads, wasps, weasels, and worms.

As if that wasn’t enough to keep people up at night, nighttime was loud. Crying babies and the moaning of the hungry, ill, and dying echoed in the night, as well as the pained wails of women suffering through domestic violence. In London, in 1595, a law was passed to prevent men from beating their wives after 9 p.m. The legislation was not prompted by concern for the wives (after all, wife-beating was generally accepted as normal and morally unproblematic) but by consideration for neighbors trying to sleep through the noise. The law read in part: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” A similar law forbade smiths from using their hammers “after the houre of nyne in the night, nore afore the houre of four in the Morninge.”

The book gives insight into a far crueler and more violent society. Legal punishments could be grotesque and sadistic. For example, in 1611, a woman who had conducted “lewd acts . . . was punished by the Westminster burgesses by being stripped naked from the waist upwards, fastened to a cart, and whipped through the streets on a cold December day.” Women deemed “scolds” were often publicly humiliated in ritual fashion. “Ducking stools or cuckstools were equipment for punishing scolds and were items of town furniture [and] were still used as a deterrent in the eighteenth century. Ducking was a rite of humiliation intended to put the woman in her place and to teach her a lesson.” Many towns took pride in the maintenance of their ducking stools, and sometimes a device with a similar rationale called a “scold’s bridle,” an iron muzzle that enclosed the head and compressed the tongue to silence the unfortunate wearer.

“Across the country [of England] the civic authorities ensured that their cuckstools were functioning. In 1603 the Southampton authorities complained that ‘the Cuckinge stoole on the Towne ditches is all broken’ and expressed their desire for a new one, to ‘punish the manifold number of scoldinge woemen that be in this Towne’. The following year they wondered whether a stool-on-wheels might be invented. This could be carried from dore to dore as the scolde shall inhabit’. This mobile stool would, it was explained, be ‘a great ease to mr mayor . . . whoe is daylie troubled w[i]th suche brawles’. The Oxford Council erected a cuck stool at the Castle Mills in 1647. The Manchester stool was set up in 1602 ‘for the punyshement of Lewde Wemen and Scoldes . . . six scolds were immersed in 1627. A decade later the town added a scold’s bridle to their armoury of reform. A new ducking chair was erected in ‘the usual place’ in 1738. Even as late as 1770 a knot and bridle hung from the door of the stationers, near the Dark Entry in the Market Place ‘as a terror to the scolding huxter-women.’”

Outhouses doubled as dumping grounds for victims of infanticide with shocking frequency. “Much of what we know about London’s privies and houses of ease comes from unpleasant witness statements concerning gruesome discoveries of infants’ corpses found among the filth. In the trial of Mercy Hornby for killing her newborn daughter we find details of the privy into which the child was cast. Newly constructed in the 1730s, it was six foot deep, with just over three feet of soil at the time of the incident.”

And that is only a small slice of the manifold horrors detailed in Cockayne’s book, where practically every page provides fresh fodder for nightmares.

Blog Post | Democracy & Autocracy

Open Societies and Closed Minds | Podcast Highlights

Marian Tupy interviews Matt Johnson about historicism, progress, and how tribalism and the “desire for recognition” are testing the foundations of open societies.

Listen to the podcast or read the full transcript here.

Today, I’m very lucky to speak to Matt Johnson, who recently had a fascinating essay in Quillette titled “The Open Society and Its New Enemies: What Karl Popper’s classic can teach us about the threats facing democracies today.”

So Matt, could you tell us who Karl Popper was and what this big book is about?

Popper is mainly known for his scientific work, especially his ideas around falsifiability. He published a book called The Open Society and Its Enemies in 1945. He started writing it right after the Nazi annexation of Austria. It’s a very powerful and clarifying set of principles for anybody interested in liberal democracy and the broader project of building open societies around the world today.

So, why talk about liberal democracies and openness? It is our conjecture here at Human Progress that openness is very important. Have you ever thought or written about the connection between openness, liberal democracy, and the scope and speed of human progress?

That’s been a major theme of my work for a long time. I think there is a strong connection between the development of liberal democracy and open societies throughout the 20th century and human progress. Liberal democracy, unlike its authoritarian rivals, has error correction mechanisms built in. It allows for pluralism in society. It allows people to cooperate without the threat of violence or coercion. There’s also the economic element: Liberal democracy facilitates free trade and open exchange because it’s rule-based and law-bound, which are important conditions for economic development.

Human Progress also assumes that there is some directionality in history. We can say that living in 2025 is better than living in 1025 or 25 AD. But you begin your essay by raising the dangers of what Karl Popper called historicism, or a belief in the inevitability of certain political or economic outcomes. Can you unwind that for us? What is the difference between acknowledging the directionality of human history and historicism?

Popper regarded historicism as extremely dangerous because it treats human beings as a means to an end. If you already know what you’re working toward—a glorious worker state or some other utopia—then it doesn’t matter how much pain you have to inflict in the meantime. You’re not treating your citizens as ends whose rights must be protected; you’re treating them as raw material, as characters in this grand historical story.

The second concern is that historicism is anti-scientific because you can hammer any existing data into a form that fits your historicist prophecy.

Marx wrote that the unfolding of history is inevitable. In his view, leaders were just responsible for making that unavoidable transition easier. That’s the central conceit of historicism. If you take a Popperian view, you’re much more modest. You have to ground every policy in empirical reality. You have to adjust when things don’t work. You’re not just birthing a new paradigm you already know everything about. You don’t know what the future holds.

Stalin would say, anytime there was a setback, that it was all part of the same plan. It was all just globalist saboteurs attacking the Soviet Union, or it was some part of the grand historical unfolding that moving toward the dictatorship of the proletariat. There’s no sense in which new information can change the course of a government with historicist ideas.

That differs from a general idea of progress. We have a lot of economic data that suggests that people have escaped poverty at an incredible rate since the middle of the 20th century. We’ve seen democratization on a vast scale around the world. We’ve seen interstate relations become much more tranquil and peaceful over the past several decades. I mean, the idea of Germany and France fighting a war now is pretty much inconceivable to most people. That’s a huge historical victory, it’s unprecedented in the history of Western Europe.

So, there are good reasons to believe that we’ve progressed. And that’s the core difference between the observation and acknowledgment of progress and historicism, which is much less grounded in empirical reality.

Right. The way I understand human progress is backward-looking. We can say that we are richer than we were in the past. Fewer women die in childbirth. Fewer infants die. We have fewer casualties in wars, et cetera. But we don’t know where we are going.

Yeah, absolutely. There were moments during the Cold War that could have plunged us into nuclear war. It makes no sense to try to cram every idea into some existing paradigm or prophecy. All we can do is incrementally move toward a better world.

This brings us to another big name in your piece: Frank Fukuyama. Tell me how you read Fukuyama.

Fukuyama is perhaps the most misread political science writer of our time. There are countless lazy journalists who want to add intellectual heft to their article about some new crisis, and they’ll say, “well, it turns out Fukuyama was wrong. There are still bad things happening in the world.” That’s a fundamental misreading of Fukuyama’s argument. He never said that bad things would stop happening. He never said there would be an end to war, poverty, or political upheaval. His argument was that liberal capitalist democracy is the most sustainable political and economic system, that it had proven itself against the great ideological competitors in the 20th century, and that it would continue to do so in the future.

I think it’s still a live thesis, it hasn’t been proven or disproven. I suppose if the entire world collapsed into totalitarianism and remained that way, then yeah, Fukuyama was wrong. But right now, there’s still a vibrant democratic world competing against the authoritarian world, and I think that liberal democracy will continue to outperform.

You use a phrase in the essay I didn’t quite understand: “the desire for recognition.” What does it mean, and why is it important to Fukuyama?

The desire for recognition is the acknowledgment that human desires go beyond material concerns. We want to be treated as individuals with worth and agency, and we are willing to sacrifice ourselves for purely abstract goals. Liberal democracies are the only systems so far that have met the desire for recognition on a vast scale. Liberal democracies treat people as autonomous, rational ends in themselves, unlike dictatorships, which treat people as expendable, and that’s one of the reasons why liberal democracy has lasted as long as it has.

However, there’s a dark side. Because liberal democracy enables pluralism, people can believe whatever they want religiously and go down whatever political rabbit holes they want to. And, oftentimes, when you have the freedom to join these other tribes, you find yourself more committed to those tribes than to the overall society. If you’re a very serious Christian nationalist, you might want society organized along the lines of the Ten Commandments because that, in your view, is the foundation of morality. So, pluralism, which is one of the strengths of liberal democracy, also creates constant threats that liberal democracy has to navigate.

I noticed in your essay that you are not too concerned. You note that democracy is not in full retreat and that, if you look at the numbers, things are not as dire as they seem. What is the argument?

If you just read annual reports from Freedom House, you would think that we’re on our way to global authoritarianism. However, if you take a longer historical view, even just 80 years versus 20 years, the trend line is still dramatically in favor of liberal democracies. It’s still an amazing historical achievement. It’s getting rolled back, but in the grand sweep of history, it’s getting rolled back on the margins.

Still, it’s a dangerous and frightening trend. And you’re in a dangerous place when you see a country like the United States electing a president who is expressly hostile toward the exchange of power after four years. So, the threats to democracy are real, but we need to have some historical perspective.

So, we are more liberally democratic than we were 40 years ago, but something has happened in the last 15 to 20 years. Some of the trust and belief in liberal democracy has eroded.

How is that connected to the issue of recognition?

In the United States, if you look at just the past five or six years, there has been a dramatic shift toward identity politics, which is a form of the desire for recognition.

On the left, there was an explosion of wokeness, especially in 2020, where there was a lot of authoritarianism. People were shouted down for fairly anodyne comments, and editors were churned out of their roles. And on the right, there’s this sense that native-born Americans are more completely American than other people. All of these things are forms of identity politics, and they privilege one group over another and drive people away from a universal conception of citizenship. That’s one of the big reasons why people have become less committed to pluralism and the classic American idea of E pluribus unum.

Have you ever thought about why, specifically after 2012, there was this massive outpouring of wokeness and identity politics? Some people on the right suggest that this is because America has begun to lose religion, and, as a consequence, people are seeking recognition in politics.

I think it could be a consequence of the decline of religion. I’ve written a lot about what many people regard as a crisis of meaning in Western liberal democracies. I think, to some extent, that crisis is overblown. Many people don’t need to have some sort of superstructure or belief system that goes beyond humanism or their commitment to liberalism or what have you.

However, I also think that we’re inclined toward religious belief. We search for things to worship. People don’t really want to create their own belief systems; they would rather go out there and pick a structure off the shelf. For some, it’s Catholicism or Protestantism, and for others, it’s Wokeism or white identity politics. And there were elements of the woke explosion that seemed deeply religious. People talked about original sin and literally fell on their knees.

We also live in an era that has been, by historical standards, extremely peaceful and prosperous, and I think Fukuyama is right that people search for things to fight over. The more prosperous your society is, the more you’ll be incensed by minor inequalities or slights. The complaints you hear from people today would be baffling to people one hundred years ago.

I also think the desire for recognition gets re-normed all the time. It doesn’t really matter how much your aggregate conditions have improved; when new people come into the world, they have a set of expectations based on their surroundings. And it’s a well-established psychological principle that people are less concerned about their absolute level of well-being than their well-being relative to their neighbors. If you see your neighbor has a bigger house or bigger boat, you feel like you’ve been cheated. And this is also the language that Donald Trump uses. It’s very zero-sum, and he traffics in this idea that everything is horrible.

You raised a subject that I’m very interested in, which is the crisis of meaning. I don’t know what to make of it. Everybody, including people I admire and respect, seems to think there is a crisis of meaning, but I don’t know what that means.

Is there more of a crisis of meaning today than there was 100 years ago or even 50 years ago? And what does it really mean? Have you thought about this issue?

You’re right to question where this claim comes from. How can people who claim there is a crisis of meaning see inside the minds of the people who say that they don’t need religion to live a meaningful life? There’s something extremely presumptuous there, and I’m not sure how it’s supposed to be quantified.

People say, well, look at the explosion of conspiracism and pseudoscience. And there are people who’ve become interested in astrology and things like that. But humanity has been crammed with pseudoscience and superstition for as long as we’ve been around. It’s very difficult to compare Western societies today to the way they were a few hundred years ago when people were killed for blasphemy and witchcraft.

And look at what our societies have accomplished in living memory. Look at the vast increase in material well-being, the vast improvements in life expectancy, literacy, everything you can imagine. I find all that very inspiring. I think if we start talking about democracy and capitalism in that grander historical context, then maybe we can make some inroads against the cynicism and the nihilism that have taken root.

The Human Progress Podcast | Ep. 61

Matt Johnson: Open Societies and Closed Minds

Marian Tupy speaks with writer and political thinker Matt Johnson about historicism, progress, and how tribalism and the “desire for recognition” are testing the foundations of open societies.

Blog Post | Human Development

The Real Threats to Golden Ages Come From Within

History’s high points have been built on openness, Johan Norberg's new book explains.

Summary: Throughout history, golden ages have emerged when societies embraced openness, curiosity, and innovation. In his book Peak Human, Johan Norberg explores how civilizations from Song China to the Dutch Republic rose through trade, intellectual freedom, and cultural exchange—only to decline when fear and control replaced dynamism. He warns that our current prosperity hinges not on external threats but on whether we choose to uphold or abandon the openness that made it possible.


“Every act of major technological innovation … is an act of rebellion not just against conventional wisdom but against existing practices and vested interests,” says economic historian Joel Mokyr. He could have said the same about artistic, business, scientific, intellectual, and other forms of innovation.

Swedish scholar Johan Norberg’s timely new book—Peak Human: What We Can Learn from the Rise and Fall of Golden Ages—surveys historical episodes in which such acts of rebellion produced outstanding civilizations. He highlights what he calls “golden ages” or historical peaks of humanity ranging from ancient Athens and China under the Song dynasty (960-1279 AD) to the Dutch Republic of the 16th and 17th centuries and the current Anglosphere.

What qualifies as a golden age? According to Norberg, societies that are open, especially to trade, people, and intellectual exchange produce these remarkable periods. They are characterized by optimism, economic growth, and achievements in numerous fields that distinguish them from other contemporary societies.

The civilizations that created golden ages imitated and innovated. Ancient Rome appropriated and adapted Greek architecture and philosophy, but it was also relatively inclusive of immigrants and outsiders: being Roman was a political identity, not an ethnic one. The Abbasid Caliphate that began more than a thousand years ago was the most prosperous place in the world. It located its capital, Baghdad, at the “center of the universe” and from there promoted intellectual tolerance, knowledge, and free trade to produce a flourishing of science, knowledge, and the arts that subsequent civilizations built upon.

China under the Song dynasty was especially impressive. “No classic civilization came as close to unleashing an industrial revolution and creating the modern world as Song China,” writes Norberg.

But that episode, like others in the past, did not last: “All these golden ages experienced a death-to-Socrates moment,’” Norberg observes, “when they soured on their previous commitment to open intellectual exchange and abandoned curiosity for control.”

The status quo is always threatening: the “Elites who have benefited enough from the innovation that elevated them want to kick away the ladder behind them,” while “groups threatened by change try to fossilize culture into an orthodoxy.” Renaissance Italy, for example, came to an end when Protestants and Catholics of the Counter-Reformation clashed and allied themselves with their respective states, thus facilitating repression.

Today we are living in a golden age that has its origins in 17th-century England, which in turn drew from the golden age of the Dutch Republic. It was in 18th-century England that the Industrial Revolution began, producing an explosion of wealth and an escape from mass poverty in much of Western Europe and its offshoots like the United States.

And it was the United States that, since the last century, has served as the backbone of an international system based on openness and the principles that produced the Anglosphere’s success. As such, most of the world is participating in the current golden age, one of unprecedented global improvements in income and well-being.

Donald Trump says he wants to usher in a golden age and appeals to a supposedly better past in the United States. To achieve his goal, he says the United States does not need other countries and that the protectionism he is imposing on the world is necessary.

Trump has not learned the lessons of Norberg’s book. One of the most important is that the factors that determine the continuation of a golden age are not external, such as a pandemic or a supposed clash of civilizations. Rather, says Norberg, the critical factor is how each civilization deals with its own internal clashes, and the decision to remain or not at a historical peak.

A Spanish-language version of this article was published by El Comercio in Peru on 5/6/2025.

Blog Post | Human Development

Grim Old Days: A. Roger Ekirch’s At Day’s Close, Part 2

What was the world really like when nightfall meant fear, filth, and fire?

Summary: A. Roger Ekirch’s book offers a vivid and unsettling portrait of life after dark before the modern era. In a world lit dimly by candles and haunted by both real and imagined dangers, the setting sun marked the beginning of fear, vulnerability, and isolation. From rampant crime to ghostly superstitions, nocturnal life was fraught with hardship, mystery, and menace that shaped how generations lived.


Read part one of the book review.

The historian A. Roger Ekirch’s book At Day’s Close: Night in Times Past provides a fascinating window into our ancestors’ world. The book provides insight into everything from the nocturnal dangers they faced, such as the threats of crime and fire, to their deeply uncomfortable sleeping arrangements. For excerpts from the book on that last subject, click here.

Nighttime in the past was far darker than today. Lighting was of poor quality and prohibitively expensive. “Preindustrial families were constrained by concerns for both safety and frugality.” Indeed, “even the best-read people remained sparing with candlelight. In his diary for 1743, the Reverend Edward Holyoke, then president of Harvard, noted that on May 22 and 23 his household made 78 pounds of candles. Less than six months later, the diary records in its line-a-day style, ‘Candles all gone.’”

Use of candles during the day was widely considered so extravagantly wasteful that it was avoided even by the wealthy. In 1712, the rich Virginia planter William Byrd II recorded finding an enslaved woman on his plantation named Prue “with a candle by daylight” for which he barbarically “gave her a salute with [his] foot” (in other words, kicked her). Jonathan Swift advised servants to never light candles “until half an hour after it be dark” to avoid facing wrath.

Most people, of course, had no servants (free or enslaved) and even fewer candles to spare. “At all hours of the evening, families often had to navigate their homes in the dark, carefully feeling their way” and relying on familiarity with the house. “Individuals long committed to memory the internal topography of their dwellings, including the exact number of steps in every flight of stairs.” The wood stair railing of a plantation in colonial Maryland features a distinctive notch to alert candle-less climbers of an abrupt turn.

“All would be horror without candles,” noted a 16th-century writer. Yet “light from a single electric bulb is one hundred times stronger than was light from a candle or oil lamp.” Although they were the best form of artificial lighting our ancestors knew, candles created only small and flickering areas of light. Rather than completely filling a room as artificial light does today with the flick of a light switch, candle light merely “cast a faint presence in the blackness,” not reaching the ceiling or the end of a room and leaving most of one’s surroundings still drenched in darkness. Even objects within the reach of the pitifully small pool of light could appear distorted. A French saying mocking the poor quality of candle illumination stated, “By candle-light a goat is lady-like.

“Prices fluctuated over time, but never did wax . . .  candles become widely accessible. . . . Tallow candles, by contrast, offered a less expensive alternative. The mainstay of many families, their shaft consisted of animal fat, preferably rendered from mutton that was sometimes mixed with beef callow. (Hog fat, which emitted a thick black smoke, did not burn nearly as well, though early Americans were known to employ bear and deer fat.)” Vermin found such candles delectable. “Tallow candles required careful storage so that they would neither melt nor fall prey to hungry rodents.” Unpleasantly, candles “made from tallow gave off a rancid smell from impurities in the fat. . . . Wicks not only flickered, but also spat, smoked and smelled. . . . Still, despite such drawbacks, even aristocratic households depended upon them for rudimentary needs,” as wax candles were so expensive.

“Only toward the eighteenth century did cities and towns take half-steps to render public spaces accessible at night.” The average person remained indoors after sunset. “For most persons, the customary name for nightfall was ‘shutting-in,’ a time to bar doors and bolt shutters.”

Centuries later, little had changed. “Across the preindustrial countryside, fortified cities and towns announced the advance of darkness by ringing bells, beating or blowing horns from atop watchtowers, ramparts, and church steeples.” As rural peasants retreated into their homes, “townspeople hurried home before massive wooden gates, reinforced by heavy beams, shut for the evening and guards hoisted drawbridges wherever moats and trenches formed natural perimeters.” The writer Jean-Jacques Rousseau wrote of his panic as he rushed toward Geneva’s barred gates: “About half a league from the city, I hear the retreat sounding; I hurry up; I hear the drum being beaten, so I run at full speed: I get there all out of breath, and perspiring; my heart is beating; from far away. I see the soldiers from their lookouts; I run, I scream with a choked voice. It was too late.” When the Swiss writer Thomas Platter (1499–1582) found himself locked outside Munich’s city gate, he was reduced to seeking overnight shelter at a “leper-house.” In one French town, when a guard rang the bell signaling the gates were closing a half-hour too early, “Such was the mad crush of panicked crowds as they neared the gate that more than one hundred persons perished, most trampled in the stampede, others pushed from the drawbridge, including a coach and six horses. For his rapacity, the guardsman was broken upon the wheel. . . . Just to approach ramparts without warning at night constituted a crime.”

The time of shutting-in varied with the length of the day. “In winter, when darkness came on quickly, they could shut as early as four o’clock.” Laws even banned leaving one’s home at night. “In 1068, William the Conqueror (ca. 1028–1087) allegedly set a national curfew in England of eight o’clock.” Streets were blockaded to further discourage venturing outside after nightfall. “Lending weight to curfews, massive iron chains, fastened by heavy padlocks, blocked thoroughfares in cities from Copenhagen to Parma . . .  Nuremberg alone maintained more than four hundred. In Moscow, instead of chains, logs were laid across lanes to discourage nightwalkers. Paris officials in 1405 set all of the city’s farriers to forging chains to cordon off not just streets but also the Seine.” In the early 1600s, one writer noted of the French town of Saint-malo: “In the dusk of the evening a bell is rung to warn all that are without the walls to retire into the town: then ye gates are shut, and eight or ten couple of hungry mastiffs turn’d out to range about town all night. . . . Courts everywhere exacted stiffer punishments for nighttime offences” than daytime ones. For example: “For thefts committed after the curfew bell, towns in Sweden decreed the death penalty.”

Toward the end of the Middle Ages, 9 p.m. or 10 p.m. became the standard “hour for withdrawing indoors” in much of Europe.

After nightfall, “for the most part, streets remained dark.” Even where early attempts at street lighting were made, they were seldom adequate. “As late as 1775 a visitor to Paris noted, ‘This town is large, stinking, & ill lighted.’ . . . Lamps in Dublin, as late as 1783, were spaced one hundred yards apart just enough, complained a visitor, to show the ‘danger of falling into a cellar.’”

Sunsets were seldom considered beautiful. “Rarely did preindustrial folk pause to ponder the beauty of day’s departure.” Instead, most surviving descriptions of sundown were characterized by anxiety. “Begins the night, and warns us home repair,” wrote one Stuart poet.

Most ordinary people feared nighttime. “We lie in the shadow of death at night, our dangers are so great,” noted one English author in 1670. Shakespeare’s Lucrece calls nighttime a “black stage for tragedies and murders” and “vast sin-concealing chaos.” “According to Roman poet Juvenal, pedestrians prowling the streets of early Rome after sunset risked life and limb” because the darkness hid so many threats. Centuries later, similar warnings are recorded: “Except in extreme necessity, take care not to go out at night,” advised the Italian writer Sabba da Castiglione (c. 1480–1554).

Many cultures widely believed that demons, ghosts, evil spirits, and other supernatural threats would emerge after sundown, hiding in the all-encompassing darkness. “Evil spirits love not the smell of lamps,” noted Plato. “In African cultures like the Yoruba and Ibo peoples of Nigeria and the Ewe of Dahomey and Togoland, spirits assumed the form of witches at night, sowing misfortune and death in their wake.” The most feared time of night was often the “dead of night,” between midnight and the crowing of roosters (roughly 3 a.m.), which the Ancient Romans called intempesta, “without time.” The crowing was thought to scare away nocturnal demons.

Hence, “in the centuries preceding the Industrial Revolution, evening appeared fraught with menace. Darkness in the early modern world summoned the worst elements in man, nature, and the cosmos. Murderers and thieves, terrible calamities, and satanic spirits lurked everywhere.”

The night was filled with terrors both real and imagined. Fear of the night was ancient. In Greek mythology, Nyx, the personification of night and daughter of Chaos, counted among her children Disease, Strife, and Doom.

The Talmud, an ancient religious text, warns, “Never greet a stranger in the night, for he may be a demon.” After all, darkness hid “vital aspects of identity in the preindustrial world.” At night, “friends were taken for foes, and shadows for phantoms.” Ghostly nighttime encounters were widely reported throughout the preindustrial age, as widely held superstitions combined with a dearth of proper lighting to create traumatic experiences in the minds of many of our ancestors. “There was not a village in England without a ghost in it, the churchyards were all haunted, every large common had a circle of fairies belonging to it, and there was scarce a shepherd to be met with who had not seen a spirit,” an 18th-century writer in the Spectator claimed. “The late eighteenth-century folklorist Francis Grose estimated that the typical churchyard contained nearly as many ghosts at night as the village had parishioners.” Fear of such folkloric creatures was near-universal. Most ordinary people felt genuine, acute distress regarding the pantheon of evil spirits they feared lurked in the night:

Especially in rural areas, residents were painfully familiar with the wickedness of local spirits, known in England by such names as the “Barguest of York,” “Long Margery,” and “Jinny Green-Teeth.” Among the most common tormenters were fairies. In England, their so-called king was Robin Good-fellow, a trickster. . . . “The honest people,” if we may believe a visitor to Wales, “are terrified about these little fellows,” and in Ireland Thomas Campbell reported in 1777, “The fairy mythology is swallowed with the wide throat of credulity.” . . . Dobbies, who dwelt near towers and bridges, reportedly attacked on horseback. An extremely malicious order of fairies, the duergars, haunted parts of Northumberland in northern England, while a band in Scotland, the kelpies, bedeviled rivers and ferries. Elsewhere, the people of nearly every European culture believed in a similar race of small beings notorious for nocturnal malevolence.

In the minds of our ancestors, every shadow might hide trolls, elves, sprites, goblins, imps, foliots, and more. A favorite prank of young men was to affix “candles onto the backs of animals to give the appearance of ghosts.” The impenetrable darkness of the night before humanity harnessed electricity gave rise to imagined horrors beyond modern comprehension.

Other denizens of the nocturnal world included banshees in Ireland whose dismal cries warned of impending death; the ar cannerez, French washwomen known to drown passersby who refused to assist them; and vampires in Hungary, Silesia, and other parts of Eastern Europe who sucked their victims’ blood. . .  As late as 1755, authorities in a small town in Moravia exhumed the bodies of suspected vampires in order to pierce their hearts and sever their heads before setting the corpses ablaze. During the sixteenth and seventeenth centuries, reports of werewolves pervaded much of Central Europe and sections of France along the Swiss border, notably the Jura and the Franche-Comté. The surgeon Johann Dietz witnessed a crowd of villagers in the northern German town of Itzehoe chase a werewolf with spears and stakes. Even Paris suffered sporadic attacks. In 1683, a werewolf on the Notre-Dame-de-Grâce road supposedly savaged a party that included several priests.

And that is not all that the darkness ostensibly hid. “Known as boggles, boggarts, and wafts, ghosts reportedly resumed their mortal likenesses at night.” It was popularly believed that those who died by suicide were doomed to wander the night for all eternity as ghosts, and such ghosts were sometimes thought to assume the form of animals such as dogs.

Ghosts afflicted numerous communities, often repeatedly, like the Bagbury ghost in Shropshire or Wiltshire’s Wilton dog. Apparitions grew so common in the Durham village of Blackburn, complained Bishop Francis Pilkington in 1564, that none in authority dared to dispute their authenticity. Common abodes included crossroads fouled by daily traffic, which were also a customary burial site for suicides. After the self-inflicted death in 1726 of an Exeter weaver, his apparition appeared to many at a crossroads. “‘Tis certain,” reported a newspaper, “that a young woman of his neighbourhood was so scared and affrighted by his pretended shadow” that she died within two days. Sometimes no spot seemed safe. Even the urbane. [English writer Samuel] Pepys feared that his London home might be haunted. The 18th-century folklorist John Brand recalled hearing many stories as a boy of a nightly specter in the form of a fierce mastiff that roamed the streets of Newcastle-upon-Tyne.

Material problems sometimes exacerbated such anxieties. Amid an episode of widespread starvation in Poland, one observer in 1737 opined, “This calamity has sunk the spirits of the people so low, that at [Kamieniec], they imagine they see spectres and apparitions of the dead, in the streets at night, who kill all persons they touch or speak to.”

Such superstitions inspired a feeling of terror that was all too real and could result in actual deaths. Sometimes our forebears literally died of fright, experiencing cardiac arrest from the sheer shock of glimpsing sights in the darkness that they interpreted to be fairies or other such entities. And ordinary people accused of being witches or werewolves could face execution. “In Cumberland, of fifty-five deaths arising from causes other than ‘old age’ reported in the parish register of Lamplugh during a five-year period from 1658 to 1662, as many as seven persons had been ‘bewitched.’ Four more were ‘frighted to death by fairies,’ one was ‘led into a horse pond by a will of the wisp,’ and three ‘old women’ were ‘drownd’ [sic] after being convicted of witchcraft.” (Note that fairies were considered dangerous, not adorable; an 18th-century rebel group of agrarian peasants in Ireland even adopted the moniker of fairies “to intimidate their adversaries”).

Many deaths attributed to legendary beings hiding in the darkness were caused by the darkness itself. Lethal nighttime accidents were common because of the poor state of lighting. “On most streets before the late 1600s, the light from households and pedestrians’ lanterns afforded the sole sources of artificial illumination. Thus the Thames and the Seine claimed numerous lives, owing to falls from wharves and bridges, as did canals like the Leidsegracht in Amsterdam and Venice’s Grand Canal.” Canals, unguarded ditches, ponds, and open pits of varying kinds were far more commonplace in the past, as concern for safety was considerably lower than in the present. “Many people fell into wells, often left unguarded with no wall or railing. If deep enough, it made little difference whether dry”—the fall was sufficient to cause death. Straying from a familiar route could prove lethal. “In Aberdeenshire, a fifteen-year-old girl died in 1739 after straying from her customary path through a churchyard and tumbling into a newly dug grave.

“Even the brightest torch illuminated but a small radius, permitting one, on a dark night, to see little more than what lay just ahead.” Wind could blow out a torch or lantern in an instant. William Shakespeare described the frequent horror of “night wand’rers” upon seeing their “light blown out in some mistrustful wood” in his poem Venus and Adonis (1593). Traveling when the moon was bright could be the difference between life and death; by the 1660s, one in every three English families bought almanacs forecasting the lunar phases, and in colonial America, such almanacs “represented the most popular publication after the Bible.” In parts of England, the evening star (the planet Venus) was known as the Shepherd’s Lamp for its role in helping the poor navigate the night. An overcast sky could, of course, deprive a traveler of any celestial light from the stars or moon. Spaniards called such occasions noché ciéga, blind nights.

Making nocturnal navigation even harder, ordinary people in the past were rarely fully sober. This lack of sobriety, when combined with darkness, could lead to confusion and accidents. “A New England newspaper in 1736 printed a list of more than two hundred synonyms for drunkenness. Included were ‘knows not the way home’ and ‘He sees two moons’ to describe people winding their way in the late evening.” In some cases, intoxication contributed to hallucinations of the supernatural and to deadly accidents. In Derby in England, one preindustrial “inebriated laborer snored so loudly after falling by the side of a road that he was mistaken for a mad dog and shot.” Similarly tragic episodes abounded. “On a winter night in 1725, a drunken man stumbled into a London well, only to die from his injuries after a neighbor ignored his cries for help, fearing instead a demon.”

When natural phenomena illuminated the night unexpectedly, our forebears often reacted with distress. Examples of such sources of illumination included comets, aurora borealis, and swamp gas lights (caused by the oxidation of decaying matter in marshlands releasing photons). Many people took swamp gas lights to be a supernatural occurrence, termed will-o’-the-wisps.

All unusual nocturnal lights inspired terror and wonder in the people of the past, who often understood the lights as supernatural signs or portents. A comet in 1719 “struck all that saw it into great terror,” according to an English vicar, who noted that “many” people “fell to [the] ground” and “swooned” in fear. “All my family were up and in tears . . . the heavens flashing in perpetual flames,” wrote George Booth of Chester in 1727, when the aurora borealis, usually only visible farther north, made a rare appearance in England’s night sky and caused panic. One colonist in Connecticut “reportedly sacrificed his wife,” killing her in the hope that a human sacrifice might appease the heavens, upon seeing an unexpected light overhead (likely a comet). Occasionally, unexpected natural light sources could prove helpful. “Only the flash from a sudden bolt of lightning, one ‘very dark’ August night in 1693, kept the merchant Samuel Jeake from tumbling over a pile of wood in the middle of the road near his Sussex home.” More often, unanticipated lights in the darkness led to tragedy. “‘Pixy led’ was a term reserved in western England . . . for nocturnal misadventures attributed to will-o’-the-wisps.” Many deaths by drowning resulted from our forebears’ rash reactions to the sight of such “pixies” (in actuality, swamp gas).

Other nocturnal dangers were all too human, although they might pretend otherwise. “In Dijon during the fifteenth century, it was common for burglars to impersonate the devil, to the terror of both households and their neighbors. Sheep-stealers in England frightened villagers by masquerading as ghosts.” In 1660, the German legal scholar Jacobus Andreas Crusius claimed, “Experience shows that very often famous thieves are also wizards.” Many criminals indeed attempted to perform magic through grotesque superstitious rituals. “Some murderers hoped to escape capture by consuming a meal from atop their victim’s corpse. In 1574, a man was executed for slaying a miller one night and forcing his wife, whom he first assaulted, to join him in eating fried eggs from the body.” And that was not all.

The most notorious charm, the “thief’s candle,” found ready acceptance in most parts of Europe. The candle was fashioned from either an amputated finger or the fat of a human corpse, leading to the frequent mutilation of executed criminals. Favored, too, were fingers severed from the remains of stillborn infants. . . . To enhance the candle’s potency, the hands of dead criminals, known as Hands of Glory, were sometimes employed as candlesticks. Not unknown were savage attacks on pregnant women whose wombs were cut open to extract their young: In 1574, Nicklauss Stiller of Aydtsfeld was convicted of this on three occasions, for which he was “torn thrice with red-hot tongs” and executed upon the wheel (In Germany, a thief’s candle was called a Diebeherze.). . . . Before entering a home in 1586 a German vagabond ignited the entire hand of a dead infant, believing that the unburned fingers signified the number of persons still awake. Even in the late eighteenth century, four men were charged in Castlelyons, Ireland, with unearthing the recently interred corpse of a woman and removing her fat for a thief’s candle.

Many households also turned to attempts at magic to defend against thieves and monsters, using “amulets, ranging from horse skulls to jugs known as ‘witch-bottles,’ which typically held an assortment of magical items. Contents salvaged from excavated jugs have included pins, nails, human hair, and dried urine.” Some hung wolves’ heads over doors. “To keep demons from descending chimneys, suspending the heart of a bullock or pig over the hearth, preferably stuck with pins and thorns, was a ritual precaution in western England. . . . In Somerset, the shriveled hearts of more than fifty pigs were discovered in a single fireplace.”

Fear of not only evil spirits but of such flesh-and-blood criminals lurking in the darkness kept most people indoors. In 1718, London’s City Marshal noted, “It is the general complaint of the taverns, the coffeehouses, the shopkeepers and others, that their customers are afraid when it is dark to come to their houses and shops for fear that their hats and wigs should be snitched from their heads or their swords taken from their sides, or that they may be blinded, knocked down, cut or stabbed. . . . As late as the mid-eighteenth century, a Londoner complained of the ‘armies of Hell’ that ‘ravage our streets’ and ‘keep possession of the town every night.’” Almost anyone who ventured outside did so armed. “As soon as night falls, you cannot go out without a buckler and a coat of mail,” opined a visitor to Valencia in 1603.

On a night in Venice, a young English lady suddenly heard a scream followed by a “curse, a splash and a gurgle,” as a body was dumped from a gondola into the Grand Canal. “Such midnight assassinations,” her escort explained, “are not uncommon here.” First light in Denmark revealed corpses floating in rivers and canals from the night before, just as bloated bodies littered the Tagus and the Seine. Parisian officials strung nets across the water to retrieve corpses. . . . In Moscow, so numerous were street murders that authorities dragged corpses each morning to the Zemskii Dvor [Zemsky Court] for families to claim. In London . . . Samuel Johnson warned in 1739, “Prepare for death, if here at night you roam, and sign your will before you sup from home.”

“On moonless nights in many Italian cities, young men called ‘Bravos’ prowled as paid assassins.” In some cases, affluent and highborn youths roamed the night looking for a fight: ”Some cities saw the rise of nocturnal gangs composed of blades with servants and retainers in tow.” Most ruffians and thieves hiding in the darkness were common people out to commit robbery, not bored young noblemen hoping to enter a swordfight. “During the late sixteenth century, pedestrians in Vienna or Madrid rarely felt safe after dark. Foot-pads [thieves] rendered Paris streets menacing, a visitor discovered in 1620; one hundred years later, a resident wrote that ‘seldom not a night passes but some body is found murdered.’”

In London in 1712, a gang called the Mohocks terrorized the population: “Besides knifing pedestrians in the face, they stood women on their heads, ‘misusing them in a barbarous manner.’” The poet Jonathan Swift so feared that gang that he made a point of coming home early. “They shan’t cut mine [face],” he reasoned.

A lack of proper lighting afforded criminals ample cover to commit crimes. In 1681, the British dramatist John Crowne observed that night is “The time when cities are set on fire; / When robberies and murders are committed.”

Indeed, nocturnal crime was so common that a dictionary in 1585 defined thieves as felons “that sleepeth by day” so that they “may steale by night.” Surviving records suggest most preindustrial crimes occurred at night. “In the eighteenth century, nearly three-quarters of thefts in rural Somerset occurred after dark, as did 60 percent in the Libournais region of France.” “Of Italian peasants, a poem, ‘De Natura Rusticorum,’ railed: “At night they make their way, as the owls, / and they steal as robbers.”

Even indoors, nocturnal thefts were so common as to be unremarkable. In 1666, Samuel Pepys awoke “much frighted” by the noise of a theft, but upon realizing the thief was merely robbing a neighbor and not Pepys’s own home, he went back to sleep feeling relieved. Urban areas were not the only sites of crime. Bands of thieves roamed the countryside. “Bands of a half-dozen or more members were typical, as were violent break-ins. . . . Wooden doors were smashed open with battering rams and shutters bashed apart by staves. Gaping holes were cut through walls of wattle and daub. Nine thieves in 1674 stormed into the Yorkshire home of Samuel Sunderland. After binding every member of the household, they escaped with £2500.” Criminal gangs were more common in some areas than others. “French gangs, known as chauffeurs, grew notorious for torturing families with fire.” Criminals either carried no lights or “dark lanterns,” which emit light from only one side. (Merely possessing such a lantern constituted a crime in Rome and could lead to imprisonment).

In preindustrial societies, violence left few realms of daily life unscathed. Wives, children and servants were flogged, bears baited, cats massacred, and dogs hanged like thieves. Swordsmen dueled, peasants brawled, and witches burned. . . . Short tempers and long draughts made for a fiery mix, especially when stoked by the monotony and despair of unremitting poverty. The incidence of murder during the early modem era was anywhere from five to ten times higher than the rate of homicide in England today. Even recent murder rates in the United States fall dramatically below those for European communities during the sixteenth century. While no social rank was spared, the lower orders bore the brunt of the brutality.

The thieves of the past were not picky and would even pry “lead from the roofs of dwellings.” After all:

Economic necessity begot most nocturnal license. With subsistence a never-ending struggle, impoverished households naturally turned to poaching, smuggling, or scavenging food and fuel. The common people are thieves and beggars,” wrote Tobias Smollett, “and I believe this is always the case with people who are extremely indigent and miserable.”

“The working poor also took precautions, for even the most mundane items—food, clothing, and household goods—attracted thieves.” Each household, however humble, barricaded itself as night fell. “Doors, shutters, and windows were closed tight and latched.” Throughout most of history, locks were feeble and easily picked. “Not until the introduction of the ‘tumbler’ lock in the eighteenth century would keyholes better withstand the prowess of experienced thieves. In the meantime, families resorted to double locks on exterior doors, bolstered from within by padlocks and iron bars. . . . Also common, naturally, for those who could afford the expense, was the practical use of candlelight to ward off thieves. . . . In the Auvergne of France, so alarmed by crime were peasants in the mid-1700s that an official reported, ‘These men keep watch with a lamp burning all night, afraid of the approach of thieves.’”

While darkness caused lethal accidents, offered cover for crimes, and terrified our ancestors with the fear that the night might hide supernatural threats, fire could also kill. Understandable fear of fire motivated brutal punishments for arsonists and would-be arsonists. “A mob in 1680, upon learning that a woman had threatened to burn the town of Wakefield, carried her off to a dung heap, where she lay all night after first being whipped. A worse fate befell a Danish boatman and his wife, upon trying to set the town of Randers ablaze. After being dragged through every street and repeatedly ‘pinched’ with ‘glowing tongs,’ they were burned alive.” A 24-year-old University of Paris student was burned alive for arson in 1557. In Denmark, beheading was the usual punishment for arson. After a Stockholm bellringer failed to sound the alarm when a fire flared in 1504, he “was ordered to be broken on the rack, until pleas for mercy resulted instead in his beheading.”

Candles, hearth flames, and poorly cleaned or designed chimneys all posed constant fire hazards. “Some homes lacked chimneys altogether, to the consternation of anxious neighbors. Complaining that John Taylor, both a brewer and a baker, had twice nearly set his Wiltshire community ablaze from not having a chimney, petitioners in 1624 pleaded that his license be revoked. Of their absence in an Irish village, John Dunton observed, ‘When the fire is lighted, the smoke will come through the thatch, so that you would think the cabin were on fire.’”

Most ordinary homes among the impoverished masses were infested with vermin, and rats and candles proved a highly combustible combination. Flickering candles “made tempting targets for hungry rats and mice. Samuel Sewall of Boston attributed a fire within his closet to a mouse’s taste for tallow.” The Old Farmer’s Almanack advised placing candles “in such a situation as to be out of the way of rats.”

“Despite the introduction of fire engines in cities by the mid-seventeenth century, most firefighting tools were primitive,” the fire engines being mere tubs of water transported by runners on long poles or wheels. Rather than assisting in fighting the flames, neighbors often robbed burning homes. “Fireside thefts were endemic.” In England, “So routine was this form of larceny that Parliament legislated in 1707 against ‘ill-disposed persons’ found ‘stealing and pilfering from the inhabitants’ of burning homes.” “There was much thieving at the fire,” noted the Pennsylvania Gazette of a 1730 Philadelphia blaze.

“Often, barely a year passed before some town or city in England experienced disaster. From 1500 to 1800, at least 421 fires in provincial towns consumed ten or more houses apiece with as many as 46 fires during that period destroying one hundred or more houses each.” England was hardly unique in this regard. Across the preindustrial world, fires raged:

Fires spread terror from Amsterdam to Moscow, where an early morning blaze in 1737 took several thousand lives. Few cities escaped at least one massive disaster. . . . Toulouse was all but consumed in 1463, as was Bourges in 1487, and practically a quarter of Troyes in 1534. The better part of Rennes was destroyed in 1720 during a conflagration that raged for seven days. . . . Boston lost 150 buildings in 1679 after a smaller blaze just three years before. Major fires again broke out in Boston in 1711 and in 1760 when flames devoured nearly 400 homes and commercial buildings. . . . While New York and Philadelphia each suffered minor calamities, a fire gutted much of Charleston in 1740.

Rural areas were not necessarily safer from the threat of fires. The Danish writer Ludvig Holberg (1684–1754) observed, “Villages were laid out with the houses so close together that, when one house burned down, the entire village had to follow suit.” After all, rural construction materials were highly flammable. “Once ignited, a thatch roof, made from reeds or straw, was nearly impossible to save.”