fbpx
01 / 05
Halloween: More Walking Dead, Fewer Dead Walkers

Blog Post | Health & Medical Care

Halloween: More Walking Dead, Fewer Dead Walkers

Today’s trick-or-treaters have far less to fear than past generations.

Summary: Halloween is a celebration of death and fear, but it also reveals how much safer and healthier life has become. This article shows how child mortality, especially from pedestrian accidents, has declined dramatically in recent decades. It also explores how other causes of death, such as disease and violence, have become less common thanks to human progress.


This Halloween, you might see your neighbors’ front yards decorated with faux tombstones and witness several children dressed as ghosts, skeletons, zombies, or other symbols of death. Thankfully, today’s trick-or-treaters can almost all expect to remain among the living until old age. But back when the holiday tradition of children going door-to-door in spooky costumes originated, death was often close at hand, and the young were particularly at risk.

Halloween’s origins are closely linked to concerns about death. The holiday arose out of All Souls’ Day, a Christian commemoration for the deceased falling on November 2 that is also simply called the Day of the Dead. In the Middle Ages, this observance was often fused with another church feast called All Saints’ Day or All Hallows’ Day on November 1. The night before, called All Hallows’ Eve—now shortened to Halloween—in parts of medieval Britain, children and people who were poor would visit their wealthier neighbors and receive “soul cakes,” round pastries with a cross shape on them. In exchange, they promised to pray for the cake-givers’ dead relatives. This was called “souling.”

In Ireland and Scotland, Halloween also incorporated some aspects of an old Celtic pagan tradition called Samhain, including bonfires and masquerades. Samhain was also associated with death and sometimes called the feast of the dead. Eventually the traditions of wearing masks and of going door-to-door for treats combined, and young people in Ireland and Scotland took part in a practice called “guising” that we now call trick-or-treating. Dressing as ghouls and other folkloric incarnations of death became popular.

In the 1800s, an influx of Irish immigrants is thought to have popularized this Halloween tradition in the United States. The phrase “trick-or-treating” dates to at least the 1920s, when Halloween pranks or tricks also became a popular pastime. But according to National Geographic, “Trick-or-treating became widespread in the U.S. after World War II, driven by the country’s suburbanization that allowed kids to safely travel door to door seeking candy from their neighbors.”

And just how safe today’s trick-or-treaters are, especially compared to the trick-or-treaters of years past, is underappreciated. Despite the occasional public panic about razor blades in candy, malicious tampering with Halloween treats is remarkably rare, especially given that upward of 70 percent of U.S. households hand out candy on Halloween each year.

The biggest danger to today’s trick-or-treaters is simply crossing streets. But while Halloween is the deadliest night of the year for children being struck by cars, there is heartening news: annual child pedestrian deaths have declined dramatically. The number of pedestrian deaths among children aged 13 or younger fell from 1,632 in 1975 to 144 in 2020. The steep decline is even more impressive when one considers that it occurred as the total number of people and cars in the country has increased substantially.

Today’s children are thus safer as they venture out on Halloween than the last few generations of trick-or-treaters were. And, of course, when compared to the world of the very first children to celebrate Halloween, the modern age is by many measures less dangerous, especially for the young. In medieval England, when “souling” began, the typical life expectancy for ducal families was merely 24 years for men and 33 for women. While data from the era is sparse, among non-noble families in Ireland and Scotland, where “guising” began, living conditions and mortality rates may have been far worse.

It is estimated that between 30 and 50 percent of medieval children did not survive infancy, let alone childhood, with many dying from diseases that are easily preventable or treatable today. Given that context, the medieval preoccupation with death that helped give rise to traditions like Halloween is quite understandable. Life expectancy was lower for everyone, even adult royalty: the mean life expectancy of the kings of Scotland and England who reigned between the years 1000 and 1600 was 51 and 48 years, respectively. Before the discovery of the germ theory of disease, the wealthy, along with “physicians and their kids lived the same amount of time as everybody else,” according to Nobel laureate Angus Deaton.

In 1850, during the wave of Irish immigration to the United States that popularized Halloween, little progress had been made for the masses: white Americans could expect to live only 25.5 years—similar to what a medieval ducal family could expect. (And for African Americans, life expectancy was just 21.4 years.)

But the wealth explosion after the Industrial Revolution soon funded widespread progress in sanitation. That reduced the spread of diarrheal diseases, a major killer of infants—and one of the top causes of death in 1850—improving children’s survival odds and lengthening lifespans. By 1927, the year when the term “trick-or-treating” first appeared in print, there had been clear progress: U.S. life expectancy was 59 years for men and 62 years for women. The public was soon treated to some innovative new medical tricks: the following year, antibiotics were discovered, and the ensuing decades saw the introduction of several new vaccines.

In 2021, U.S. life expectancy was 79.1 years for women and 73 years for men. That’s slightly down from recent years but still decades longer than life expectancy for the aforementioned medieval kings who ruled during Halloween’s origins. Life expectancy has risen for all age groups, but especially for children, thanks to incremental progress in everything from infant care to better car-seat design.

So as you enjoy the spooky festivities this Halloween, take a moment to appreciate that today’s trick-or-treaters inhabit a world that is in many ways less frightening than when Halloween originated.

Reasons to be Cheerful | Child Abuse & Bullying

Moldova Is Making Orphanages Obsolete

“Moldova, like many post-Soviet nations, inherited a system heavily reliant on institutional child care. Prior to 2000, the country had over 17,000 children living in orphanages. Known in the country as residential institutions, they generally had austere conditions and provided a basic level of care and education…

But over the past two decades, the Moldovan government has been dismantling this legacy of institutional care, working with non-profits and UNICEF to prevent family separation and reform the child care system. Closing orphanages has given way to building new social support systems for disadvantaged families and single mothers, with the goal of keeping children with their birth families whenever possible. Introducing inclusive education for children with special needs has also been key, destigmatizing what it means to have a child with a disability. Developing a network of compassionate foster families has been at the heart of this shift.

The launch in 2007 of Moldova’s National Strategy to reform its residential childcare system aimed to deinstitutionalize 50 percent of children housed in orphanages as the country began focusing on raising social standards to align with the rest of Europe, all in preparation for EU membership, which it is still negotiating. Today, only around 700 children remain in Moldova’s orphanages. By 2027, the goal is to have none.”

From Reasons to be Cheerful.

Girls Not Brides | Child Abuse & Bullying

Burkina Faso Raises the Legal Age for Marriage to 18 Years Old

“Burkina Faso has adopted the bill for the new Personal and Family Code (CPF), changing the minimum legal age for marriage to 18 years old for both girls and boys.

Previously, the minimum age of marriage was 17 years old for girls and 20 years old for boys. However, girls could marry as young as 15 and boys at 18 if authorised by the courts.

This new bill harmonises the legal age of marriage at 18 for both girls and boys. It remains unclear if a judge can still grant exceptions for marriage at the age of 16 in some circumstances.”

From Girls Not Brides.

Save the Children | Child Abuse & Bullying

Bolivia Bans Child Marriage

“Bolivia has become the 14th country in Latin America to ban child marriage after girls across the country and Save the Children joined a campaign to criminalize the practice.

Bolivia’s parliament this week passed legislation banning marriages and civil unions with children following a four-year campaign by Save the Children, IPAS Bolivia, Coordinadora de la Mujer and other local NGOs.  

Under the previous law, children aged 16 and 17 could marry if they had authorization from parents or guardians. The new bill ends this legal exception.”

From Save the Children.

Blog Post | Human Development

Grim Old Days: Peter Laslett’s The World We Have Lost

Poverty and hardship long predated the factory age.

Summary: Before the Industrial Revolution, life in England was marked by widespread poverty, illiteracy, and relentless labor. Even children worked from as young as three. Most people lacked education, political voice, and basic comforts, enduring hunger, disease, and harsh living conditions that kept them in constant proximity to hardship and death. Peter Laslett’s The World We Have Lost reveals that the deprivations often blamed on early industrialization were in fact the norm long before factories and industry.


Peter Laslett’s book The World We Have Lost is an influential history of what life was like in England before the Industrial Revolution. Laslett makes clear that the infamous problems of the industrial era were preexisting, not innovations that first arose with the construction of factories: “The coming of industry cannot be shown to have brought economic oppression and exploitation along with it. It was there already.” His book brings into focus the poverty and hardship faced by preindustrial people and the fact that “we now inhabit a world wealthy on a scale quite unknown before industrialization.”

Laslett describes the dearth of schooling, observing that neither Isaac Newton’s nor William Shakespeare’s parents could read. Inventories from Kentish towns between the 1560s and 1630s show a steady increase from a fifth or less owning books to nearly a quarter, although such inventories were recorded only for prosperous households and thus probably overestimate the extent of book ownership. Leicestershire wills from the 1620s to 1640s show that only 17 percent of people with wills bequeathed books to their heirs, and even among the gentry that figure was only 50 percent.

The “inability to share in literate life cut most men off from even contemplating a share in political power.” And the idea of women attaining a political voice was more absurd still. Even James Tyrrell—an associate of John Locke, a critic of absolutism, and a believer in limited political authority—noted in 1681, “There never was any government where all the promiscuous rabble of women and children had votes.”

Illiteracy often not only limited women’s ability to engage with society but also increased women’s vulnerability. “An illiterate maidservant whose place was five or ten miles from home was cut off from her parents and her brothers and sisters,” effectively unable to send them messages and alert them if her employer physically abused her or sexually assaulted her (as was, sadly, common).

Instead of learning to read, many children began work at shockingly young ages. Laslett informs the reader that, as John Locke noted in 1697, poor children were expected to start working at age three, contributing in what capacity they could, often through apprenticeships. The apprentice’s contract typically went thus: “He shall not absent himself by night or by day without his master’s leave.” Some apprentices “stayed subordinate to a master in a master’s house for the whole of their lives,” far beyond the initial terms of their contract.

Not only could children start work at age 3, but by age 12, they were considered old enough to help run businesses. In 1699, at an alehouse in Harefield, Middlesex, run by Catherine and John Baily, 6 of their 10 children still living at home “were above the age of twelve, . . . old enough to help run the family establishment.”

In England grooms could legally be as young as 14 and brides as young as 12, although Laslett notes that thankfully that was relatively rare in practice. Early marriages did occur, though. In 1623, a London parish clerk wrote disapprovingly of the wedding of a 17-year-old boy working as a threadmaker to the 14-year-old daughter of a porter, calling them a “couple of young Fooles.”

A rather offensive (to modern sensibilities) form of divorce known as “wife-selling” sometimes occurred among those who could not afford a formal dissolution of marriage. The Ipswich Journal records such a sale occurring in 1789:

Oct. 29, Samuel Balls sold his wife to Abraham Rade in the parish of Blythburgh in his county for 1 [shilling]. A halter was put around her neck and she was resigned up to this Abraham Rade.

Such bizarre episodes “reveal something of the slightly quizzical attitude of ordinary people to the official marriage code,” with local customs and practices varying wildly. Upon settling down typically, a man tilled land with the aid of his wife and children. Picture the “hard-working, needy, half-starved labourers of pre-industrial times,” who toiled nonstop and yet never produced enough to live comfortably.

Here was an economy conspicuously lacking in those devices for the saving of exertion which are so marked a feature of our own everyday life. The simplest operation needed effort; drawing the water from the well, striking steel on flint to catch the tinder alight, cutting goose-feather quills to make a pen, they all took time, trouble and energy. The working of the land, the labour in the craftsmen’s shop, were infinitely taxing. [The peasantry would] shock us with their worn hands and faces, their immeasurable fatigue.

Those who didn’t work in agriculture were often servants. The percentage of workers employed as servants in the population varied from as low as 4 percent to as high as a third of the population in relatively wealthy times and places, such as London and parts of Norwich in the 1690s. “Everywhere work of all kinds varied alarmingly with the state of the weather and of trade, so that hunger was not very far away.” Many had no employment and begged. “Wandering beggars . . . were . . . a feature of the countryside at all times.”

Any increase in the cost of food staples could prompt social discord. “Right up to the time of the French Revolution and beyond, in Europe the threat of high prices for food was the commonest and most potent cause of public disorder.” Public panic about food was often warranted, as the threat of hunger was all too real. In 1698 in Scotland, contemporary accounts say, “[m]any have died for want of bread, and have been necessitate to make use of wild-runches draff and the like for the support of nature.” A runch is a common weed.

Laslett makes clear that England, being wealthier than much of Europe, saw relatively few famines by the late early modern period. Still, England’s harvest year of 1623–1624 was devastating, and in some locations, such as Ashton, the number of recorded burials was over two-and-a-half times the typical level. Numerous burials record the cause of the death as starvation. The deaths recorded in the Register of Greystoke in England, in 1623, put names to some of these victims of starvation, including, “A poor hungerstarved beggar child, Dorothy,” and “Thomas Simpson, a poor hungerstarved beggar boy,” as well as “Leonard . . . which child died for want of food,” and 4-year-old “John, son of John Lancaster, late of Greystoke, a waller by trade, which child died for want of food and means.”

Preindustrial people also froze. Indeed, in cold climates such as those of northern and western Europe, “the necessity of gathering round fires and sharing beds, make it obvious that the privacy now regarded as indispensable, almost as a human right,” was once rare, with the masses forced to sleep next to each other and their farm animals for body heat.

If there was one thing that was better about the past, it was perhaps that people were—by necessity—tougher. London’s suicide rate circa 1660 is estimated as somewhere between 2.5 and 5 per 100,000 people, low by modern standards.1 But on the whole, what Laslett calls “the world we have lost” is not a world we’d want back.

  1. According to the most recent data from Britain’s Office of National Statistics, London’s suicide rate now stands at 7.3 per 100,000 people, while England and Wales have a suicide rate of 17.4 per 100,000. According to the most recent year of OECD data, only one OECD country has a suicide rate of under 5 per 100,000: Turkey, at 4.8 per 100,000. (In recent years, only two or three OECD countries typically manage to keep suicides below the upper bound of the estimated level seen in 17th-century London).