fbpx
01 / 05
Grim Old Days: Fernand Braudel’s Structures of Everyday Life

Blog Post | Human Development

Grim Old Days: Fernand Braudel’s Structures of Everyday Life

“The poor in the towns and countryside of the West lived in a state of almost complete deprivation."

Summary: Fernand Braudel’s book offers a vivid exploration of the harsh realities of pre-industrial life, from deadly wildlife encounters and frequent famines to the widespread use of primitive table manners. Braudel delves into the grim aspects of daily existence, highlighting the pervasive dangers and struggles that defined life before modern advancements. His work paints a stark picture of a world where survival was precarious, and comfort was a rare and costly luxury.


The French historian Fernand Braudel’s ambitious book, The Structures of Everyday Life: Civilization and Capitalism, 15th–18th Century, paints a sweeping and vivid picture of many aspects of life in the past, from the shocking frequency of deadly wildlife encounters and famines to plagues and violence.

“There is a drawing which shows Maximilian of Austria at a table, in about 1513: he is putting his hand into a dish. Two centuries or so later, the Princess Palatine tells how Louis XIV, when he allowed his children to sit up to table for the first time, forbade them to eat differently from him, and in particular to eat with a fork as an over-zealous tutor had taught them.” Even the Queen of France from 1615 to 1643, “Anne of Austria ate her meat with her fingers all her life. And so did the Court of Vienna until at least 1651.”

“The individual fork dates from about the sixteenth century. . . . We know that Montaigne did not use a fork, since he accuses himself of eating too quickly so that ‘I sometimes bite my fingers in my haste.’”

How the table settings in various artists’ depictions of The Last Supper (a frequent subject in Western art) changed over time is revealing: “no fork appears before 1600 and almost no spoons either.”

“The use of a spoon did not become widespread until the sixteenth century, and the custom of providing knives dates from the same time—before that the guests brought along their own. Individual glasses for each guest also appeared at about this time. Courtesy formerly dictated that one emptied the glass and passed it on to one’s neighbour, who did the same.” Even individual plates are relatively recent; previously, a single dish was brought out on a wooden board and “each selected the morsel he wanted and picked it up with his fingers.”

The state of table manners is hinted at in an Austrian ordinance from 1624 instructing young officers on how to behave when dining with an archduke, which specifies: “not to arrive half drunk, not to drink after every mouthful, … not to lick the fingers, not to spit in the plate, not to wipe the nose on the tablecloth, not to gulp drink like animals.” Separate dining rooms were rare until the 16th century, and then only among the rich.

Pepper was a rare luxury. Among 15th-century explorers, “as dear as pepper” was a common expression (“dear” being used in the sense meaning “expensive”).

Salt was consumed in unhealthily large quantities in a desperate attempt to improve the taste of the masses’ monotonous and unappetizing diets. “In the Europe of insipid farinaceous gruels consumption of salt was large . . . twenty grams daily per person,” far exceeding the present-day figure.” To put that into perspective, the US Food and Drug Administration’s nutritional guidelines currently recommend limiting salt consumption to 2.3 grams per person per day; on average, Americans eat about 3.4 grams of salt per day, overshooting recommendations but still lagging far behind what was typical in the preindustrial world. Outside of Europe, too, the impoverished majority turned to salt out of desperation to enliven their similarly bland, unvaried staple foods. As an Indian writer put it, “When the palate revolts against the insipidness of rice boiled with no other ingredients, we dream of fat, salt and spices.”

“Famine recurred so insistently for centuries on end that it became incorporated into man’s biological regime and built into his daily life. Dearth and penury were continual, and familiar. . . . Two consecutive bad harvests spelt disaster.”

“Any national calculation shows a sad story. France, by any standards a privileged country, is reckoned to have experienced 10 general famines during the tenth century: 26 in the eleventh; 2 in the twelfth; 4 in the fourteenth; 7 in the fifteenth; 13 in the sixteenth; 11 in the seventeenth and 16 in the eighteenth. . . . [Much] the same could be said of any country in Europe.” Braudel relates that a third of Finland’s population is estimated to have died of starvation during a famine from 1696 to 1697. “Florence . . . experienced 111 years when people went hungry, and only sixteen ‘very good’ harvests between 1371 and 1791.” “Near Blois in 1662, a witness reported that [due to famine] the poor were on a diet of ‘cabbage stumps with bran soaked in cod broth.’” A decade earlier, in 1652, a chronicler noted, “the people of Lorraine and other surrounding lands are reduced to such extremities that, like animals, they eat the grass in the meadows.” In 1694 near Meulan, famine again made it so that “large numbers of people lived on grass like animals.” In 1674–76, in southeastern France, people were reportedly reduced to eating “acorns and roots.”

Beyond Europe, famine was also frequent and severe. “In 1555, and again in 1596, violent famine throughout north-west India, resulted in scenes of cannibalism, according to contemporary chroniclers. There was another terrible famine, almost everywhere in India, in 1630–31. A Dutch merchant has left us an appalling description of it: ‘Men abandoned towns and villages and wandered helplessly: It was easy to recognize their condition: eyes sunk deep in the head, lips pale and covered with slime, the skin hard, with the bones showing through, the belly nothing but a pouch hanging down empty. . . . One would cry and howl for hunger, while another lay stretched on the ground dying in misery.’” The famine caused “collective suicides. . . . Then came the stage when the starving split open the stomachs of the dead or dying and ‘drew at the entrails to fill their own bellies’. ‘Many hundred thousands of men died of hunger, so that the whole country was covered with corpses lying unburied, which caused such a stench that the whole air was filled and infected with it . . . in the village of Susuntra . . . human flesh was sold in open market.”

Some Europeans also resorted to cannibalism in times of famine; in 1662 in Burgundy, contemporary accounts say that “famine this year has put an end to over ten thousand families . . . and forced a third of the inhabitants, even in the good towns, to eat wild plants. . . . Some people ate human flesh.”

Even in good times, peasants often subsisted on “gruels, sops and bread” that barely provided any nutritional value. “Bread was almost always hard and mouldy.” “Bread was sometimes bread in name alone.” By some estimates, “no more than 4% of the European population ate white bread. Even at the beginning of the eighteenth century, half the rural population fed on non-bread-making cereals and rye, and a lot of bran was left in the mixture of grains that went to make bread for the poor. Wheaten bread and white bread . . . remained a luxury for a long time.” It was only between 1750 and 1850 that white bread became more widely available.

In some societies, meat was so scarce as to be the preserve of the wealthy. “‘One has to be a very great lord in Sumatra,’ said one seventeenth-century traveller, ‘to have a boiled or roast chicken, which moreover has to last for the whole day.’” Today, meanwhile, one can purchase a whole rotisserie chicken from Costco for $4.99 and casually gorge on what was once a delicacy.

Nutritional deficiencies harmed human health and may have even prevented children from reaching their intellectual potential. “The Dictionnaire de Trévoux (1771) says quite bluntly: ‘The peasants are usually so stupid because they only live on coarse foods.’”

Water was also often scarce. “Whole towns—and very wealthy ones at that—were poorly supplied with water.” One example was Venice. “When no rain fell for weeks on end, the cisterns ran dry; this happened when [the 19th-century French writer] Stendhal was staying in the city. If there was a storm they were tainted with salt water.” “In 1770, Thames water ‘which is not good’ was carried to all the houses in London … but this was not what we would usually think of as running water: it was ‘distributed regularly three times a week, according to the amount consumed per household.”

Even when water was supplied, it was often tainted. The use of lead, a powerful toxin, in piping “is recorded in England in 1236.” Exposure to lead in drinking water is now known to cause many negative health effects, including debilitating lifelong brain damage and stunted growth in children, anemia and cardiovascular problems in adults, and in pregnant women, a heightened risk of miscarriage. In Paris, the main water source was the thick sludge of the Seine: “It was supposed to bear boats well, being muddy and therefore heavy, as a Portuguese envoy reported in 1641—not that this quality would recommend itself to drinkers.” Indeed, the water source doubled as a dumping site for toxic waste. “‘A number of dyers pour their dye three times a week into the branch of the river which washes the Pelletier quay and between the two bridges,’ said an eye witness (1771). ‘The arch which forms the Gêvres quai is a seat of pestilence. All that part of the town drinks infected waters.’”

Such poor diets and tainted water made people more vulnerable to every illness. “The undernourished, unprotected population could offer little resistance to [epidemics, hence] the Tuscan proverb [says] ‘The best remedy against malaria is a well filled pot.’ Undernourishment, on all the evidence, is a ‘multiplying’ factor in the spread of diseases.”

“To mention only smallpox: in 1775, when inoculation was beginning to be discussed, a medical book considered it ‘the most general of all diseases’; ninety-five in every hundred people were affected; one in seven died.” In 1780, a mysterious illness dubbed “purple fever” is said to have killed so many hundreds of Parisians that “the gravediggers’ arms were falling off.” Influenza struck often as well. “In 1588, it laid low (but did not kill) the entire population of Venice, to the point where the Grand Council was empty.” A mysterious “sweating sickness” plagued England from 1486 to 1551, with five major outbreaks, also striking Denmark, the Netherlands, Germany, and Switzerland: “The victims had fits of shivering and sweated profusely and were often dead within hours.” Even the rich could not escape the ravages of disease. “Tuberculosis was also an old scourge of Europe: Francis II (tubercular meningitis), Charles IX (pulmonary tuberculosis) and Louis XIII (intestinal tuberculosis) all fell victim to it (1560, 1574, 1643).”

Even royalty received atrociously ineffective and occasionally deadly medical care. The itinerant physician Arnaud de Villeneuve (c. 1240–c. 1313) claimed, “brandy, aqua vitae, accomplished the miracle of preserving youth, dissipated superfluous body fluids, revived the heart, cured colic, drops: paralysis, quartan ague, calmed toothache and gave protection against plague. But his miracle cure brought Charles the Bad, of execrable memory, to a terrible end (1387); doctors had enveloped him in a brandy-soaked sheet sewn up with large stitches for greater efficiency so that it fitted rightly round the patient. A servant held a candle up close to try to break one of the threads, and sheet and invalid went up in flames.”

Outbreaks of deadly disease caused not only deaths but social discord among survivors. Braduel quotes the English diarist Samuel Pepys in 1665 calling the bubonic plague “the plague making us cruel, as doggs, one to another.” Braudel quotes Daniel Defoe’s account of the 1664 plague of London, saying that the dead were thrown “for the most part on to a cart like common dung.” The outbreaks were frequent. “Plague occurred in Amsterdam every year from 1622 to 1628 (the toll: 35,000 dead). It struck Paris in 1612, 1619, 1631, 1638, 1662, 1668 (the last).” Those who did survive the poverty, famines, and plagues of the past were often prematurely aged. In 1754, Braudel quotes one author as noting that “the peasants in France . . . begin to decline before they are forty.”

“The poor in the towns and countryside of the West lived in a state of almost complete deprivation. Their furniture consisted of next to nothing, at least before the eighteenth century, when a rudimentary luxury began to spread (chairs, where before people had been content with benches, woollen mattresses, feather beds) … But before the eighteenth century, … inventories mention only a few old clothes, a stool, a table, a bench, the planks of a bed, sacks filled with straw. Official reports for Burgundy between the sixteenth and the eighteenth centuries are full of references to people [sleeping] on straw with no bed or furniture’ who were only separated ‘from the pigs by a with a screen.’”

“Travellers’ tales are full of savage beasts. One seventeenth-century account describes tigers prowling round Asian villages and towns, and swimming out into the Ganges delta to surprise fisherman asleep in their boats.”

“No one feels safe after nightfall, not even inside a house. One man went out of his hut in a small town near [Guangzhou], where the Jesuit father [Adriano] de Las Cortes and his fellow sufferers were imprisoned (1626), and was carried off by a tiger.”

“The whole of Europe, from the Urals to the Straits of Gibraltar, was the domain of wolves, and bears roamed in all its mountains. The omnipresence of wolves and the attention they aroused make wolf-hunting an index of the health of the countryside, and even of the towns, and of the character of the year gone by. A lapse in vigilance, an economic setback, a rough winter, and they multiplied. In 1420, packs entered Paris through a breach in the ramparts or unguarded gates. They were there again in September 1438, attacking people this time outside the town, between Montmartre and the Saint-Antoine gate. In 1640, wolves entered Besançon by crossing the Doubs near the mills of the town and ‘ate children along the roads.’”

“There was an example of this in Gevaudan ‘where the ravages of the wolves made people believe in the existence of an unnatural monster.’” A creature so lethal that it has become the stuff of legends, nicknamed the Beast of Gévaudan, terrorized France between 1764 and 1767, allegedly killing over 100 victims and maiming many more. The tragic loss of life even prompted King Louis XV to send troops to hunt the predator. Some scholars think the infamous maneater may have been a hyena or lion escaped from a menagerie based on conflicting descriptions of its appearance from the period, but details from the autopsy report after the animal was finally slain by a hunter suggest it was a canine, probably an unusually large wolf or wolf-dog hybrid that developed a marked taste for human flesh.

Knowing firsthand how deadly wolves could be, the French even considered using wolves as a form of biological warfare against the English. “In fact the Députés du Commerce were discussing in 1783 a proposal made several years earlier, to ‘introduce into England a sufficient number of wolves to destroy the greater part of the population.’”

And that is only a small taste of the past from Braudel’s lengthy tome.

Blog Post | Human Development

The Grim Truth About the “Good Old Days”

Preindustrial life wasn’t simple or serene—it was filthy, violent, and short.

Summary: Rose-tinted nostalgia for the preindustrial era has gone viral—some people claim that modernity itself was a mistake and that “progress” is an illusion. This article addresses seven supposed negative effects of the Industrial Revolution. The conclusion is that history bears little resemblance to the sanitized image of preindustrial times in the popular imagination.


When Ted Kaczynski, the Unabomber, declared in 1995 that “the Industrial Revolution and its consequences have been a disaster for the human race,” he was voicing a sentiment that now circulates widely online.

Rose-tinted nostalgia for the preindustrial era has gone viral, strengthened by anxieties about our own digital era. Some are even claiming that modernity itself was a mistake and that “progress” is an illusion. Medieval peasants led happier and more leisurely lives than we do, according to those who pine for the past. “The internet has become strangely nostalgic for life in the Middle Ages,” journalist Amanda Mull wrote in a piece for The Atlantic. Samuel Matlack, managing editor of The New Atlantis, observed that there is currently an “endless debate around whether the preindustrial past was clearly better than what we have now and we must go back to save humanity, or whether modern technological society is unambiguously a forward leap we must forever extend.”

In the popular imagination, the Industrial Revolution was the birth of many evils, a time when smoke-belching factories disrupted humanity’s erstwhile idyllic existence. Economics professor Vincent Geloso’s informal survey of university students found that they believed “living standards did not increase for the poor; only the rich got richer; the cities were dirty and the poor suffered from ill-health.” Pundit Tucker Carlson has even suggested that feudalism was preferable to modern liberal democracy.

Different groups tend to idealize different aspects of the past. Environmentalists might idealize preindustrial harmony with nature, while social traditionalists romanticize our ancestors’ family lives. People from across the political spectrum share the sense that the Industrial Revolution brought little real improvement for ordinary people.

In 2021, History.com published “7 Negative Effects of the Industrial Revolution,” an article reflecting much of the thinking behind the popular impression that industrialization was a step backward for humanity, rather than a period of tremendous progress. But was industrialization really to blame for each of the ills detailed in the article?

“Horrible Living Conditions for Workers”

Were horrible living conditions a result of industrialization? To be sure, industrial-era living conditions did not meet modern standards—but neither did the living conditions that preceded them.

As historian Kirstin Olsen put it in her book, Daily Life in 18th-Century England, “The rural poor . . . crowded together, often in a single room of little more than 100 square feet, sometimes in a single bed, or sometimes in a simple pile of shavings or straw or matted wool on the floor. In the country, the livestock might be brought indoors at night for additional warmth.” In 18th-century Wales, one observer claimed that in the homes of the common people, “every edifice” was practically a miniature “Noah’s Ark” filled with a great variety of animals. One shudders to think of the barnlike smell that bedchambers took on, in addition to the chorus of barnyard sounds that likely filled every night. Our forebears put up with the stench and noise and cuddled up with their livestock, if only to stave off hypothermia.

Homes were often so poorly constructed that they were unstable. The din of collapsing buildings was such a common sound that in 1688, Randle Holme defined a crash as “a noise proceeding from a breach of a house or wall.” The poet Dr. Samuel Johnson wrote that in 1730s London, “falling houses thunder on your head.” In the 1740s, “props to houses” keeping them from collapsing were listed among the most common obstacles that blocked free passage along London’s walkways.

“Poor Nutrition”

What about poor nutrition? From liberal flower children to the “Make America Healthy Again” crowd, fetishizing the supposedly chemical-free, wholesome diets of yore is bipartisan. The truth, however, is stomach-churning.

Our ancestors not only failed to eat well, but they sometimes didn’t eat at all. Historian William Manchester noted that in preindustrial Europe, famines occurred every four years on average. In the lean years, “cannibalism was not unknown. Strangers and travelers were waylaid and killed to be eaten.” Historian Fernand Braudel recorded a 1662 account from Burgundy, France, that lamented that “famine this year has put an end to over ten thousand families . . . and forced a third of the inhabitants, even in the good towns, to eat wild plants. . . . Some people ate human flesh.” A third of Finland’s population is estimated to have died of starvation during a famine in the 1690s.

Even when food was available, it was often far from appetizing. Our forebears lived in a world where adulterated bread and milk, spoiled meat, and vegetables tainted with human waste were everyday occurrences. London bread was described in a 1771 novel as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.” According to historian Emily Cockayne, the 1757 public health treatise Poison Detected noted that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.”

Water was also far from pristine. “For the most part, filth flowed out windows, down the streets, and into the same streams, rivers, and lakes where the city’s inhabitants drew their water,” according to environmental law professor James Salzman. This ensured that each swig included a copious dose of human excreta and noxious bacteria. Waterborne illnesses were frequent.

“A Stressful, Unsatisfying Lifestyle”

Did stressful lifestyles originate with industrialization? Did our preindustrial ancestors generally enjoy a sense of inner peace? Doubtful. Sadly, many of them suffered from what they called melancholia, roughly analogous to the modern concepts of anxiety and depression.

In 1621, physician Robert Burton described a common symptom of melancholia as waking in the night due to mental stress among the upper classes. An observer said the poor similarly “feel their sleep interrupted by the cold, the filth, the screams and infants’ cries, and by a thousand other anxieties.” Richard Napier, a 17th-century physician, recorded over several decades that some 20 percent of his patients suffered from insomnia. Today, in comparison, 12 percent of Americans say they have been diagnosed with chronic insomnia. Stress is nothing new.

Sky-high preindustrial mortality rates caused profound emotional suffering to those in mourning. Losing a child to death in infancy was once a common—indeed, near-universal—experience among parents, but the loss was no less painful for all its ordinariness. Many surviving testimonies suggest that mothers and fathers felt acute grief with each loss. The 18th-century poem, “To an Infant Expiring the Second Day of Its Birth,” by Mehetabel “Hetty” Wright—who lost several of her own children prematurely—heartrendingly urges her infant to look at her one last time before passing away.

So common were child deaths that practically every major poet explored the subject. Robert Burns wrote “On the Birth of a Posthumous Child.” Percy Bysshe Shelley wrote multiple poems to his deceased son. Consider the pain captured by these lines from William Shakespeare’s play King John, spoken by the character Constance upon her son’s death: “Grief fills the room up of my absent child. . . . O Lord! My boy, my Arthur, my fair son! My life, my joy, my food, my all the world!” Shakespeare’s own son died in 1596, around the time the playwright would have finished writing King John.

Only in the modern world has child loss changed from extraordinarily common to exceedingly rare. As stressful as modern life can be, our ancestors faced forms of heartache that most people today will never endure.

“Dangerous Workplaces” and “Child Labor”

Dangerous workplaces and child labor both predate the Industrial Revolution. In agrarian societies, entire families would labor in fields and pastures, including pregnant women and young children. Many preindustrial children entered the workforce at what today would be considered preschool or kindergarten age.

In poorer families, children were sent to work by age 4 or 5. If children failed to find gainful employment by age 8, even social reformers unusually sympathetic to the plight of the poor, would express open disgust at such a lack of industriousness. Jonas Hanway was reportedly “revolted by families who sought charity when they had children aged 8 to 14 earning no wages.”

For most, work was backbreaking and unending. A common myth suggests that preindustrial peasants worked fewer days than modern people do. This misconception originated from an early estimate by historian Gregory Clark, who initially proposed that peasants labored only 150 days a year. He later revised this figure to around 300 days—higher than the modern average of 260 working days, even before factoring in today’s paid holidays and vacation time.

Physically harming one’s employees was once widely accepted, too, and authorities stepped in only when the mistreatment was exceptionally severe. In 1666, one such case occurred in Kittery, in what is now Maine, when Nicholas and Judith Weekes caused the death of a servant. Judith confessed that she cut off the servant’s toes with an axe. The couple, however, was not indicted for murder, merely for cruelty.

“Discrimination Against Women”

The preindustrial world was hardly a model of gender equality—discrimination against women was not an invention of the early industrialists but a long-standing feature of many societies.

Domestic violence was widely tolerated. In London, a 1595 law dictated: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” In other words, no beating your wife after 9:00 p.m. That was a noise regulation. A similar law forbade using a hammer after 9:00 p.m. Beating one’s wife until she screamed was an ordinary and acceptable activity.

Domestic violence was celebrated in popular culture, as in the lively folk song “The Cooper of Fife,” a traditional Scottish tune that inspired a country dance and influenced similar English and American ballads. To modern ears, the contrast between its violent lyrics and upbeat melody is unsettling. The song portrays a husband as entirely justified in his acts of domestic violence, inviting the audience to side with the wifebeater and cheer as he beats his wife into submission for her failure to perform domestic chores to her husband’s satisfaction.

Sexist laws often empowered men to abuse women. If a woman earned money, her husband could legally claim it at any time. For instance, in 18th-century Britain, a wife could not enter into contracts, make a will without her husband’s approval, or decide on her children’s education or apprenticeships; moreover, in the event of a separation, she automatically lost custody. Mistreatment of women, in other words, long predated industrialization. Arguably, it was the increase in female labor force participation during the Industrial Revolution that ultimately gave women greater economic independence and strengthened their social bargaining power.

“Environmental Harm”

While many of today’s environmental challenges—such as climate change and plastic pollution—differ from those our forebears faced, environmental degradation is not a recent phenomenon. Worrying about environmental impact, however, is rather new. Indeed, as historian Richard Hoffmann has pointed out, “Medieval writers often articulated an adversarial understanding of nature, a belief that it was not only worthless and unpleasant, but actively hostile to . . . humankind.”

Consider deforestation. The Domesday Survey of 1086 found that trees covered 15 percent of England; by 1340, the share had fallen to 6 percent. France’s forests more than halved from about 30 million hectares in Charlemagne’s time (768–814) to 13 million by Philip IV’s reign (1285–1314).

Europe was hardly the only part of the world to abuse its forests. A 16th-century witness observed that at every proclamation demanding more wood for imperial buildings, the peasants of what are today the Hubei and Sichuan provinces in China “wept with despair until they choked,” for there was scarcely any wood left to be found.

Despeciation is also nothing new. Humans have been exterminating wildlife since prehistory. The past 50,000 years saw about 90 genera of large mammals go extinct, amounting to over 70 percent of America’s large species and over 90 percent of Australia’s. 

Exterminations of species occurred throughout the preindustrial era. People first settled in New Zealand in the late 13th century. In only 100 years, humans exterminated 10 species of moa in addition to at least 15 other kinds of native birds, including ducks, geese, pelicans, coots, Haast’s eagle, and an indigenous harrier. Today, few people realize that lions, hyenas, and leopards were once native to Europe, but by the first century, human activity eliminated them from the continent. The final known auroch, Europe’s native wild ox, was killed in Poland by a noble hunter in 1627.

Progress Is Real

History bears little resemblance to the sanitized image of preindustrial times in the popular imagination—that is, a beautiful scene of idyllic country villages with pristine air and residents merrily dancing around maypoles. The healthy, peaceful, and prosperous people in this fantasy of pastoral bliss do not realize their contented, leisurely lives will soon be disrupted by the story’s villain: the dark smokestacks of the Industrial Revolution’s “satanic mills.”

Such rose-colored views of the past bear little resemblance to reality. A closer look shatters the illusion. The world most of our ancestors faced was in fact more gruesome than modern minds can fathom. From routine spousal and child abuse to famine-induced cannibalism and streets that doubled as open sewers, practically every aspect of existence was horrific.

A popular saying holds that “the past is a foreign country,” and based on recorded accounts, it is not one where you would wish to vacation. If you could visit the preindustrial past, you would likely give the experience a zero-star rating. Indeed, the trip might leave you permanently scarred, both physically and psychologically. You might long to unsee the horrors encountered on your adventure and to forget the shocking, gory details.

The upside is that the visit would help deromanticize the past and show how far humanity has truly come—emphasizing the utter transformation of everyday lives and the reality of progress.

This article was published at Big Think on 11/19/2025.

Blog Post | Human Development

Discontent in the Age of Plenty | Podcast Highlights

Marian Tupy interviews Brink Lindsey about why unprecedented prosperity has failed to deliver widespread meaning.

Listen to the podcast or read the full transcript here.

Today, I’ll be speaking with Brink Lindsey, an American political writer and Senior Vice President at the Niskanen Center. Previously, he was Cato’s Vice President for Research and a dear colleague. Today, we’ll be discussing his latest book, The Permanent Problem: The Uncertain Transformation from Mass Plenty to Mass Flourishing.

I want to start by congratulating you on your excellent book. It is concise, thoughtful, and beautifully written. As a published author, I’m envious of your style, and I really recommend the book to our listeners.

Let’s start with the most obvious question. What is the permanent problem?

I stole that line from the British economist John Maynard Keynes, who wrote a fascinating essay called “Economic Possibilities for Our Grandchildren.”

That essay came out in 1930 in the depths of the Great Depression, but he was brave enough to argue that this global catastrophe was just a bump in the road in a much longer process of modern economic growth, which he believed would continue until his audience’s grandchildren were grown. By that point, he said that the economic problem, meaning serious material deprivation, would be more or less solved. With that done, he foresaw that humanity’s permanent problem would loom into view: how to live wisely and agreeably and well with the blessings that modern economic growth has bestowed upon us.

He got some specific things wrong. He imagined that by now we’d only be working 15 hours a week, which hasn’t panned out. However, he got the big picture profoundly right, which is that an abundant future was coming, and that moving from tackling the economic problem to the permanent problem would be traumatic for societies. That they would have to unlearn the habits of untold generations.

He imagined that this transition would be, in his words, something like a “general nervous breakdown throughout society.” That phrase struck me as a pretty good description for the predicament that the United States and other advanced democracies have found themselves in. We’re richer, healthier, better educated, and more humanely governed than any people have ever been before, yet economic growth has slowed to a crawl in most advanced economies, class divisions have sparked a global populist uprising against elites and established institutions, personal relationships are fraying, mental health problems are on the rise, faith in democracy is wavering, and widespread pessimism is one of the few things you can get people across the political spectrum to agree on.

So, the thesis of the book is that our predicament amounts to the fact that we are in this no man’s land between mass plenty and mass flourishing. That, having achieved mass plenty, we’ve moved the goalposts of what makes a successful life. It’s no longer just about having food, shelter, and clothing, but meaning, purpose, belonging, and status. While we are providing those conditions for a larger fraction of the population than ever before, for 70 or 80 percent of people, our current way of life is not providing the conditions for flourishing that one would imagine would go with our level of technological and organizational prowess.

So, in America today, things are so good that we are moving to the top of Maslow’s hierarchy, but on the other hand, we have a hysteria where people are saying basic necessities like food and shelter have never been more unaffordable.

Can both be true at the same time?

I think we are absolutely materially richer than any society before. People who are discontent with the status quo grope for something quantifiable that has gone wrong, and so they try to make an argument about material decline that just isn’t consistent with the facts. It is true that we are rich enough to take our basic material needs for granted. Nonetheless, we enjoy these blessings with a kind of asterisk, which is that we get them only by spending the bulk of our waking adult lives working 40-hour weeks.

The blessed 20 or 30 percent at the top have an arena for flourishing. They’ve got intellectually challenging jobs that offer a lot of autonomy and scope for creativity, and social status. The rest are in fairly low-autonomy jobs with a lot of scutwork, and they’re one stroke of bad luck away from losing their job and falling into a serious hole. They’re shadowed by both the precarity of their hold on mass plenty and also by the need to spend a lot of their lives in drudgery to pay the bills.

According to Gallup, life satisfaction in America remained pretty much the same between 1979 and 2025. Roughly 80 percent of Americans say they are either satisfied or very satisfied with their lives, while only 20 percent of Americans believe that America is going in the right direction.

So, how bad is it really, if 80 percent of Americans say that they are satisfied or very satisfied with their lives?

I don’t put much stock in self-assessments of life satisfaction. Psychologically healthy people make the best of things, whatever the circumstances. Plus, happiness and life satisfaction surveys have a lot of cultural variation. Latin Americans seem to report higher life satisfaction given their level of GDP than Scandinavians or Japanese.

What I look at instead is the conditions for a well-lived life. The chances to do work that is challenging, fulfilling, and interesting are very good for a considerable fraction of people, but they’re not so good for the majority. There’s a large divergence there between the well-off and well-educated and everybody else. That’s also translated into diverging odds of even being in the workforce: there’s been a small drop-off in male prime-age labor force participation for college-educated men from the mid-’60s to the present, and a big drop-off in labor force participation for non-college-educated men. There’s been a similar divergence in the odds of getting married and in the odds of growing up in a two-parent home. And finally, in recent years, we’ve seen a divergence in life expectancy. Rather than the poor catching up with the rich over time, they’re now pulling apart.

So, are we doing better than ever before? Sure. But I don’t think that exhausts the inquiry. In a society organized around progress, a purely backward-looking standard of evaluation isn’t dispositive. In some of the more intangible aspects of flourishing, there are warning signs that things are going in the wrong direction.

So, do you have in your mind a sense of what an agreeable life should be?

At least in broad outlines.

In the agrarian age, to quote Hobbes, “Life was poor, nasty, brutish, and short,” but it was not solitary. People were miserable and poor, but they weren’t atomized or alienated. Now, I think it’s a real liberation that we’re not stuck in the same place that we were born, working the same trade as our parents. We can choose our own lives, and that’s a great opportunity. The next question is, “Are we going to develop cultural and institutional supports in these new conditions that will help us to have satisfying lives?

It’s beyond serious dispute that for most people, the most important determinant of the quality of their life is the quality of their personal relationships. And once upon a time, when the world was poor, your face-to-face relationships with other people filled vital practical functions. Your spouse was a partner in economic co-production. Your kids were economic assets. Your neighbors were an insurance policy. The main source of entertainment was hanging out with your friends and talking.

Over time, as we’ve gotten richer, we’ve outsourced a lot of those functions either to the marketplace or the welfare state. Personal relationships with people have become just one consumption option in a sea of expertly marketed alternatives. Learning to live wisely and agreeably and well amidst riches requires cultural and institutional supports that push us to spend our time on what really matters, which is the people who are close to us. We don’t have those, so we’re seeing fraying human connection.

This is cashing out most fatefully in the declining rate of people getting married and having babies. More than half of people now live in countries where the fertility rate is below replacement. That puts the whole demographic sustainability of liberal, democratic, capitalist, cosmopolitan, affluent civilization in doubt.

I want to ask you about the danger of presentism.

When we see a problem on the front pages of newspapers, we tend to extrapolate from it a broader crisis. In other words, we have trouble separating that which is fundamental to our civilization from that which is just a passing trend.

Let me give you a few examples. You write in the book that “we are getting fatter, dumber, and our mental health is deteriorating.” It certainly feels like it, right? But obesity is already declining in the United States because of Ozempic. Increasingly large numbers of young people are switching off social media. Apparently, Gen Z, the newest ones, are the best at that. Suicide rates are falling in rich countries outside of the United States, meaning this may be a particular American problem, or even simply a problem of measurement, rather than a general problem with modernity.

So, are we underestimating human adaptability and technological innovation?

That’s a very good point. We learn over time that some things that we thought were great turned out to be bad, and we put them behind us. Forty percent of American adults used to smoke, and we covered our walls with lead paint. And yes, we’ve got what looks like a deus ex machina for obesity, but the fact that the obesity wave happened at all is a good example of a more general challenge of being rich.

When we were poor, we developed a scarcity-based morality of self-discipline and self-control and resisting temptation out of necessity, but as those material constraints lessened, there was an inevitable and appropriate loosening. People could indulge their desires more. They could, to a greater extent than in the past, follow an “if it feels good, do it” kind of path. Well, it turns out that those qualities of self-discipline and self-mastery are still extremely helpful today, not for keeping you from falling into horrible poverty, but for keeping you focused on the things that really matter, rather than trivial, distracting desires.

Capitalism gives us what we want, and we don’t yet have the cultural supports that make sure it gives us what we want to want.

One set of problems that you identify has to do with the disintegration of personal bonds and the atomization of society.

Now, if I wanted to make grandparents more reliant on their children, to make neighbors more helpful to each other, and to increase church attendance, I would start by abolishing the welfare state, which I think has eroded the kind of mutual, voluntary reliance that people once had on each other.

This might irritate you, but I see the welfare state as an integral part of modern capitalism. Nowhere do we see a complex, technologically intensive, organizationally intensive division of labor without a strong welfare state. It’s possible to imagine such a thing, but it’s also possible to imagine a human being that’s 100 meters tall. If you actually had a human being that tall, he would collapse under his own weight. Plus, the libertarian movement in the United States has made zero headway in knocking back the welfare state, so I think libertarians need some kind of plan B.

The hopeful future I have in mind is more localistic and involves reimbuing our face-to-face relationships with family and neighbors with practical functions, which will allow people to live without the welfare state to a considerable degree. You can imagine a world of small modular nuclear reactors and 3D printing and vertical farming where small communities, with small divisions of labor, could have a degree of material affluence that today requires large-scale divisions of labor. But even in the here and now, if people are living together in communities, they can reassume duties of care that have been outsourced to private enterprise and the welfare state, such as taking care of little kids and elderly people and educating the young.

I wonder what is going to be more effective at driving culture change: appealing to people, or changing the incentives. When the government says, “We can pay for your child to go to a school,” you can opt out, but you will have to pay twice if you want to send your kids to a private school.

At the very least, I think we agree we will need to have competition. We could give the welfare state to the states and let them play around with it so that different jurisdictions can learn from each other.

Yeah. And, even more importantly, on the regulatory side. This is what I call capitalism’s crisis of inclusion, which is the weakening relationship between growth and widespread good conditions for the good life for people.

Meanwhile, though, we have a crisis of dynamism, a weakening capacity of the system to just keep delivering growth and pushing the technological frontier outward. Mancur Olson identified this problem a long time ago, which is that the richer you get, the more people you have with a stake in the status quo. For those people, the prospect of disruptive change is anxiety-provoking because it could knock them off their privileged perch, so they have an incentive to stop change. Also, the richer you get, the lower communication costs are, and the easier it is to band together with like-minded people and throw sand in the gears of creative destruction.

Meanwhile, the knowledge economy has created this large class of knowledge workers who desire to control and rationalize everything in their grasp. When something isn’t working, the solution is to add another layer of bureaucracy and process. Obviously, we’ve got lots of this kind of dysfunction in the public sector, but I think we also see it in the private sector, with the explosion of administrative staff on campus, the HR-ization of corporate life, and also in personal life, with helicopter parenting. These same professionals, on their off hours, deploy their managerial instincts to squeeze every drop of spontaneity out of childhood in the name of safety.

Those impulses are deep-seated, and they have contributed to an increasing drag on our dynamism.

One of the most effective ways to tackle this is inter-jurisdictional competition, allowing different groups to have different rules to limit the exposure of those different rules. Then, if that different set of rules really is producing better results, they can be emulated elsewhere. Beyond that, we’re just ineradicably culturally pluralistic people, especially under conditions of modernity. People are not going to agree with each other on what the good life is. They’re going to have different values. Having us all crammed together under one set of rules makes those value differences really high stakes and combustible and has produced a lot of the dysfunctional politics we’re experiencing now.

Last question.

My view of what living wisely, agreeably, and well may be very different from a guy who is perfectly satisfied living in his basement playing games and smoking a lot of pot. I would find such a life appalling, but who am I to tell this person that they are not living wisely, agreeably, and well?

In other words, aren’t you worried that even if all your hopes come to pass, the future may still contain a lot of people who will not be living wisely, agreeably, and well, just as they are today?

We can talk about flourishing at the individual level and then flourishing at the societal level.

In the book, I talk about projects, relationships, and experiences. Some people are really focused on projects and very light on relationships, and they do fine. Some people are great at cultivating amazing experiences, and they’re not very practical about anything else, but they live well that way. So there are a lot of different ways to have a good life.

At the social level, there’s a little bit less variety. To take one example, you can totally have a flourishing individual life without having children, but you can’t really have a flourishing society unless a certain number of people are having babies. So, I think you can’t have a flourishing society that isn’t a free society where people are the authors of their own lives, but a free society requires the freedom to fail. Some people are just not going to live wisely and agreeably and well.

I think we can create better conditions for people to choose well than we have at present. But that doesn’t mean we need to converge on one way of living well. That would be boring. Getting richer should mean a flowering of variety, not everybody converging on one way of life. And I think a more pluralistic, localistic institutional environment is most conducive to that end.

And it seems to me that living in a pluralistic society doesn’t mean that you are voiceless, that you don’t have a right to express your views about other people’s lives. Pluralism does not require total relativism. I can still say to little Jimmy, “Spend less time playing video games in your room and go out and explore the world.”

Ultimately, if we are going to be living in a pluralistic society where people can choose their values and how they want to live, it should be possible for people to persuade them that some ways of living, such as living up to their best potential, are better than wasting their lives.

This is the ultimate challenge for Homo sapiens: are we cut out for freedom? Are we cut out for being allowed to choose the good? Or are we just such a refractory species that we have to be lorded over?

The dystopian novel Brave New World, I think, is a much better fit with the predicament we’re in right now than 1984. The human spirit is being degraded, not by a regime of fear, but by a regime of cheap pleasures. At the end of that book, there’s this long monologue by the head of the society making this argument that human beings just don’t know what’s good for them and need to be taken care of. I don’t believe that. I have faith that there is a human nature that wants the good, that wants to connect to the outside world, and to other people, and figure things out. And we have the great privilege of living in a very rich, technologically advanced world that gives more people opportunities to do those things. We just need to structure things a little bit better to make it easier to make the right choices.

Blog Post | Wealth & Poverty

Dinner With Dickens Was Slim Pickins

Claims that characters in "A Christmas Carol" were better off than modern Americans are pure humbug.

Summary: There have recently been widespread claims that Dickens’s working poor were better off than modern minimum-wage workers. Such comparisons rely on misleading inflation math and selective reading. The severe material deprivation of Victorian life—crowded housing, scarce possessions, and basic sanitation problems—dwarfs today’s standards. Modern Americans, even at the lower end of the income scale, enjoy far greater material comfort than the Cratchits ever did.


Christmas is often a time for nostalgia. We look back on our own childhood holidays. Songs and traditions from the past dominate the culture.

Nostalgia is not without its purposes. But it can also be misleading. Take those who view the material circumstances of Charles Dickens’s “A Christmas Carol” as superior to our own.

Claims that an American today earning the minimum wage is worse off than the working poor of the 19th century have been popular since at least 2021. A recent post with thousands of likes reads:

Time for your annual reminder that, according to A Christmas Carol, Bob Cratchit makes 15 shillings a week. Adjusted for inflation, that’s $530.27/wk, $27,574/yr, or $13.50/ hr. Most Americans on minimum wage earn less than a Dickensian allegory for destitution.

This is humbug.

Consider how harsh living conditions were for a Victorian earning 15 shillings a week.

Dickens writes that Mr. Cratchit lives with his wife and six children in a four-room house. It is rare for modern residents of developed nations to crowd eight people into four rooms.

It was common in the Victorian era. According to Britain’s National Archives, a typical home had no more than four rooms. Worse yet, it lacked running water and a toilet. Entire streets (or more) would share a few toilets and a pump with water that was often polluted.

The Cratchit household has few possessions. Their glassware consists of merely “two tumblers, and a custard-cup without a handle.” For Christmas dinner, Mr. Cratchit wears “threadbare clothes” while his wife is “dressed out but poorly in a twice-turned gown.”

People used to turn clothing inside-out and alter the stitching to extend its lifespan. The practice predated the Victorian era, but continued into it. Eventually, clothes would become “napless, threadbare and tattered,” as the historian Emily Cockayne noted.

The Cratchits didn’t out-earn a modern American earning the minimum wage. Mr. Cratchit’s weekly salary of 15 shillings in 1843, the year “A Christmas Carol” was published, is equivalent to almost £122 in 2025. Converted to U.S. dollars, that’s about $160 a week, for an annual salary of $8,320.

The U.S. federal minimum wage is $7.25 per hour or $15,080 per year for a full-time worker. That’s about half of what the meme claims Mr. Cratchit earned. Only 1% of U.S. workers earned the federal minimum wage or less last year. Most states set a higher minimum wage. The average worker earns considerably more. Clerks like Mr. Cratchit now earn an average annual salary of $49,210.

Mr. Cratchit couldn’t have purchased much of the modern “basket of goods” used in inflation calculations. Many of the basket’s items weren’t available in 1843. The U.K.’s Office of National Statistics recently added virtual reality headsets to it.

Another way to compare the relative situation of Mr. Cratchit and a minimum-wage worker today is to see how long it would take each of them to earn enough to buy something comparable. A BBC article notes that, according to an 1844 theatrical adaptation of “A Christmas Carol,” it would have taken Mr. Cratchit a week’s wages to purchase the trappings of a Christmas feast: “seven shillings for the goose, five for the pudding, and three for the onions, sage and oranges.” Mr. Cratchit opts for a goose for the family’s Christmas meal. A turkey—then a costlier option—was too expensive.

The American Farm Bureau Federation found that the ingredients for a turkey-centered holiday meal serving 10 people cost $55.18 in 2025. At the federal minimum wage, someone would need to work seven hours and 37 minutes to afford that feast.

A minimum-wage worker could earn more than enough in a single workday to purchase a meal far more lavish than the modest Christmas dinner that cost Mr. Cratchit an entire week’s pay. And the amount of time a person needs to work to afford a holiday meal has fallen dramatically for the average blue-collar worker in recent years despite inflation. Wages have grown faster than food prices.

There has been substantial progress in living conditions since the 1840s. We’re much better off than the Cratchits were. In fact, most people today enjoy far greater material comfort than did even Dickens’s rich miser Ebenezer Scrooge.

This article was originally published in the Wall Street Journal on 12/23/2025.