fbpx
01 / 05
Grim Old Days: Kirstin Olsen’s Daily Life in 18th-Century England

Blog Post | Human Development

Grim Old Days: Kirstin Olsen’s Daily Life in 18th-Century England

Life just prior to industrialization was more callous, uncomfortable, and dangerous than most people today care to fathom.

Summary: Kirstin Olsen’s book Daily Life in 18th-Century England captures a period of tremendous change, highlighting the stark differences in living conditions between 1700 and 1800. The 18th century saw advancements like the development of effective steam engines and profound new scientific knowledge, which led to improved comfort even for the poor by 1800. Olsen elucidates the immense hardships commonplace in English society prior to industrialization, from the evolution of marriage and childbirth to the grim realities of public entertainment, criminal justice, and healthcare.


Kirstin Olsen’s book Daily Life in 18th-Century England paints a vivid portrait of a time of immense change. “There were no really effective steam engines in 1700, no awareness that ‘air’ and ‘water’ were divisible into separate elements, no understanding of why things burned, and no knowledge of positive and negative electrical charges. The words ‘mammal’ and ‘Homo sapiens’ did not exist. No one had ever flown, and no one, since prehistory, had discovered a new planet in the sky. Weaving and spinning were still done entirely by hand. By 1800, all this would change.” Living conditions transformed so that even “the poor were much more comfortable in 1800 than in 1700.” This book provides a thorough look into everyday life just prior to the dawn of industrialization as well as during that momentous transition, which began around 1760 in Britain.

In the 18th century, people seldom traveled and lived in hyperlocal worlds. “Weights and measures still varied from one region to another. . . . Cornish was still spoken in parts of the far southwest until about 1780, and Welsh and Gaelic were still in common use in areas outside England. Most residents of the Isle of Man spoke their own language, Manx, as well.”

Given the highly limited pool of marriage partner choices that resulted from this extreme isolation, perhaps it is unsurprising that “much of the satirical literature of the 18th century . . . lampooned marriage as a hell or prison sentence for one or both partners. The most typical attitude toward marriage evinced in 18th-century literature and visual art is a sly, collegial misery.” The poem “Wedlock” by the English poet Mehetabel “Hetty” Wright (1697–1750), herself pressured into a loveless marriage with a plumber (who trekked home grime that may have been responsible for their losing many children to premature death), paints a typical picture:

Thou source of discord, pain and care,
Thou sure forerunner of despair,
Thou scorpion with a double face,
Thou lawful plague of human race,
Thou bane of freedom, ease and mirth, [. . .]
Who hopes for happiness from thee,
May search successfully as well
For truth in whores and ease in hell.

Legally “the groom could be as young as 14 and the bride as young as 12.” Many marriages turned abusive. “Domestic violence was tolerated by the courts so long as it was limited to ‘moderate physical correction,’ and a man could even commit his wife to an insane asylum against her will.” An abused woman’s best hope was often not legal recourse but the possibility that a male relative, neighbor, or sympathetic passerby might notice her plight and take action on her behalf. “Neighbors [sometimes] intervened when men beat their wives, shaming the abusers with public processions and chants, or simply stopping beating, as a saddler did in 1703, telling the abusive husband, ‘You shall not beat your wife.’” Remaining single in the 18th century brought its own challenges: “The life of a spinster could be a difficult one, with extended family using unattached female relatives as temporary live-in housekeepers when a wife died.”

Those who imagine that the people of the past unfailingly adhered to stricter standards of chastity might be alarmed at the frequency of shotgun marriages: “One-third of all brides were pregnant at their weddings.” About 20 percent of first births occurred outside marriage in 1790 in England. Such children were often subject to neglect and even infanticide. In England: “A 1624 statute criminalized concealing the death of a bastard child unless the mother (who in this was presumed guilty) could prove that it had been stillborn.”

“It was common for one parent to die before all the children had grown up.” The 18th-century “Birmingham businessman William Hutton received a straightforward appraisal of his chances when, as a child, he lost both his parents. ‘Don’t cry,’ his nanny told him. ‘You will soon go yourself.’” (He defied this prophecy: After a long life that included beginning work in a mill at age 7, he died at the ripe old age of 91).

“Childhood ailments claimed a large number of children before their fifth birthdays (60 percent in London in 1764), and those illnesses that failed to kill often scarred or attracted treatments that were even worse. A child might have to survive teething problems, tapeworms, chicken pox, whooping smallpox, lead poisoning, thrush, measles, and mumps, being bled, swaddled, and dosed with belladonna, syrup of poppies (opium), quinine, rum, gin, brandy. Laxatives, and patent medicines. Children wore amulets of such ingredients as mistletoe and elk’s horn, had hare’s brains smeared on their gums while teething, and were given enemas for worms. A particularly drastic worm remedy involved inserting a piece of pork on a string into the rectum and drawing it out slowly to lure the worms. Some diseases could be cured, it was thought, by a sudden fright, such as riding on a bear, having a gun fired nearby, or ‘giving the patient a part of some disgraceful animal, as a mouse, etc., to eat, and afterwards informing him of it; and so forth.’”

“Imagine that you are sick in the 18th century. You are running a high fever, feeling light-headed, and beginning to develop blotches on your skin. Your mother has dosed you with some cheap patent medicines. She has tried poultices and some sort of nasty-smelling broth. Time passes, and a man with a cane and a sword feeds you more bad-tasting medicines. You think you hear him say that one is made of spiders. You are dimly aware of warm water and a pain in your arm, and you turn your head to witness the sight of your blood running from a vein in your elbow into a bowl. Ah, good, you think, being an 18th-century person. Everything that can be done is being done.”

In those days, sometimes avoiding doctors altogether was better than receiving what passed for medical treatment. “Needing to do something dramatic, or for lack of anything better to do, or because they really believed it would work, doctors resorted to visible but useless or even harmful measures-bleeding, dosing with dangerous drugs, raising blisters on the skin, and inducing vomiting. [Joseph] Addison, in The Spectator, called physicians ‘a most formidable Body of Men: The Sight of them is enough to make a Man serious, for we may lay it down as a Maxim, that When a Nation abounds in Physicians it grows thin of People.’”

Folk remedies were also usually useless and often dangerous. “They ate soap for stomach troubles, touched hanged men to cure goiter and swollen glands, drank asses’ milk, made charms of babies’ amniotic sacs, drank their own urine for ague or snail tea for a sore chest, rubbed their eyes with black cats’ tails for styes, and ate eye of pike for toothaches, pigeon blood for apoplexy, tortoise blood for epilepsy, cockroach tea for kidney ailments, puppy and owl broth for bronchitis, and spiders for fever.”

Beauty products could be harmful too. “Most cosmetics were made at home” even in the 1700s, with some recipes “containing harmful chemicals like the white lead in face paint or the mercury in some rouges” and others included irritants such as quicklime or even “cat’s dung.” “Some reportedly also wore false eyebrows made of mouse skin that could, in a hot room, begin to slide down an unfortunate woman’s face.”

The state of dentistry was similarly dreadful. “If something went wrong with the teeth, dentists hand-drilled cavities as always, with no anesthetic but alcohol and filled the resulting holes with molten tin, lead, or gold. Where a dentist was unavailable, one called the farrier (the horse-doctor). False teeth were made of bone, ivory, gold, porcelain, wood, or the purchased teeth of the poor, but such dentures were expensive and, held in place by awkward spring mechanisms, sometimes fell out of the mouth. Tooth problems could also result in infections; 780 Londoners ostensibly died in 1774 from dental problems.”

Standards of sanitation were also unacceptable. London’s streets were “full of sewage and horse dung and butchers’ offal.” “The streets were atrocious in the first half of the [18th] century, full of dust in dry weather and mud in wet. These streams were augmented by dirtied water tossed by maids from the upper stories, by gutters that ran directly onto the streets and pavements, and by rainstorms, which carried into them ‘Sweepings from butchers’ stalls, dung, guts, and blood, / Drowned puppies, stinking sprats, all drenched in mud, / Dead cats and turnip tops.’ The streets were dirtied by not only horse manure but also human waste, particularly from beggars and children who urinated and [defecated] next to buildings.” In the 1760s, just as industrialization began, so too did the condition of London’s streets start to improve.

Mental health care was appalling as well. A chief amusement of the pre-industrial world was finding entertainment in the act of gawking at anyone unusual, especially those suffering from bodily abnormalities or mental health problems. “The interior of a madhouse such as London’s Bethlehem Hospital (Bedlam) was a sight to behold, and many did—Bedlam was one of London’s principal tourist attractions, and until 1770, visitors could pay for admission and a tour, during which guards and visitors alike goaded the inmates to view their violent reactions. Nuts, fruit, cheesecakes, and beer were sold to the tune of ‘rattling of Chains, drumming of Doors, Ranting, Hollowing, Singing,’ and the distinctive uproar that spread like a wave through the asylum when the inmates became outraged at the treatment one of their fellows was receiving. Some inmates fought back by hurling the contents of their chamber pots. Bedlam’s occupants were lightly dressed in both summer and winter, in unheated rooms. Often with only a pile of straw for a bed.” It was somewhat unusual when in London, the rather distastefully named St. Luke’s Hospital for Lunatics, “founded in 1751, explicitly forbade exposing ‘the patients . . . to public view.’”

Executions and other criminal punishments were another popular form of entertainment. There were about 200 capital crimes (for which the punishment was death) in England as late as 1800, including pickpocketing goods over 1 shilling in value, shoplifting 5 shillings’ worth, sheep-stealing, killing a cow, entering land with intent to kill rabbits, “associating with gypsies,” theft of a master’s goods by a servant, and vandalism of fishponds.

Lesser crimes were punished with public shame. “People exposed in the pillory were tormented by the crowd, sometimes for fun and sometimes out of genuine resentment of the crime. It was not unusual-for-the-person pilloried to suffer death or maiming as a result of being pelted with stones, food, dirt, dead animals, and trash. Those not pilloried were sometimes branded, though the brander could be bribed to use a cold iron. Another common punishment was public flogging, and it was a holiday of sorts when women, particularly prostitutes, were flogged. Crowds would gather to see these women stripped to the waist and beaten. The holiday mood only intensified when a hanging was scheduled.” Hence in the 1730s, one writer observed of England, “The Execution of Criminals here is a perfect Shew to the People, by Reason of the Courage with which most of ’em go to the fatal Tree. . . . I lately saw five carried to the Gallows, who were dressed, and seemed to be as well pleased, as if they were going to a Feast.”

A festival-like atmosphere attended public hangings, which were a major source of entertainment. “At Tyburn the crowd either stood or paid for the privilege of sitting in the wooden grandstands, called ‘Mother Proctor’s Pews.’ The cart moved beneath the gallows, and there were final speeches from the condemned, perhaps a last-minute reprieve, prayers from the chaplain, the nooses placed around necks. Then ‘away goes the Cart, and there swing my Gentlemen kicking in the Air.’ Hawkers began selling the alleged dying utterances of the hanged, which made the execution, 1 sale being far more important than factual accuracy. Sufferers from disease snatched at the bodies, believing them to possess magical powers. Entrepreneurs waited for the right moment to make off with the rope, which could be sold in pieces as a souvenir. Friends red lingered, trying either to support them long enough to cut them down (which worked on at least one occasion) or to yank their legs to shorten their suffering (since 18th-century hanging had no drop to break the neck, and death was by slow strangulation) and defending their bodies (sometimes with fierce violence) from the surgeons, who had a right to dissect 10 Tyburn corpses per year and claimed any corpse not purchased by the family. In some cases, the bodies were violated according to the nature of the crime. Jacobites’ heads were, until 1777, severed and displayed on spikes at Temple Bar. Sometimes whole bodies, often shaved, disemboweled, or coated with tar or tallow, were hung in chains near the symbolic scene of their crimes—along roads for highwaymen and near the Thames for pirates, mutineers, and deserters. Far from being shocked by such displays, the crowds positively demanded them. They sometimes rioted if denied a hanging, for example by the suicide of the condemned. In one such case, they seized the dead body and attacked it with such ferocity that virtually all its bones were shattered.”

People also commonly enjoyed violence against animals as entertainment. “The torture and killing of animals and fights between humans were a prime source of entertainment. Thus, in 1730, a showman advertised ‘a mad bull to be dressed up with fireworks and turned loose in the game place, a dog to be dressed up with fireworks over him, a bear to be let loose at the same time, and a cat to be tied to the bull’s tail.’ Some impresarios staged dog fights, or tied an owl to the back of a duck to see the duck dive in fear and half-drown the owl, or hung a goose head-down from a tree or a pair of poles, greased its neck, and gave people turns trying to pull off its head while riding underneath. Children’s games included shooting flies with small guns, sewing a string to a mayfly to keep it on a leash, and ‘conquering,’ or pressing snails against each other till one shell broke.”

“One of the most popular blood sports was cockfighting. Participants of all classes came to the cockpit with sacks holding their prize roosters, whose wings and tails had been clipped and whose legs were fitted with long sharp spurs called gaffles. Amidst a roar of betting, two cocks were placed in the ring and pushed at each other until they began to fight. ‘Then it is amazing,’ wrote one spectator, ‘to see how they peck at each other, and especially how they hack with their spurs. Their combs bleed terribly and they often slit each other’s crop and abdomen with the spurs.’ Battle continued until one of the birds stood crowing on its dead opponent’s body.” One witness to such a battle in 1728 wrote, “Cocks will sometimes fight a whole-hour before one or the other is victorious.”

“Another popular spectacle was the ‘baiting’ of an animal by tying it up and sending dogs against it. The most popular animal for such contests was a bull. In fact, in some places, it was illegal for a butcher to slaughter a bull without first making it the subject of such sport.”

Blog Post | Water & Sanitation

If You Think New York City Life Is Bad Now

A grim tour of preindustrial New York

Summary: Many people today feel that life in New York has become uniquely difficult. Some imagine that the city was cleaner, safer, and more livable in the distant past. Historical reality tells a different story: Preindustrial New York was marked by extreme filth, unsafe water, rampant disease, pervasive poverty, and living conditions that made everyday life harsh and dangerous compared to contemporary times.


Discontent fueled the 2025 New York City mayoral election and Zohran Mamdani’s victory. A common theme echoed across the five boroughs: New York is a hard place to live. “We are overwhelmed by housing costs,” said Santiago, a 69-year-old retiree, outside a Mamdani rally. Those opposed to Mamdani had their own complaints. María Moreno, a first-time voter from the Bronx who supported Andrew Cuomo, lamented, “Now everything’s dirty, and our neighborhood does not feel safe.”

Today’s voters have legitimate grievances. The city’s housing costs, quality-of-life issues, and perceptions of disorder weigh heavily on residents’ minds. But it’s important to keep things in perspective. Different voters may romanticize different eras, but many seem to share a sense that if they could travel back far enough in time, they’d find a New York that was once clean, safe, and affordable. When Americans were polled in 2023, almost 20 percent said that it was easier to “have a thriving and fulfilling life” hundreds of years ago. Across the country, as one writer put it, people are engaged in an “endless debate around whether the preindustrial past was clearly better than what we have now.” In fact, Mamdani’s politics are grounded in an ideology that first arose from the frustrations of the early industrial era.

If Americans could go back in time to preindustrial New York City, however, they’d likely be horrified and possibly traumatized. Despite today’s real challenges, most New Yorkers would not trade places with their predecessors.

Long before the rise of factories and industry, New York City was a bustling port, founded by the Dutch as New Amsterdam in order to trade furs in the early seventeenth century. As early as 1650, local authorities enacted an ordinance against animals roaming the streets to protect local infrastructure—but to no avail. Then, in 1657, according to the Dutch scholar Jaap Harskamp:

New Amsterdam’s council attempted to ban the common practice of throwing rubbish, ashes, oyster-shells or dead animals in the street and leave the filth there to be consumed by droves of pigs on the loose. When the English took over the colony from the Dutch, pigs and goats stayed put. . . . Pollution persisted. The streets of Manhattan were a stinking mass. Inhabitants hurled carcasses and the contents of loaded chamber pots into the street and rivers. Runoff from tanneries where skins were turned into leather flowed into the waters that supplied the shallow wells. The (salty) natural springs and ponds in the region became contaminated with animal and human waste. For some considerable time, access to clean water remained an urgent problem for the city. . . . The penetrating smell of decomposing flesh was everywhere.

Into the early twentieth century, urban living in the United States felt surprisingly rural and agrarian, with an omnipresent reek to match. As late as the mid-nineteenth century, pigs roamed freely through New York City streets, acting as scavengers, and nearly every household maintained a vegetable garden, often fertilized with animal manure.

Indoor air quality was no better. A drawing from Mary L. Booth’s History of the City of New York depicts a seventeenth century New Amsterdam home with smoke from the fireplace swirling through the room. Indoor air pollution remains a serious problem today in the poorest parts of the world, as smoke from hearths can cause cancer and acute respiratory infections that often prove deadly in children. One preindustrial writer railed against the “pernicious smoke [from fireplaces] superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.”

That said, before industrialization, though inescapable filth coated the interiors of homes, the average person owned few possessions for the corrosive hearth smoke and soot to ruin. By modern standards, New Yorkers—like most preindustrial people—were impoverished and lacked even the most basic amenities. According to historian Judith Flanders, in the mid-eighteenth century, “fewer than two households in ten in some counties of New York possessed a fork.” Many were desperately poor even by the standards of the day and could not afford housing. One 1788 account lamented how in New York City, “vagrants multiply on our Hands to an amazing Degree.” Charity records suggest that the “outdoor poor” far outnumbered those in almshouses.

Water quality was infamously awful. In seventeenth-century New Amsterdam, as Benjamin Bullivant observed, “[There are] many publique wells enclosed & Covered in ye Streetes . . . [which are] Nasty & unregarded.” A century later, New York’s water remained as foul as Bullivant had described. Visiting in 1748, the Swedish botanist Peter Kalm noted that the city’s well water was so filthy that horses from out of town refused to drink it. In 1798, the Commercial Advertiser condemned Manhattan’s main well as “a shocking hole, where all impure things center together and engender the worst of unwholesome productions; foul with excrement, frogspawn, and reptiles, that delicate pump system is supplied. The water has grown worse manifestly within a few years. It is time to look out [for] some other supply, and discontinue the use of a water growing less and less wholesome every day. . . . It is so bad . . . as to be very sickly and nauseating; and the larger the city grows the worse this evil will be.”

In 1831, a letter in the New York Evening Journal described the state of the water supply:

I have no doubt that one cause of the numerous stomach affections so common in this city is the impure, I may say poisonous nature of the pernicious Manhattan water which thousands of us daily and constantly use. It is true the unpalatableness of this abominable fluid prevents almost every person from using it as a beverage at the table, but you will know that all the cooking of a very large portion of the community is done through the agency of this common nuisance. Our tea and coffee are made of it, our bread is mixed with it, and our meat and vegetables are boiled in it. Our linen happily escapes the contamination of its touch, “for no two things hold more antipathy” than soap and this vile water.

In 1832, New York experienced a devastating outbreak of cholera, a bacterial disease that typically spread through contaminated water and killed with remarkable speed. A person could wake up feeling well and be dead by nightfall, struck down with agonizing cramps, vomiting, and diarrhea. The epidemic killed about 3,500 New Yorkers.

The initial actions taken to protect city water supplies were often private in nature. In fact, throughout the eighteenth and early nineteenth centuries, private businesses generally supplied urban water infrastructure. Despite such efforts, drinking water remained generally unsafe, even after industrialization, until the chlorination of urban water supplies became widespread.

The pervasive grime took a visible toll on New Yorkers. Between drinking tainted water, eating contaminated food, inhaling smoke-filled air, and living with poor hygiene, the average resident sported visibly rotten teeth. One letter from 1781 described an acquaintance: “Her teeth are beginning to decay, which is the case with most New York girls, after eighteen.”

The dental practices of the time were often as horrifying as the effects of neglect. The medieval method of using arsenic to kill gum tissue, providing pain relief by destroying nerve endings, remained common until the introduction of Novocain in the twentieth century. As late as 1879, the New York Times ran a story with the headline “Fatal Poison in a Tooth; What Caused the Horrible Death of Mr. Gardiner. A Man’s Head Nearly Severed from His Body by Decay Caused by Arsenic Which Had Been Placed in One of His Teeth to Deaden an Aching Nerve—an Extraordinary Case.” The story detailed the gruesome demise of a man in Brooklyn, George Arthur Gardiner, who died “in great agony, after two weeks of indescribable suffering.”

Preindustrial New York City wasn’t uniquely miserable for its time. Life was harsh everywhere, and cities around the world contended with the same foul smells, filth, poor sanitation, and grinding poverty. Rural villages were no better. Peasant families often brought their livestock indoors at night and slept huddled together for warmth. In many cases, rural peasants were even poorer than their urban counterparts and owned fewer possessions. Farm laborers frequently suffered injuries and aged prematurely from backbreaking work, while fertilizing cesspits spread disease and filled the air with an inescapable stench.

Though they may have been slightly better off than their rural counterparts, the struggles of early New Yorkers are worth remembering. However daunting the problems of today may seem, a proper historical perspective can remind us of how far we’ve come.

This article was originally published in City Journal on 1/13/2026.

Blog Post | Human Development

The Grim Truth About the “Good Old Days”

Preindustrial life wasn’t simple or serene—it was filthy, violent, and short.

Summary: Rose-tinted nostalgia for the preindustrial era has gone viral—some people claim that modernity itself was a mistake and that “progress” is an illusion. This article addresses seven supposed negative effects of the Industrial Revolution. The conclusion is that history bears little resemblance to the sanitized image of preindustrial times in the popular imagination.


When Ted Kaczynski, the Unabomber, declared in 1995 that “the Industrial Revolution and its consequences have been a disaster for the human race,” he was voicing a sentiment that now circulates widely online.

Rose-tinted nostalgia for the preindustrial era has gone viral, strengthened by anxieties about our own digital era. Some are even claiming that modernity itself was a mistake and that “progress” is an illusion. Medieval peasants led happier and more leisurely lives than we do, according to those who pine for the past. “The internet has become strangely nostalgic for life in the Middle Ages,” journalist Amanda Mull wrote in a piece for The Atlantic. Samuel Matlack, managing editor of The New Atlantis, observed that there is currently an “endless debate around whether the preindustrial past was clearly better than what we have now and we must go back to save humanity, or whether modern technological society is unambiguously a forward leap we must forever extend.”

In the popular imagination, the Industrial Revolution was the birth of many evils, a time when smoke-belching factories disrupted humanity’s erstwhile idyllic existence. Economics professor Vincent Geloso’s informal survey of university students found that they believed “living standards did not increase for the poor; only the rich got richer; the cities were dirty and the poor suffered from ill-health.” Pundit Tucker Carlson has even suggested that feudalism was preferable to modern liberal democracy.

Different groups tend to idealize different aspects of the past. Environmentalists might idealize preindustrial harmony with nature, while social traditionalists romanticize our ancestors’ family lives. People from across the political spectrum share the sense that the Industrial Revolution brought little real improvement for ordinary people.

In 2021, History.com published “7 Negative Effects of the Industrial Revolution,” an article reflecting much of the thinking behind the popular impression that industrialization was a step backward for humanity, rather than a period of tremendous progress. But was industrialization really to blame for each of the ills detailed in the article?

“Horrible Living Conditions for Workers”

Were horrible living conditions a result of industrialization? To be sure, industrial-era living conditions did not meet modern standards—but neither did the living conditions that preceded them.

As historian Kirstin Olsen put it in her book, Daily Life in 18th-Century England, “The rural poor . . . crowded together, often in a single room of little more than 100 square feet, sometimes in a single bed, or sometimes in a simple pile of shavings or straw or matted wool on the floor. In the country, the livestock might be brought indoors at night for additional warmth.” In 18th-century Wales, one observer claimed that in the homes of the common people, “every edifice” was practically a miniature “Noah’s Ark” filled with a great variety of animals. One shudders to think of the barnlike smell that bedchambers took on, in addition to the chorus of barnyard sounds that likely filled every night. Our forebears put up with the stench and noise and cuddled up with their livestock, if only to stave off hypothermia.

Homes were often so poorly constructed that they were unstable. The din of collapsing buildings was such a common sound that in 1688, Randle Holme defined a crash as “a noise proceeding from a breach of a house or wall.” The poet Dr. Samuel Johnson wrote that in 1730s London, “falling houses thunder on your head.” In the 1740s, “props to houses” keeping them from collapsing were listed among the most common obstacles that blocked free passage along London’s walkways.

“Poor Nutrition”

What about poor nutrition? From liberal flower children to the “Make America Healthy Again” crowd, fetishizing the supposedly chemical-free, wholesome diets of yore is bipartisan. The truth, however, is stomach-churning.

Our ancestors not only failed to eat well, but they sometimes didn’t eat at all. Historian William Manchester noted that in preindustrial Europe, famines occurred every four years on average. In the lean years, “cannibalism was not unknown. Strangers and travelers were waylaid and killed to be eaten.” Historian Fernand Braudel recorded a 1662 account from Burgundy, France, that lamented that “famine this year has put an end to over ten thousand families . . . and forced a third of the inhabitants, even in the good towns, to eat wild plants. . . . Some people ate human flesh.” A third of Finland’s population is estimated to have died of starvation during a famine in the 1690s.

Even when food was available, it was often far from appetizing. Our forebears lived in a world where adulterated bread and milk, spoiled meat, and vegetables tainted with human waste were everyday occurrences. London bread was described in a 1771 novel as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.” According to historian Emily Cockayne, the 1757 public health treatise Poison Detected noted that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.”

Water was also far from pristine. “For the most part, filth flowed out windows, down the streets, and into the same streams, rivers, and lakes where the city’s inhabitants drew their water,” according to environmental law professor James Salzman. This ensured that each swig included a copious dose of human excreta and noxious bacteria. Waterborne illnesses were frequent.

“A Stressful, Unsatisfying Lifestyle”

Did stressful lifestyles originate with industrialization? Did our preindustrial ancestors generally enjoy a sense of inner peace? Doubtful. Sadly, many of them suffered from what they called melancholia, roughly analogous to the modern concepts of anxiety and depression.

In 1621, physician Robert Burton described a common symptom of melancholia as waking in the night due to mental stress among the upper classes. An observer said the poor similarly “feel their sleep interrupted by the cold, the filth, the screams and infants’ cries, and by a thousand other anxieties.” Richard Napier, a 17th-century physician, recorded over several decades that some 20 percent of his patients suffered from insomnia. Today, in comparison, 12 percent of Americans say they have been diagnosed with chronic insomnia. Stress is nothing new.

Sky-high preindustrial mortality rates caused profound emotional suffering to those in mourning. Losing a child to death in infancy was once a common—indeed, near-universal—experience among parents, but the loss was no less painful for all its ordinariness. Many surviving testimonies suggest that mothers and fathers felt acute grief with each loss. The 18th-century poem, “To an Infant Expiring the Second Day of Its Birth,” by Mehetabel “Hetty” Wright—who lost several of her own children prematurely—heartrendingly urges her infant to look at her one last time before passing away.

So common were child deaths that practically every major poet explored the subject. Robert Burns wrote “On the Birth of a Posthumous Child.” Percy Bysshe Shelley wrote multiple poems to his deceased son. Consider the pain captured by these lines from William Shakespeare’s play King John, spoken by the character Constance upon her son’s death: “Grief fills the room up of my absent child. . . . O Lord! My boy, my Arthur, my fair son! My life, my joy, my food, my all the world!” Shakespeare’s own son died in 1596, around the time the playwright would have finished writing King John.

Only in the modern world has child loss changed from extraordinarily common to exceedingly rare. As stressful as modern life can be, our ancestors faced forms of heartache that most people today will never endure.

“Dangerous Workplaces” and “Child Labor”

Dangerous workplaces and child labor both predate the Industrial Revolution. In agrarian societies, entire families would labor in fields and pastures, including pregnant women and young children. Many preindustrial children entered the workforce at what today would be considered preschool or kindergarten age.

In poorer families, children were sent to work by age 4 or 5. If children failed to find gainful employment by age 8, even social reformers unusually sympathetic to the plight of the poor, would express open disgust at such a lack of industriousness. Jonas Hanway was reportedly “revolted by families who sought charity when they had children aged 8 to 14 earning no wages.”

For most, work was backbreaking and unending. A common myth suggests that preindustrial peasants worked fewer days than modern people do. This misconception originated from an early estimate by historian Gregory Clark, who initially proposed that peasants labored only 150 days a year. He later revised this figure to around 300 days—higher than the modern average of 260 working days, even before factoring in today’s paid holidays and vacation time.

Physically harming one’s employees was once widely accepted, too, and authorities stepped in only when the mistreatment was exceptionally severe. In 1666, one such case occurred in Kittery, in what is now Maine, when Nicholas and Judith Weekes caused the death of a servant. Judith confessed that she cut off the servant’s toes with an axe. The couple, however, was not indicted for murder, merely for cruelty.

“Discrimination Against Women”

The preindustrial world was hardly a model of gender equality—discrimination against women was not an invention of the early industrialists but a long-standing feature of many societies.

Domestic violence was widely tolerated. In London, a 1595 law dictated: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” In other words, no beating your wife after 9:00 p.m. That was a noise regulation. A similar law forbade using a hammer after 9:00 p.m. Beating one’s wife until she screamed was an ordinary and acceptable activity.

Domestic violence was celebrated in popular culture, as in the lively folk song “The Cooper of Fife,” a traditional Scottish tune that inspired a country dance and influenced similar English and American ballads. To modern ears, the contrast between its violent lyrics and upbeat melody is unsettling. The song portrays a husband as entirely justified in his acts of domestic violence, inviting the audience to side with the wifebeater and cheer as he beats his wife into submission for her failure to perform domestic chores to her husband’s satisfaction.

Sexist laws often empowered men to abuse women. If a woman earned money, her husband could legally claim it at any time. For instance, in 18th-century Britain, a wife could not enter into contracts, make a will without her husband’s approval, or decide on her children’s education or apprenticeships; moreover, in the event of a separation, she automatically lost custody. Mistreatment of women, in other words, long predated industrialization. Arguably, it was the increase in female labor force participation during the Industrial Revolution that ultimately gave women greater economic independence and strengthened their social bargaining power.

“Environmental Harm”

While many of today’s environmental challenges—such as climate change and plastic pollution—differ from those our forebears faced, environmental degradation is not a recent phenomenon. Worrying about environmental impact, however, is rather new. Indeed, as historian Richard Hoffmann has pointed out, “Medieval writers often articulated an adversarial understanding of nature, a belief that it was not only worthless and unpleasant, but actively hostile to . . . humankind.”

Consider deforestation. The Domesday Survey of 1086 found that trees covered 15 percent of England; by 1340, the share had fallen to 6 percent. France’s forests more than halved from about 30 million hectares in Charlemagne’s time (768–814) to 13 million by Philip IV’s reign (1285–1314).

Europe was hardly the only part of the world to abuse its forests. A 16th-century witness observed that at every proclamation demanding more wood for imperial buildings, the peasants of what are today the Hubei and Sichuan provinces in China “wept with despair until they choked,” for there was scarcely any wood left to be found.

Despeciation is also nothing new. Humans have been exterminating wildlife since prehistory. The past 50,000 years saw about 90 genera of large mammals go extinct, amounting to over 70 percent of America’s large species and over 90 percent of Australia’s. 

Exterminations of species occurred throughout the preindustrial era. People first settled in New Zealand in the late 13th century. In only 100 years, humans exterminated 10 species of moa in addition to at least 15 other kinds of native birds, including ducks, geese, pelicans, coots, Haast’s eagle, and an indigenous harrier. Today, few people realize that lions, hyenas, and leopards were once native to Europe, but by the first century, human activity eliminated them from the continent. The final known auroch, Europe’s native wild ox, was killed in Poland by a noble hunter in 1627.

Progress Is Real

History bears little resemblance to the sanitized image of preindustrial times in the popular imagination—that is, a beautiful scene of idyllic country villages with pristine air and residents merrily dancing around maypoles. The healthy, peaceful, and prosperous people in this fantasy of pastoral bliss do not realize their contented, leisurely lives will soon be disrupted by the story’s villain: the dark smokestacks of the Industrial Revolution’s “satanic mills.”

Such rose-colored views of the past bear little resemblance to reality. A closer look shatters the illusion. The world most of our ancestors faced was in fact more gruesome than modern minds can fathom. From routine spousal and child abuse to famine-induced cannibalism and streets that doubled as open sewers, practically every aspect of existence was horrific.

A popular saying holds that “the past is a foreign country,” and based on recorded accounts, it is not one where you would wish to vacation. If you could visit the preindustrial past, you would likely give the experience a zero-star rating. Indeed, the trip might leave you permanently scarred, both physically and psychologically. You might long to unsee the horrors encountered on your adventure and to forget the shocking, gory details.

The upside is that the visit would help deromanticize the past and show how far humanity has truly come—emphasizing the utter transformation of everyday lives and the reality of progress.

This article was published at Big Think on 11/19/2025.

Blog Post | Human Development

Discontent in the Age of Plenty | Podcast Highlights

Marian Tupy interviews Brink Lindsey about why unprecedented prosperity has failed to deliver widespread meaning.

Listen to the podcast or read the full transcript here.

Today, I’ll be speaking with Brink Lindsey, an American political writer and Senior Vice President at the Niskanen Center. Previously, he was Cato’s Vice President for Research and a dear colleague. Today, we’ll be discussing his latest book, The Permanent Problem: The Uncertain Transformation from Mass Plenty to Mass Flourishing.

I want to start by congratulating you on your excellent book. It is concise, thoughtful, and beautifully written. As a published author, I’m envious of your style, and I really recommend the book to our listeners.

Let’s start with the most obvious question. What is the permanent problem?

I stole that line from the British economist John Maynard Keynes, who wrote a fascinating essay called “Economic Possibilities for Our Grandchildren.”

That essay came out in 1930 in the depths of the Great Depression, but he was brave enough to argue that this global catastrophe was just a bump in the road in a much longer process of modern economic growth, which he believed would continue until his audience’s grandchildren were grown. By that point, he said that the economic problem, meaning serious material deprivation, would be more or less solved. With that done, he foresaw that humanity’s permanent problem would loom into view: how to live wisely and agreeably and well with the blessings that modern economic growth has bestowed upon us.

He got some specific things wrong. He imagined that by now we’d only be working 15 hours a week, which hasn’t panned out. However, he got the big picture profoundly right, which is that an abundant future was coming, and that moving from tackling the economic problem to the permanent problem would be traumatic for societies. That they would have to unlearn the habits of untold generations.

He imagined that this transition would be, in his words, something like a “general nervous breakdown throughout society.” That phrase struck me as a pretty good description for the predicament that the United States and other advanced democracies have found themselves in. We’re richer, healthier, better educated, and more humanely governed than any people have ever been before, yet economic growth has slowed to a crawl in most advanced economies, class divisions have sparked a global populist uprising against elites and established institutions, personal relationships are fraying, mental health problems are on the rise, faith in democracy is wavering, and widespread pessimism is one of the few things you can get people across the political spectrum to agree on.

So, the thesis of the book is that our predicament amounts to the fact that we are in this no man’s land between mass plenty and mass flourishing. That, having achieved mass plenty, we’ve moved the goalposts of what makes a successful life. It’s no longer just about having food, shelter, and clothing, but meaning, purpose, belonging, and status. While we are providing those conditions for a larger fraction of the population than ever before, for 70 or 80 percent of people, our current way of life is not providing the conditions for flourishing that one would imagine would go with our level of technological and organizational prowess.

So, in America today, things are so good that we are moving to the top of Maslow’s hierarchy, but on the other hand, we have a hysteria where people are saying basic necessities like food and shelter have never been more unaffordable.

Can both be true at the same time?

I think we are absolutely materially richer than any society before. People who are discontent with the status quo grope for something quantifiable that has gone wrong, and so they try to make an argument about material decline that just isn’t consistent with the facts. It is true that we are rich enough to take our basic material needs for granted. Nonetheless, we enjoy these blessings with a kind of asterisk, which is that we get them only by spending the bulk of our waking adult lives working 40-hour weeks.

The blessed 20 or 30 percent at the top have an arena for flourishing. They’ve got intellectually challenging jobs that offer a lot of autonomy and scope for creativity, and social status. The rest are in fairly low-autonomy jobs with a lot of scutwork, and they’re one stroke of bad luck away from losing their job and falling into a serious hole. They’re shadowed by both the precarity of their hold on mass plenty and also by the need to spend a lot of their lives in drudgery to pay the bills.

According to Gallup, life satisfaction in America remained pretty much the same between 1979 and 2025. Roughly 80 percent of Americans say they are either satisfied or very satisfied with their lives, while only 20 percent of Americans believe that America is going in the right direction.

So, how bad is it really, if 80 percent of Americans say that they are satisfied or very satisfied with their lives?

I don’t put much stock in self-assessments of life satisfaction. Psychologically healthy people make the best of things, whatever the circumstances. Plus, happiness and life satisfaction surveys have a lot of cultural variation. Latin Americans seem to report higher life satisfaction given their level of GDP than Scandinavians or Japanese.

What I look at instead is the conditions for a well-lived life. The chances to do work that is challenging, fulfilling, and interesting are very good for a considerable fraction of people, but they’re not so good for the majority. There’s a large divergence there between the well-off and well-educated and everybody else. That’s also translated into diverging odds of even being in the workforce: there’s been a small drop-off in male prime-age labor force participation for college-educated men from the mid-’60s to the present, and a big drop-off in labor force participation for non-college-educated men. There’s been a similar divergence in the odds of getting married and in the odds of growing up in a two-parent home. And finally, in recent years, we’ve seen a divergence in life expectancy. Rather than the poor catching up with the rich over time, they’re now pulling apart.

So, are we doing better than ever before? Sure. But I don’t think that exhausts the inquiry. In a society organized around progress, a purely backward-looking standard of evaluation isn’t dispositive. In some of the more intangible aspects of flourishing, there are warning signs that things are going in the wrong direction.

So, do you have in your mind a sense of what an agreeable life should be?

At least in broad outlines.

In the agrarian age, to quote Hobbes, “Life was poor, nasty, brutish, and short,” but it was not solitary. People were miserable and poor, but they weren’t atomized or alienated. Now, I think it’s a real liberation that we’re not stuck in the same place that we were born, working the same trade as our parents. We can choose our own lives, and that’s a great opportunity. The next question is, “Are we going to develop cultural and institutional supports in these new conditions that will help us to have satisfying lives?

It’s beyond serious dispute that for most people, the most important determinant of the quality of their life is the quality of their personal relationships. And once upon a time, when the world was poor, your face-to-face relationships with other people filled vital practical functions. Your spouse was a partner in economic co-production. Your kids were economic assets. Your neighbors were an insurance policy. The main source of entertainment was hanging out with your friends and talking.

Over time, as we’ve gotten richer, we’ve outsourced a lot of those functions either to the marketplace or the welfare state. Personal relationships with people have become just one consumption option in a sea of expertly marketed alternatives. Learning to live wisely and agreeably and well amidst riches requires cultural and institutional supports that push us to spend our time on what really matters, which is the people who are close to us. We don’t have those, so we’re seeing fraying human connection.

This is cashing out most fatefully in the declining rate of people getting married and having babies. More than half of people now live in countries where the fertility rate is below replacement. That puts the whole demographic sustainability of liberal, democratic, capitalist, cosmopolitan, affluent civilization in doubt.

I want to ask you about the danger of presentism.

When we see a problem on the front pages of newspapers, we tend to extrapolate from it a broader crisis. In other words, we have trouble separating that which is fundamental to our civilization from that which is just a passing trend.

Let me give you a few examples. You write in the book that “we are getting fatter, dumber, and our mental health is deteriorating.” It certainly feels like it, right? But obesity is already declining in the United States because of Ozempic. Increasingly large numbers of young people are switching off social media. Apparently, Gen Z, the newest ones, are the best at that. Suicide rates are falling in rich countries outside of the United States, meaning this may be a particular American problem, or even simply a problem of measurement, rather than a general problem with modernity.

So, are we underestimating human adaptability and technological innovation?

That’s a very good point. We learn over time that some things that we thought were great turned out to be bad, and we put them behind us. Forty percent of American adults used to smoke, and we covered our walls with lead paint. And yes, we’ve got what looks like a deus ex machina for obesity, but the fact that the obesity wave happened at all is a good example of a more general challenge of being rich.

When we were poor, we developed a scarcity-based morality of self-discipline and self-control and resisting temptation out of necessity, but as those material constraints lessened, there was an inevitable and appropriate loosening. People could indulge their desires more. They could, to a greater extent than in the past, follow an “if it feels good, do it” kind of path. Well, it turns out that those qualities of self-discipline and self-mastery are still extremely helpful today, not for keeping you from falling into horrible poverty, but for keeping you focused on the things that really matter, rather than trivial, distracting desires.

Capitalism gives us what we want, and we don’t yet have the cultural supports that make sure it gives us what we want to want.

One set of problems that you identify has to do with the disintegration of personal bonds and the atomization of society.

Now, if I wanted to make grandparents more reliant on their children, to make neighbors more helpful to each other, and to increase church attendance, I would start by abolishing the welfare state, which I think has eroded the kind of mutual, voluntary reliance that people once had on each other.

This might irritate you, but I see the welfare state as an integral part of modern capitalism. Nowhere do we see a complex, technologically intensive, organizationally intensive division of labor without a strong welfare state. It’s possible to imagine such a thing, but it’s also possible to imagine a human being that’s 100 meters tall. If you actually had a human being that tall, he would collapse under his own weight. Plus, the libertarian movement in the United States has made zero headway in knocking back the welfare state, so I think libertarians need some kind of plan B.

The hopeful future I have in mind is more localistic and involves reimbuing our face-to-face relationships with family and neighbors with practical functions, which will allow people to live without the welfare state to a considerable degree. You can imagine a world of small modular nuclear reactors and 3D printing and vertical farming where small communities, with small divisions of labor, could have a degree of material affluence that today requires large-scale divisions of labor. But even in the here and now, if people are living together in communities, they can reassume duties of care that have been outsourced to private enterprise and the welfare state, such as taking care of little kids and elderly people and educating the young.

I wonder what is going to be more effective at driving culture change: appealing to people, or changing the incentives. When the government says, “We can pay for your child to go to a school,” you can opt out, but you will have to pay twice if you want to send your kids to a private school.

At the very least, I think we agree we will need to have competition. We could give the welfare state to the states and let them play around with it so that different jurisdictions can learn from each other.

Yeah. And, even more importantly, on the regulatory side. This is what I call capitalism’s crisis of inclusion, which is the weakening relationship between growth and widespread good conditions for the good life for people.

Meanwhile, though, we have a crisis of dynamism, a weakening capacity of the system to just keep delivering growth and pushing the technological frontier outward. Mancur Olson identified this problem a long time ago, which is that the richer you get, the more people you have with a stake in the status quo. For those people, the prospect of disruptive change is anxiety-provoking because it could knock them off their privileged perch, so they have an incentive to stop change. Also, the richer you get, the lower communication costs are, and the easier it is to band together with like-minded people and throw sand in the gears of creative destruction.

Meanwhile, the knowledge economy has created this large class of knowledge workers who desire to control and rationalize everything in their grasp. When something isn’t working, the solution is to add another layer of bureaucracy and process. Obviously, we’ve got lots of this kind of dysfunction in the public sector, but I think we also see it in the private sector, with the explosion of administrative staff on campus, the HR-ization of corporate life, and also in personal life, with helicopter parenting. These same professionals, on their off hours, deploy their managerial instincts to squeeze every drop of spontaneity out of childhood in the name of safety.

Those impulses are deep-seated, and they have contributed to an increasing drag on our dynamism.

One of the most effective ways to tackle this is inter-jurisdictional competition, allowing different groups to have different rules to limit the exposure of those different rules. Then, if that different set of rules really is producing better results, they can be emulated elsewhere. Beyond that, we’re just ineradicably culturally pluralistic people, especially under conditions of modernity. People are not going to agree with each other on what the good life is. They’re going to have different values. Having us all crammed together under one set of rules makes those value differences really high stakes and combustible and has produced a lot of the dysfunctional politics we’re experiencing now.

Last question.

My view of what living wisely, agreeably, and well may be very different from a guy who is perfectly satisfied living in his basement playing games and smoking a lot of pot. I would find such a life appalling, but who am I to tell this person that they are not living wisely, agreeably, and well?

In other words, aren’t you worried that even if all your hopes come to pass, the future may still contain a lot of people who will not be living wisely, agreeably, and well, just as they are today?

We can talk about flourishing at the individual level and then flourishing at the societal level.

In the book, I talk about projects, relationships, and experiences. Some people are really focused on projects and very light on relationships, and they do fine. Some people are great at cultivating amazing experiences, and they’re not very practical about anything else, but they live well that way. So there are a lot of different ways to have a good life.

At the social level, there’s a little bit less variety. To take one example, you can totally have a flourishing individual life without having children, but you can’t really have a flourishing society unless a certain number of people are having babies. So, I think you can’t have a flourishing society that isn’t a free society where people are the authors of their own lives, but a free society requires the freedom to fail. Some people are just not going to live wisely and agreeably and well.

I think we can create better conditions for people to choose well than we have at present. But that doesn’t mean we need to converge on one way of living well. That would be boring. Getting richer should mean a flowering of variety, not everybody converging on one way of life. And I think a more pluralistic, localistic institutional environment is most conducive to that end.

And it seems to me that living in a pluralistic society doesn’t mean that you are voiceless, that you don’t have a right to express your views about other people’s lives. Pluralism does not require total relativism. I can still say to little Jimmy, “Spend less time playing video games in your room and go out and explore the world.”

Ultimately, if we are going to be living in a pluralistic society where people can choose their values and how they want to live, it should be possible for people to persuade them that some ways of living, such as living up to their best potential, are better than wasting their lives.

This is the ultimate challenge for Homo sapiens: are we cut out for freedom? Are we cut out for being allowed to choose the good? Or are we just such a refractory species that we have to be lorded over?

The dystopian novel Brave New World, I think, is a much better fit with the predicament we’re in right now than 1984. The human spirit is being degraded, not by a regime of fear, but by a regime of cheap pleasures. At the end of that book, there’s this long monologue by the head of the society making this argument that human beings just don’t know what’s good for them and need to be taken care of. I don’t believe that. I have faith that there is a human nature that wants the good, that wants to connect to the outside world, and to other people, and figure things out. And we have the great privilege of living in a very rich, technologically advanced world that gives more people opportunities to do those things. We just need to structure things a little bit better to make it easier to make the right choices.