fbpx
01 / 05
Modern Freedom Beats Feudal Serfdom

Blog Post | Poverty Rates

Modern Freedom Beats Feudal Serfdom

Make the Middle Ages Great Again?

Summary: Some influential voices today romanticize feudalism, but the reality of feudalism was misery for nearly everyone. Life under that system meant hunger, disease, violence, and lives cut brutally short. By contrast, modern societies have lifted billions out of poverty and extended life far beyond what kings and queens once knew. Progress comes from freedom, innovation, and hard work, not a return to the rule of lords and monarchs.


On a recent podcast, Tucker Carlson praised feudalism as “so much better than what we have now” because a ruler is “vested in the prosperity of the people he rules.” This romantic view of medieval hierarchy ignores a brutal reality: For most people, feudalism meant grinding poverty, disease, and early death.

As Gale L. Pooley and I found in our 2022 book Superabundance, society in preindustrial Europe was bifurcated between a small minority of the very rich and the vast majority of the very poor. One 17th-century observer estimated that the French population consisted of “10 percent rich, 50 percent very poor, 30 percent who were nearly beggars, and 10 percent who were actually beggars.” In 16th-century Spain, the Italian historian Francesco Guicciardini wrote, “except for a few Grandees of the Kingdom who live with great sumptuousness … others live in great poverty.”

An account from 18th-century Naples recorded beggars finding “nocturnal asylum in a few caves, stables or ruined houses” where “they are to be seen there lying like filthy animals, with no distinction of age or sex.” Children fared the worst. Paris, according to the French author Louis-Sébastien Mercier, had “7,000 to 8,000 abandoned children out of some 30,000 births around 1780.” These children were then taken—three at a time—to the poor house, with carriers often finding at least “one of them dead” upon arrival.

People were constantly hungry, and starvation was only ever a few bad harvests away. In 1800, even France, one of the world’s richest countries, had an average food supply of only 1,846 calories per person per day. In other words, the majority of the population was undernourished. (Given that the average person needs about 2,000 calories a day.) That, in the words of the Italian historian Carlo Cipolla, gave rise to “serious forms of avitaminosis,” or medical conditions resulting from vitamin deficiencies. There was also, he noted, a prevalence of intestinal worms, which is “a slow, disgusting, and debilitating disease that caused a vast amount of human misery and ill health.”

Sanitation was a nightmare. As the English historian Lawrence Stone wrote in his book The Family, Sex and Marriage in England 1500–1800, “city ditches, now often filled with stagnant water, were commonly used as latrines; butchers killed animals in their shops and threw the offal of the carcasses into the streets; dead animals were left to decay and fester where they lay.” London had “poor holes” or “large, deep, open pits in which were laid the bodies of the poor, side by side, row by row.” The stench was overwhelming, for “great quantities of human excrement were cast into the streets.”

The French historian Fernand Braudel found that in 15th-century England, “80 percent of private expenditure was on food, with 20 percent spent on bread alone.” An account of 16th-century life in rural Lombardy noted that peasants lived on wheat alone: Their “expenses for clothing and other needs are practically non-existent.” Per Cipolla, “One of the main preoccupations of hospital administration was to ensure that the clothes of the deceased should not be usurped but should be given to lawful inheritors. During epidemics of plague, the town authorities had to struggle to confiscate the clothes of the dead and to burn them: people waited for others to die so as to take over their clothes.”

Prior to mechanized agriculture, there were no food surpluses to sustain idle hands, not even those of children. And working conditions were brutal. A 16th-century ordinance in Lombardy found that supervisors in rice fields “bring together a large number of children and adolescents, against whom they practice barbarous cruelties … [They] do not provide these poor creatures with the necessary food and make them labor as slaves by beating them and treating them more harshly than galley slaves, so that many of the children die miserably in the farms and neighboring fields.”

Such violence pervaded daily life. Medieval homicide rates reached 150 murders per 100,000 people in 14th-century Florence. In 15th-century England, it hovered around 24 per 100,000. (In 2020, the Italian homicide rate was 0.48 per 100,000. It was 0.95 per 100,000 in England and Wales in 2024.) People resolved their disputes through physical violence because no effective legal system existed. The serfs—serfdom in Russia was abolished only in 1861—lived as property, bound to land they could never own, subject to masters who viewed them as assets rather than humans. And between 1500 and the first quarter of the 17th century, Europe’s great powers were at war nearly 100 percent of the time.

Carlson’s nostalgia for feudalism is not unique on the MAGA right. The influential American blogger Curtis Yarvin, for example, attributes to monarchs such as France’s Louis XIV decisive and long-term leadership that modern democracies apparently lack. But less frequently mentioned is how, for example, that same Louis ruined his country during the War of the Spanish Succession. As Winston Churchill wrote in Marlborough: His Life and Times,

After more than sixty years of his reign, more than thirty years of which had been consumed in European war, the Great King saw his people face to face with actual famine. Their sufferings were extreme. In Paris the death-rate doubled. Even before Christmas the market-women had marched to Versailles to proclaim their misery. In the countryside the peasantry subsisted on herbs or roots or flocked in despair into the famishing towns. Brigandage was widespread. Bands of starving men, women, and children roamed about in desperation. Châteaux and convents were attacked; the market-place of Amiens was pillaged; credit failed. From every province and from every class rose the cry for bread and peace.

The Great Enrichment, a phrase coined by my Cato Institute colleague Deirdre McCloskey, of the past 200 years or so lifted billions from the misery that defined human existence for millennia. It was driven by market economies and limits on the rulers’ arbitrary power, not feudal hierarchy.

There are many plausible reasons for Carlson’s (and Yarvin’s) openness to giving pre-modern institutions such as feudalism and absolute monarchy a second look. One is a lack of appreciation for the reality of the daily existence of ordinary people whose lives, in the immortal words of the English philosopher Thomas Hobbes, were “poor, nasty, brutish, and short.”

Another is their apparent conviction that the United States is, in the words of President Donald Trump, “a failed nation.” Except that we are nothing of the sort. The United States has plenty of problems, but the lives of ordinary Americans in 2025 are incomparably better than those of the kings and queens of the past. Our standard of living is, in fact, the envy of the world, which is the most parsimonious explanation for millions of people trying to get here.

Solving the problems that remain and will arise in the future will depend on careful evaluation of evidence, historical experience, reason, and hard work. Catastrophism does not help, for it rejects human agency by declaring that the future is already decided. Hunkering down under a protective shield of feudal hierarchy or placing our trust in a modern incarnation of Louis XIV is no guarantee of success. We tried it before, and the results were disastrous.

This article originally appeared in The Dispatch on August 26, 2025.

Blog Post | Mental Health

Psychiatric Overdiagnosis: The Price of Prosperity?

Abundance, loose criteria, and perverse healthcare incentives turned normal struggles into a diagnosable epidemic.

Summary: Rising rates of psychiatric diagnoses in wealthy countries have fueled claims of a growing mental health crisis, but this trend is in large part a byproduct of greater mental health awareness. Subjective psychiatric standards—combined with expanded diagnostic criteria—have blurred the line between normal human struggles and clinical disorders. Healthcare incentives, such as those in the US, often encourage overdiagnosis by rewarding providers for labeling and treating more patients rather than ensuring accurate or necessary care. Overdiagnosis is a problem of prosperity—but preferable to underdiagnosis, and solvable with the right incentives.


According to the World Health Organization, more than 1.1 billion people worldwide are living with a mental disorder. The figure has grown faster than the global population, and the burden falls disproportionately on the world’s wealthiest societies. In the United States, an estimated 49.5 percent of adolescents have met diagnostic criteria for at least one mental disorder at some point in their lifetime. Additionally, about 31 percent of American adults will experience an anxiety disorder at some point in their lives, and 21 percent a mood disorder, according to the National Institutes of Mental Health. 

In Australia, the National Study of Mental Health and Wellbeing found that 21.5 percent of adults over 25 and over 38 percent of young people aged 16 to 24 met criteria for a mental disorder in the previous 12 months. Across OECD nations, one in five adults experiences at least mild depressive symptoms, with over 9 percent of the population reporting clinical depression or anxiety. 

These trends have become perhaps the most common objection to the case for human progress: If life is getting better, why are so many people apparently unhappy? Why are hundreds of millions of people across the most prosperous nations on Earth labeled clinically mentally unwell? 

For one, rising mental health diagnoses may themselves be a sign of progress. Psychiatry as a discipline is barely more than a century old, and it was stigmatized and unscientific throughout most of its history. What we now call mental health problems are, in many cases, what our ancestors called the inevitable vicissitudes of life. When survival demanded hard physical labor from dawn to dusk, there was little room for psychoanalysis. Perhaps only in a world of material abundance, safety, and comfort—where mood swings and relationship conflict represent life’s biggest challenges for many otherwise healthy people—do we begin to treat such adversity not as fate but as a problem to be solved. 

That is not to dismiss the problem entirely. Our survival-evolved brains are navigating environments they were never built for. It was adaptive to be vigilant about threats in one’s local environment; there was no possibility of witnessing every catastrophe on Earth in real time. Social media, sedentary lifestyles, weakened community bonds, and the erosion of traditional sources of meaning all represent genuine evolutionary mismatches that plausibly contribute to psychological distress. 

But at least in the United States, there is strong reason to believe that a less-examined driver of the supposed rise in mental illness is the healthcare financing system itself, which pays more when providers diagnose more.

Psychiatric Overdiagnosis in the United States

Psychiatric diagnoses in the United States are rising across virtually every category, in every age group. According to the National Institutes of Mental Health, more than one in five U.S. adults—59.3 million people—lived with a mental illness in 2022. By these numbers, mental illness is not a rare affliction but a near-universal feature of American life, prompting some, including former US Surgeon General Vivek Murthy, MD, to declare a mental health epidemic.

The rise is evident across specific conditions as well. The Centers for Disease Control and Prevention (CDC) now places autism prevalence at 1 in 31, a 381 percent increase since 2000. Attention-Deficit/Hyperactivity Disorder (ADHD) diagnoses among American children nearly doubled from 6.1 to 11.4 percent between 1997 and 2022. Among adults, self-reported ADHD diagnosis among working-age adults has more than tripled since 2012, from 4.25 to 13.9 percent. Diagnosed anxiety among children aged 3 to 17 rose from 6.9 to 10.6 percent between 2016 and 2022—a 54 percent increase in just six years. Diagnosed depression among the same age group climbed from 3.1 to 4.6 percent, a 48 percent increase, in the same time period. Among adults, the past-year prevalence of any mental illness rose to 23.1 percent in 2022, with young adults aged 18 to 25 reporting the highest rate of 36.2 percent. 

A surface-level reading of these numbers suggests that America is indeed in the midst of a mental health crisis. But diagnoses can change even when our underlying psychology does not.

Psychiatric diagnoses differ from most of medicine because they rely on subjective mental phenomena and behavioral symptoms instead of physical symptoms or biomarkers. There is no blood test for autism, no imaging scan that confirms ADHD, and no objective test that differentiates clinical anxiety from ordinary worry. Diagnosis depends on clinical judgment about whether a person’s behavior exceeds a threshold established by committee consensus in the Diagnostic and Statistical Manual of Mental Disorders (DSM).

The DSM has progressively broadened the boundaries of major psychiatric categories over successive revisions. The DSM-5, published in 2013, collapsed previously distinct autism categories into a single spectrum, making “on the spectrum” a label elastic enough to encompass both nonverbal children requiring constant care and socially awkward adolescents who prefer solitude. The same revision loosened ADHD criteria, allowing symptoms to appear as late as age 12 rather than requiring onset by age 7, and reducing the symptom threshold for adults. Generalized anxiety disorder requires only that worry be “excessive” and cause “clinically significant distress or impairment,” judgments that depend entirely on a clinician’s interpretation of where normal worry ends, and disorder begins.

Defenders of modern psychiatry often claim that expanding diagnostic criteria reflect better screening, capturing subtler presentations, and that rising diagnoses reflect more accurate assessments of the true population prevalence of mental illness. But aside from the grim forecasts of living in a world where half of all young people have experienced mental illness, there is reason to believe that psychiatric diagnoses have become less precise, not more. 

Broad diagnostic criteria often interact with screening instruments that cannot reliably distinguish clinical conditions from normal variation. The CDC’s autism prevalence estimates, for instance, rely on surveys such as the Social Responsiveness Scale, which asks parents to rate statements like “Would rather be alone than with others,” “Has difficulty making friends,” and “Is regarded by other children as odd or weird.” These items describe behavioral traits common to social anxiety, introversion, and ordinary shyness and cannot reliably distinguish autism. Yet researchers routinely use high scores on such instruments as proxies for clinical diagnosis in prevalence studies, including in the CDC’s own data.  

The limitations of this approach became especially apparent after the COVID-19 pandemic. CDC autism prevalence surged an additional 40 percent in just four years, from 2018 to 2022—a period during which millions of children experienced prolonged social isolation, disrupted routines, and reduced peer interaction that would predictably elevate scores on parent-reported behavioral surveys measuring social difficulties, whether or not the underlying rate of autism had changed. 

None of this diminishes the reality of autism, ADHD, or anxiety disorders for individuals with significant functional impairment. But when the boundaries of diagnosis are inherently subjective, and when diagnosis is the key that unlocks streams of taxpayer-funded services, the system will predictably expand those boundaries.

How Medicaid and the ACA Reward Diagnostic Expansion

When diagnosis is subjective, and payment depends on diagnosis, the system will reward expanding the definition of illness.

Incentives drive behavior. Psychiatric overdiagnoses would matter less if the diagnosis were merely a label. But in the American healthcare system, diagnoses serve as keys that unlock streams of taxpayer dollars. 

The Mental Health Parity and Addiction Equity Act of 2008, extended by the Affordable Care Act (ACA), requires health plans, including Medicaid managed care plans, to cover behavioral health services at parity with medical and surgical services. Parity addressed a real problem: mental health conditions were historically under-covered. But parity also limits the tools that plans can use to manage utilization. Prior authorization requirements, visit caps, and annual spending ceilings can all be challenged on parity grounds. Plans that wish to avoid litigation or regulatory action have a strong reason to approve rather than deny.

Under the fee-for-service payment model within Medicaid, which 2008 parity provisions dramatically expanded, providers submit a claim to the state Medicaid agency. The state then pays the provider in accordance with the predetermined price of the service, otherwise known as the fee schedule. The fee schedule, in theory, serves to regulate providers’ room for maneuver with respect to payment claims, thereby preventing undue financial gain. The reimbursement structure underlying the fee-for-service model is designed to mitigate abuse by binding providers to a prearranged sum. 

However, the fee schedule only governs the prices to which providers are entitled for their services. It introduces no effective mechanism by which to govern the legitimacy of the services themselves. This empowers providers to profit by inflating the frequency of services, knowing that the fee-for-service model fixes only the pricing and not the services. This creates the conditions for supplier-induced demand.

In practice, therefore, providers have the freedom to manipulate demand by lowering the diagnostic threshold for services. Across states, weak spending constraints further subsidize this demand. This serves to distort natural market forces by enabling providers to expand mental health services beyond the point at which their cost would be acceptable to recipients, especially those with minimal diagnostic eligibility.

Similar risks persist in managed care, which pays per patient rather than per service. While this model improves cost predictability, it does little to ensure services are necessary. Providers still control enrollment, and expanding the number of patients can drive spending just as effectively as increasing the number of services. Changing the payment mechanism does not eliminate the incentive—it simply shifts how it is exploited.

Additionally, under Medicaid’s Early and Periodic Screening, Diagnostic, and Treatment (EPSDT) benefit, states must cover all medically necessary services for children under 21, even services not otherwise included in the state’s Medicaid plan, including mental health services. 

When diagnoses rest on subjective behavioral criteria, and when coverage means open-ended reimbursement for services billed by the hour, the connection between spending and genuine clinical need begins to erode.

Then there is the federal matching structure. Medicaid’s open-ended Federal Medical Assistance Percentage reimburses states for 50 to 83 percent of Medicaid expenditures. When a state spends a dollar on autism services, it pays 17 to 50 cents. Federal taxpayers cover the rest. And because the match is open-ended, more spending automatically brings in more federal dollars. States bear only a fraction of the cost, weakening the fiscal discipline that comes with spending their own money.

Once therapy became mandatory, states used Medicaid waivers to circumvent standard rules and expand services and eligibility with federal funds. These waivers—and similar authorities—opened the door for providers to significantly increase Medicaid billing.

Enhanced federal matching rates during the COVID-19 public health emergency further reduced the state share, especially during the period when mental health spending grew the fastest. The pandemic significantly increased both the supply of and demand for psychiatric services. Telehealth services for mental health conditions surged 16- to 20-fold during the first year of the pandemic, according to a RAND study of over 5 million commercially insured adults, more than compensating for the drop in in-person care. By August 2022, overall mental health service utilization was 38.8 percent higher than before the pandemic. Mental health and substance use diagnoses grew from 11 percent of telehealth visits in early 2019 to 39 percent by mid-2021. The share of all outpatient visits carrying a mental health or substance use diagnosis doubled from 4 to 8 percent. 

Pandemic emergency waivers and telehealth policies further loosened restrictions on how services could be delivered and reimbursed. States such as Massachusetts, North Carolina, Indiana, and Colorado expanded telehealth eligibility (including audio-only services) and adopted payment parity for telehealth, effectively turning remote services into scalable, high-volume billing opportunities. The result was not just a shift in how care was delivered, but a notable increase in utilization and spending, often in the tens of millions of dollars per state annually, consistent with policy changes that reduced the marginal cost of delivering and billing for services.

A substantial body of research suggests that financial incentives can influence psychiatric diagnosis rates. In the United States, eligibility for school services and insurance coverage often depends on specific diagnostic categories. For example, states offering more autism-specific services tend to report higher autism prevalence, while classifications of other developmental disabilities decline—a pattern consistent with diagnostic substitution. A 2009 study estimated that at least 26 percent of the increase in autism diagnoses in California between 1992 and 2005 could be explained by diagnostic substitution, primarily from children previously classified as having intellectual disability.

A Problem of Prosperity?

In economic terms, what has unfolded in American mental healthcare is supplier-induced demand operating within a system that lacks the price signals, utilization controls, and outcome accountability mechanisms that would normally constrain it. The therapy industry has expanded to absorb the available reimbursement, exactly as economic theory would predict in a fee-for-service system with elastic diagnostic criteria, open-ended coverage mandates, and absent oversight.

That is worth stating clearly, because the rising tide of psychiatric diagnoses is often cited as proof that modernity has failed; that the improvements in life expectancy, poverty reduction, literacy, income, and so forth are hollow, because they mask a deeper spiritual or psychological collapse. That narrative is understandable. It is also incomplete.

The story of mental health in the modern world is not one of pure decline. It is a story of multiple forces operating simultaneously, some genuinely concerning and some artifacts of the very prosperity that makes psychological well-being a priority in the first place. Wealthy societies can afford to screen for, name, and treat conditions that our ancestors endured in silence or never recognized at all. That is a form of progress. But when the systems designed to deliver that care are structured to reward volume over value, diagnosis over outcome, and spending over accountability, the result is predictable: an ever-expanding pool of diagnoses that dilutes resources away from those with the most severe impairment.

There is reason to be optimistic. The fact that societies are wealthy and secure enough to attend to psychological suffering at all—rather than simply enduring it—represents a remarkable achievement. 

But the same ingenuity that produced modern medicine, market economies, and unprecedented material abundance can also produce perverse incentive structures that undermine the goals they were designed to serve. Understanding that human systems, like the humans who design them, are imperfect and responsive to incentives, is not an argument against progress. It is a precondition for sustaining it. Progress, as ever, depends on getting the incentives right.

Revolution Medicines | Noncommunicable Disease

A Promising New Drug to Treat Metastatic Pancreatic Cancer

“Revolution Medicines, a late-stage clinical oncology company developing targeted therapies for patients with RAS-addicted cancers, today announced positive topline results from its global, randomized, controlled Phase 3 RASolute 302 clinical trial evaluating daraxonrasib in patients with metastatic pancreatic ductal adenocarcinoma (PDAC) who had been previously treated. Daraxonrasib taken orally once daily demonstrated statistically significant and clinically meaningful improvements in progression-free survival (PFS) and overall survival (OS) compared with standard of care cytotoxic chemotherapy delivered intravenously. In the overall (intent-to-treat) study population, daraxonrasib demonstrated a median OS of 13.2 months versus 6.7 months for chemotherapy, with a hazard ratio of 0.40 (p < 0.0001). Daraxonrasib was generally well tolerated, with a manageable safety profile and with no new safety signals.”

From Revolution Medicines.

Newswise | Vaccination

Experimental Hookworm Vaccine Shows Promising Protection

“Researchers at the George Washington University School of Medicine and Health Sciences in partnership with Baylor College of Medicine report encouraging results from a phase 2 clinical trial evaluating a candidate vaccine to prevent hookworm infection – one of the world’s most common parasitic diseases.

The findings, published in The Lancet Infectious Diseases, show that a formulation of the investigational vaccine significantly reduced the intensity of infection in healthy adult volunteers exposed to the parasite under carefully controlled conditions…

  • Participants who received the Na-GST 1/Al–CpG vaccine showed a dramatically lower intensity of infection after exposure: maximal hookworm egg count was median 0.0 eggs per gram of feces compared with the placebo group (median 66.7 eggs)
  • Peak eosinophil levels – a blood marker linked to parasitic infection – were significantly lower in the Na-GST-1/Al–CpG group of participants.
  • This group of participants also produced the highest levels of anti–Na-GST-1 antibodies, suggesting these antibodies may help protect against infection.”

From Newswise.

Blog Post | Water & Sanitation

If You Think New York City Life Is Bad Now

A grim tour of preindustrial New York

Summary: Many people today feel that life in New York has become uniquely difficult. Some imagine that the city was cleaner, safer, and more livable in the distant past. Historical reality tells a different story: Preindustrial New York was marked by extreme filth, unsafe water, rampant disease, pervasive poverty, and living conditions that made everyday life harsh and dangerous compared to contemporary times.


Discontent fueled the 2025 New York City mayoral election and Zohran Mamdani’s victory. A common theme echoed across the five boroughs: New York is a hard place to live. “We are overwhelmed by housing costs,” said Santiago, a 69-year-old retiree, outside a Mamdani rally. Those opposed to Mamdani had their own complaints. María Moreno, a first-time voter from the Bronx who supported Andrew Cuomo, lamented, “Now everything’s dirty, and our neighborhood does not feel safe.”

Today’s voters have legitimate grievances. The city’s housing costs, quality-of-life issues, and perceptions of disorder weigh heavily on residents’ minds. But it’s important to keep things in perspective. Different voters may romanticize different eras, but many seem to share a sense that if they could travel back far enough in time, they’d find a New York that was once clean, safe, and affordable. When Americans were polled in 2023, almost 20 percent said that it was easier to “have a thriving and fulfilling life” hundreds of years ago. Across the country, as one writer put it, people are engaged in an “endless debate around whether the preindustrial past was clearly better than what we have now.” In fact, Mamdani’s politics are grounded in an ideology that first arose from the frustrations of the early industrial era.

If Americans could go back in time to preindustrial New York City, however, they’d likely be horrified and possibly traumatized. Despite today’s real challenges, most New Yorkers would not trade places with their predecessors.

Long before the rise of factories and industry, New York City was a bustling port, founded by the Dutch as New Amsterdam in order to trade furs in the early seventeenth century. As early as 1650, local authorities enacted an ordinance against animals roaming the streets to protect local infrastructure—but to no avail. Then, in 1657, according to the Dutch scholar Jaap Harskamp:

New Amsterdam’s council attempted to ban the common practice of throwing rubbish, ashes, oyster-shells or dead animals in the street and leave the filth there to be consumed by droves of pigs on the loose. When the English took over the colony from the Dutch, pigs and goats stayed put. . . . Pollution persisted. The streets of Manhattan were a stinking mass. Inhabitants hurled carcasses and the contents of loaded chamber pots into the street and rivers. Runoff from tanneries where skins were turned into leather flowed into the waters that supplied the shallow wells. The (salty) natural springs and ponds in the region became contaminated with animal and human waste. For some considerable time, access to clean water remained an urgent problem for the city. . . . The penetrating smell of decomposing flesh was everywhere.

Into the early twentieth century, urban living in the United States felt surprisingly rural and agrarian, with an omnipresent reek to match. As late as the mid-nineteenth century, pigs roamed freely through New York City streets, acting as scavengers, and nearly every household maintained a vegetable garden, often fertilized with animal manure.

Indoor air quality was no better. A drawing from Mary L. Booth’s History of the City of New York depicts a seventeenth century New Amsterdam home with smoke from the fireplace swirling through the room. Indoor air pollution remains a serious problem today in the poorest parts of the world, as smoke from hearths can cause cancer and acute respiratory infections that often prove deadly in children. One preindustrial writer railed against the “pernicious smoke [from fireplaces] superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.”

That said, before industrialization, though inescapable filth coated the interiors of homes, the average person owned few possessions for the corrosive hearth smoke and soot to ruin. By modern standards, New Yorkers—like most preindustrial people—were impoverished and lacked even the most basic amenities. According to historian Judith Flanders, in the mid-eighteenth century, “fewer than two households in ten in some counties of New York possessed a fork.” Many were desperately poor even by the standards of the day and could not afford housing. One 1788 account lamented how in New York City, “vagrants multiply on our Hands to an amazing Degree.” Charity records suggest that the “outdoor poor” far outnumbered those in almshouses.

Water quality was infamously awful. In seventeenth-century New Amsterdam, as Benjamin Bullivant observed, “[There are] many publique wells enclosed & Covered in ye Streetes . . . [which are] Nasty & unregarded.” A century later, New York’s water remained as foul as Bullivant had described. Visiting in 1748, the Swedish botanist Peter Kalm noted that the city’s well water was so filthy that horses from out of town refused to drink it. In 1798, the Commercial Advertiser condemned Manhattan’s main well as “a shocking hole, where all impure things center together and engender the worst of unwholesome productions; foul with excrement, frogspawn, and reptiles, that delicate pump system is supplied. The water has grown worse manifestly within a few years. It is time to look out [for] some other supply, and discontinue the use of a water growing less and less wholesome every day. . . . It is so bad . . . as to be very sickly and nauseating; and the larger the city grows the worse this evil will be.”

In 1831, a letter in the New York Evening Journal described the state of the water supply:

I have no doubt that one cause of the numerous stomach affections so common in this city is the impure, I may say poisonous nature of the pernicious Manhattan water which thousands of us daily and constantly use. It is true the unpalatableness of this abominable fluid prevents almost every person from using it as a beverage at the table, but you will know that all the cooking of a very large portion of the community is done through the agency of this common nuisance. Our tea and coffee are made of it, our bread is mixed with it, and our meat and vegetables are boiled in it. Our linen happily escapes the contamination of its touch, “for no two things hold more antipathy” than soap and this vile water.

In 1832, New York experienced a devastating outbreak of cholera, a bacterial disease that typically spread through contaminated water and killed with remarkable speed. A person could wake up feeling well and be dead by nightfall, struck down with agonizing cramps, vomiting, and diarrhea. The epidemic killed about 3,500 New Yorkers.

The initial actions taken to protect city water supplies were often private in nature. In fact, throughout the eighteenth and early nineteenth centuries, private businesses generally supplied urban water infrastructure. Despite such efforts, drinking water remained generally unsafe, even after industrialization, until the chlorination of urban water supplies became widespread.

The pervasive grime took a visible toll on New Yorkers. Between drinking tainted water, eating contaminated food, inhaling smoke-filled air, and living with poor hygiene, the average resident sported visibly rotten teeth. One letter from 1781 described an acquaintance: “Her teeth are beginning to decay, which is the case with most New York girls, after eighteen.”

The dental practices of the time were often as horrifying as the effects of neglect. The medieval method of using arsenic to kill gum tissue, providing pain relief by destroying nerve endings, remained common until the introduction of Novocain in the twentieth century. As late as 1879, the New York Times ran a story with the headline “Fatal Poison in a Tooth; What Caused the Horrible Death of Mr. Gardiner. A Man’s Head Nearly Severed from His Body by Decay Caused by Arsenic Which Had Been Placed in One of His Teeth to Deaden an Aching Nerve—an Extraordinary Case.” The story detailed the gruesome demise of a man in Brooklyn, George Arthur Gardiner, who died “in great agony, after two weeks of indescribable suffering.”

Preindustrial New York City wasn’t uniquely miserable for its time. Life was harsh everywhere, and cities around the world contended with the same foul smells, filth, poor sanitation, and grinding poverty. Rural villages were no better. Peasant families often brought their livestock indoors at night and slept huddled together for warmth. In many cases, rural peasants were even poorer than their urban counterparts and owned fewer possessions. Farm laborers frequently suffered injuries and aged prematurely from backbreaking work, while fertilizing cesspits spread disease and filled the air with an inescapable stench.

Though they may have been slightly better off than their rural counterparts, the struggles of early New Yorkers are worth remembering. However daunting the problems of today may seem, a proper historical perspective can remind us of how far we’ve come.

This article was originally published in City Journal on 1/13/2026.