fbpx
01 / 05
Dinner With Dickens Was Slim Pickins

Blog Post | Wealth & Poverty

Dinner With Dickens Was Slim Pickins

Claims that characters in "A Christmas Carol" were better off than modern Americans are pure humbug.

Summary: There have recently been widespread claims that Dickens’s working poor were better off than modern minimum-wage workers. Such comparisons rely on misleading inflation math and selective reading. The severe material deprivation of Victorian life—crowded housing, scarce possessions, and basic sanitation problems—dwarfs today’s standards. Modern Americans, even at the lower end of the income scale, enjoy far greater material comfort than the Cratchits ever did.


Christmas is often a time for nostalgia. We look back on our own childhood holidays. Songs and traditions from the past dominate the culture.

Nostalgia is not without its purposes. But it can also be misleading. Take those who view the material circumstances of Charles Dickens’s “A Christmas Carol” as superior to our own.

Claims that an American today earning the minimum wage is worse off than the working poor of the 19th century have been popular since at least 2021. A recent post with thousands of likes reads:

Time for your annual reminder that, according to A Christmas Carol, Bob Cratchit makes 15 shillings a week. Adjusted for inflation, that’s $530.27/wk, $27,574/yr, or $13.50/ hr. Most Americans on minimum wage earn less than a Dickensian allegory for destitution.

This is humbug.

Consider how harsh living conditions were for a Victorian earning 15 shillings a week.

Dickens writes that Mr. Cratchit lives with his wife and six children in a four-room house. It is rare for modern residents of developed nations to crowd eight people into four rooms.

It was common in the Victorian era. According to Britain’s National Archives, a typical home had no more than four rooms. Worse yet, it lacked running water and a toilet. Entire streets (or more) would share a few toilets and a pump with water that was often polluted.

The Cratchit household has few possessions. Their glassware consists of merely “two tumblers, and a custard-cup without a handle.” For Christmas dinner, Mr. Cratchit wears “threadbare clothes” while his wife is “dressed out but poorly in a twice-turned gown.”

People used to turn clothing inside-out and alter the stitching to extend its lifespan. The practice predated the Victorian era, but continued into it. Eventually, clothes would become “napless, threadbare and tattered,” as the historian Emily Cockayne noted.

The Cratchits didn’t out-earn a modern American earning the minimum wage. Mr. Cratchit’s weekly salary of 15 shillings in 1843, the year “A Christmas Carol” was published, is equivalent to almost £122 in 2025. Converted to U.S. dollars, that’s about $160 a week, for an annual salary of $8,320.

The U.S. federal minimum wage is $7.25 per hour or $15,080 per year for a full-time worker. That’s about half of what the meme claims Mr. Cratchit earned. Only 1% of U.S. workers earned the federal minimum wage or less last year. Most states set a higher minimum wage. The average worker earns considerably more. Clerks like Mr. Cratchit now earn an average annual salary of $49,210.

Mr. Cratchit couldn’t have purchased much of the modern “basket of goods” used in inflation calculations. Many of the basket’s items weren’t available in 1843. The U.K.’s Office of National Statistics recently added virtual reality headsets to it.

Another way to compare the relative situation of Mr. Cratchit and a minimum-wage worker today is to see how long it would take each of them to earn enough to buy something comparable. A BBC article notes that, according to an 1844 theatrical adaptation of “A Christmas Carol,” it would have taken Mr. Cratchit a week’s wages to purchase the trappings of a Christmas feast: “seven shillings for the goose, five for the pudding, and three for the onions, sage and oranges.” Mr. Cratchit opts for a goose for the family’s Christmas meal. A turkey—then a costlier option—was too expensive.

The American Farm Bureau Federation found that the ingredients for a turkey-centered holiday meal serving 10 people cost $55.18 in 2025. At the federal minimum wage, someone would need to work seven hours and 37 minutes to afford that feast.

A minimum-wage worker could earn more than enough in a single workday to purchase a meal far more lavish than the modest Christmas dinner that cost Mr. Cratchit an entire week’s pay. And the amount of time a person needs to work to afford a holiday meal has fallen dramatically for the average blue-collar worker in recent years despite inflation. Wages have grown faster than food prices.

There has been substantial progress in living conditions since the 1840s. We’re much better off than the Cratchits were. In fact, most people today enjoy far greater material comfort than did even Dickens’s rich miser Ebenezer Scrooge.

This article was originally published in the Wall Street Journal on 12/23/2025.

Blog Post | Water & Sanitation

If You Think New York City Life Is Bad Now

A grim tour of preindustrial New York

Summary: Many people today feel that life in New York has become uniquely difficult. Some imagine that the city was cleaner, safer, and more livable in the distant past. Historical reality tells a different story: Preindustrial New York was marked by extreme filth, unsafe water, rampant disease, pervasive poverty, and living conditions that made everyday life harsh and dangerous compared to contemporary times.


Discontent fueled the 2025 New York City mayoral election and Zohran Mamdani’s victory. A common theme echoed across the five boroughs: New York is a hard place to live. “We are overwhelmed by housing costs,” said Santiago, a 69-year-old retiree, outside a Mamdani rally. Those opposed to Mamdani had their own complaints. María Moreno, a first-time voter from the Bronx who supported Andrew Cuomo, lamented, “Now everything’s dirty, and our neighborhood does not feel safe.”

Today’s voters have legitimate grievances. The city’s housing costs, quality-of-life issues, and perceptions of disorder weigh heavily on residents’ minds. But it’s important to keep things in perspective. Different voters may romanticize different eras, but many seem to share a sense that if they could travel back far enough in time, they’d find a New York that was once clean, safe, and affordable. When Americans were polled in 2023, almost 20 percent said that it was easier to “have a thriving and fulfilling life” hundreds of years ago. Across the country, as one writer put it, people are engaged in an “endless debate around whether the preindustrial past was clearly better than what we have now.” In fact, Mamdani’s politics are grounded in an ideology that first arose from the frustrations of the early industrial era.

If Americans could go back in time to preindustrial New York City, however, they’d likely be horrified and possibly traumatized. Despite today’s real challenges, most New Yorkers would not trade places with their predecessors.

Long before the rise of factories and industry, New York City was a bustling port, founded by the Dutch as New Amsterdam in order to trade furs in the early seventeenth century. As early as 1650, local authorities enacted an ordinance against animals roaming the streets to protect local infrastructure—but to no avail. Then, in 1657, according to the Dutch scholar Jaap Harskamp:

New Amsterdam’s council attempted to ban the common practice of throwing rubbish, ashes, oyster-shells or dead animals in the street and leave the filth there to be consumed by droves of pigs on the loose. When the English took over the colony from the Dutch, pigs and goats stayed put. . . . Pollution persisted. The streets of Manhattan were a stinking mass. Inhabitants hurled carcasses and the contents of loaded chamber pots into the street and rivers. Runoff from tanneries where skins were turned into leather flowed into the waters that supplied the shallow wells. The (salty) natural springs and ponds in the region became contaminated with animal and human waste. For some considerable time, access to clean water remained an urgent problem for the city. . . . The penetrating smell of decomposing flesh was everywhere.

Into the early twentieth century, urban living in the United States felt surprisingly rural and agrarian, with an omnipresent reek to match. As late as the mid-nineteenth century, pigs roamed freely through New York City streets, acting as scavengers, and nearly every household maintained a vegetable garden, often fertilized with animal manure.

Indoor air quality was no better. A drawing from Mary L. Booth’s History of the City of New York depicts a seventeenth century New Amsterdam home with smoke from the fireplace swirling through the room. Indoor air pollution remains a serious problem today in the poorest parts of the world, as smoke from hearths can cause cancer and acute respiratory infections that often prove deadly in children. One preindustrial writer railed against the “pernicious smoke [from fireplaces] superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.”

That said, before industrialization, though inescapable filth coated the interiors of homes, the average person owned few possessions for the corrosive hearth smoke and soot to ruin. By modern standards, New Yorkers—like most preindustrial people—were impoverished and lacked even the most basic amenities. According to historian Judith Flanders, in the mid-eighteenth century, “fewer than two households in ten in some counties of New York possessed a fork.” Many were desperately poor even by the standards of the day and could not afford housing. One 1788 account lamented how in New York City, “vagrants multiply on our Hands to an amazing Degree.” Charity records suggest that the “outdoor poor” far outnumbered those in almshouses.

Water quality was infamously awful. In seventeenth-century New Amsterdam, as Benjamin Bullivant observed, “[There are] many publique wells enclosed & Covered in ye Streetes . . . [which are] Nasty & unregarded.” A century later, New York’s water remained as foul as Bullivant had described. Visiting in 1748, the Swedish botanist Peter Kalm noted that the city’s well water was so filthy that horses from out of town refused to drink it. In 1798, the Commercial Advertiser condemned Manhattan’s main well as “a shocking hole, where all impure things center together and engender the worst of unwholesome productions; foul with excrement, frogspawn, and reptiles, that delicate pump system is supplied. The water has grown worse manifestly within a few years. It is time to look out [for] some other supply, and discontinue the use of a water growing less and less wholesome every day. . . . It is so bad . . . as to be very sickly and nauseating; and the larger the city grows the worse this evil will be.”

In 1831, a letter in the New York Evening Journal described the state of the water supply:

I have no doubt that one cause of the numerous stomach affections so common in this city is the impure, I may say poisonous nature of the pernicious Manhattan water which thousands of us daily and constantly use. It is true the unpalatableness of this abominable fluid prevents almost every person from using it as a beverage at the table, but you will know that all the cooking of a very large portion of the community is done through the agency of this common nuisance. Our tea and coffee are made of it, our bread is mixed with it, and our meat and vegetables are boiled in it. Our linen happily escapes the contamination of its touch, “for no two things hold more antipathy” than soap and this vile water.

In 1832, New York experienced a devastating outbreak of cholera, a bacterial disease that typically spread through contaminated water and killed with remarkable speed. A person could wake up feeling well and be dead by nightfall, struck down with agonizing cramps, vomiting, and diarrhea. The epidemic killed about 3,500 New Yorkers.

The initial actions taken to protect city water supplies were often private in nature. In fact, throughout the eighteenth and early nineteenth centuries, private businesses generally supplied urban water infrastructure. Despite such efforts, drinking water remained generally unsafe, even after industrialization, until the chlorination of urban water supplies became widespread.

The pervasive grime took a visible toll on New Yorkers. Between drinking tainted water, eating contaminated food, inhaling smoke-filled air, and living with poor hygiene, the average resident sported visibly rotten teeth. One letter from 1781 described an acquaintance: “Her teeth are beginning to decay, which is the case with most New York girls, after eighteen.”

The dental practices of the time were often as horrifying as the effects of neglect. The medieval method of using arsenic to kill gum tissue, providing pain relief by destroying nerve endings, remained common until the introduction of Novocain in the twentieth century. As late as 1879, the New York Times ran a story with the headline “Fatal Poison in a Tooth; What Caused the Horrible Death of Mr. Gardiner. A Man’s Head Nearly Severed from His Body by Decay Caused by Arsenic Which Had Been Placed in One of His Teeth to Deaden an Aching Nerve—an Extraordinary Case.” The story detailed the gruesome demise of a man in Brooklyn, George Arthur Gardiner, who died “in great agony, after two weeks of indescribable suffering.”

Preindustrial New York City wasn’t uniquely miserable for its time. Life was harsh everywhere, and cities around the world contended with the same foul smells, filth, poor sanitation, and grinding poverty. Rural villages were no better. Peasant families often brought their livestock indoors at night and slept huddled together for warmth. In many cases, rural peasants were even poorer than their urban counterparts and owned fewer possessions. Farm laborers frequently suffered injuries and aged prematurely from backbreaking work, while fertilizing cesspits spread disease and filled the air with an inescapable stench.

Though they may have been slightly better off than their rural counterparts, the struggles of early New Yorkers are worth remembering. However daunting the problems of today may seem, a proper historical perspective can remind us of how far we’ve come.

This article was originally published in City Journal on 1/13/2026.

Blog Post | Poverty Rates

Modern Freedom Beats Feudal Serfdom

Make the Middle Ages Great Again?

Summary: Some influential voices today romanticize feudalism, but the reality of feudalism was misery for nearly everyone. Life under that system meant hunger, disease, violence, and lives cut brutally short. By contrast, modern societies have lifted billions out of poverty and extended life far beyond what kings and queens once knew. Progress comes from freedom, innovation, and hard work, not a return to the rule of lords and monarchs.


On a recent podcast, Tucker Carlson praised feudalism as “so much better than what we have now” because a ruler is “vested in the prosperity of the people he rules.” This romantic view of medieval hierarchy ignores a brutal reality: For most people, feudalism meant grinding poverty, disease, and early death.

As Gale L. Pooley and I found in our 2022 book Superabundance, society in preindustrial Europe was bifurcated between a small minority of the very rich and the vast majority of the very poor. One 17th-century observer estimated that the French population consisted of “10 percent rich, 50 percent very poor, 30 percent who were nearly beggars, and 10 percent who were actually beggars.” In 16th-century Spain, the Italian historian Francesco Guicciardini wrote, “except for a few Grandees of the Kingdom who live with great sumptuousness … others live in great poverty.”

An account from 18th-century Naples recorded beggars finding “nocturnal asylum in a few caves, stables or ruined houses” where “they are to be seen there lying like filthy animals, with no distinction of age or sex.” Children fared the worst. Paris, according to the French author Louis-Sébastien Mercier, had “7,000 to 8,000 abandoned children out of some 30,000 births around 1780.” These children were then taken—three at a time—to the poor house, with carriers often finding at least “one of them dead” upon arrival.

People were constantly hungry, and starvation was only ever a few bad harvests away. In 1800, even France, one of the world’s richest countries, had an average food supply of only 1,846 calories per person per day. In other words, the majority of the population was undernourished. (Given that the average person needs about 2,000 calories a day.) That, in the words of the Italian historian Carlo Cipolla, gave rise to “serious forms of avitaminosis,” or medical conditions resulting from vitamin deficiencies. There was also, he noted, a prevalence of intestinal worms, which is “a slow, disgusting, and debilitating disease that caused a vast amount of human misery and ill health.”

Sanitation was a nightmare. As the English historian Lawrence Stone wrote in his book The Family, Sex and Marriage in England 1500–1800, “city ditches, now often filled with stagnant water, were commonly used as latrines; butchers killed animals in their shops and threw the offal of the carcasses into the streets; dead animals were left to decay and fester where they lay.” London had “poor holes” or “large, deep, open pits in which were laid the bodies of the poor, side by side, row by row.” The stench was overwhelming, for “great quantities of human excrement were cast into the streets.”

The French historian Fernand Braudel found that in 15th-century England, “80 percent of private expenditure was on food, with 20 percent spent on bread alone.” An account of 16th-century life in rural Lombardy noted that peasants lived on wheat alone: Their “expenses for clothing and other needs are practically non-existent.” Per Cipolla, “One of the main preoccupations of hospital administration was to ensure that the clothes of the deceased should not be usurped but should be given to lawful inheritors. During epidemics of plague, the town authorities had to struggle to confiscate the clothes of the dead and to burn them: people waited for others to die so as to take over their clothes.”

Prior to mechanized agriculture, there were no food surpluses to sustain idle hands, not even those of children. And working conditions were brutal. A 16th-century ordinance in Lombardy found that supervisors in rice fields “bring together a large number of children and adolescents, against whom they practice barbarous cruelties … [They] do not provide these poor creatures with the necessary food and make them labor as slaves by beating them and treating them more harshly than galley slaves, so that many of the children die miserably in the farms and neighboring fields.”

Such violence pervaded daily life. Medieval homicide rates reached 150 murders per 100,000 people in 14th-century Florence. In 15th-century England, it hovered around 24 per 100,000. (In 2020, the Italian homicide rate was 0.48 per 100,000. It was 0.95 per 100,000 in England and Wales in 2024.) People resolved their disputes through physical violence because no effective legal system existed. The serfs—serfdom in Russia was abolished only in 1861—lived as property, bound to land they could never own, subject to masters who viewed them as assets rather than humans. And between 1500 and the first quarter of the 17th century, Europe’s great powers were at war nearly 100 percent of the time.

Carlson’s nostalgia for feudalism is not unique on the MAGA right. The influential American blogger Curtis Yarvin, for example, attributes to monarchs such as France’s Louis XIV decisive and long-term leadership that modern democracies apparently lack. But less frequently mentioned is how, for example, that same Louis ruined his country during the War of the Spanish Succession. As Winston Churchill wrote in Marlborough: His Life and Times,

After more than sixty years of his reign, more than thirty years of which had been consumed in European war, the Great King saw his people face to face with actual famine. Their sufferings were extreme. In Paris the death-rate doubled. Even before Christmas the market-women had marched to Versailles to proclaim their misery. In the countryside the peasantry subsisted on herbs or roots or flocked in despair into the famishing towns. Brigandage was widespread. Bands of starving men, women, and children roamed about in desperation. Châteaux and convents were attacked; the market-place of Amiens was pillaged; credit failed. From every province and from every class rose the cry for bread and peace.

The Great Enrichment, a phrase coined by my Cato Institute colleague Deirdre McCloskey, of the past 200 years or so lifted billions from the misery that defined human existence for millennia. It was driven by market economies and limits on the rulers’ arbitrary power, not feudal hierarchy.

There are many plausible reasons for Carlson’s (and Yarvin’s) openness to giving pre-modern institutions such as feudalism and absolute monarchy a second look. One is a lack of appreciation for the reality of the daily existence of ordinary people whose lives, in the immortal words of the English philosopher Thomas Hobbes, were “poor, nasty, brutish, and short.”

Another is their apparent conviction that the United States is, in the words of President Donald Trump, “a failed nation.” Except that we are nothing of the sort. The United States has plenty of problems, but the lives of ordinary Americans in 2025 are incomparably better than those of the kings and queens of the past. Our standard of living is, in fact, the envy of the world, which is the most parsimonious explanation for millions of people trying to get here.

Solving the problems that remain and will arise in the future will depend on careful evaluation of evidence, historical experience, reason, and hard work. Catastrophism does not help, for it rejects human agency by declaring that the future is already decided. Hunkering down under a protective shield of feudal hierarchy or placing our trust in a modern incarnation of Louis XIV is no guarantee of success. We tried it before, and the results were disastrous.

This article originally appeared in The Dispatch on August 26, 2025.

JMP | Water & Sanitation

A Quarter of the World Population Gained Safe Water Since 2000

“Between 2000 and 2024, the global population increased from 6.2 billion to 8.2 billion. Over this period, a quarter of the world’s population (2.2 billion) gained access to safely managed drinking water, and a third (2.8 billion) gained safely managed sanitation. But while billions have gained access to WASH services, progress has been uneven and the total number of people still lacking access has decreased more slowly.

Since the start of the SDG period in 2015, 961 million have gained safely managed drinking water and the number of people still lacking access has decreased by 270 million. Among the 2.1 billion people still lacking access to safely managed drinking water in 2024, two thirds (1.4 billion) had a basic service, 287 million used limited services, 302 million used unimproved sources and 106 million still used surface water (61 million fewer people used surface water than in 2015).

Between 2015 and 2024, 1.2 billion people gained safely managed sanitation, and the number of people without decreased from 3.9 billion to 3.4 billion. In 2024, more than half of these people (1.9 billion) had a basic service, 560 million used limited services, 555 million with unimproved services and 354 million still practised open defecation (the number of people practising open defecation has decreased by 429 million since 2015).

Since 2015, 1.5 billion people have gained access to basic hygiene services and the number of people who are still unserved has fallen by nearly 900 million (from 2.5 billion to 1.7 billion). Among the 1.7 billion people who still lacked basic hygiene services in 2024,two thirds (1 billion) had a limited service and 611 million still had no handwashing facility.”

From JMP.

DD News | Water & Sanitation

Tap Water Coverage Crosses 81 Percent in Rural India

“More than 15.68 crore rural households – 81% of the total 19.36 crore – now have tap water connections under the government’s flagship Jal Jeevan Mission (JJM), Minister of State for Jal Shakti V. Somanna informed the Rajya Sabha on Monday.

At the time of announcement of JJM, 3.23 crore (17%) rural households were reported to have tap water connections.”

From DD News.