fbpx
01 / 05
The Grim Truth About the “Good Old Days”

Blog Post | Human Development

The Grim Truth About the “Good Old Days”

Preindustrial life wasn’t simple or serene—it was filthy, violent, and short.

Summary: Rose-tinted nostalgia for the preindustrial era has gone viral—some people claim that modernity itself was a mistake and that “progress” is an illusion. This article addresses seven supposed negative effects of the Industrial Revolution. The conclusion is that history bears little resemblance to the sanitized image of preindustrial times in the popular imagination.


When Ted Kaczynski, the Unabomber, declared in 1995 that “the Industrial Revolution and its consequences have been a disaster for the human race,” he was voicing a sentiment that now circulates widely online.

Rose-tinted nostalgia for the preindustrial era has gone viral, strengthened by anxieties about our own digital era. Some are even claiming that modernity itself was a mistake and that “progress” is an illusion. Medieval peasants led happier and more leisurely lives than we do, according to those who pine for the past. “The internet has become strangely nostalgic for life in the Middle Ages,” journalist Amanda Mull wrote in a piece for The Atlantic. Samuel Matlack, managing editor of The New Atlantis, observed that there is currently an “endless debate around whether the preindustrial past was clearly better than what we have now and we must go back to save humanity, or whether modern technological society is unambiguously a forward leap we must forever extend.”

In the popular imagination, the Industrial Revolution was the birth of many evils, a time when smoke-belching factories disrupted humanity’s erstwhile idyllic existence. Economics professor Vincent Geloso’s informal survey of university students found that they believed “living standards did not increase for the poor; only the rich got richer; the cities were dirty and the poor suffered from ill-health.” Pundit Tucker Carlson has even suggested that feudalism was preferable to modern liberal democracy.

Different groups tend to idealize different aspects of the past. Environmentalists might idealize preindustrial harmony with nature, while social traditionalists romanticize our ancestors’ family lives. People from across the political spectrum share the sense that the Industrial Revolution brought little real improvement for ordinary people.

In 2021, History.com published “7 Negative Effects of the Industrial Revolution,” an article reflecting much of the thinking behind the popular impression that industrialization was a step backward for humanity, rather than a period of tremendous progress. But was industrialization really to blame for each of the ills detailed in the article?

“Horrible Living Conditions for Workers”

Were horrible living conditions a result of industrialization? To be sure, industrial-era living conditions did not meet modern standards—but neither did the living conditions that preceded them.

As historian Kirstin Olsen put it in her book, Daily Life in 18th-Century England, “The rural poor . . . crowded together, often in a single room of little more than 100 square feet, sometimes in a single bed, or sometimes in a simple pile of shavings or straw or matted wool on the floor. In the country, the livestock might be brought indoors at night for additional warmth.” In 18th-century Wales, one observer claimed that in the homes of the common people, “every edifice” was practically a miniature “Noah’s Ark” filled with a great variety of animals. One shudders to think of the barnlike smell that bedchambers took on, in addition to the chorus of barnyard sounds that likely filled every night. Our forebears put up with the stench and noise and cuddled up with their livestock, if only to stave off hypothermia.

Homes were often so poorly constructed that they were unstable. The din of collapsing buildings was such a common sound that in 1688, Randle Holme defined a crash as “a noise proceeding from a breach of a house or wall.” The poet Dr. Samuel Johnson wrote that in 1730s London, “falling houses thunder on your head.” In the 1740s, “props to houses” keeping them from collapsing were listed among the most common obstacles that blocked free passage along London’s walkways.

“Poor Nutrition”

What about poor nutrition? From liberal flower children to the “Make America Healthy Again” crowd, fetishizing the supposedly chemical-free, wholesome diets of yore is bipartisan. The truth, however, is stomach-churning.

Our ancestors not only failed to eat well, but they sometimes didn’t eat at all. Historian William Manchester noted that in preindustrial Europe, famines occurred every four years on average. In the lean years, “cannibalism was not unknown. Strangers and travelers were waylaid and killed to be eaten.” Historian Fernand Braudel recorded a 1662 account from Burgundy, France, that lamented that “famine this year has put an end to over ten thousand families . . . and forced a third of the inhabitants, even in the good towns, to eat wild plants. . . . Some people ate human flesh.” A third of Finland’s population is estimated to have died of starvation during a famine in the 1690s.

Even when food was available, it was often far from appetizing. Our forebears lived in a world where adulterated bread and milk, spoiled meat, and vegetables tainted with human waste were everyday occurrences. London bread was described in a 1771 novel as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.” According to historian Emily Cockayne, the 1757 public health treatise Poison Detected noted that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.”

Water was also far from pristine. “For the most part, filth flowed out windows, down the streets, and into the same streams, rivers, and lakes where the city’s inhabitants drew their water,” according to environmental law professor James Salzman. This ensured that each swig included a copious dose of human excreta and noxious bacteria. Waterborne illnesses were frequent.

“A Stressful, Unsatisfying Lifestyle”

Did stressful lifestyles originate with industrialization? Did our preindustrial ancestors generally enjoy a sense of inner peace? Doubtful. Sadly, many of them suffered from what they called melancholia, roughly analogous to the modern concepts of anxiety and depression.

In 1621, physician Robert Burton described a common symptom of melancholia as waking in the night due to mental stress among the upper classes. An observer said the poor similarly “feel their sleep interrupted by the cold, the filth, the screams and infants’ cries, and by a thousand other anxieties.” Richard Napier, a 17th-century physician, recorded over several decades that some 20 percent of his patients suffered from insomnia. Today, in comparison, 12 percent of Americans say they have been diagnosed with chronic insomnia. Stress is nothing new.

Sky-high preindustrial mortality rates caused profound emotional suffering to those in mourning. Losing a child to death in infancy was once a common—indeed, near-universal—experience among parents, but the loss was no less painful for all its ordinariness. Many surviving testimonies suggest that mothers and fathers felt acute grief with each loss. The 18th-century poem, “To an Infant Expiring the Second Day of Its Birth,” by Mehetabel “Hetty” Wright—who lost several of her own children prematurely—heartrendingly urges her infant to look at her one last time before passing away.

So common were child deaths that practically every major poet explored the subject. Robert Burns wrote “On the Birth of a Posthumous Child.” Percy Bysshe Shelley wrote multiple poems to his deceased son. Consider the pain captured by these lines from William Shakespeare’s play King John, spoken by the character Constance upon her son’s death: “Grief fills the room up of my absent child. . . . O Lord! My boy, my Arthur, my fair son! My life, my joy, my food, my all the world!” Shakespeare’s own son died in 1596, around the time the playwright would have finished writing King John.

Only in the modern world has child loss changed from extraordinarily common to exceedingly rare. As stressful as modern life can be, our ancestors faced forms of heartache that most people today will never endure.

“Dangerous Workplaces” and “Child Labor”

Dangerous workplaces and child labor both predate the Industrial Revolution. In agrarian societies, entire families would labor in fields and pastures, including pregnant women and young children. Many preindustrial children entered the workforce at what today would be considered preschool or kindergarten age.

In poorer families, children were sent to work by age 4 or 5. If children failed to find gainful employment by age 8, even social reformers unusually sympathetic to the plight of the poor, would express open disgust at such a lack of industriousness. Jonas Hanway was reportedly “revolted by families who sought charity when they had children aged 8 to 14 earning no wages.”

For most, work was backbreaking and unending. A common myth suggests that preindustrial peasants worked fewer days than modern people do. This misconception originated from an early estimate by historian Gregory Clark, who initially proposed that peasants labored only 150 days a year. He later revised this figure to around 300 days—higher than the modern average of 260 working days, even before factoring in today’s paid holidays and vacation time.

Physically harming one’s employees was once widely accepted, too, and authorities stepped in only when the mistreatment was exceptionally severe. In 1666, one such case occurred in Kittery, in what is now Maine, when Nicholas and Judith Weekes caused the death of a servant. Judith confessed that she cut off the servant’s toes with an axe. The couple, however, was not indicted for murder, merely for cruelty.

“Discrimination Against Women”

The preindustrial world was hardly a model of gender equality—discrimination against women was not an invention of the early industrialists but a long-standing feature of many societies.

Domestic violence was widely tolerated. In London, a 1595 law dictated: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” In other words, no beating your wife after 9:00 p.m. That was a noise regulation. A similar law forbade using a hammer after 9:00 p.m. Beating one’s wife until she screamed was an ordinary and acceptable activity.

Domestic violence was celebrated in popular culture, as in the lively folk song “The Cooper of Fife,” a traditional Scottish tune that inspired a country dance and influenced similar English and American ballads. To modern ears, the contrast between its violent lyrics and upbeat melody is unsettling. The song portrays a husband as entirely justified in his acts of domestic violence, inviting the audience to side with the wifebeater and cheer as he beats his wife into submission for her failure to perform domestic chores to her husband’s satisfaction.

Sexist laws often empowered men to abuse women. If a woman earned money, her husband could legally claim it at any time. For instance, in 18th-century Britain, a wife could not enter into contracts, make a will without her husband’s approval, or decide on her children’s education or apprenticeships; moreover, in the event of a separation, she automatically lost custody. Mistreatment of women, in other words, long predated industrialization. Arguably, it was the increase in female labor force participation during the Industrial Revolution that ultimately gave women greater economic independence and strengthened their social bargaining power.

“Environmental Harm”

While many of today’s environmental challenges—such as climate change and plastic pollution—differ from those our forebears faced, environmental degradation is not a recent phenomenon. Worrying about environmental impact, however, is rather new. Indeed, as historian Richard Hoffmann has pointed out, “Medieval writers often articulated an adversarial understanding of nature, a belief that it was not only worthless and unpleasant, but actively hostile to . . . humankind.”

Consider deforestation. The Domesday Survey of 1086 found that trees covered 15 percent of England; by 1340, the share had fallen to 6 percent. France’s forests more than halved from about 30 million hectares in Charlemagne’s time (768–814) to 13 million by Philip IV’s reign (1285–1314).

Europe was hardly the only part of the world to abuse its forests. A 16th-century witness observed that at every proclamation demanding more wood for imperial buildings, the peasants of what are today the Hubei and Sichuan provinces in China “wept with despair until they choked,” for there was scarcely any wood left to be found.

Despeciation is also nothing new. Humans have been exterminating wildlife since prehistory. The past 50,000 years saw about 90 genera of large mammals go extinct, amounting to over 70 percent of America’s large species and over 90 percent of Australia’s. 

Exterminations of species occurred throughout the preindustrial era. People first settled in New Zealand in the late 13th century. In only 100 years, humans exterminated 10 species of moa in addition to at least 15 other kinds of native birds, including ducks, geese, pelicans, coots, Haast’s eagle, and an indigenous harrier. Today, few people realize that lions, hyenas, and leopards were once native to Europe, but by the first century, human activity eliminated them from the continent. The final known auroch, Europe’s native wild ox, was killed in Poland by a noble hunter in 1627.

Progress Is Real

History bears little resemblance to the sanitized image of preindustrial times in the popular imagination—that is, a beautiful scene of idyllic country villages with pristine air and residents merrily dancing around maypoles. The healthy, peaceful, and prosperous people in this fantasy of pastoral bliss do not realize their contented, leisurely lives will soon be disrupted by the story’s villain: the dark smokestacks of the Industrial Revolution’s “satanic mills.”

Such rose-colored views of the past bear little resemblance to reality. A closer look shatters the illusion. The world most of our ancestors faced was in fact more gruesome than modern minds can fathom. From routine spousal and child abuse to famine-induced cannibalism and streets that doubled as open sewers, practically every aspect of existence was horrific.

A popular saying holds that “the past is a foreign country,” and based on recorded accounts, it is not one where you would wish to vacation. If you could visit the preindustrial past, you would likely give the experience a zero-star rating. Indeed, the trip might leave you permanently scarred, both physically and psychologically. You might long to unsee the horrors encountered on your adventure and to forget the shocking, gory details.

The upside is that the visit would help deromanticize the past and show how far humanity has truly come—emphasizing the utter transformation of everyday lives and the reality of progress.

This article was published at Big Think on 11/19/2025.

Blog Post | Housing

The End of the Housing Affordability Crisis

The decline of housing affordability has been a policy choice.

Summary: Americans have enjoyed extraordinary gains in material abundance, yet housing in recent decades stands out as a stubborn exception. Home prices in many parts of the United States have risen faster than incomes, placing growing pressure on renters and first-time buyers. The problem is not an inevitable market failure but the predictable result of supply constraints—especially land-use regulations—that can be reformed to increase affordability.


Americans have seen tremendous advances in the availability and abundance of material goods. As Marian L. Tupy and Gale Pooley from the Cato Institute have shown, the most basic necessity of food became eight times more affordable over the 100 years up to 2019, relative to average wages (the food inflation after 2019 set us back a little bit, but the long-run trends are still quite favorable). This increasing abundance is not limited to food alone, as a wide variety of finished goods have become much more affordable in recent decades.

These positive trends are well known for goods and even some services, such as cosmetic surgeries, but a common objection, both on social media and in real life, is: What about housing? That is a fair question, considering that Americans spend about 25 percent of their pre-tax annual income on housing, which has been a fairly constant share of their income for most of the past 125 years. Given the large share of the budget that housing costs represent, and the failure of housing to decline as a share of the budget as other necessities did, it is worth investigating the problem further.

On housing, the critics do have a point: Housing costs across the US and many other nations have quickly outpaced income growth in recent years. While we shouldn’t be nostalgic for the housing of the 1950s—houses were about half the size of today’s and had fewer amenities we now consider standard, such as air conditioning—nostalgia for the housing of 30 years ago might be justifiable.

Since 1994, two common measures of housing prices, the Case-Shiller Index and the US Department of Housing and Urban Development’s Median Sales Price data, have increased faster than most measures of income, including median family income and average wages. And unlike the change since the 1950s, the recent increase in housing prices can’t be primarily explained by houses getting bigger: The median square footage of new homes sold has increased only 16 percent since 1994 and has even been falling in the past decade.

Even more so, to the extent housing has become more expensive relative to wage growth in recent years, the trend could worsen over the next 30 years—unless we quickly change policy to allow the supply of housing to increase.

It may seem puzzling that housing could remain roughly the same share of income on average in the US, even as housing prices have increased faster than incomes in recent decades. This seeming puzzle can be resolved by thinking about two different kinds of households: renters and homeowners. While renters and homeowners may certainly be different in many ways—renters tend to be younger, poorer, and so on—there is a fundamental difference in how they experience increases in the price of housing. Renters are typically subject to new market-rate rents on a regular basis, often annually. However, if homeowners remain in the same house they are generally insulated from these changes, with only insurance and property taxes possibly increasing annually, not their principal and interest on the mortgage.

These intuitions are borne out in the data. According to the BLS Consumer Expenditure Survey, in 1984 the share of income that renters spent on housing was about 30.4 percent, which rose over the next four decades to 34.4 percent. Homeowners saw the opposite pattern, with the share of their income spent on housing falling from 27.7 percent in 1984 to 22.6 percent in 2024. The overall average has been fairly stable, but the experience of renters and homeowners has diverged.

The Facts of Housing Unaffordability

Historically, the rule of thumb in the United States is to spend no more than 30 percent of income on housing—though as we saw above, on average Americans spend less than that. But averages can obscure cost burdens for some households. According to an analysis of the Census Bureau’s American Community Survey data by Harvard’s Joint Center for Housing Studies (JCHS), fully one-third of US households spent over 30 percent of their income on housing, and 16 percent of households spent over half of their income on housing in 2024. The number of cost-burdened households has been steadily rising in recent years, as the price of both homes and rentals has increased faster than incomes in most of the US.

We can see the problem of rising home values relative to income by looking at another rule of thumb: Home prices should be in the range of three and five times a household’s annual income. In 1994, out of the United States’ 387 metropolitan statistical areas (MSAs), 263 had median home prices that were less than three times the median household income (the data once again come from Harvard’s JCHS). Only 12 MSAs in 1994—mostly in California and Hawaii—had ratios above 5.0.

Fast-forward to 2024, when there were 114 MSAs above the 5.0 ratio of median home prices to income, and those were scattered all over the country. Instead of being in just California and Hawaii, they were also in previously affordable states such as Montana, Wisconsin, North Carolina, and Arkansas. In 2024, the number of MSAs with price-to-income ratios below 3.0 had dwindled to just 32, many of them in the dying Rust Belt. And you don’t even need to go back to 1994 to see the dramatic change. As late as 2019, there were still well over 100 MSAs with a price-to-income ratio below 3.0.

While the majority (241 MSAs) are still within the suggested range of three to five times a household’s income, many are pushing toward the upper end of that range. Given the trend—the median ratio crept up from 2.65 in 1994 to 4.27 in 2024—it is not unreasonable to expect the ratio to continue to increase, absent any changes in policy.

The challenge of housing affordability is not unique to the United States. Using the home-price-to-income ratio from the Organisation for Economic Co-operation and Development (OECD), since 1994 the US saw home prices increase by 20 percent more than incomes did, meaning that housing is more expensive in real terms. Some other countries were in a much worse situation: Australia, Canada, and the United Kingdom all had over 80 percent increases in the ratio of housing prices to income. Not every country followed the same pattern, though. In New Zealand, the price-to-income ratio rose by 126 percent between 1994 and 2021. The ratio declined to 80 percent in 2024. And Japan’s price-to-income ratio fell by 25 percent from 1994 to 2024. However, even Japan has recently seen a modest increase in the ratio, by about 14 percent in the past decade. We’ll look at New Zealand and Japan in more detail below.

The Fix for Housing Affordability

But something can be done. While there have been several political solutions proposed, most of those focused on the demand side, such as subsidies to homeowners or renters. Those kinds of solutions are suboptimal because they increase demand, which will only further increase prices if supply does not also increase. The real problem is on the supply side: There is not enough new housing being built in the places people want to live and of the size people want. What is preventing additional building? In most of the US, it is land-use restrictions such as zoning and other policies that limit the density of new homes. Australia and countries across Europe have implemented similar policies that limit the construction of housing in various ways, primarily in the first half of the 20th century. Price increases did not show up immediately, because in most places restrictions were not binding constraints; there was plenty of land in favorable locations until recent decades.

A major restriction on the supply of housing comes in the form of single-family zoning, which prevents multifamily housing (everything from duplexes to skyscraper apartments) from being built in residential areas. A 2019 analysis by the New York Times found that about 75 percent of residential areas in US cities are reserved for single-family homes. In some cities that figure may reach over 85 percent. Of course, most families probably aspire to eventually own a single-family home, but the zoning laws force most land to be dedicated to this form of housing for everyone. That contributes to making housing unaffordable for many younger families today.

Land-use restrictions limit supply in ways that go beyond merely proscribing that most lots be reserved for single-family homes. For example, regulations will often require lots to be of a minimum size, which is counterproductive because land area is often the most expensive part of the property in urban settings, and the regulation forces families to purchase more land than they want. Regulations also set a maximum amount (a common range is 40–60 percent) of the lot that can be covered by the building itself, essentially forcing homes to have large lawns. Again, many families might want a large lot with a large lawn, but these regulations require it for everyone. The problem is that the less land dedicated to the home itself, the less land there is for other homes in the same area. These rules preclude single-family home types that were common in the past in large American cities, such as row houses or townhouses, which typically occupy most of the small lots they sit on.

Zoning Reforms Work

Would reforming land-use regulations really increase the supply of housing and make it more affordable? The available evidence indeed suggests it would.

One example of reform is New Zealand’s largest city, Auckland, which in 2016 reformed residential zoning to allow for more intensive housing—duplexes, triplexes, townhomes, and the like—on most residential land. This process is referred to as “upzoning.” The results were staggering: As documented in a paper published in the Journal of Urban Economics, construction boomed, with permits doubling in five years. The economists who studied this reform found that rents were 26–33 percent lower than they would have been without it. Rents kept skyrocketing in the rest of New Zealand but stabilized in the parts of Auckland that were upzoned. As mentioned above, New Zealand is notable for seeing its home-price-to-income ratio fall after 2021: As rents stabilized and incomes continued to grow, the ratio declined.

Another example comes from Houston, the fourth-largest city in the US. Houston has long been known as the shining example of a major US city that never adopted citywide zoning, even though some neighborhoods have private deed restrictions that incorporate features similar to zoning. But despite eschewing traditional zoning, Houston still has land-use regulations of various sorts. For example, like most cities, Houston prescribed a minimum lot size of 5,000 square feet. Because people would’ve been paying for more land than they needed, alternate forms of housing such as townhomes were less likely to be built.  First in 1998 and then in 2013, Houston reduced the minimum lot size to just 1,400 square feet in parts of the city. As Mercatus Center economist Emily Hamilton shows, there was a boom in construction following the reforms. Despite adding over 1 million people between 1970 and 2020, Houston still managed to have median home prices below the national average.

If Houston and Auckland demonstrate the power of local reform, Tokyo shows what is possible when a nation treats housing as essential infrastructure rather than a matter set by local competing interest groups. As urban scholar André Sorensen details in The Making of Urban Japan (2002), the country stripped municipalities of the power to block code-compliant projects, effectively turning zoning into a national “right to build” rather than a discretionary local negotiation. The results of this policy choice are astonishing. According to a 2016 analysis by the Financial Times, the city of Tokyo consistently builds more new housing each year than the entire state of California or the whole of England, despite having little empty land to spare. By removing the “veto points” that plague Western cities, Tokyo has achieved the status of a growing, vibrant mega-city where rents have remained flat for decades.

Allowing the Market to Increase Supply Keeps Housing Affordable

As families become richer and the population grows, there is increasing pressure on housing prices in desirable locales. The natural market response to increasing prices is to increase supply. Unfortunately, in much of the US and the rest of the developed world, governments have put artificial barriers in place to prevent this market response. While the housing shortage was created by the political process—through the establishment of zoning and other land-use regulations—the solution does not need to come from governments in the form of subsidizing demand. Instead, to unleash the forces of the market and human initiative, governments need to ease regulations on supply.

Land-use regulations are not the only interference in the market process that makes housing less affordable. Some forms of trade policy and protectionism can also harm home prices. For example, the National Association of Home Builders (NAHB) estimates that recent tariff increases for lumber and other inputs can add at least $10,000 to the average price of a home. Even more costly are building regulations, which the NAHB estimated could exceed $90,000 for a typical home in 2021 and were around 40 percent of the cost of multifamily housing such as apartment buildings. While not all of these regulations could be eliminated immediately, the best thing governments can do to address the affordability issue in housing is to figure out how they can get out of the way.

Mexico News Daily | Wealth & Poverty

Access to Housing, Food and Education Improving in Mexico

“A government study has found that access to education, housing and nutritious food has improved nationwide…

81.4% of the population had access to education in 2024 and the use of basic supplies for studying at home — electricity, television, internet — reached 70.2% of students between 3 and 17 years old. This represented an increase of 33.5 percentage points compared to 2016.

92.1% of Mexico’s population reported access to decent housing without a lack of quality and space, while 85.9% had access to basic services in 2024. On the other hand, access to water within the home fell to 53.4% in 2024, as compared to 54.8% in 2016…

85.6% of the population did not experience a lack of access to nutritious and quality food in 2024, compared to 78.1% in 2016…

The percentage of people without deficiencies in access to health care services decreased from 84.4% in 2016 to 65.8% in 2024.”

From Mexico News Daily.

Blog Post | Wealth & Poverty

Dinner With Dickens Was Slim Pickins

Claims that characters in "A Christmas Carol" were better off than modern Americans are pure humbug.

Summary: There have recently been widespread claims that Dickens’s working poor were better off than modern minimum-wage workers. Such comparisons rely on misleading inflation math and selective reading. The severe material deprivation of Victorian life—crowded housing, scarce possessions, and basic sanitation problems—dwarfs today’s standards. Modern Americans, even at the lower end of the income scale, enjoy far greater material comfort than the Cratchits ever did.


Christmas is often a time for nostalgia. We look back on our own childhood holidays. Songs and traditions from the past dominate the culture.

Nostalgia is not without its purposes. But it can also be misleading. Take those who view the material circumstances of Charles Dickens’s “A Christmas Carol” as superior to our own.

Claims that an American today earning the minimum wage is worse off than the working poor of the 19th century have been popular since at least 2021. A recent post with thousands of likes reads:

Time for your annual reminder that, according to A Christmas Carol, Bob Cratchit makes 15 shillings a week. Adjusted for inflation, that’s $530.27/wk, $27,574/yr, or $13.50/ hr. Most Americans on minimum wage earn less than a Dickensian allegory for destitution.

This is humbug.

Consider how harsh living conditions were for a Victorian earning 15 shillings a week.

Dickens writes that Mr. Cratchit lives with his wife and six children in a four-room house. It is rare for modern residents of developed nations to crowd eight people into four rooms.

It was common in the Victorian era. According to Britain’s National Archives, a typical home had no more than four rooms. Worse yet, it lacked running water and a toilet. Entire streets (or more) would share a few toilets and a pump with water that was often polluted.

The Cratchit household has few possessions. Their glassware consists of merely “two tumblers, and a custard-cup without a handle.” For Christmas dinner, Mr. Cratchit wears “threadbare clothes” while his wife is “dressed out but poorly in a twice-turned gown.”

People used to turn clothing inside-out and alter the stitching to extend its lifespan. The practice predated the Victorian era, but continued into it. Eventually, clothes would become “napless, threadbare and tattered,” as the historian Emily Cockayne noted.

The Cratchits didn’t out-earn a modern American earning the minimum wage. Mr. Cratchit’s weekly salary of 15 shillings in 1843, the year “A Christmas Carol” was published, is equivalent to almost £122 in 2025. Converted to U.S. dollars, that’s about $160 a week, for an annual salary of $8,320.

The U.S. federal minimum wage is $7.25 per hour or $15,080 per year for a full-time worker. That’s about half of what the meme claims Mr. Cratchit earned. Only 1% of U.S. workers earned the federal minimum wage or less last year. Most states set a higher minimum wage. The average worker earns considerably more. Clerks like Mr. Cratchit now earn an average annual salary of $49,210.

Mr. Cratchit couldn’t have purchased much of the modern “basket of goods” used in inflation calculations. Many of the basket’s items weren’t available in 1843. The U.K.’s Office of National Statistics recently added virtual reality headsets to it.

Another way to compare the relative situation of Mr. Cratchit and a minimum-wage worker today is to see how long it would take each of them to earn enough to buy something comparable. A BBC article notes that, according to an 1844 theatrical adaptation of “A Christmas Carol,” it would have taken Mr. Cratchit a week’s wages to purchase the trappings of a Christmas feast: “seven shillings for the goose, five for the pudding, and three for the onions, sage and oranges.” Mr. Cratchit opts for a goose for the family’s Christmas meal. A turkey—then a costlier option—was too expensive.

The American Farm Bureau Federation found that the ingredients for a turkey-centered holiday meal serving 10 people cost $55.18 in 2025. At the federal minimum wage, someone would need to work seven hours and 37 minutes to afford that feast.

A minimum-wage worker could earn more than enough in a single workday to purchase a meal far more lavish than the modest Christmas dinner that cost Mr. Cratchit an entire week’s pay. And the amount of time a person needs to work to afford a holiday meal has fallen dramatically for the average blue-collar worker in recent years despite inflation. Wages have grown faster than food prices.

There has been substantial progress in living conditions since the 1840s. We’re much better off than the Cratchits were. In fact, most people today enjoy far greater material comfort than did even Dickens’s rich miser Ebenezer Scrooge.

This article was originally published in the Wall Street Journal on 12/23/2025.

Bloomberg | Housing

Luxury Apartments Are Bringing Rent Down in Some Big Cities

“Rents got cheaper in several major cities this past year, thanks to an influx of luxury apartment buildings opening their doors and luring tenants to vacate their old homes.

But those looking for bargains will have to be quick, since the available apartments won’t last long, developers say.

The US’s average rental rate fell 0.18 percent in November, the largest monthly drop in more than 15 years, according to real estate research firm CoStar. Driving that decline: lower rents in big cities like Austin, Denver and Phoenix, as well as vacation destinations like Naples, Florida; Asheville, North Carolina; and Myrtle Beach, South Carolina.

New building openings are bringing rents down as wealthy tenants trade up, forcing landlords to drop prices for older apartments. Rents for older units have fallen as much as 11%, and some are now on offer at rates as low as homes that are usually designated as ‘affordable’ and come with restrictions including rent control and rent stabilization.”

From Bloomberg.