fbpx
01 / 05
Grim Old Days: Pat Thane’s History of Old Age

Blog Post | Population Demographics

Grim Old Days: Pat Thane’s History of Old Age

"He who has made himself dependent on his children for bread and suffers from want, he shall be knocked dead by this club."

Summary: Pat Thane’s book explores the harsh realities faced by the elderly in pre-industrial societies, including early aging, high mortality rates, and widespread elder abuse. The book reveals that old age, often accompanied by physical disability and poverty, was generally marked by isolation, familial neglect, and societal contempt. Thane’s volume challenges the romanticized notion that the elderly were once universally respected, showing that industrialization brought both longer lifespans and improved intergenerational relationships.


A History of Old Age, edited by the British historian Pat Thane, features contributions from several scholars exploring old age in different eras, from antiquity to the recent past. The volume reveals that in the pre-industrial era, premature aging, early death, and elder abuse were far more common than today.

In the 17th century, “due to inadequate diet and poor living standards . . . poor women [were considered] to have entered old age around age 50.” “Mother” became an honorary title for women over 50, such as the famously ugly “Mother Shipton” of Yorkshire, born toward the end of the 15th century and who, like many old women of the era, was “reputed to be a witch.” In 1754, one author noted that “the peasants in France . . . begin to decline before they are forty.” For ordinary people, the injuries of old age reflected a lifetime of painful toil. There was a “high probability of some physical disability stemming from earlier, work-related injuries.” For example, female lacemakers “suffered debilitating blindness and stiff fingers.” “The ‘Dowager’s Hump’ of osteoporosis was the stereotypical hallmark of the elderly women in the 17th century, as were the broken hips and arms of the aged male.” Given the harsh toll that the challenges of preindustrial life took on the body, and the prevalence of early aging, it is not surprising that fewer people survived to old age.

While a preindustrial adult had a much better shot at reaching old age than did a preindustrial child (owing to the latter group’s horrifically high rate of early death), it was still a long shot relative to today. The 16th-century French philosopher Michel de Montaigne observed, “To die of age, is a rare, singular, and extraordinary death, and so much lesse naturall [sic] than others.” In the preindustrial era, “the elderly generally constituted not more than 8 per cent of the population, and in some regions and periods it was not more than 5.” (Although after outbreaks of bubonic plague, which disproportionately killed off the young, the elderly share of the population temporarily increased). With industrialization, the relative rarity of older adults began to change; “in England and the Low Countries, the numbers of elderly began to increase earlier” than elsewhere.

Even among royalty, living into old age was once relatively rare. “Of all the kings of Europe from the 11th to the beginning of the 15th century, the longest living king was Alfonso VI, king of Castile and León (1030–1109), who reached the age of 79. Of all his predecessors and successors only two made it to their 60s. Only three of the kings of Aragon reached their 60s, and only four of the German emperors. Three of the kings of England reached their 60s, but only one of the Capetian kings of France—Louis VII (1120–80). All other kings, in all European countries, died younger.” That bears repeating: For a king to live past 70 was extraordinary, and most kings did not live to see age 60. Among common peasants, typical lifespans were, of course, shorter still.

In antiquity, old age was also relatively rare. There were, of course, exceptions, such as the famed stoic philosopher Diogenes the Cynic, who lived to be 96, and the philosopher Chrysippus, who is said to have died around age 80, but such longevity was unusual. In the classical past, most of the population was young. “For example, around 6–8 per cent of the population of the Roman empire in the 1st century AD was over the age of 60.” This had many repercussions, including that fewer people ever knew their grandparents. “By the age of ten years, the average ancient individual had only a one-in-two chance of having any of his grandparents still alive. Fewer than one in a hundred Greeks or Romans of the age of 20 would have had a surviving paternal grandfather.”

Close, long-term relationships between grandchildren and their grandparents were thus relatively rare. “Most adult Greeks or Romans would have had only shadowy memories of their grandparents.” In fact, it was not until industrialization began in parts of Europe in the latter half of the 18th century that close grandparent-grandchild relationships such as those that are typical today started to become more common, as “longer lives meant greater opportunity to play the roles associated with the aged.” The archetypes associated with grandparents are newer than many realize, although they do slightly predate industrialization. “Only at the end of the [17th] century does the social, ‘spoil-the-child’, modern-looking ‘grandparent’ appear.” In other words, “the modern social role of the grandparent was just beginning to develop at the end of the century.” One might imagine that doting grandparents have existed since time immemorial, and some likely did, but high rates of early death and widespread material poverty deprived most ordinary people of the experience prior to the wealth generated by the Industrial Revolution. “A new representation of grandparents can be recognized in French culture in the late 18th century, preparing the way for the great stereotype of 19th-century grandparents spoiling their children’s offspring.” That was a consequence of more grandparents living long enough to form deep bonds with their grandchildren, and greater prosperity enabling the former to lavish gifts on the latter, as wealth and longevity spread: “Old age, traditionally viewed as a period of social isolation, was being experienced by greater numbers.”

Poor people continued working as long as possible—no matter how long they lived. “Bridget Holmes [(1591–1691)] was a servant in the Stuart royal household who was still working hard at the age of 96.” Beetty or Betty Dick, the town-crier of Dalkeith in Scotland continued to work until her death at age 85 in 1778, wandering the town beating a wooden plate with a spoon to draw public attention and making local announcements. This lengthy working life took a heavy toll. “The lifestyle of the poor was physically and mentally demanding even for those in the pink of health” and could be devastating in old age. Nonetheless, working until one’s dying day or the arrival of debilitating infirmity was a common fate among poor people, who once comprised the greater share of humanity.

The idea of a leisurely retirement being within ordinary people’s reach is a modern concept. For most of history, ordinary laborers worked until they became bedridden or died, owing to the extreme poverty of the preindustrial world. “Most of them were unable to save enough for their old age during their working years. They could thus not afford to retire and were obliged to continue working as long as they could.” Old age and poverty were practically synonyms. “As women generally worked in more poorly paid occupations than men, they were even more exposed to dire want in their old age.” By the 17th century, “at a certain stage in his life the peasant handed over his farm to one of his offspring [and] moved from the main room to a back room, or to the attic, or to a spare cottage.” After the handover, he would still assist with farmwork to the extent of his abilities. For women, living with family in old age was less common, at least partly because women who avoided childbirth had better chances of surviving to old age than women who had children.

A common narrative maintains that in the past, the elderly received far better treatment, enjoying greater respect and more familial support than today. “Insofar as old age is thought to have a history, it is presented as a story of decline . . . [in the past, the elderly] were valued, respected, cherished and supported by their families as, it is said, they are not today.” Nowadays, in contrast, the narrative holds that disrespect and loneliness are more likely to characterize the last years of life than in ages past. Yet in reality, “none of [the evidence] suggests that comfortable succour in the household of one’s children was the expected lot of older people in pre-industrial . . . Europe.” The evidence suggests quite the opposite, in fact.

Contrary to popular belief, preindustrial people were far less likely to have any surviving children or grandchildren to care for them in old age than modern people. That is partly because even though birth rates were higher in the past, children died with such horrifying frequency that they often predeceased their parents. “Given the higher rate of death at all ages before the later 20th century, older people could not be sure that their children would outlive them. In the 18th century just one-third of Europeans had a surviving child when they reached their 60th birthday.” Hence, the majority of those who lived to old age had no surviving children. In the modern world, in contrast, that is only the case for a minority. For example, US Census Bureau data suggests that among adults age 55 and older, over 83 percent have living adult children. Despite “today’s pessimistic narrative of old age [that] stresses the increasing loneliness of older people in the modern world,” loneliness was more pervasive in the preindustrial past.

What became of the childless majority of elderly people in the preindustrial era? “If they had no surviving children, they entered hospitals and institutions for the poor, which, throughout pre-industrial Europe and early America, were filled with older people without surviving relatives. Or they died alone.” Conditions in the hospitals were famously unsanitary and overcrowded. “There, sharing a bed with whoever else needed one, the destitute elderly lived out their final years.” Despite the poor conditions, demand for a hospital bed far exceeded the supply. “Seventeenth-century Brunswick had only 23 beds for every 1,000 inhabitants, Rheims had 24.94 for every 1,000; and in Marne, they were particularly scarce, with just 2.77 beds per 1,000. Furthermore, the elderly were only one of many eligible groups vying for accommodation. . . . It has been suggested that 74 per cent of all applications were denied.”

Some were even less fortunate: Older people without relatives also often faced harassment and even accusations of witchcraft. While old men also suffered through such allegations, old women were particularly likely to be targeted. That is at least in part because, then as now, women often outlived men, so there were more elderly women around. (Although in some times and places, men outlived women, such as Quattrocento Venice). “A physician in 17th-century south Germany explained why old women were so often accused of witchcraft: ‘They are so unfairly despised and rejected by everyone, enjoy no-one’s protection, much less their affection and loyalty . . . so that it is no wonder that, on account of poverty and need, misery and faint-heartedness, they often . . . give themselves to the devil and practice witchcraft.’ A 70-year-old woman said at her trial, ‘The children call all old people witches.’”

In other words, many communities violently scapegoated the aging. Any local misfortune, from illness to a house fire, could be blamed on supposed witches, usually impoverished older women without surviving children. Superstitions related to menopause did not help matters. “It was said that a menopausal woman could cause grass to dry up, fruit to wither on the vine, and trees to die. Dogs would become rabid and mirrors crack by her mere presence. Such women, without even trying, could cast the evil eye. With malice and aforethought, the glance of the post-menopausal woman could kill.” In reality, it was the aging women themselves who were killed by such delusions. From the 14th century through the 17th century, between 200,000 and 500,000 alleged witches—over 85 percent of them female and mostly middle-aged or elderly—were executed. Public shaming, harsh interrogations, and torture often preceded witch burnings.

Such violence was enabled by attitudes toward the elderly that were often grotesquely negative. “Literary depictions of old men in epics and romances [show] the old man is an object of contempt.” In the 17th century, “the Italian theatrical genre of Commedia dell’Arte reflected the Europe-wide characterization of old men as objects of mockery and disdain,” featuring a prominent stock character called Pantaloon, who was meant to represent a decrepit and ridiculous old man. “The 17th-century stage, elite literature and the sayings of peasants belittled and mocked the old in ways that few groups are targeted today.”

Old women often fared even worse in the public imagination. “Generally old women were feared or held in contempt.” To give an example, in the allegorical text Le Pèlerinage de la vie humane, “The Pilgrimage of human life,” written in the 14th century by the monk Guillaume of Deguileville, the virtues are all personified by beautiful young women, while ugly old women represent the vices. Even in the 17th century, women “were thought to grow increasingly evil and dangerous as menopause set it.” A literary genre popular from the 13th century onward known as “sermones ad status”—sermons divided according to their audience (i.e., sermons to the nobility, to merchants, and so forth)—reveals how the people of the past viewed different groups. In this classification scheme, “the elderly, like women and children, were represented as a single marginal group irrespective of social stratum, rank, profession or lifestyle. In some texts they were classed with invalids, foreigners or the very poor, the emphasis being on their . . . social inferiority.”

Public ridicule of the elderly was also commonplace and considered an ordinary pastime for children. A description of each decade of life “popular in Germany in the 16th century and probably familiar still in the 17th” describes a man of 90 years as ‘the scorn of children.’” A Viennese woodcut from 1579 depicts a nonagenarian man derided by a young child.

The minority of old people who did have surviving children were not necessarily much better off, as treatment of the elderly was often appalling, even by close family members. One “popular . . . tale, already old in medieval Europe, told of a man who, tired of caring for his old father, starts to carve a trough from which the old man is to eat, instead of sitting at the family table, or, in another version, starts to exchange his father’s bedding for a piece of sacking.” Similar stories abounded that depict cruelty toward the elderly. “In another, bleaker version, the old man is gradually demoted from a place of honour at the head of the table to a bench behind the door, where he dies in misery.” In some areas, this power imbalance was reversed. “In late 17th century Upper Provence, for example, until the death of his father, the heir was ‘completely subservient to his father economically, socially, and legally, just as though he were still a child.’ He could not, without his father’s permission, buy, sell, trade, make a will or complete any legal contract. Trouble arose repeatedly as a consequence.” In most areas, however, elder abuse was likely more frequent than aging parents legally tyrannizing their adult children.

Of course, individuals varied, and many adult children dutifully supported their aging parents and maintained positive relationships with them. But economic stress made it hard even for willing adult children to support their parents. “As the younger generation was typically poor themselves and overburdened by children, leaving little food or money to spare for an aged parent. Barbara Ziegler, from Bächlingen in southwestern Germany, described what the 1620s had been like for her: ‘I stayed with my son for four years, but the food was bad and [he] supported me only with great effort.’” Far from the romantic notion that the past offered greater familial support to older adults, the prevailing attitude toward any older person relying on their adult children was often one of bitterness and disgust.

This is true even in antiquity, despite the “common myth about the classical past . . . that older individuals enjoyed something of a golden age when they were treated with great respect.” The reality was that attitudes toward the elderly were often cruel. Classical literature often depicted the old as “witches or alcoholics.” In Greek and Roman mythology, the personification of old age, Geras or Senectus, is said to be “the offspring of Night, and has siblings Doom, Fate, Death, Sleep, Blame, Nemesis and Deceit, among others.” The philosopher Juncus noted that even to his friends and family, an aging man is nothing but “an oppressive, painful, grievous and decrepit spectacle: in short, an Iliad of woes.” The Greek satirist Lucian in his work On Grief points out, albeit jokingly, that one benefit of a man’s untimely demise is that “by dying young, he will not be scorned in old age.” In fact, “it was a common proverb that old age and poverty are both burdensome—in combination they are impossible to bear.” Even when adult children took care of their parents, it was often with great resentment. In the playwright Aristophanes’ comedy Wasps, a son is depicted supporting his father but without any hint of the filial respect often imagined to characterize the past. The son character says with disgust, “I’ll support him, providing everything that’s suitable for an old man: gruel to lick up, a soft thick cloak, a goatskin mantle, a whore to massage his . . . loins.” At the beginning of Plato’s Republic, the elderly Cephalus says this of “old men”: “Most of them are full of woes [and] grumble that their families show no respect for their age.” The old were often despised as “marginal members of society.”

Even in the later 18th century, “the town gates of some cities in Brandenburg hung large clubs with this inscription: ‘He who has made himself dependent on his children for bread and suffers from want, he shall be knocked dead by this club.’”

These facts and more can be found in this fascinating book.

Blog Post | Population Growth

More People Mean More Genius Innovators

With 18 times more people you get 18 times more geniuses. You also get enormous markets that can accommodate complex and expensive fixed-cost products.

Summary: With a global population now 18 times larger than during Leonardo da Vinci’s era, the potential number of geniuses has multiplied. Large populations also enable the creation and affordability of complex, high-cost innovations like the iPhone. That underscores the vital role of economic systems that empower creativity and innovation.


Leonardo da Vinci (1452–1519) is considered by many to be the most brilliant person in history. He lived at a time when the global population was around 450 million. Knowing that IQ is distributed on a normal bell-shaped curve with an average of 100 and a standard deviation of 15, we can estimate that da Vinci, as one in 450 million, had an IQ of around 188.

Leonardo da Vinci was considered the smartest man of his age. Today, because of population growth, one would expect that there are eighteen people that have da Vinci's level of genius.

With a population of eight billion on the planet, today we have 18 times more people than in da Vinci’s day. That means we should also have 18 times more geniuses. I nominate Elon Musk as one of those. So where are the other 17? Perhaps they aren’t as lucky as Elon, who was able to exit South Africa and make his way to Silicon Valley. (Note: the coauthor of Superabundance left Czechoslovakia and South Africa, landed in America, and was then discovered by me on Twitter. Thank heavens.)

Steve Jobs’s biological father was Syrian. Imagine our world if Steve Jobs had grown up in Damascus instead of Silicon Valley. Or even better, ask yourself: How many Steve Jobses are in places like Syria today? Now imagine our world if everyone had the freedom and opportunity to achieve their potential and create value like Steve Jobs and Elon Musk.

More people also provide enormous markets that can accommodate complex and expensive fixed-cost products. Apple has spent an estimated $100 billion to create the iPhone. We can own one for $799, or about 21 hours’ work for a typical blue-collar worker earning $38 an hour in wages and benefits.

How can iPhones be sold so cheap if they cost so much? Because there are billions of us. $100 billion spread over 8 billion is $12.50 per person. The parts on the product cost Apple around $416. After paying other expenses, Apple’s profit margin is around 23 percent. We get to enjoy a $100 billion product and Apple makes $184. Behold the fruits of capitalism.

There is a reason the iPhone was created in the United States and not Venezuela or Cuba. As Professor Don Boudreaux notes, “Those who today call for socialism to replace capitalism are ignorant not only of socialism’s well-documented history of failure and tyranny, but also of the enormous benefits that capitalism inspires in creative entrepreneurs to deliver daily, and with disproportionate generosity, to the masses.”

We might believe Rep. Alexandria Ocasio-Cortez and Sen. Bernie Sanders are serious about socialism when they give up their iPhones.

Find more of Gale’s work at his Substack, Gale Winds.

Blog Post | Food Prices

Thanksgiving Dinner Will Be 8.8 Percent Cheaper This Year

Be thankful for the increase in human knowledge that transforms atoms into valuable resources.

Summary: There has been a remarkable decrease in the “time price” of a Thanksgiving dinner over the past 38 years, despite nominal cost increases. Thanks to rising wages and innovation, the time required for a blue-collar worker to afford the meal dropped significantly, making food much more abundant. Population growth and human knowledge drive resource abundance, allowing for greater prosperity and efficiency in providing for more people.


Since 1986, the American Farm Bureau Federation (AFBF) has conducted an annual price survey of food items that make up in a typical Thanksgiving Day dinner. The items on this shopping list are intended to feed a group of 10 people, with plenty of leftovers remaining. The list includes a turkey, a pumpkin pie mix, milk, a vegetable tray, bread rolls, pie shells, green peas, fresh cranberries, whipping cream, cubed stuffing, sweet potatoes, and several miscellaneous ingredients.

So, what has happened to the price of a Thanksgiving Day dinner over the past 38 years? The AFBF reports that in nominal terms, the cost rose from $28.74 in 1986 to $58.08 in 2024. That’s an increase of 102.1 percent.

Since we buy things with money but pay for them with time, we should analyze the cost of a Thanksgiving Day dinner using time prices. To calculate the time price, we divide the nominal price of the meal by the nominal wage rate. That gives us the number of work hours required to earn enough money to feed those 10 guests.

According to the Bureau of Labor Statistics, the blue-collar hourly wage rate increased by 240.2 percent – from $8.96 per hour in October 1986 to $30.48 in October 2024.

Remember that when wages increase faster than prices, time prices decrease. Consequently, we can say that between 1986 and 2024 the time price of the Thanksgiving dinner for a blue-collar worker declined from 3.2 hours to 1.9 hours, or 40.6 percent.

That means that blue-collar workers can buy 1.68 Thanksgiving Day dinners in 2024 for the same number of hours it took to buy one dinner in 1986. We can also say that Thanksgiving dinner became 68 percent more abundant.

Here is a chart showing the time price trend for the Thanksgiving dinner over the past 38 years:

The figure shows that the time price of a Thanksgiving dinner for a blue collar worker has gone down since 1986.
The figure shows that the time price of a Thanksgiving meal has decreased, while population, the nominal price of the meal, and hourly earnings have all increased.

The lowest time price for the Thanksgiving dinner was 1.87 hours in 2020, but then COVID-19 policies struck, and the time price jumped to 2.29 hours in 2022.

In 2023, the time price of the Thanksgiving dinner came to 2.09 hours. This year, it came to 1.91 hours – a decline of 8.8 percent. For the time it took to buy Thanksgiving dinner last year, we get 9.6 percent more food this year.

Between 1986 and 2024, the US population rose from 240 million to 337 million – a 40.4 percent increase. Over the same period, the Thanksgiving dinner time price decreased by 40.6 percent. Each one percentage point increase in population corresponded to a one percentage point decrease in the time price.

To get a sense of the relationship between food prices and population growth, imagine providing a Thanksgiving Day dinner for everyone in the United States. If the whole of the United States had consisted of blue-collar workers in 1986, the total Thanksgiving dinner time price would have been 77 million hours. By 2024, the time price fell to 64.2 million hours – a decline of 12.8 million hours or 16.6 percent.

Given that the population of the United States increased by 40.4 percent between 1986 and 2024, we can confidently say that more people truly make resources much more abundant.

An earlier version of this article was published at Gale Winds on 11/21/2024.

Blog Post | Economic Growth

What Unifies the Enemies of Civilization?

Socialism, environmentalism, scientism, relativism, dogmatism, and doomerism all have one thing in common.

This article was excerpted from an upcoming documentary.

Summary: Anti-merit, authoritarian, collectivist ideas like socialism, environmental extremism, and doomerism are enemies of human progress because they impede innovation, limit personal freedom, and prevent societal growth. Fostering decentralized creativity, by contrast, improves the continued ability of human civilization to advance.


We have enemies.

Our enemies are not bad people—but rather bad ideas.

Our enemy is stagnation.

Our enemy is anti-merit, anti-ambition, anti-striving, anti-achievement, anti-greatness.

Our enemy is statism, authoritarianism, collectivism, central planning, socialism.

Our enemy is bureaucracy, vetocracy, gerontocracy, blind deference to tradition.

Our enemy is corruption, regulatory capture, monopolies, cartels.

Our enemy is institutions that in their youth were vital and energetic and truth-seeking, but are now compromised and corroded . . . blocking progress in increasingly desperate bids for continued relevance, frantically trying to justify their ongoing funding despite spiraling dysfunction and escalating ineptness.

Our enemy is the ivory tower, the know-it-all credentialed expert worldview, indulging in abstract dogmas . . . luxury beliefs, social engineering, disconnected from the real world, delusional, unelected, and unaccountable—playing God with everyone else’s lives, with total insulation from the consequences.

Our enemy is speech control and thought control—the increasing use, in plain sight, of George Orwell’s “1984” as an instruction manual . . .

Our enemy is the Precautionary Principle, which would have prevented virtually all progress since man first harnessed fire. The Precautionary Principle was invented to prevent the large-scale deployment of civilian nuclear power, perhaps the most catastrophic mistake in Western society in my lifetime. The Precautionary Principle continues to inflict enormous unnecessary suffering on our world today. It is deeply immoral, and we must jettison it with extreme prejudice.

Our enemy is deceleration, de-growth, depopulation—the nihilistic wish, so trendy among our elites, for fewer people, less energy, and more suffering and death . . .

We will explain to people captured by these zombie ideas that their fears are unwarranted and the future is bright.

We believe we must help them find their way out of their self-imposed labyrinth of pain.

We invite everyone to join us . . .

The water is warm.

Become our allies in the pursuit of technology, abundance, and life.

—Marc Andreessen, The Techno-Optimist Manifesto

Although our society is becoming more dynamic over time, some creativity-suppressing memes that had dominated our static ancestors survive to this day, albeit under different guises. As we saw, those memes ensured that societies like Sparta made practically no progress at all. Thankfully, in our time, such memes don’t stop us from improving our lives and the world more broadly. But they do slow us down, and if left unchecked, they could come to dominate our dynamic society and revert it back to the static societies of old. We, therefore, have a duty not only to recognize them for the threat that they are but to do everything in our power to eradicate them entirely.

Socialism advocates for centralized institutions, like States, to take the means of production away from citizens against their will. Socialists falsely assume that States can better allocate wealth in the form of consumer goods and services better than the private sector. But in the absence of free markets, States cannot determine prices and so cannot discover how resources can be best allocated. Resources like wood and gold could go toward producing all sorts of consumer goods, and market prices signal to entrepreneurs which resources should go into producing which consumer goods. That is, entrepreneurs use prices to “calculate” whether or not a particular venture will improve consumers’ lives. For instance, entrepreneurs might want to buy wood to build houses that they wish to sell. But they can only determine whether such a venture is profitable—that is, if it makes people better off—if they know the prices of the wood they’d buy and the houses they’d sell. But centralizing all of society’s resources into the hands of a single institution obliterates the possibility of prices. As economist Ludwig von Mises wrote, “The paradox of ‘planning’ is that it cannot plan, because of the absence of economic calculation. What is called a planned economy is no economy at all. It is just a system of groping about in the dark. There is no question of a rational choice of means for the best possible attainment of the ultimate ends sought. What is called conscious planning is precisely the elimination of conscious purposive action.”

The impossibility of socialist-style central planning came to light in 1989, when Boris Yeltsin, then the president of the Soviet Union, visited a grocery store in the United States. Back in Russia, people waited in line for food and other goods, but in the capitalist United States, Yeltsin could buy as much of any of the countless items he wanted, and the lines were nothing like they were back home. In recognition of the stark contrast, Yeltsin told some Russians who were with him that if Russians saw what American supermarkets were like, “there would be a revolution.”

Many socialists think that wealth is a fixed pie. They see rich people and poor people and think that such inequality is unfair or unjust. Because they think wealth is fixed, they are sure that the moral thing to do is to forcibly transfer wealth from the rich people to the poor people. They think that the State ought to do such things—hence, they want the State to own the means of production, use them to create goods and services, and allocate them in a fair and just way to the people.

But wealth is not a fixed pie. Mankind was born into utter poverty, and now billions of people are wealthy enough to have the free time to read articles such as this one. So, yes, poverty is a tragedy. But with enough progress, we can all become as wealthy as today’s billionaires—indeed, most modern Westerners are wealthier than the kings of old, who died of diseases we’ve long since cured and who lacked basic comforts such as air conditioning.

The answer to poverty is not socialism, which only makes it more difficult to create more wealth. But trends indicate that young people in the West don’t know that—an Axios poll showed that 41 percent of American adults in 2021 held favorable views toward socialism.

Extreme environmentalism, or the so-called degrowth movement, aims to minimize humanity’s environmental impact by having fewer children, consuming less energy, and releasing less carbon into the atmosphere. As documented in a June 2024 New York Times article, anthropologist and prominent degrowth advocate Jason Hickel once wrote, “Degrowth is about reducing the material and energy throughput of the economy to bring it back into balance with the living world, while distributing income and resources more fairly, liberating people from needless work, and investing in the public goods that people need to thrive.”

The author of the New York Times piece, Jennifer Szalai, further writes, “The distinctive argument that Hickel and other degrowthers make is ultimately a moral one: ‘We have ceded our political agency to the lazy calculus of growth.’”

But there is nothing moral about slowing down growth for the planet’s sake or of rebalancing our relationship with nature. Growth is not some abstract thing that greedy capitalists have made a deity of. Growth means more wealth for people in the form of lifesaving and life-enhancing technologies, from shelter to protect us from the violent forces of the Earth to mass food production to bring starvation to an all-time low.

Some environmentalists are willing to sacrifice the well-being of humans for the sake of the Earth and its nonhuman inhabitants. But they fail to appreciate that it is only humans who stand a chance at saving the planet and every species in existence! After all, the sun will eventually engulf the Earth, and most species have gone extinct, never mind what humans have done. But only humans are capable of developing the technology to protect the Earth from the sun’s death and revive any species we so choose. This might sound like science fiction, but already we deflect asteroids from the Earth and create cells with synthetic genomes. The gap between those feats and the ones you think are science fiction is not insurmountable—but human civilization will need to grow to achieve them.

So, even by the environmentalists’ own standards, people are the primary moral agent in the world. Any side effect we cause can, in principle, be reversed in the long run. Incidentally, the primacy of people serves as a devastating criticism against those who advocate that we have fewer children—after all, more people means more creativity and more boundless potential to make progress.

And if something like climate change is judged by its effects on people, things have never been better thanks to growth. The Earth doesn’t care about us—but we care about each other. As philosopher Alex Epstein notes, “If you review the world’s leading source of climate disaster data, you will find that it totally contradicts the moral case for eliminating fossil fuels. Climate-related disaster deaths have plummeted by 98 percent over the last century, as CO2 levels have risen from 280 ppm (parts per million) to 420 ppm (parts per million) and temperatures have risen by 1°C.”

Yes, fossil fuels have changed the Earth. But they’ve also given us enough energy to create solutions to an uncountable number of problems, including developing safe, manmade environments that shield us from Mother Earth’s dangers. Degrowth would rob us of such creations and leave us cold, dark, and vulnerable. “On a human flourishing standard,” Epstein writes, “we want to avoid not ‘climate change’ but ‘climate danger’—and we want to increase ‘climate livability’ by adapting to and mastering climate, not simply refrain from impacting climate.”

You may laugh at those environmentalists who throw paint at art, but they’ve been effective at halting the development of nuclear power, a potential source of abundant energy that we’ve known how to build for decades. We can’t calculate how much suffering could have been ameliorated had we been free to build nuclear power plants across the Earth.

Scientism is the false idea that scientific knowledge trumps all other kinds of knowledge—that science alone can answer all our questions. But moral, economic, political, and philosophical problems can’t be answered by science alone. This is why the phrase “follow the science,” as we heard so often during the 2020 pandemic, doesn’t make sense. Scientific knowledge can inform our choices, but it alone cannot tell us what to do next, either in our personal lives or in politics more widely. For instance, science might offer us an explanation for how and why COVID-19 spreads, the conditions under which masks reduce spread, and the effect of age and body fat percentage on the risk of infection. But science cannot tell us whether the trade-offs associated with government-mandated lockdowns are worth it, whether the government should invest public funds into drug companies for the development of a vaccine, whether all questions pertaining to a pandemic should be left to the most local level of government or to the most global level of government, whether a grandparent ought to risk infection to visit his grandchildren, or whether a businessman should run an underground (and illegal) speakeasy during lockdowns so that he can afford rent. The answers to such questions require more than just scientific knowledge—they require political, economic, and moral knowledge. Knowledge about what one ought to want in life, knowledge about the trade-offs involved in our decisions, knowledge about the intended and unintended consequences of governmental policy, knowledge about legal precedent, and knowledge about what our political institutions are capable of doing. None of this could possibly be found in a science textbook. Those who claim otherwise are guilty of the sins of scientism.

As the Nobel Prize–winning economist F. A. Hayek, inventor of the term “scientism,” wrote, “It seems to me that this failure of the economists to guide policy more successfully is closely connected with their propensity to imitate as closely as possible the procedures of the brilliantly successful physical sciences—an attempt which in our field may lead to outright error. It is an approach which has come to be described as the ‘scientistic’ attitude—an attitude which . . . is decidedly unscientific in the true sense of the word, since it involves a mechanical and uncritical application of habits of thought to fields different from those in which they have been formed.”

But if we cannot acquire moral, economic, or political knowledge via the methods that work so well in physics, how do we get such knowledge? The same way we always do: by conjecture and criticism. We guess what the right policy is, how we ought to act in the world, and how the economy works. And we criticize all those guesses—maybe not with the rigorous experiments we conduct in the physics laboratory, but experimentation is just one way of criticizing ideas.

Ironically, with the staggering advances made in the hard sciences over the past century, scientism has been on the rise. Quite simply, people think that they can take science’s successes and carry them over into every other field of human endeavor. In political and cultural battles, it is often thought that he who knows the most science must be in the right. If only we put the most scientifically minded people in charge of the world, it is thought, then they could solve all our problems from on high. But science alone cannot tell us whether children have a right to take hormone blockers, whether circumcision should be legal, or how long patents should last. That is no reason to despair—with or without the microscope, we can continue to make progress with creative guessing and criticizing.

Relativism comes in many forms, but perhaps the most dangerous is moral relativism—the idea that there is no difference between right and wrong or good and evil. “Who’s to say who is in the wrong?” the relativist ponders high-mindedly. “What Hamas did to Israel on October 7th is barbaric, but we must end this cycle of violence,” a relativist would say, implicating both sides. “Russia may have invaded Ukraine, but Ukraine is conscripting its own citizens. Therefore, both sides have committed wrongdoing.” “If Hitler was a villain for his genocide, then so was Churchill.”

Relativism might seem open-minded and fair, but it is neither. For it is not open to the possibility that one party is in the right and the other in the wrong. It is not open to the idea that one society is open and dynamic and the other closed and static. It is not open to the notion that one country cherishes life while the other worships death. Nor is relativism fair—the relativist does static societies no favors by denying that they could become as prosperous as dynamic ones should they choose to do so. In their own way, relativists trap evil under the weight of their own suppressive culture when they could have cleansed it with the light of better ideas. And the relativist distorts the self-confidence of dynamic, progressive societies by muddying their understanding of why they’re so successful in the first place, mitigating their ability to make even further progress and spread the right ideas to static societies. The relativist is no highfalutin hero—he keeps evil on life support long past its expiration date.

Perhaps relativism is thriving in the West right now because people can afford to make such an egregious error. But not forever. For the enemies of the West are the enemies of civilization more broadly. They will not stop their anti-human ambitions, no matter how much relativists deny that that is what they are. Nor will it be relativists who ultimately stand up to them but rather those who distinguish between right and wrong, stasis and progress, victory and defeat.

Dogmatism refers to an idea that is considered, implicitly or explicitly, uncriticizable. The final truth. Known with certainty. Never to be changed. People tend to associate religious doctrines with dogmatism, but the connection is not a necessary one. After all, some religions have evolved to cohabitate with the rapid progress we’ve undergone since the Enlightenment (to be sure, other religions, tragically, have not yet done so—and whenever someone admits to “taking something on faith,” dogmatism is surely at work). But dogma is not confined to the cathedral. For instance, many political ideologies are thought to have perfect foundations by their adherents. And even in science, our best theories could, in principle, spread by dogmatic means. Karl Popper described Sigmund Freud’s psychoanalysis as dogmatic. As philosopher Bryan Magee described psychoanalysts, “We should not . . . systematically evade refutation by continually reformulating either our theory or our evidence in order to keep the two in accord. . . . Thus they are substituting dogmatism for science while claiming to be scientific.” Even in the hard sciences, we could imagine a world in which people are not persuaded that Albert Einstein’s theory of relativity is true but rather are pressured to accept it as an uncriticizable foundation of our scientific worldview.

Because all our ideas contain errors, dogmatism always prevents us from improving on the ideas locked in dogma’s cage. Couple that with the fact that any error, no matter how small, could result in the eventual extinction of the human race, and we have good reason to rid our society of all dogmatic elements.

Doomerism is the idea that humanity has no shot at continuing to make progress, or that our extinction is just around the corner, or that we are uniquely vulnerable to being wiped out today, or that we are just one innovation away from guaranteeing our decline.

This attitude neutralizes the human spirit—after all, if humanity is sunk, why bother trying in the first place?

One of the primary examples of doomerism today is the debate over artificial intelligence. Some think that if we just keep innovating, we will eventually create an entity that is more intelligent and/or powerful than people could ever be and that we will fall to the status of slaves or animals beneath its feet. First, if the machine is not creative, it will be precisely as obedient as our microwaves are. And any unintentional side effects of AI can be accounted for with safety measures, as are currently being developed for self-driving cars. Second, if we do end up creating a machine that is as alive as we are—a so-called artificial general intelligence, or AGI—it is no more rational to assume that it will pursue our destruction as it is to assume that new humans will do so. New humans—namely children—are raised to adopt the values of the culture around them. Of course, sometimes they rebel, especially when adults force them to do things they don’t want to do. Therefore, the problem of how to integrate an AGI into our society is the same as the problem of how to raise children into happy, productive adults—and we’ve been improving at that for centuries.

Another dangerous effect of doomerism is tyranny, whether through cultural taboos, governmental regulations, or outright bans. They all amount to slowing the growth of knowledge and wealth, and of progress more generally. For if the next innovative step marks our doom, then surely a little—or a lot—of tyranny is justified! But innovation is the very panacea that doomers are worried about. It is stasis, not change, that will mark our end.

Moreover, we might choose to slow ourselves down, but the bad guys won’t. So there’s no world in which AI doesn’t continue to progress. But there is a world in which the bad guys get a hold of novel technologies before we do—and, with it, the end of our sustained Enlightenment.

So socialism, environmentalism, scientism, relativism, dogmatism, and doomerism have all earned their bona fides as enemies of civilization. In one way or another, they curb our ability to make progress, a stain on the project that is humanity. But is each stain a unique color, or do they come from the same poisonous ink jar?

Indeed, all memetic enemies of civilization have one thing in common: They slow the growth of knowledge.

This article was excerpted from an upcoming documentary.