fbpx
01 / 05
Beyond Zero-Sum Thinking: Love, Like Resources, Does Not Run Out

Blog Post | Culture & Tolerance

Beyond Zero-Sum Thinking: Love, Like Resources, Does Not Run Out

New study shows us that the idea of “inexhaustibility” applies just as well to intangible assets as to physical ones.

Summary: A recent study reveals that love and affection are not perceived as finite resources. Warmth and support can grow to include new connections without diminishing existing bonds. This finding parallels the thesis in Superabundance that innovation and collaboration make physical resources practically inexhaustible. Just as technological progress unlocks material abundance, human relationships can foster limitless emotional growth and reciprocity.


A recent study titled “Love Doesn’t Run Out: Children and Adults Do Not View Social Resources as Inherently Zero-Sum argues that the human capacity for care and affection has no obvious upper limit. When researchers asked children and adults about the distribution of love and kindness, participants largely rejected the notion that warmth and support are finite commodities. Note the similarity between that finding and the broader thesis that Gale L. Pooley and I advance in our 2022 book Superabundance: When people innovate and collaborate, resources become practically inexhaustible.

The study’s most provocative conclusion is that people, even from a young age, do not typically treat love as zero-sum. In other words, caring for one child, spouse, or friend need not diminish affection for another. When subjects were asked whether a person’s love might be “used up” by having multiple recipients, most insisted that affection could stretch to encompass new connections. Participants of different ages concurred that love and supportive emotions are not strictly limited. That contrasts with old clichés about jealousy or parental favorites. The data suggest that most individuals naturally view social bonds as expandable.

As Pooley and I show, innovation, problem-solving, and cumulative stock of knowledge empower us to multiply what is possible from seemingly limited physical means. Yet, just as a clever engineer finds new oil fields or ways to yield more energy from the same volume of fuel, human relationships appear to expand indefinitely. A mother of three need not ration her warmth, the same way a market teeming with entrepreneurs need not remain locked in a zero-sum scramble for limited wealth. Instead, she can cultivate growing affection for each child, complementing the love they share among siblings.

This phenomenon can be understood as a “love multiplier,” analogous to the time price of material goods discussed in our research. Pooley and I have shown that as the population rises and ideas proliferate, the abundance of resources often grows faster than demand, driving down real costs relative to time. Similarly, when individuals share attention, empathy, and kindness, these social goods tend to expand in circulation. Humanity can benefit from a continuous, virtuous cycle in which giving fosters more giving.

Skeptics might note that emotional labor can be exhausting. True, humans have limitations in time and energy. However, the study illustrates that we do not perceive love itself as a resource that is drained beyond repair. Individuals still structure boundaries to avoid burnout, but few interpret the innate capacity to care as a pie with only so many slices. Love, in this sense, might operate much like knowledge: Transmitting it and spreading it do not need to reduce what the giver retains.

Our experience with technological progress offers a helpful parallel: The spread of the internet did not kill communication among human beings; it accelerated it by creating new networks and allowing people to learn from each other in real time. Social resources follow a comparable logic. Providing attention, affection, and supportive relationships unlocks reciprocal benefits. The mere existence of robust, affectionate families indicates that love is additive rather than depleting.

The study, in other words, suggests that the idea of “inexhaustibility” applies just as well to an intangible asset like love as to a physical one like oil. Yes, it’s encouraging that the real price of natural resources typically declines over time when measured in labor hours, but it’s equally heartening to recognize that human beings naturally resist artificial scarcity in their relationships. Whereas some might assume people cling tightly to love, the new research supports the view that such hoarding instincts are not our default.

Ultimately, recognizing that love need not be a zero-sum game restores faith in humanity’s remarkable capacity to grow both materially and emotionally. If children grasp the idea that caring can be boundless, then surely our societies can foster a broader culture of resource expansion in every sense.

We stand to benefit not only from tangible innovations—cheaper energy, cleaner water, and more advanced medicine—but also from the immeasurable yet equally important domain of human affection. The lesson is clear: Love, like physical resources, need not be rationed.

Blog Post | Wellbeing

Meaning and Morality in the Modern Age | Podcast Highlights

Marian Tupy interviews Steven Pinker about the so-called "crisis of meaning," the decline of religion, and what can give life purpose in a modern, largely secular world.

Listen to the podcast or read the full transcript here.

Today, I’m pleased to have with me Steven Pinker, a world-renowned Harvard University psychologist and author of best-selling books including The Blank Slate, The Better Angels of Our Nature, Enlightenment Now, and of course, most recently, When Everyone Knows That Everyone Knows. Highly recommend all of them.

Let’s start at a high level and look at how Americans think about the country. Gallup shows that 80 percent of Americans are either satisfied or very satisfied with their lives, but only 20 percent are satisfied with the way that America is going. That’s a bit of a discrepancy.

What does a psychologist have to say about that?

It’s a fascinating phenomenon that pollsters have known about for decades. They call it “the optimism gap.” It appears in just about any question.

“What is the quality of education in this country?”

“It’s terrible.”

“What’s the quality of your child’s school?”

“Well, not bad.”

“How safe is the country?”

“Oh, you can’t walk anywhere. You’ll get mugged.”

“How safe is your neighborhood?”

“Oh, I feel perfectly fine.”

Part of it is that, because none of us can experience the entire country ourselves, our opinions are based on media coverage, and the media have a number of negativity biases. The nature of news selects for negative events because it reports what’s new and discrete enough to be a story. New, discrete events are more likely to be bad than good because there are many more ways for things to go wrong than for things to go right. And while bad things, like a terrorist attack or natural disaster, can happen quickly, positive things tend to be things that don’t happen or things that happen gradually, like the long-term decline in extreme poverty, the rise in literacy, and many other trends that you’ve written about.

Editors also feel more responsible if they emphasize negative stories over positive ones. I’ve heard one editor say, “Well, negative news is journalism, and positive news is advertising.” I think it was Stewart Brand who once said, more generally, that a pessimist sounds like he’s trying to help you, while an optimist sounds like he’s trying to sell you something. So, our picture of the country and the world as a whole is distorted both deliberately and accidentally by the very nature of news.

Let me mention one other thing. There really are problems in the world, to put it mildly, and some things have gotten worse in the last 10 or 20 years. But one has to have a quantitative, statistical, probabilistic view of the world to acknowledge the reality that things can get worse while still being better than they were historically, and that some things can get worse while other things are getting better.

You don’t conclude from something that genuinely has gotten worse that everything has gotten worse or that we’re in a worse situation now than we ever have been.

You mentioned literacy. Recently, I’ve been reading about freshmen entering university without basic reading and math skills. People are reading fewer books. Are we getting dumber, and is education an example of something that is worse than it was 40 or 50 years ago?

Yes, and it’s not the only example. The world’s democracy score has gone down in the last couple of decades. War deaths are worse now than they were 20 years ago, although still better than they were in the ’60s, ’70s, ’80s, and most of the ’90s. But yeah, educational scores have gone down. The Flynn effect, by which IQ scores rose for about three points a year for almost a century, has now gone in the other direction.

Now, that doesn’t mean that we’re back to the level that we were 100 years ago, but there’s been a bit of a droop. It may be that there are pathologies in our educational system, that the drive for equity and especially for equity across all racial groups has led to bringing down the top rather than raising the bottom. It could be that our schools of education have been training teachers to use the wrong methods. There’s also the fact that, while reading and literacy are good things, they are cognitively unnatural. We didn’t evolve with print; it’s a recent invention, and we’ve seen, especially in the last 10 years, that a lot of people prefer listening and watching to reading. Thanks to the massive availability of video, people may no longer be putting the effort into developing literacy, which we have reason to believe was one of the drivers of the Flynn effect and of cognitive sophistication in general.

My understanding is that the decline of reading and math scores is most severe at the low end. The smart students have not declined much, but weaker students have. So, it is a problem, and I think it’s a problem that ought to be addressed.

When it comes to the decline in reading books, there may be one other factor: the optimal length of a work of text may no longer be a book. I have found that, as a curious person, I can get lost in reading about things on Wikipedia like the history of the potato chip or transatlantic travel or planets. There’s just a flood of information out there and it’s all really interesting. And I say this with some embarrassment because I write books, and sometimes very long books, but for some kinds of information, it may be that a book has diminishing returns.

Let’s now look at other criticisms of human progress.

You and I had an article in The Free Press pushing back against the “crisis of meaning.” Have you ever seen any hard evidence suggesting that people’s lives are more meaningless in rich countries versus poor countries or that lives are less meaningful today than they used to be?

No, I haven’t.

We don’t have survey data on “How meaningful do you think life is?”, but meaning and happiness seem to be partially correlated. So, in general, people who are happier say their lives are more meaningful. But some sources of meaning are not the same as sources of happiness, and vice versa. Just to give a couple of examples, if you’re dedicating your life to some cause, there can be setbacks and frustrations that make you less happy, but you say your life is more meaningful compared to a life of pleasure and leisure. Time spent with friends is more pleasurable, while time spent with family is more meaningful. So, meaning and happiness are not perfectly correlated, but they are partially correlated.

Over the course of history, if you look at the whole range of countries, there has been more of an increase in happiness than a decrease. In countries that are very affluent, like the United States, there has not been an increase in happiness. We may be close to the ceiling. But overall, across the world, there’s reason to believe that happiness has increased, so that would suggest but not prove that there has not been a decline in meaningfulness.

Anecdotally, there have been complaints that life is meaningless as far back as you go. Ecclesiastes: “Vanity of vanities, all is vanity.” Henry David Thoreau in 1854: “The mass of men lead lives of quiet desperation.” T.S. Eliot, 1920s: “We are the hollow men, we are the stuffed men.” So, it’s a constant complaint, and the fact that people say it doesn’t necessarily mean it’s true. It’s always tempting to think that life is meaningless. We like to think that there is a plan to the universe, and we get disillusioned when we find out there isn’t one. The laws of nature don’t tell any story with an ending. There are things built into the evolutionary process that guarantee that life is going to appear meaningless. There’s the law of entropy. Things fall apart and decay. We die, we get older, we weaken. Even our closest relationships are never perfect.

Now, I think the answer to that is to focus on human purposes, like not dying young, not getting shot, knowing more, experiencing art and culture, experiencing friendship, and seeing the world. But one has to reorient and realize that those are the goals of life and not expect that the universe itself tells a satisfying story.

People often look at proxies for meaning, such as anxiety and suicide. There seems to be some evidence that rich countries have higher rates of anxiety than poor countries. Of course, definitions can change and expand. Trauma used to mean being bombed by the Germans; today, it may be that you are breaking up with your boyfriend or girlfriend.

Do you have any sense as to how reliable the data on anxiety and trauma is?

There’s certainly been some diagnostic category creep. I’ve seen this in my own students. There’s an eagerness to diagnose oneself, sometimes with bogus diagnoses like autism for introversion. There’s a funny kind of cachet to having a pathology. But looking retrospectively at surveys, I think there probably has also been, on top of that, some increase in anxiety since the late 1950s.

Some of that may be that we’re taking on more responsibilities and adding to our anxiety burden. When I think back to my parents in the 1950s, there were a lot of things that they just never thought about. Are they getting enough exercise? Are they exposing themselves to skin cancer risk by going out in the sun? The state of the climate, inequality. Most people didn’t think about these things.

Jean Twenge and Jon Haidt have been trying to make the case that social media, especially through smartphones, has led to a genuine rise in anxiety, particularly in younger people. There’s some controversy there over cause and effect—maybe anxious and depressed kids turn to social media—but there seems to be at least some evidence that suggests causation.

Let me offer to our listeners what I consider to be the strongest argument in favor of rational optimism.

The clearest sign of unhappiness is when you kill yourself. Here in the United States, we’ve had an increase in suicides, but suicides are dropping in most, if not all, other rich countries. So, it seems there is a particular American pathology rather than a general pathology in prosperous countries. What’s wrong with this argument?

When I report on violence, I usually concentrate on homicide, simply because homicide is the most objective measure of violence. A dead body is hard to argue away, and people record homicides pretty accurately, so it’s the best indicator of violence. By extension, one might think that suicide would be the best indicator of unhappiness. But, partly to my surprise, that doesn’t seem to be right.

There is more ambiguity in how officials record suicide deaths. For example, when there’s a stigma against suicide, they’re often classified as accidents. Also, as best as we can tell, there’s not an excellent correlation between the suicide rate and national unhappiness. There’s even what some researchers call the suicide-unhappiness paradox, which is that countries where people are happier can sometimes have higher suicide rates, partly for the same reason that suicide rates increase around Christmas: if you look around and everyone is happy and you’re not, then you really think you’re a loser.

Suicide rates are also driven by contagion and by how easy it is to commit suicide. I quote the rather macabre poem by Dorothy Parker: “Guns aren’t lawful, nooses give, gas smells awful, you might as well live.” Suicide went way down in Britain when they changed the composition of cooking gas from coal gas to methane, which is not toxic.In developing countries, access to pesticides, a common method of suicide, has a big effect on actual rates. And in the United States, the availability of guns seems to be one of the drivers.

So, there are a lot of puzzles with suicide rates. But generally, I think it’s important to point out, as you do, that suicide rates are actually dropping globally, especially in poorer countries, but also in many rich countries. The United States is something of an anomaly. Since the 1990s, when the Global Burden of Disease project began to collect data, suicide has gone down by about 40 percent. A lot of that is thanks to urbanization. When a woman is put into an arranged marriage and leaves her village for the village of her husband, where she is dominated by her in-laws and has no friends and no way of escaping, that leads to a lot of suicides. In a more modern urban culture where you kind of have more freedom, there’s less desperation. So globally, modernization and urbanization have led to falling suicide rates.Even in the United States, suicide rates went down until the mid to late 1990s. That was a low point, and they’ve been rising since then, but it’s not as if they’ve been inexorably rising over the last century.

Those are very good caveats, thanks for introducing that nuance.

One thing that you and I discussed in our Free Press article was the criticism that meaninglessness in the West is driven in part by falling religiosity. A defender of religion might say that religion is essentially a cognitive or cultural technology for producing responsibility, happiness, restraint, and gratitude. So, if you remove religion, you may be making people more irresponsible, more unhappy, less restrained, and less grateful.

What do you think about that argument?

There is a need for community institutions and organizations that bring people together, that discuss meaning and morality, and that are a locus for collective action. The problem is that if you bundle that with theology, miracles, scripture, and invisible agents, it just isn’t going to be convincing anymore.

Religion wasn’t taken away from people; people left religion. In every developed country, there’s been a move away from organized religion. The churches are still around, and no one’s stopping people from attending; they just don’t find that religion gives them meaning and purpose. This is partly because the institutions themselves have not been sources of morality or meaning. The Roman Catholic Church with its sex abuse scandals, evangelical Protestantism in the United States with its embrace of far-right politics, the subordinate role of women in the more conservative religions like Orthodox Judaism—these are just turn-offs.

I’m gonna quote G. K. Chesterton, who is supposed to have said that when men stop believing in God, they don’t believe in nothing, they believe in anything. A 2021 national survey found that young Americans are more likely to believe in witchcraft, luck, black magic, and spell casting.

What do you make of the argument that Christianity keeps the belief in black magic and witchcraft at bay?

A few things. The witch hunts of the 16th century were a Christian movement. I mean, “Thou shalt not suffer a witch to live” is in the Christian Bible. I also think Chesterton was wrong about the idea that people who are more religious are also more open to astrology, ESP, the paranormal, crystal healing, and other kinds of New Age woo-woo. I don’t think it’s true as a general correlation.

The data that you cite on openness to paranormal beliefs is interesting. I’ve never reported this, but I’ve looked at trends in the belief in devils, ESP, precognition, curses, and all kinds of paranormal things. As best as I can tell, it’s been pretty flat since the 1970s.

Something to be aware of is that there are different ways in which societies can change, and quantitatively, it’s not always easy to tell them apart. There can be a cohort effect, that is, as one generation replaces another, that generation has beliefs that they carry with them as they age; a period effect, where everyone changes their beliefs; or a life cycle event where, as people age, they change their beliefs. As best I can tell, what you cited is largely an age effect. Younger people are more open to woo-woo and magic than older people. So, I think those data are correct, but don’t necessarily mean that societies have become more open to the paranormal.

One way or another, there is a sizable chunk of the population that is attracted to the supernatural or transcendental, the so-called God-shaped hole in the human heart. Critics say that irreligious people are offering a meaningless, cold universe without a purpose, and that people really need some form of transcendence to make sense of their lives.

What do you think of that argument?

I think it’s literally wrong in the sense that people’s craving for meaning and purpose isn’t shaped like a God. In fact, that argument is sometimes used to explain the rise of wokeness, that religion was replaced with the idea that differences between groups are a moral emergency, and you have to find the oppressors responsible and punish them. There’s no God in any of that.

Granted, many people do search for transcendence, but kids like to believe in Santa Claus. That belief doesn’t have to be indulged. Kant’s definition of the Enlightenment was man’s escape from his self-imposed childhood. Part of growing up involves some hard lessons, like the universe is a cold place, and it doesn’t care about you. That does not mean life is meaningless, because the fact that the universe doesn’t care about you doesn’t mean that other humans don’t care about you or that we don’t have to care about other humans. We have a purpose, which is to make people as well off as possible, to increase flourishing, to increase knowledge, life, health, freedom, and safety. These are really meaningful goals that I don’t think should leave you empty.

Without religion, what is the basis of morality? Where does morality come from if not from man being created in the image of God?

Well, man being created in the image of God doesn’t give you a whole lot of morality. If you look at the Old Testament, God is commanding the Israelites to rape, massacre, and mutilate their enemies, while there are religious prescriptions against mixing linen and cotton, lighting a fire on Saturday, and other crazy stuff that has nothing to do with morality as we could argue for it.

Conversely, I think the obvious source of morality is some kind of Golden Rule. The way we teach kids to be moral is we say, “How would you like that if someone did that to you?” The logical basis of mortality is that, as long as I’m not the galactic overlord and my fate depends on other people, I’ve got to agree to some sort of social contract that treats us as equivalent. That’s why versions of the Golden Rule have been independently discovered by many different cultures.

Here’s the most common counterargument I hear to that point of view: it is very well for an intelligent professor who reads a lot of books to derive moral principles from reciprocity, reason, and self-interest, but ordinary people don’t think like that.

What’s wrong with just picking an oven-ready set of moral norms off the shelf, like those presented by modern Christianity, which have been made more humane over time? You don’t have to do much thinking, for which you might not have time or ability.

Well, I think that could be a means to an end, but one must keep in mind what the end is, which is humanistic morality that we can justify. As we know, religions can contain off-the-shelf moralities such as “kill anyone who insults the prophet Muhammad,” “execute blasphemers or gay people,” or “thou shalt not suffer a witch to live.”

Now there are religions guided by humanistic, enlightenment, universalist principles, such as some of the liberal Protestant denominations and Reform Judaism. I don’t oppose keeping some symbolism and ritual if the institution has moved in a humanistic direction. Maybe that would be a good thing.

A somewhat different criticism of progress has to do with status competition, essentially the idea that no matter how much things get better, ultimately, as you once again put it in your book, men don’t contend with the dead but with the living.

Are our efforts at Human Progress bound to fail because people care about relative rather than absolute improvements in life?

I love that Hobbes quote. He introduces it by saying there’s a natural reverence for antiquity because men contend with the living, not with the dead. That is, intellectuals and moralists will tend to revere earlier eras and bemoan the present era because complaining about the present is another way of complaining about your contemporaries, who are your rivals. That’s another reason there is a negativity bias.

That’s an aside on elite status competition, but we all compare ourselves to others. So, in that sense, there won’t ever be a utopia. People will always compare themselves to others and be less happy than they ought to be. Still, it’s worth working toward progress. Even if you’re a spoiled first-world brat, it’s still better that you live to 80 instead of 55. It’s still better that your kids don’t die. It’s still better to travel the world instead of being confined to your village.

There’s a quote on my wall from a psychologist called Richard Layard that reads, “One secret of happiness is to ignore comparisons with people who are more successful than you are. Always compare downwards, not upwards.”

How do we go about explaining to people that it’s okay that there is always going to be somebody who is taller, smarter, and more handsome than you are?

You’re right that this is a piece of wisdom we’d be better off having, but it’s not easy to engineer. Some features of culture are very bottom-up. They can be influenced by education and by the messages that we give children, but no one’s really in charge; it’s the result of millions of people interacting with each other every day. However, we shouldn’t abdicate our responsibility for what we teach kids. We can do our part and try to nudge them in the right direction.

The Human Progress Podcast | Ep. 76

Meaning and Morality in the Modern Age

Steven Pinker joins Marian Tupy to discuss the so-called "crisis of meaning," the decline of religion, and what can give life purpose in a modern, largely secular world.

Blog Post | Human Development

The Grim Truth About the “Good Old Days”

Preindustrial life wasn’t simple or serene—it was filthy, violent, and short.

Summary: Rose-tinted nostalgia for the preindustrial era has gone viral—some people claim that modernity itself was a mistake and that “progress” is an illusion. This article addresses seven supposed negative effects of the Industrial Revolution. The conclusion is that history bears little resemblance to the sanitized image of preindustrial times in the popular imagination.


When Ted Kaczynski, the Unabomber, declared in 1995 that “the Industrial Revolution and its consequences have been a disaster for the human race,” he was voicing a sentiment that now circulates widely online.

Rose-tinted nostalgia for the preindustrial era has gone viral, strengthened by anxieties about our own digital era. Some are even claiming that modernity itself was a mistake and that “progress” is an illusion. Medieval peasants led happier and more leisurely lives than we do, according to those who pine for the past. “The internet has become strangely nostalgic for life in the Middle Ages,” journalist Amanda Mull wrote in a piece for The Atlantic. Samuel Matlack, managing editor of The New Atlantis, observed that there is currently an “endless debate around whether the preindustrial past was clearly better than what we have now and we must go back to save humanity, or whether modern technological society is unambiguously a forward leap we must forever extend.”

In the popular imagination, the Industrial Revolution was the birth of many evils, a time when smoke-belching factories disrupted humanity’s erstwhile idyllic existence. Economics professor Vincent Geloso’s informal survey of university students found that they believed “living standards did not increase for the poor; only the rich got richer; the cities were dirty and the poor suffered from ill-health.” Pundit Tucker Carlson has even suggested that feudalism was preferable to modern liberal democracy.

Different groups tend to idealize different aspects of the past. Environmentalists might idealize preindustrial harmony with nature, while social traditionalists romanticize our ancestors’ family lives. People from across the political spectrum share the sense that the Industrial Revolution brought little real improvement for ordinary people.

In 2021, History.com published “7 Negative Effects of the Industrial Revolution,” an article reflecting much of the thinking behind the popular impression that industrialization was a step backward for humanity, rather than a period of tremendous progress. But was industrialization really to blame for each of the ills detailed in the article?

“Horrible Living Conditions for Workers”

Were horrible living conditions a result of industrialization? To be sure, industrial-era living conditions did not meet modern standards—but neither did the living conditions that preceded them.

As historian Kirstin Olsen put it in her book, Daily Life in 18th-Century England, “The rural poor . . . crowded together, often in a single room of little more than 100 square feet, sometimes in a single bed, or sometimes in a simple pile of shavings or straw or matted wool on the floor. In the country, the livestock might be brought indoors at night for additional warmth.” In 18th-century Wales, one observer claimed that in the homes of the common people, “every edifice” was practically a miniature “Noah’s Ark” filled with a great variety of animals. One shudders to think of the barnlike smell that bedchambers took on, in addition to the chorus of barnyard sounds that likely filled every night. Our forebears put up with the stench and noise and cuddled up with their livestock, if only to stave off hypothermia.

Homes were often so poorly constructed that they were unstable. The din of collapsing buildings was such a common sound that in 1688, Randle Holme defined a crash as “a noise proceeding from a breach of a house or wall.” The poet Dr. Samuel Johnson wrote that in 1730s London, “falling houses thunder on your head.” In the 1740s, “props to houses” keeping them from collapsing were listed among the most common obstacles that blocked free passage along London’s walkways.

“Poor Nutrition”

What about poor nutrition? From liberal flower children to the “Make America Healthy Again” crowd, fetishizing the supposedly chemical-free, wholesome diets of yore is bipartisan. The truth, however, is stomach-churning.

Our ancestors not only failed to eat well, but they sometimes didn’t eat at all. Historian William Manchester noted that in preindustrial Europe, famines occurred every four years on average. In the lean years, “cannibalism was not unknown. Strangers and travelers were waylaid and killed to be eaten.” Historian Fernand Braudel recorded a 1662 account from Burgundy, France, that lamented that “famine this year has put an end to over ten thousand families . . . and forced a third of the inhabitants, even in the good towns, to eat wild plants. . . . Some people ate human flesh.” A third of Finland’s population is estimated to have died of starvation during a famine in the 1690s.

Even when food was available, it was often far from appetizing. Our forebears lived in a world where adulterated bread and milk, spoiled meat, and vegetables tainted with human waste were everyday occurrences. London bread was described in a 1771 novel as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.” According to historian Emily Cockayne, the 1757 public health treatise Poison Detected noted that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.”

Water was also far from pristine. “For the most part, filth flowed out windows, down the streets, and into the same streams, rivers, and lakes where the city’s inhabitants drew their water,” according to environmental law professor James Salzman. This ensured that each swig included a copious dose of human excreta and noxious bacteria. Waterborne illnesses were frequent.

“A Stressful, Unsatisfying Lifestyle”

Did stressful lifestyles originate with industrialization? Did our preindustrial ancestors generally enjoy a sense of inner peace? Doubtful. Sadly, many of them suffered from what they called melancholia, roughly analogous to the modern concepts of anxiety and depression.

In 1621, physician Robert Burton described a common symptom of melancholia as waking in the night due to mental stress among the upper classes. An observer said the poor similarly “feel their sleep interrupted by the cold, the filth, the screams and infants’ cries, and by a thousand other anxieties.” Richard Napier, a 17th-century physician, recorded over several decades that some 20 percent of his patients suffered from insomnia. Today, in comparison, 12 percent of Americans say they have been diagnosed with chronic insomnia. Stress is nothing new.

Sky-high preindustrial mortality rates caused profound emotional suffering to those in mourning. Losing a child to death in infancy was once a common—indeed, near-universal—experience among parents, but the loss was no less painful for all its ordinariness. Many surviving testimonies suggest that mothers and fathers felt acute grief with each loss. The 18th-century poem, “To an Infant Expiring the Second Day of Its Birth,” by Mehetabel “Hetty” Wright—who lost several of her own children prematurely—heartrendingly urges her infant to look at her one last time before passing away.

So common were child deaths that practically every major poet explored the subject. Robert Burns wrote “On the Birth of a Posthumous Child.” Percy Bysshe Shelley wrote multiple poems to his deceased son. Consider the pain captured by these lines from William Shakespeare’s play King John, spoken by the character Constance upon her son’s death: “Grief fills the room up of my absent child. . . . O Lord! My boy, my Arthur, my fair son! My life, my joy, my food, my all the world!” Shakespeare’s own son died in 1596, around the time the playwright would have finished writing King John.

Only in the modern world has child loss changed from extraordinarily common to exceedingly rare. As stressful as modern life can be, our ancestors faced forms of heartache that most people today will never endure.

“Dangerous Workplaces” and “Child Labor”

Dangerous workplaces and child labor both predate the Industrial Revolution. In agrarian societies, entire families would labor in fields and pastures, including pregnant women and young children. Many preindustrial children entered the workforce at what today would be considered preschool or kindergarten age.

In poorer families, children were sent to work by age 4 or 5. If children failed to find gainful employment by age 8, even social reformers unusually sympathetic to the plight of the poor, would express open disgust at such a lack of industriousness. Jonas Hanway was reportedly “revolted by families who sought charity when they had children aged 8 to 14 earning no wages.”

For most, work was backbreaking and unending. A common myth suggests that preindustrial peasants worked fewer days than modern people do. This misconception originated from an early estimate by historian Gregory Clark, who initially proposed that peasants labored only 150 days a year. He later revised this figure to around 300 days—higher than the modern average of 260 working days, even before factoring in today’s paid holidays and vacation time.

Physically harming one’s employees was once widely accepted, too, and authorities stepped in only when the mistreatment was exceptionally severe. In 1666, one such case occurred in Kittery, in what is now Maine, when Nicholas and Judith Weekes caused the death of a servant. Judith confessed that she cut off the servant’s toes with an axe. The couple, however, was not indicted for murder, merely for cruelty.

“Discrimination Against Women”

The preindustrial world was hardly a model of gender equality—discrimination against women was not an invention of the early industrialists but a long-standing feature of many societies.

Domestic violence was widely tolerated. In London, a 1595 law dictated: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” In other words, no beating your wife after 9:00 p.m. That was a noise regulation. A similar law forbade using a hammer after 9:00 p.m. Beating one’s wife until she screamed was an ordinary and acceptable activity.

Domestic violence was celebrated in popular culture, as in the lively folk song “The Cooper of Fife,” a traditional Scottish tune that inspired a country dance and influenced similar English and American ballads. To modern ears, the contrast between its violent lyrics and upbeat melody is unsettling. The song portrays a husband as entirely justified in his acts of domestic violence, inviting the audience to side with the wifebeater and cheer as he beats his wife into submission for her failure to perform domestic chores to her husband’s satisfaction.

Sexist laws often empowered men to abuse women. If a woman earned money, her husband could legally claim it at any time. For instance, in 18th-century Britain, a wife could not enter into contracts, make a will without her husband’s approval, or decide on her children’s education or apprenticeships; moreover, in the event of a separation, she automatically lost custody. Mistreatment of women, in other words, long predated industrialization. Arguably, it was the increase in female labor force participation during the Industrial Revolution that ultimately gave women greater economic independence and strengthened their social bargaining power.

“Environmental Harm”

While many of today’s environmental challenges—such as climate change and plastic pollution—differ from those our forebears faced, environmental degradation is not a recent phenomenon. Worrying about environmental impact, however, is rather new. Indeed, as historian Richard Hoffmann has pointed out, “Medieval writers often articulated an adversarial understanding of nature, a belief that it was not only worthless and unpleasant, but actively hostile to . . . humankind.”

Consider deforestation. The Domesday Survey of 1086 found that trees covered 15 percent of England; by 1340, the share had fallen to 6 percent. France’s forests more than halved from about 30 million hectares in Charlemagne’s time (768–814) to 13 million by Philip IV’s reign (1285–1314).

Europe was hardly the only part of the world to abuse its forests. A 16th-century witness observed that at every proclamation demanding more wood for imperial buildings, the peasants of what are today the Hubei and Sichuan provinces in China “wept with despair until they choked,” for there was scarcely any wood left to be found.

Despeciation is also nothing new. Humans have been exterminating wildlife since prehistory. The past 50,000 years saw about 90 genera of large mammals go extinct, amounting to over 70 percent of America’s large species and over 90 percent of Australia’s. 

Exterminations of species occurred throughout the preindustrial era. People first settled in New Zealand in the late 13th century. In only 100 years, humans exterminated 10 species of moa in addition to at least 15 other kinds of native birds, including ducks, geese, pelicans, coots, Haast’s eagle, and an indigenous harrier. Today, few people realize that lions, hyenas, and leopards were once native to Europe, but by the first century, human activity eliminated them from the continent. The final known auroch, Europe’s native wild ox, was killed in Poland by a noble hunter in 1627.

Progress Is Real

History bears little resemblance to the sanitized image of preindustrial times in the popular imagination—that is, a beautiful scene of idyllic country villages with pristine air and residents merrily dancing around maypoles. The healthy, peaceful, and prosperous people in this fantasy of pastoral bliss do not realize their contented, leisurely lives will soon be disrupted by the story’s villain: the dark smokestacks of the Industrial Revolution’s “satanic mills.”

Such rose-colored views of the past bear little resemblance to reality. A closer look shatters the illusion. The world most of our ancestors faced was in fact more gruesome than modern minds can fathom. From routine spousal and child abuse to famine-induced cannibalism and streets that doubled as open sewers, practically every aspect of existence was horrific.

A popular saying holds that “the past is a foreign country,” and based on recorded accounts, it is not one where you would wish to vacation. If you could visit the preindustrial past, you would likely give the experience a zero-star rating. Indeed, the trip might leave you permanently scarred, both physically and psychologically. You might long to unsee the horrors encountered on your adventure and to forget the shocking, gory details.

The upside is that the visit would help deromanticize the past and show how far humanity has truly come—emphasizing the utter transformation of everyday lives and the reality of progress.

This article was published at Big Think on 11/19/2025.

Study Finds | Happiness & Satisfaction

Unplugging Is Making Young Americans Happier

“Half of Americans now deliberately spend less time on screens, and the choice is paying off. People who create screen-free windows in their day say they feel more productive, more present with loved ones, and more aware of what’s happening around them. But here’s the kicker: 70% of time spent online actually leaves people feeling disconnected and lonely rather than connected to others.

Gen Z is driving this shift. Despite growing up with smartphones as extensions of their hands, 63% now intentionally unplug. That’s the highest rate of any generation surveyed. Millennials follow at 57%, then Gen X at 42% and baby boomers at 29%. Digital natives, it turns out, are the first to recognize what all that connectivity is costing them.”

From Study Finds.