fbpx
01 / 05
Open Societies and Closed Minds | Podcast Highlights

Blog Post | Democracy & Autocracy

Open Societies and Closed Minds | Podcast Highlights

Marian Tupy interviews Matt Johnson about historicism, progress, and how tribalism and the “desire for recognition” are testing the foundations of open societies.

Listen to the podcast or read the full transcript here.

Today, I’m very lucky to speak to Matt Johnson, who recently had a fascinating essay in Quillette titled “The Open Society and Its New Enemies: What Karl Popper’s classic can teach us about the threats facing democracies today.”

So Matt, could you tell us who Karl Popper was and what this big book is about?

Popper is mainly known for his scientific work, especially his ideas around falsifiability. He published a book called The Open Society and Its Enemies in 1945. He started writing it right after the Nazi annexation of Austria. It’s a very powerful and clarifying set of principles for anybody interested in liberal democracy and the broader project of building open societies around the world today.

So, why talk about liberal democracies and openness? It is our conjecture here at Human Progress that openness is very important. Have you ever thought or written about the connection between openness, liberal democracy, and the scope and speed of human progress?

That’s been a major theme of my work for a long time. I think there is a strong connection between the development of liberal democracy and open societies throughout the 20th century and human progress. Liberal democracy, unlike its authoritarian rivals, has error correction mechanisms built in. It allows for pluralism in society. It allows people to cooperate without the threat of violence or coercion. There’s also the economic element: Liberal democracy facilitates free trade and open exchange because it’s rule-based and law-bound, which are important conditions for economic development.

Human Progress also assumes that there is some directionality in history. We can say that living in 2025 is better than living in 1025 or 25 AD. But you begin your essay by raising the dangers of what Karl Popper called historicism, or a belief in the inevitability of certain political or economic outcomes. Can you unwind that for us? What is the difference between acknowledging the directionality of human history and historicism?

Popper regarded historicism as extremely dangerous because it treats human beings as a means to an end. If you already know what you’re working toward—a glorious worker state or some other utopia—then it doesn’t matter how much pain you have to inflict in the meantime. You’re not treating your citizens as ends whose rights must be protected; you’re treating them as raw material, as characters in this grand historical story.

The second concern is that historicism is anti-scientific because you can hammer any existing data into a form that fits your historicist prophecy.

Marx wrote that the unfolding of history is inevitable. In his view, leaders were just responsible for making that unavoidable transition easier. That’s the central conceit of historicism. If you take a Popperian view, you’re much more modest. You have to ground every policy in empirical reality. You have to adjust when things don’t work. You’re not just birthing a new paradigm you already know everything about. You don’t know what the future holds.

Stalin would say, anytime there was a setback, that it was all part of the same plan. It was all just globalist saboteurs attacking the Soviet Union, or it was some part of the grand historical unfolding that moving toward the dictatorship of the proletariat. There’s no sense in which new information can change the course of a government with historicist ideas.

That differs from a general idea of progress. We have a lot of economic data that suggests that people have escaped poverty at an incredible rate since the middle of the 20th century. We’ve seen democratization on a vast scale around the world. We’ve seen interstate relations become much more tranquil and peaceful over the past several decades. I mean, the idea of Germany and France fighting a war now is pretty much inconceivable to most people. That’s a huge historical victory, it’s unprecedented in the history of Western Europe.

So, there are good reasons to believe that we’ve progressed. And that’s the core difference between the observation and acknowledgment of progress and historicism, which is much less grounded in empirical reality.

Right. The way I understand human progress is backward-looking. We can say that we are richer than we were in the past. Fewer women die in childbirth. Fewer infants die. We have fewer casualties in wars, et cetera. But we don’t know where we are going.

Yeah, absolutely. There were moments during the Cold War that could have plunged us into nuclear war. It makes no sense to try to cram every idea into some existing paradigm or prophecy. All we can do is incrementally move toward a better world.

This brings us to another big name in your piece: Frank Fukuyama. Tell me how you read Fukuyama.

Fukuyama is perhaps the most misread political science writer of our time. There are countless lazy journalists who want to add intellectual heft to their article about some new crisis, and they’ll say, “well, it turns out Fukuyama was wrong. There are still bad things happening in the world.” That’s a fundamental misreading of Fukuyama’s argument. He never said that bad things would stop happening. He never said there would be an end to war, poverty, or political upheaval. His argument was that liberal capitalist democracy is the most sustainable political and economic system, that it had proven itself against the great ideological competitors in the 20th century, and that it would continue to do so in the future.

I think it’s still a live thesis, it hasn’t been proven or disproven. I suppose if the entire world collapsed into totalitarianism and remained that way, then yeah, Fukuyama was wrong. But right now, there’s still a vibrant democratic world competing against the authoritarian world, and I think that liberal democracy will continue to outperform.

You use a phrase in the essay I didn’t quite understand: “the desire for recognition.” What does it mean, and why is it important to Fukuyama?

The desire for recognition is the acknowledgment that human desires go beyond material concerns. We want to be treated as individuals with worth and agency, and we are willing to sacrifice ourselves for purely abstract goals. Liberal democracies are the only systems so far that have met the desire for recognition on a vast scale. Liberal democracies treat people as autonomous, rational ends in themselves, unlike dictatorships, which treat people as expendable, and that’s one of the reasons why liberal democracy has lasted as long as it has.

However, there’s a dark side. Because liberal democracy enables pluralism, people can believe whatever they want religiously and go down whatever political rabbit holes they want to. And, oftentimes, when you have the freedom to join these other tribes, you find yourself more committed to those tribes than to the overall society. If you’re a very serious Christian nationalist, you might want society organized along the lines of the Ten Commandments because that, in your view, is the foundation of morality. So, pluralism, which is one of the strengths of liberal democracy, also creates constant threats that liberal democracy has to navigate.

I noticed in your essay that you are not too concerned. You note that democracy is not in full retreat and that, if you look at the numbers, things are not as dire as they seem. What is the argument?

If you just read annual reports from Freedom House, you would think that we’re on our way to global authoritarianism. However, if you take a longer historical view, even just 80 years versus 20 years, the trend line is still dramatically in favor of liberal democracies. It’s still an amazing historical achievement. It’s getting rolled back, but in the grand sweep of history, it’s getting rolled back on the margins.

Still, it’s a dangerous and frightening trend. And you’re in a dangerous place when you see a country like the United States electing a president who is expressly hostile toward the exchange of power after four years. So, the threats to democracy are real, but we need to have some historical perspective.

So, we are more liberally democratic than we were 40 years ago, but something has happened in the last 15 to 20 years. Some of the trust and belief in liberal democracy has eroded.

How is that connected to the issue of recognition?

In the United States, if you look at just the past five or six years, there has been a dramatic shift toward identity politics, which is a form of the desire for recognition.

On the left, there was an explosion of wokeness, especially in 2020, where there was a lot of authoritarianism. People were shouted down for fairly anodyne comments, and editors were churned out of their roles. And on the right, there’s this sense that native-born Americans are more completely American than other people. All of these things are forms of identity politics, and they privilege one group over another and drive people away from a universal conception of citizenship. That’s one of the big reasons why people have become less committed to pluralism and the classic American idea of E pluribus unum.

Have you ever thought about why, specifically after 2012, there was this massive outpouring of wokeness and identity politics? Some people on the right suggest that this is because America has begun to lose religion, and, as a consequence, people are seeking recognition in politics.

I think it could be a consequence of the decline of religion. I’ve written a lot about what many people regard as a crisis of meaning in Western liberal democracies. I think, to some extent, that crisis is overblown. Many people don’t need to have some sort of superstructure or belief system that goes beyond humanism or their commitment to liberalism or what have you.

However, I also think that we’re inclined toward religious belief. We search for things to worship. People don’t really want to create their own belief systems; they would rather go out there and pick a structure off the shelf. For some, it’s Catholicism or Protestantism, and for others, it’s Wokeism or white identity politics. And there were elements of the woke explosion that seemed deeply religious. People talked about original sin and literally fell on their knees.

We also live in an era that has been, by historical standards, extremely peaceful and prosperous, and I think Fukuyama is right that people search for things to fight over. The more prosperous your society is, the more you’ll be incensed by minor inequalities or slights. The complaints you hear from people today would be baffling to people one hundred years ago.

I also think the desire for recognition gets re-normed all the time. It doesn’t really matter how much your aggregate conditions have improved; when new people come into the world, they have a set of expectations based on their surroundings. And it’s a well-established psychological principle that people are less concerned about their absolute level of well-being than their well-being relative to their neighbors. If you see your neighbor has a bigger house or bigger boat, you feel like you’ve been cheated. And this is also the language that Donald Trump uses. It’s very zero-sum, and he traffics in this idea that everything is horrible.

You raised a subject that I’m very interested in, which is the crisis of meaning. I don’t know what to make of it. Everybody, including people I admire and respect, seems to think there is a crisis of meaning, but I don’t know what that means.

Is there more of a crisis of meaning today than there was 100 years ago or even 50 years ago? And what does it really mean? Have you thought about this issue?

You’re right to question where this claim comes from. How can people who claim there is a crisis of meaning see inside the minds of the people who say that they don’t need religion to live a meaningful life? There’s something extremely presumptuous there, and I’m not sure how it’s supposed to be quantified.

People say, well, look at the explosion of conspiracism and pseudoscience. And there are people who’ve become interested in astrology and things like that. But humanity has been crammed with pseudoscience and superstition for as long as we’ve been around. It’s very difficult to compare Western societies today to the way they were a few hundred years ago when people were killed for blasphemy and witchcraft.

And look at what our societies have accomplished in living memory. Look at the vast increase in material well-being, the vast improvements in life expectancy, literacy, everything you can imagine. I find all that very inspiring. I think if we start talking about democracy and capitalism in that grander historical context, then maybe we can make some inroads against the cynicism and the nihilism that have taken root.

Blog Post | Water & Sanitation

If You Think New York City Life Is Bad Now

A grim tour of preindustrial New York

Summary: Many people today feel that life in New York has become uniquely difficult. Some imagine that the city was cleaner, safer, and more livable in the distant past. Historical reality tells a different story: Preindustrial New York was marked by extreme filth, unsafe water, rampant disease, pervasive poverty, and living conditions that made everyday life harsh and dangerous compared to contemporary times.


Discontent fueled the 2025 New York City mayoral election and Zohran Mamdani’s victory. A common theme echoed across the five boroughs: New York is a hard place to live. “We are overwhelmed by housing costs,” said Santiago, a 69-year-old retiree, outside a Mamdani rally. Those opposed to Mamdani had their own complaints. María Moreno, a first-time voter from the Bronx who supported Andrew Cuomo, lamented, “Now everything’s dirty, and our neighborhood does not feel safe.”

Today’s voters have legitimate grievances. The city’s housing costs, quality-of-life issues, and perceptions of disorder weigh heavily on residents’ minds. But it’s important to keep things in perspective. Different voters may romanticize different eras, but many seem to share a sense that if they could travel back far enough in time, they’d find a New York that was once clean, safe, and affordable. When Americans were polled in 2023, almost 20 percent said that it was easier to “have a thriving and fulfilling life” hundreds of years ago. Across the country, as one writer put it, people are engaged in an “endless debate around whether the preindustrial past was clearly better than what we have now.” In fact, Mamdani’s politics are grounded in an ideology that first arose from the frustrations of the early industrial era.

If Americans could go back in time to preindustrial New York City, however, they’d likely be horrified and possibly traumatized. Despite today’s real challenges, most New Yorkers would not trade places with their predecessors.

Long before the rise of factories and industry, New York City was a bustling port, founded by the Dutch as New Amsterdam in order to trade furs in the early seventeenth century. As early as 1650, local authorities enacted an ordinance against animals roaming the streets to protect local infrastructure—but to no avail. Then, in 1657, according to the Dutch scholar Jaap Harskamp:

New Amsterdam’s council attempted to ban the common practice of throwing rubbish, ashes, oyster-shells or dead animals in the street and leave the filth there to be consumed by droves of pigs on the loose. When the English took over the colony from the Dutch, pigs and goats stayed put. . . . Pollution persisted. The streets of Manhattan were a stinking mass. Inhabitants hurled carcasses and the contents of loaded chamber pots into the street and rivers. Runoff from tanneries where skins were turned into leather flowed into the waters that supplied the shallow wells. The (salty) natural springs and ponds in the region became contaminated with animal and human waste. For some considerable time, access to clean water remained an urgent problem for the city. . . . The penetrating smell of decomposing flesh was everywhere.

Into the early twentieth century, urban living in the United States felt surprisingly rural and agrarian, with an omnipresent reek to match. As late as the mid-nineteenth century, pigs roamed freely through New York City streets, acting as scavengers, and nearly every household maintained a vegetable garden, often fertilized with animal manure.

Indoor air quality was no better. A drawing from Mary L. Booth’s History of the City of New York depicts a seventeenth century New Amsterdam home with smoke from the fireplace swirling through the room. Indoor air pollution remains a serious problem today in the poorest parts of the world, as smoke from hearths can cause cancer and acute respiratory infections that often prove deadly in children. One preindustrial writer railed against the “pernicious smoke [from fireplaces] superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.”

That said, before industrialization, though inescapable filth coated the interiors of homes, the average person owned few possessions for the corrosive hearth smoke and soot to ruin. By modern standards, New Yorkers—like most preindustrial people—were impoverished and lacked even the most basic amenities. According to historian Judith Flanders, in the mid-eighteenth century, “fewer than two households in ten in some counties of New York possessed a fork.” Many were desperately poor even by the standards of the day and could not afford housing. One 1788 account lamented how in New York City, “vagrants multiply on our Hands to an amazing Degree.” Charity records suggest that the “outdoor poor” far outnumbered those in almshouses.

Water quality was infamously awful. In seventeenth-century New Amsterdam, as Benjamin Bullivant observed, “[There are] many publique wells enclosed & Covered in ye Streetes . . . [which are] Nasty & unregarded.” A century later, New York’s water remained as foul as Bullivant had described. Visiting in 1748, the Swedish botanist Peter Kalm noted that the city’s well water was so filthy that horses from out of town refused to drink it. In 1798, the Commercial Advertiser condemned Manhattan’s main well as “a shocking hole, where all impure things center together and engender the worst of unwholesome productions; foul with excrement, frogspawn, and reptiles, that delicate pump system is supplied. The water has grown worse manifestly within a few years. It is time to look out [for] some other supply, and discontinue the use of a water growing less and less wholesome every day. . . . It is so bad . . . as to be very sickly and nauseating; and the larger the city grows the worse this evil will be.”

In 1831, a letter in the New York Evening Journal described the state of the water supply:

I have no doubt that one cause of the numerous stomach affections so common in this city is the impure, I may say poisonous nature of the pernicious Manhattan water which thousands of us daily and constantly use. It is true the unpalatableness of this abominable fluid prevents almost every person from using it as a beverage at the table, but you will know that all the cooking of a very large portion of the community is done through the agency of this common nuisance. Our tea and coffee are made of it, our bread is mixed with it, and our meat and vegetables are boiled in it. Our linen happily escapes the contamination of its touch, “for no two things hold more antipathy” than soap and this vile water.

In 1832, New York experienced a devastating outbreak of cholera, a bacterial disease that typically spread through contaminated water and killed with remarkable speed. A person could wake up feeling well and be dead by nightfall, struck down with agonizing cramps, vomiting, and diarrhea. The epidemic killed about 3,500 New Yorkers.

The initial actions taken to protect city water supplies were often private in nature. In fact, throughout the eighteenth and early nineteenth centuries, private businesses generally supplied urban water infrastructure. Despite such efforts, drinking water remained generally unsafe, even after industrialization, until the chlorination of urban water supplies became widespread.

The pervasive grime took a visible toll on New Yorkers. Between drinking tainted water, eating contaminated food, inhaling smoke-filled air, and living with poor hygiene, the average resident sported visibly rotten teeth. One letter from 1781 described an acquaintance: “Her teeth are beginning to decay, which is the case with most New York girls, after eighteen.”

The dental practices of the time were often as horrifying as the effects of neglect. The medieval method of using arsenic to kill gum tissue, providing pain relief by destroying nerve endings, remained common until the introduction of Novocain in the twentieth century. As late as 1879, the New York Times ran a story with the headline “Fatal Poison in a Tooth; What Caused the Horrible Death of Mr. Gardiner. A Man’s Head Nearly Severed from His Body by Decay Caused by Arsenic Which Had Been Placed in One of His Teeth to Deaden an Aching Nerve—an Extraordinary Case.” The story detailed the gruesome demise of a man in Brooklyn, George Arthur Gardiner, who died “in great agony, after two weeks of indescribable suffering.”

Preindustrial New York City wasn’t uniquely miserable for its time. Life was harsh everywhere, and cities around the world contended with the same foul smells, filth, poor sanitation, and grinding poverty. Rural villages were no better. Peasant families often brought their livestock indoors at night and slept huddled together for warmth. In many cases, rural peasants were even poorer than their urban counterparts and owned fewer possessions. Farm laborers frequently suffered injuries and aged prematurely from backbreaking work, while fertilizing cesspits spread disease and filled the air with an inescapable stench.

Though they may have been slightly better off than their rural counterparts, the struggles of early New Yorkers are worth remembering. However daunting the problems of today may seem, a proper historical perspective can remind us of how far we’ve come.

This article was originally published in City Journal on 1/13/2026.

Blog Post | Wellbeing

Meaning and Morality in the Modern Age | Podcast Highlights

Marian Tupy interviews Steven Pinker about the so-called "crisis of meaning," the decline of religion, and what can give life purpose in a modern, largely secular world.

Listen to the podcast or read the full transcript here.

Today, I’m pleased to have with me Steven Pinker, a world-renowned Harvard University psychologist and author of best-selling books including The Blank Slate, The Better Angels of Our Nature, Enlightenment Now, and of course, most recently, When Everyone Knows That Everyone Knows. Highly recommend all of them.

Let’s start at a high level and look at how Americans think about the country. Gallup shows that 80 percent of Americans are either satisfied or very satisfied with their lives, but only 20 percent are satisfied with the way that America is going. That’s a bit of a discrepancy.

What does a psychologist have to say about that?

It’s a fascinating phenomenon that pollsters have known about for decades. They call it “the optimism gap.” It appears in just about any question.

“What is the quality of education in this country?”

“It’s terrible.”

“What’s the quality of your child’s school?”

“Well, not bad.”

“How safe is the country?”

“Oh, you can’t walk anywhere. You’ll get mugged.”

“How safe is your neighborhood?”

“Oh, I feel perfectly fine.”

Part of it is that, because none of us can experience the entire country ourselves, our opinions are based on media coverage, and the media have a number of negativity biases. The nature of news selects for negative events because it reports what’s new and discrete enough to be a story. New, discrete events are more likely to be bad than good because there are many more ways for things to go wrong than for things to go right. And while bad things, like a terrorist attack or natural disaster, can happen quickly, positive things tend to be things that don’t happen or things that happen gradually, like the long-term decline in extreme poverty, the rise in literacy, and many other trends that you’ve written about.

Editors also feel more responsible if they emphasize negative stories over positive ones. I’ve heard one editor say, “Well, negative news is journalism, and positive news is advertising.” I think it was Stewart Brand who once said, more generally, that a pessimist sounds like he’s trying to help you, while an optimist sounds like he’s trying to sell you something. So, our picture of the country and the world as a whole is distorted both deliberately and accidentally by the very nature of news.

Let me mention one other thing. There really are problems in the world, to put it mildly, and some things have gotten worse in the last 10 or 20 years. But one has to have a quantitative, statistical, probabilistic view of the world to acknowledge the reality that things can get worse while still being better than they were historically, and that some things can get worse while other things are getting better.

You don’t conclude from something that genuinely has gotten worse that everything has gotten worse or that we’re in a worse situation now than we ever have been.

You mentioned literacy. Recently, I’ve been reading about freshmen entering university without basic reading and math skills. People are reading fewer books. Are we getting dumber, and is education an example of something that is worse than it was 40 or 50 years ago?

Yes, and it’s not the only example. The world’s democracy score has gone down in the last couple of decades. War deaths are worse now than they were 20 years ago, although still better than they were in the ’60s, ’70s, ’80s, and most of the ’90s. But yeah, educational scores have gone down. The Flynn effect, by which IQ scores rose for about three points a year for almost a century, has now gone in the other direction.

Now, that doesn’t mean that we’re back to the level that we were 100 years ago, but there’s been a bit of a droop. It may be that there are pathologies in our educational system, that the drive for equity and especially for equity across all racial groups has led to bringing down the top rather than raising the bottom. It could be that our schools of education have been training teachers to use the wrong methods. There’s also the fact that, while reading and literacy are good things, they are cognitively unnatural. We didn’t evolve with print; it’s a recent invention, and we’ve seen, especially in the last 10 years, that a lot of people prefer listening and watching to reading. Thanks to the massive availability of video, people may no longer be putting the effort into developing literacy, which we have reason to believe was one of the drivers of the Flynn effect and of cognitive sophistication in general.

My understanding is that the decline of reading and math scores is most severe at the low end. The smart students have not declined much, but weaker students have. So, it is a problem, and I think it’s a problem that ought to be addressed.

When it comes to the decline in reading books, there may be one other factor: the optimal length of a work of text may no longer be a book. I have found that, as a curious person, I can get lost in reading about things on Wikipedia like the history of the potato chip or transatlantic travel or planets. There’s just a flood of information out there and it’s all really interesting. And I say this with some embarrassment because I write books, and sometimes very long books, but for some kinds of information, it may be that a book has diminishing returns.

Let’s now look at other criticisms of human progress.

You and I had an article in The Free Press pushing back against the “crisis of meaning.” Have you ever seen any hard evidence suggesting that people’s lives are more meaningless in rich countries versus poor countries or that lives are less meaningful today than they used to be?

No, I haven’t.

We don’t have survey data on “How meaningful do you think life is?”, but meaning and happiness seem to be partially correlated. So, in general, people who are happier say their lives are more meaningful. But some sources of meaning are not the same as sources of happiness, and vice versa. Just to give a couple of examples, if you’re dedicating your life to some cause, there can be setbacks and frustrations that make you less happy, but you say your life is more meaningful compared to a life of pleasure and leisure. Time spent with friends is more pleasurable, while time spent with family is more meaningful. So, meaning and happiness are not perfectly correlated, but they are partially correlated.

Over the course of history, if you look at the whole range of countries, there has been more of an increase in happiness than a decrease. In countries that are very affluent, like the United States, there has not been an increase in happiness. We may be close to the ceiling. But overall, across the world, there’s reason to believe that happiness has increased, so that would suggest but not prove that there has not been a decline in meaningfulness.

Anecdotally, there have been complaints that life is meaningless as far back as you go. Ecclesiastes: “Vanity of vanities, all is vanity.” Henry David Thoreau in 1854: “The mass of men lead lives of quiet desperation.” T.S. Eliot, 1920s: “We are the hollow men, we are the stuffed men.” So, it’s a constant complaint, and the fact that people say it doesn’t necessarily mean it’s true. It’s always tempting to think that life is meaningless. We like to think that there is a plan to the universe, and we get disillusioned when we find out there isn’t one. The laws of nature don’t tell any story with an ending. There are things built into the evolutionary process that guarantee that life is going to appear meaningless. There’s the law of entropy. Things fall apart and decay. We die, we get older, we weaken. Even our closest relationships are never perfect.

Now, I think the answer to that is to focus on human purposes, like not dying young, not getting shot, knowing more, experiencing art and culture, experiencing friendship, and seeing the world. But one has to reorient and realize that those are the goals of life and not expect that the universe itself tells a satisfying story.

People often look at proxies for meaning, such as anxiety and suicide. There seems to be some evidence that rich countries have higher rates of anxiety than poor countries. Of course, definitions can change and expand. Trauma used to mean being bombed by the Germans; today, it may be that you are breaking up with your boyfriend or girlfriend.

Do you have any sense as to how reliable the data on anxiety and trauma is?

There’s certainly been some diagnostic category creep. I’ve seen this in my own students. There’s an eagerness to diagnose oneself, sometimes with bogus diagnoses like autism for introversion. There’s a funny kind of cachet to having a pathology. But looking retrospectively at surveys, I think there probably has also been, on top of that, some increase in anxiety since the late 1950s.

Some of that may be that we’re taking on more responsibilities and adding to our anxiety burden. When I think back to my parents in the 1950s, there were a lot of things that they just never thought about. Are they getting enough exercise? Are they exposing themselves to skin cancer risk by going out in the sun? The state of the climate, inequality. Most people didn’t think about these things.

Jean Twenge and Jon Haidt have been trying to make the case that social media, especially through smartphones, has led to a genuine rise in anxiety, particularly in younger people. There’s some controversy there over cause and effect—maybe anxious and depressed kids turn to social media—but there seems to be at least some evidence that suggests causation.

Let me offer to our listeners what I consider to be the strongest argument in favor of rational optimism.

The clearest sign of unhappiness is when you kill yourself. Here in the United States, we’ve had an increase in suicides, but suicides are dropping in most, if not all, other rich countries. So, it seems there is a particular American pathology rather than a general pathology in prosperous countries. What’s wrong with this argument?

When I report on violence, I usually concentrate on homicide, simply because homicide is the most objective measure of violence. A dead body is hard to argue away, and people record homicides pretty accurately, so it’s the best indicator of violence. By extension, one might think that suicide would be the best indicator of unhappiness. But, partly to my surprise, that doesn’t seem to be right.

There is more ambiguity in how officials record suicide deaths. For example, when there’s a stigma against suicide, they’re often classified as accidents. Also, as best as we can tell, there’s not an excellent correlation between the suicide rate and national unhappiness. There’s even what some researchers call the suicide-unhappiness paradox, which is that countries where people are happier can sometimes have higher suicide rates, partly for the same reason that suicide rates increase around Christmas: if you look around and everyone is happy and you’re not, then you really think you’re a loser.

Suicide rates are also driven by contagion and by how easy it is to commit suicide. I quote the rather macabre poem by Dorothy Parker: “Guns aren’t lawful, nooses give, gas smells awful, you might as well live.” Suicide went way down in Britain when they changed the composition of cooking gas from coal gas to methane, which is not toxic.In developing countries, access to pesticides, a common method of suicide, has a big effect on actual rates. And in the United States, the availability of guns seems to be one of the drivers.

So, there are a lot of puzzles with suicide rates. But generally, I think it’s important to point out, as you do, that suicide rates are actually dropping globally, especially in poorer countries, but also in many rich countries. The United States is something of an anomaly. Since the 1990s, when the Global Burden of Disease project began to collect data, suicide has gone down by about 40 percent. A lot of that is thanks to urbanization. When a woman is put into an arranged marriage and leaves her village for the village of her husband, where she is dominated by her in-laws and has no friends and no way of escaping, that leads to a lot of suicides. In a more modern urban culture where you kind of have more freedom, there’s less desperation. So globally, modernization and urbanization have led to falling suicide rates.Even in the United States, suicide rates went down until the mid to late 1990s. That was a low point, and they’ve been rising since then, but it’s not as if they’ve been inexorably rising over the last century.

Those are very good caveats, thanks for introducing that nuance.

One thing that you and I discussed in our Free Press article was the criticism that meaninglessness in the West is driven in part by falling religiosity. A defender of religion might say that religion is essentially a cognitive or cultural technology for producing responsibility, happiness, restraint, and gratitude. So, if you remove religion, you may be making people more irresponsible, more unhappy, less restrained, and less grateful.

What do you think about that argument?

There is a need for community institutions and organizations that bring people together, that discuss meaning and morality, and that are a locus for collective action. The problem is that if you bundle that with theology, miracles, scripture, and invisible agents, it just isn’t going to be convincing anymore.

Religion wasn’t taken away from people; people left religion. In every developed country, there’s been a move away from organized religion. The churches are still around, and no one’s stopping people from attending; they just don’t find that religion gives them meaning and purpose. This is partly because the institutions themselves have not been sources of morality or meaning. The Roman Catholic Church with its sex abuse scandals, evangelical Protestantism in the United States with its embrace of far-right politics, the subordinate role of women in the more conservative religions like Orthodox Judaism—these are just turn-offs.

I’m gonna quote G. K. Chesterton, who is supposed to have said that when men stop believing in God, they don’t believe in nothing, they believe in anything. A 2021 national survey found that young Americans are more likely to believe in witchcraft, luck, black magic, and spell casting.

What do you make of the argument that Christianity keeps the belief in black magic and witchcraft at bay?

A few things. The witch hunts of the 16th century were a Christian movement. I mean, “Thou shalt not suffer a witch to live” is in the Christian Bible. I also think Chesterton was wrong about the idea that people who are more religious are also more open to astrology, ESP, the paranormal, crystal healing, and other kinds of New Age woo-woo. I don’t think it’s true as a general correlation.

The data that you cite on openness to paranormal beliefs is interesting. I’ve never reported this, but I’ve looked at trends in the belief in devils, ESP, precognition, curses, and all kinds of paranormal things. As best as I can tell, it’s been pretty flat since the 1970s.

Something to be aware of is that there are different ways in which societies can change, and quantitatively, it’s not always easy to tell them apart. There can be a cohort effect, that is, as one generation replaces another, that generation has beliefs that they carry with them as they age; a period effect, where everyone changes their beliefs; or a life cycle event where, as people age, they change their beliefs. As best I can tell, what you cited is largely an age effect. Younger people are more open to woo-woo and magic than older people. So, I think those data are correct, but don’t necessarily mean that societies have become more open to the paranormal.

One way or another, there is a sizable chunk of the population that is attracted to the supernatural or transcendental, the so-called God-shaped hole in the human heart. Critics say that irreligious people are offering a meaningless, cold universe without a purpose, and that people really need some form of transcendence to make sense of their lives.

What do you think of that argument?

I think it’s literally wrong in the sense that people’s craving for meaning and purpose isn’t shaped like a God. In fact, that argument is sometimes used to explain the rise of wokeness, that religion was replaced with the idea that differences between groups are a moral emergency, and you have to find the oppressors responsible and punish them. There’s no God in any of that.

Granted, many people do search for transcendence, but kids like to believe in Santa Claus. That belief doesn’t have to be indulged. Kant’s definition of the Enlightenment was man’s escape from his self-imposed childhood. Part of growing up involves some hard lessons, like the universe is a cold place, and it doesn’t care about you. That does not mean life is meaningless, because the fact that the universe doesn’t care about you doesn’t mean that other humans don’t care about you or that we don’t have to care about other humans. We have a purpose, which is to make people as well off as possible, to increase flourishing, to increase knowledge, life, health, freedom, and safety. These are really meaningful goals that I don’t think should leave you empty.

Without religion, what is the basis of morality? Where does morality come from if not from man being created in the image of God?

Well, man being created in the image of God doesn’t give you a whole lot of morality. If you look at the Old Testament, God is commanding the Israelites to rape, massacre, and mutilate their enemies, while there are religious prescriptions against mixing linen and cotton, lighting a fire on Saturday, and other crazy stuff that has nothing to do with morality as we could argue for it.

Conversely, I think the obvious source of morality is some kind of Golden Rule. The way we teach kids to be moral is we say, “How would you like that if someone did that to you?” The logical basis of mortality is that, as long as I’m not the galactic overlord and my fate depends on other people, I’ve got to agree to some sort of social contract that treats us as equivalent. That’s why versions of the Golden Rule have been independently discovered by many different cultures.

Here’s the most common counterargument I hear to that point of view: it is very well for an intelligent professor who reads a lot of books to derive moral principles from reciprocity, reason, and self-interest, but ordinary people don’t think like that.

What’s wrong with just picking an oven-ready set of moral norms off the shelf, like those presented by modern Christianity, which have been made more humane over time? You don’t have to do much thinking, for which you might not have time or ability.

Well, I think that could be a means to an end, but one must keep in mind what the end is, which is humanistic morality that we can justify. As we know, religions can contain off-the-shelf moralities such as “kill anyone who insults the prophet Muhammad,” “execute blasphemers or gay people,” or “thou shalt not suffer a witch to live.”

Now there are religions guided by humanistic, enlightenment, universalist principles, such as some of the liberal Protestant denominations and Reform Judaism. I don’t oppose keeping some symbolism and ritual if the institution has moved in a humanistic direction. Maybe that would be a good thing.

A somewhat different criticism of progress has to do with status competition, essentially the idea that no matter how much things get better, ultimately, as you once again put it in your book, men don’t contend with the dead but with the living.

Are our efforts at Human Progress bound to fail because people care about relative rather than absolute improvements in life?

I love that Hobbes quote. He introduces it by saying there’s a natural reverence for antiquity because men contend with the living, not with the dead. That is, intellectuals and moralists will tend to revere earlier eras and bemoan the present era because complaining about the present is another way of complaining about your contemporaries, who are your rivals. That’s another reason there is a negativity bias.

That’s an aside on elite status competition, but we all compare ourselves to others. So, in that sense, there won’t ever be a utopia. People will always compare themselves to others and be less happy than they ought to be. Still, it’s worth working toward progress. Even if you’re a spoiled first-world brat, it’s still better that you live to 80 instead of 55. It’s still better that your kids don’t die. It’s still better to travel the world instead of being confined to your village.

There’s a quote on my wall from a psychologist called Richard Layard that reads, “One secret of happiness is to ignore comparisons with people who are more successful than you are. Always compare downwards, not upwards.”

How do we go about explaining to people that it’s okay that there is always going to be somebody who is taller, smarter, and more handsome than you are?

You’re right that this is a piece of wisdom we’d be better off having, but it’s not easy to engineer. Some features of culture are very bottom-up. They can be influenced by education and by the messages that we give children, but no one’s really in charge; it’s the result of millions of people interacting with each other every day. However, we shouldn’t abdicate our responsibility for what we teach kids. We can do our part and try to nudge them in the right direction.

The Human Progress Podcast | Ep. 76

Meaning and Morality in the Modern Age

Steven Pinker joins Marian Tupy to discuss the so-called "crisis of meaning," the decline of religion, and what can give life purpose in a modern, largely secular world.

ScienceDirect | Trust

People Overestimate the Actual Dishonesty of Others

“Do people believe that others are similarly, more, or less dishonest than they truly are? The accuracy of dishonesty beliefs is not only important for psychological knowledge, but also has implications for organizations and policymaking. In this paper, Study 1 presents a research program on moral decision-making comprising 31 different effects from 11 experiments, where participants could anonymously lie for personal gain. Crucially, participants were also asked to estimate what percentage of other people would lie in the same situation. An internal meta-analysis summarizing all belief-behavior comparisons revealed that people substantially overestimate others’ dishonest behavior (g = 0.61; k = 31; N = 8126), by 13.6 percentage points on average. This effect holds across study contexts and participants’ own behavior, and 63.5% of participants overestimated dishonesty by 5 percentage points or more (only 25.4% underestimated it). We then examined potential consequences of biased dishonesty beliefs in three pre-registered follow-up studies. Study 2 (N = 981) found that providing correct information about actual honesty levels enhanced general prosocial expectations (e.g., trustworthiness, fairness). Study 3 (N = 285) revealed that professional managers have pessimistic beliefs also about people’s real-world dishonesty (e.g., insurance fraud, workplace theft), and moral pessimism predicted greater support for freedom-restrictive countermeasures to reduce dishonesty (e.g., surveillance). Study 4 (N = 741) demonstrated that providing managers with correct information about actual honesty levels causally reduced their support for freedom-restrictive countermeasures. In conclusion, the pessimistic bias in dishonesty beliefs about others is robust, and it shapes prosocial expectations and policy preferences.”

From ScienceDirect.