fbpx
01 / 05
Young Americans Are Getting Happier

The Economist | Happiness & Satisfaction

Young Americans Are Getting Happier

“American youth are in the midst of a mental-illness epidemic. Few know this better than Daniel Eisenberg. In 2007 the UCLA health-policy professor, then at the University of Michigan, sent a mental-health survey to 5,591 college students and found that 22% showed signs of depression. Over the next 15 years as new students were polled this figure grew. In 2022, when more than 95,000 students at 373 universities were surveyed, a staggering 44% displayed symptoms of depression. Then, curiously, the trend reversed. In 2023 41% of students seemed depressed; in 2024, the figure fell again to 38%. Mr Eisenberg is cautiously optimistic. ‘It’s the first time that things are moving in a positive direction.'”

From The Economist.

Blog Post | Communications

We’re Living in a Split-Screen America

The evolution from broadcast news to personalized feeds has fractured how we see the world, but progress is possible.

Summary: Americans once shared a common media landscape, but the rise of personalized digital feeds has splintered that reality into partisan echo chambers. Social platforms now amplify outrage, reinforce tribal instincts, and erode agreement on basic facts. While there is no easy fix, reforms in design, digital literacy, and cultural norms offer hope for a more truthful and united public discourse.


“And that’s the way it is.” At least, that’s the way it was. When Walter Cronkite closed his nightly broadcasts with those words, America was a foreign country. At the height of broadcast news, Americans had differences of opinion but agreed on a basic set of facts about what was going on in the country and the world. Anchors like Cronkite, voted in 1972 by Democrats and Republicans alike as the most trusted man in America, aimed to be impartial and to win bipartisan credibility. But as partisan cable news and talk radio came to prominence in the 1990s, basic agreement on the facts began to erode. And with the rise of social media, it splintered entirely.

Platforms like Facebook, YouTube, TikTok, and Twitter personalize content to maximize engagement (time spent on an app, posts liked and shared), showing you what you want to see. That reinforces users’ existing beliefs and limits exposure to opposing views. Strikingly, a Meta-commissioned study of 208 million users during the 2020 U.S. election cycle showed that liberals and conservatives on Facebook encountered almost entirely non-overlapping news sources. Once a social media user spends time looking at political content on one of these platforms, he or she is fed more and more of the same. Far from the broadcasts of the mid-century, modern news is delivered via increasingly bespoke “narrowcast.”

This political siloing is not trivial. Americans now inhabit split-screen realities. In one 2023 Gallup poll, 90 percent of Republicans believed crime was rising, while 60 percent of Democrats believed it was falling. On climate change, a 2021 survey showed a 56-point partisan gap in beliefs about whether humans have a serious impact on the climate system (compared to a 16-point gap in 2001). In 2024, 44 percent of Democrats rated the national economy as “excellent or good,” compared to only 13 percent of Republicans, despite the same underlying economic conditions. The gap wasn’t driven by personal finances, but by partisan interpretations of identical economic indicators. These are not differences of opinion; they are incommensurable beliefs about the state of the world.

But platforms don’t just feed us headlines that align with our politics. They also bait our strongest emotions. In 2017, Facebook began weighting “angry” reactions five times more heavily than “likes” when floating posts to the top of our feeds. That same year, a study found that each additional moral-emotional word in a tweet (think “shameful,” “detestable”, “evil”) significantly increased the likelihood of it being shared and reshared.

This platform design calls up ancient instincts. Humans evolved to detect threats to the coalition, to signal our group loyalty, and to rally allies against rivals. A tweet calling someone “abhorrent” isn’t just an opinion; it’s a tribal call to action. And because these platforms so reliably elicit our ire and impel us to spread it to others, they’ve become outrage engines.

They create sealed chambers that echo our anger, where contrary evidence is unlikely to penetrate. Carl Sagan now sounds prescient when he warned in 1995 of a future where Americans, embedded in an information economy, would become “unable to distinguish between what feels good and what’s true,” leaving society vulnerable to illusion and manipulation.

And the consequences of the outrage engines don’t stop at our borders. In 2016, Russian operatives used fake personas on Facebook and Twitter to spread inflammatory memes targeting both liberals and conservatives. They didn’t need to hack anything. They simply exploited an information ecosystem already optimized for spreading partisan outrage.

What can be done? There is no single fix, but meaningful improvements are possible.

In a randomized study, older adults who received just one hour of digital literacy training from MediaWise improved their ability to tell false headlines from real ones by 21 percentage points. When Twitter added a prompt asking users if they wanted to read an article before retweeting it, people were 40 percent more likely to click through to the article before sharing it impulsively.

Choice helps too. In one study, switching users from a feed that had been personalized by the algorithm to one that showed posts in chronological order measurably increased their exposure to content across the political aisle. While it may not be a silver bullet, giving users the ability to choose their feed structure, including which algorithm to use, allows for opportunities to be exposed to contrary opinions and to peer outside the echo chamber.

But deeper change is cultural. A compelling case has been made that human reasoning evolved not to uncover objective truth, but to persuade others, to justify our own ideas, and to win arguments. That is why the habits of sound reasoning must be cultivated through norms that prize truth over tribal loyalty, deliberation over impulsivity, and the ability to make the best case for opposing views in order to oppose them on their merits.

This isn’t a call for censorship or government control of the news, nor is it a plea to go back to three-network broadcasting. The democratization of media has brought real benefits, including broader participation in public discourse and greater scrutiny of powerful institutions. But it has also made public life more combustible and has manufactured disagreements about factual questions. In a competition for attention, platforms are designed to maximize time spent on them. That means elevating content that provokes strong emotional responses, especially outrage, and targeting it toward the users most likely to react. The more incendiary the content, the more likely it is to hold us captivated.

What we are witnessing is not a failure of the market, but a particularly efficient version of it, albeit one that optimizes for attention, not accuracy. Personalized feeds, algorithmic curation, and viral content are giving people more of what they want. And yet, many Americans say they are dissatisfied with the result. In a 2023 Pew survey, 86 percent of U.S. adults said they believe Democrats and Republicans are more focused on fighting each other than solving real problems, and respondents across party lines cited political polarization as the biggest problem with the political system.

While online outrage bubbles may not qualify as a market failure in the technical sense, they are clearly a civic problem worth confronting. An information ecosystem optimized for attention rather than accuracy will reliably amplify division and distrust, even while giving users more of what they like to see and share. The incentives are working as designed, but the outcome is a fragmented public unable to agree on the real state of the world. If democracy depends on a shared understanding of basic facts of the matter, then reckoning with these tradeoffs is well worth our much-demanded attention.

Blog Post | Democracy & Autocracy

Open Societies and Closed Minds | Podcast Highlights

Marian Tupy interviews Matt Johnson about historicism, progress, and how tribalism and the “desire for recognition” are testing the foundations of open societies.

Listen to the podcast or read the full transcript here.

Today, I’m very lucky to speak to Matt Johnson, who recently had a fascinating essay in Quillette titled “The Open Society and Its New Enemies: What Karl Popper’s classic can teach us about the threats facing democracies today.”

So Matt, could you tell us who Karl Popper was and what this big book is about?

Popper is mainly known for his scientific work, especially his ideas around falsifiability. He published a book called The Open Society and Its Enemies in 1945. He started writing it right after the Nazi annexation of Austria. It’s a very powerful and clarifying set of principles for anybody interested in liberal democracy and the broader project of building open societies around the world today.

So, why talk about liberal democracies and openness? It is our conjecture here at Human Progress that openness is very important. Have you ever thought or written about the connection between openness, liberal democracy, and the scope and speed of human progress?

That’s been a major theme of my work for a long time. I think there is a strong connection between the development of liberal democracy and open societies throughout the 20th century and human progress. Liberal democracy, unlike its authoritarian rivals, has error correction mechanisms built in. It allows for pluralism in society. It allows people to cooperate without the threat of violence or coercion. There’s also the economic element: Liberal democracy facilitates free trade and open exchange because it’s rule-based and law-bound, which are important conditions for economic development.

Human Progress also assumes that there is some directionality in history. We can say that living in 2025 is better than living in 1025 or 25 AD. But you begin your essay by raising the dangers of what Karl Popper called historicism, or a belief in the inevitability of certain political or economic outcomes. Can you unwind that for us? What is the difference between acknowledging the directionality of human history and historicism?

Popper regarded historicism as extremely dangerous because it treats human beings as a means to an end. If you already know what you’re working toward—a glorious worker state or some other utopia—then it doesn’t matter how much pain you have to inflict in the meantime. You’re not treating your citizens as ends whose rights must be protected; you’re treating them as raw material, as characters in this grand historical story.

The second concern is that historicism is anti-scientific because you can hammer any existing data into a form that fits your historicist prophecy.

Marx wrote that the unfolding of history is inevitable. In his view, leaders were just responsible for making that unavoidable transition easier. That’s the central conceit of historicism. If you take a Popperian view, you’re much more modest. You have to ground every policy in empirical reality. You have to adjust when things don’t work. You’re not just birthing a new paradigm you already know everything about. You don’t know what the future holds.

Stalin would say, anytime there was a setback, that it was all part of the same plan. It was all just globalist saboteurs attacking the Soviet Union, or it was some part of the grand historical unfolding that moving toward the dictatorship of the proletariat. There’s no sense in which new information can change the course of a government with historicist ideas.

That differs from a general idea of progress. We have a lot of economic data that suggests that people have escaped poverty at an incredible rate since the middle of the 20th century. We’ve seen democratization on a vast scale around the world. We’ve seen interstate relations become much more tranquil and peaceful over the past several decades. I mean, the idea of Germany and France fighting a war now is pretty much inconceivable to most people. That’s a huge historical victory, it’s unprecedented in the history of Western Europe.

So, there are good reasons to believe that we’ve progressed. And that’s the core difference between the observation and acknowledgment of progress and historicism, which is much less grounded in empirical reality.

Right. The way I understand human progress is backward-looking. We can say that we are richer than we were in the past. Fewer women die in childbirth. Fewer infants die. We have fewer casualties in wars, et cetera. But we don’t know where we are going.

Yeah, absolutely. There were moments during the Cold War that could have plunged us into nuclear war. It makes no sense to try to cram every idea into some existing paradigm or prophecy. All we can do is incrementally move toward a better world.

This brings us to another big name in your piece: Frank Fukuyama. Tell me how you read Fukuyama.

Fukuyama is perhaps the most misread political science writer of our time. There are countless lazy journalists who want to add intellectual heft to their article about some new crisis, and they’ll say, “well, it turns out Fukuyama was wrong. There are still bad things happening in the world.” That’s a fundamental misreading of Fukuyama’s argument. He never said that bad things would stop happening. He never said there would be an end to war, poverty, or political upheaval. His argument was that liberal capitalist democracy is the most sustainable political and economic system, that it had proven itself against the great ideological competitors in the 20th century, and that it would continue to do so in the future.

I think it’s still a live thesis, it hasn’t been proven or disproven. I suppose if the entire world collapsed into totalitarianism and remained that way, then yeah, Fukuyama was wrong. But right now, there’s still a vibrant democratic world competing against the authoritarian world, and I think that liberal democracy will continue to outperform.

You use a phrase in the essay I didn’t quite understand: “the desire for recognition.” What does it mean, and why is it important to Fukuyama?

The desire for recognition is the acknowledgment that human desires go beyond material concerns. We want to be treated as individuals with worth and agency, and we are willing to sacrifice ourselves for purely abstract goals. Liberal democracies are the only systems so far that have met the desire for recognition on a vast scale. Liberal democracies treat people as autonomous, rational ends in themselves, unlike dictatorships, which treat people as expendable, and that’s one of the reasons why liberal democracy has lasted as long as it has.

However, there’s a dark side. Because liberal democracy enables pluralism, people can believe whatever they want religiously and go down whatever political rabbit holes they want to. And, oftentimes, when you have the freedom to join these other tribes, you find yourself more committed to those tribes than to the overall society. If you’re a very serious Christian nationalist, you might want society organized along the lines of the Ten Commandments because that, in your view, is the foundation of morality. So, pluralism, which is one of the strengths of liberal democracy, also creates constant threats that liberal democracy has to navigate.

I noticed in your essay that you are not too concerned. You note that democracy is not in full retreat and that, if you look at the numbers, things are not as dire as they seem. What is the argument?

If you just read annual reports from Freedom House, you would think that we’re on our way to global authoritarianism. However, if you take a longer historical view, even just 80 years versus 20 years, the trend line is still dramatically in favor of liberal democracies. It’s still an amazing historical achievement. It’s getting rolled back, but in the grand sweep of history, it’s getting rolled back on the margins.

Still, it’s a dangerous and frightening trend. And you’re in a dangerous place when you see a country like the United States electing a president who is expressly hostile toward the exchange of power after four years. So, the threats to democracy are real, but we need to have some historical perspective.

So, we are more liberally democratic than we were 40 years ago, but something has happened in the last 15 to 20 years. Some of the trust and belief in liberal democracy has eroded.

How is that connected to the issue of recognition?

In the United States, if you look at just the past five or six years, there has been a dramatic shift toward identity politics, which is a form of the desire for recognition.

On the left, there was an explosion of wokeness, especially in 2020, where there was a lot of authoritarianism. People were shouted down for fairly anodyne comments, and editors were churned out of their roles. And on the right, there’s this sense that native-born Americans are more completely American than other people. All of these things are forms of identity politics, and they privilege one group over another and drive people away from a universal conception of citizenship. That’s one of the big reasons why people have become less committed to pluralism and the classic American idea of E pluribus unum.

Have you ever thought about why, specifically after 2012, there was this massive outpouring of wokeness and identity politics? Some people on the right suggest that this is because America has begun to lose religion, and, as a consequence, people are seeking recognition in politics.

I think it could be a consequence of the decline of religion. I’ve written a lot about what many people regard as a crisis of meaning in Western liberal democracies. I think, to some extent, that crisis is overblown. Many people don’t need to have some sort of superstructure or belief system that goes beyond humanism or their commitment to liberalism or what have you.

However, I also think that we’re inclined toward religious belief. We search for things to worship. People don’t really want to create their own belief systems; they would rather go out there and pick a structure off the shelf. For some, it’s Catholicism or Protestantism, and for others, it’s Wokeism or white identity politics. And there were elements of the woke explosion that seemed deeply religious. People talked about original sin and literally fell on their knees.

We also live in an era that has been, by historical standards, extremely peaceful and prosperous, and I think Fukuyama is right that people search for things to fight over. The more prosperous your society is, the more you’ll be incensed by minor inequalities or slights. The complaints you hear from people today would be baffling to people one hundred years ago.

I also think the desire for recognition gets re-normed all the time. It doesn’t really matter how much your aggregate conditions have improved; when new people come into the world, they have a set of expectations based on their surroundings. And it’s a well-established psychological principle that people are less concerned about their absolute level of well-being than their well-being relative to their neighbors. If you see your neighbor has a bigger house or bigger boat, you feel like you’ve been cheated. And this is also the language that Donald Trump uses. It’s very zero-sum, and he traffics in this idea that everything is horrible.

You raised a subject that I’m very interested in, which is the crisis of meaning. I don’t know what to make of it. Everybody, including people I admire and respect, seems to think there is a crisis of meaning, but I don’t know what that means.

Is there more of a crisis of meaning today than there was 100 years ago or even 50 years ago? And what does it really mean? Have you thought about this issue?

You’re right to question where this claim comes from. How can people who claim there is a crisis of meaning see inside the minds of the people who say that they don’t need religion to live a meaningful life? There’s something extremely presumptuous there, and I’m not sure how it’s supposed to be quantified.

People say, well, look at the explosion of conspiracism and pseudoscience. And there are people who’ve become interested in astrology and things like that. But humanity has been crammed with pseudoscience and superstition for as long as we’ve been around. It’s very difficult to compare Western societies today to the way they were a few hundred years ago when people were killed for blasphemy and witchcraft.

And look at what our societies have accomplished in living memory. Look at the vast increase in material well-being, the vast improvements in life expectancy, literacy, everything you can imagine. I find all that very inspiring. I think if we start talking about democracy and capitalism in that grander historical context, then maybe we can make some inroads against the cynicism and the nihilism that have taken root.

The Human Progress Podcast | Ep. 61

Matt Johnson: Open Societies and Closed Minds

Marian Tupy speaks with writer and political thinker Matt Johnson about historicism, progress, and how tribalism and the “desire for recognition” are testing the foundations of open societies.

Blog Post | Human Development

The Real Threats to Golden Ages Come From Within

History’s high points have been built on openness, Johan Norberg's new book explains.

Summary: Throughout history, golden ages have emerged when societies embraced openness, curiosity, and innovation. In his book Peak Human, Johan Norberg explores how civilizations from Song China to the Dutch Republic rose through trade, intellectual freedom, and cultural exchange—only to decline when fear and control replaced dynamism. He warns that our current prosperity hinges not on external threats but on whether we choose to uphold or abandon the openness that made it possible.


“Every act of major technological innovation … is an act of rebellion not just against conventional wisdom but against existing practices and vested interests,” says economic historian Joel Mokyr. He could have said the same about artistic, business, scientific, intellectual, and other forms of innovation.

Swedish scholar Johan Norberg’s timely new book—Peak Human: What We Can Learn from the Rise and Fall of Golden Ages—surveys historical episodes in which such acts of rebellion produced outstanding civilizations. He highlights what he calls “golden ages” or historical peaks of humanity ranging from ancient Athens and China under the Song dynasty (960-1279 AD) to the Dutch Republic of the 16th and 17th centuries and the current Anglosphere.

What qualifies as a golden age? According to Norberg, societies that are open, especially to trade, people, and intellectual exchange produce these remarkable periods. They are characterized by optimism, economic growth, and achievements in numerous fields that distinguish them from other contemporary societies.

The civilizations that created golden ages imitated and innovated. Ancient Rome appropriated and adapted Greek architecture and philosophy, but it was also relatively inclusive of immigrants and outsiders: being Roman was a political identity, not an ethnic one. The Abbasid Caliphate that began more than a thousand years ago was the most prosperous place in the world. It located its capital, Baghdad, at the “center of the universe” and from there promoted intellectual tolerance, knowledge, and free trade to produce a flourishing of science, knowledge, and the arts that subsequent civilizations built upon.

China under the Song dynasty was especially impressive. “No classic civilization came as close to unleashing an industrial revolution and creating the modern world as Song China,” writes Norberg.

But that episode, like others in the past, did not last: “All these golden ages experienced a death-to-Socrates moment,’” Norberg observes, “when they soured on their previous commitment to open intellectual exchange and abandoned curiosity for control.”

The status quo is always threatening: the “Elites who have benefited enough from the innovation that elevated them want to kick away the ladder behind them,” while “groups threatened by change try to fossilize culture into an orthodoxy.” Renaissance Italy, for example, came to an end when Protestants and Catholics of the Counter-Reformation clashed and allied themselves with their respective states, thus facilitating repression.

Today we are living in a golden age that has its origins in 17th-century England, which in turn drew from the golden age of the Dutch Republic. It was in 18th-century England that the Industrial Revolution began, producing an explosion of wealth and an escape from mass poverty in much of Western Europe and its offshoots like the United States.

And it was the United States that, since the last century, has served as the backbone of an international system based on openness and the principles that produced the Anglosphere’s success. As such, most of the world is participating in the current golden age, one of unprecedented global improvements in income and well-being.

Donald Trump says he wants to usher in a golden age and appeals to a supposedly better past in the United States. To achieve his goal, he says the United States does not need other countries and that the protectionism he is imposing on the world is necessary.

Trump has not learned the lessons of Norberg’s book. One of the most important is that the factors that determine the continuation of a golden age are not external, such as a pandemic or a supposed clash of civilizations. Rather, says Norberg, the critical factor is how each civilization deals with its own internal clashes, and the decision to remain or not at a historical peak.

A Spanish-language version of this article was published by El Comercio in Peru on 5/6/2025.