fbpx
01 / 05
Grim Old Days: Pat Thane’s History of Old Age

Blog Post | Population Demographics

Grim Old Days: Pat Thane’s History of Old Age

"He who has made himself dependent on his children for bread and suffers from want, he shall be knocked dead by this club."

Summary: Pat Thane’s book explores the harsh realities faced by the elderly in pre-industrial societies, including early aging, high mortality rates, and widespread elder abuse. The book reveals that old age, often accompanied by physical disability and poverty, was generally marked by isolation, familial neglect, and societal contempt. Thane’s volume challenges the romanticized notion that the elderly were once universally respected, showing that industrialization brought both longer lifespans and improved intergenerational relationships.


A History of Old Age, edited by the British historian Pat Thane, features contributions from several scholars exploring old age in different eras, from antiquity to the recent past. The volume reveals that in the pre-industrial era, premature aging, early death, and elder abuse were far more common than today.

In the 17th century, “due to inadequate diet and poor living standards . . . poor women [were considered] to have entered old age around age 50.” “Mother” became an honorary title for women over 50, such as the famously ugly “Mother Shipton” of Yorkshire, born toward the end of the 15th century and who, like many old women of the era, was “reputed to be a witch.” In 1754, one author noted that “the peasants in France . . . begin to decline before they are forty.” For ordinary people, the injuries of old age reflected a lifetime of painful toil. There was a “high probability of some physical disability stemming from earlier, work-related injuries.” For example, female lacemakers “suffered debilitating blindness and stiff fingers.” “The ‘Dowager’s Hump’ of osteoporosis was the stereotypical hallmark of the elderly women in the 17th century, as were the broken hips and arms of the aged male.” Given the harsh toll that the challenges of preindustrial life took on the body, and the prevalence of early aging, it is not surprising that fewer people survived to old age.

While a preindustrial adult had a much better shot at reaching old age than did a preindustrial child (owing to the latter group’s horrifically high rate of early death), it was still a long shot relative to today. The 16th-century French philosopher Michel de Montaigne observed, “To die of age, is a rare, singular, and extraordinary death, and so much lesse naturall [sic] than others.” In the preindustrial era, “the elderly generally constituted not more than 8 per cent of the population, and in some regions and periods it was not more than 5.” (Although after outbreaks of bubonic plague, which disproportionately killed off the young, the elderly share of the population temporarily increased). With industrialization, the relative rarity of older adults began to change; “in England and the Low Countries, the numbers of elderly began to increase earlier” than elsewhere.

Even among royalty, living into old age was once relatively rare. “Of all the kings of Europe from the 11th to the beginning of the 15th century, the longest living king was Alfonso VI, king of Castile and León (1030–1109), who reached the age of 79. Of all his predecessors and successors only two made it to their 60s. Only three of the kings of Aragon reached their 60s, and only four of the German emperors. Three of the kings of England reached their 60s, but only one of the Capetian kings of France—Louis VII (1120–80). All other kings, in all European countries, died younger.” That bears repeating: For a king to live past 70 was extraordinary, and most kings did not live to see age 60. Among common peasants, typical lifespans were, of course, shorter still.

In antiquity, old age was also relatively rare. There were, of course, exceptions, such as the famed stoic philosopher Diogenes the Cynic, who lived to be 96, and the philosopher Chrysippus, who is said to have died around age 80, but such longevity was unusual. In the classical past, most of the population was young. “For example, around 6–8 per cent of the population of the Roman empire in the 1st century AD was over the age of 60.” This had many repercussions, including that fewer people ever knew their grandparents. “By the age of ten years, the average ancient individual had only a one-in-two chance of having any of his grandparents still alive. Fewer than one in a hundred Greeks or Romans of the age of 20 would have had a surviving paternal grandfather.”

Close, long-term relationships between grandchildren and their grandparents were thus relatively rare. “Most adult Greeks or Romans would have had only shadowy memories of their grandparents.” In fact, it was not until industrialization began in parts of Europe in the latter half of the 18th century that close grandparent-grandchild relationships such as those that are typical today started to become more common, as “longer lives meant greater opportunity to play the roles associated with the aged.” The archetypes associated with grandparents are newer than many realize, although they do slightly predate industrialization. “Only at the end of the [17th] century does the social, ‘spoil-the-child’, modern-looking ‘grandparent’ appear.” In other words, “the modern social role of the grandparent was just beginning to develop at the end of the century.” One might imagine that doting grandparents have existed since time immemorial, and some likely did, but high rates of early death and widespread material poverty deprived most ordinary people of the experience prior to the wealth generated by the Industrial Revolution. “A new representation of grandparents can be recognized in French culture in the late 18th century, preparing the way for the great stereotype of 19th-century grandparents spoiling their children’s offspring.” That was a consequence of more grandparents living long enough to form deep bonds with their grandchildren, and greater prosperity enabling the former to lavish gifts on the latter, as wealth and longevity spread: “Old age, traditionally viewed as a period of social isolation, was being experienced by greater numbers.”

Poor people continued working as long as possible—no matter how long they lived. “Bridget Holmes [(1591–1691)] was a servant in the Stuart royal household who was still working hard at the age of 96.” Beetty or Betty Dick, the town-crier of Dalkeith in Scotland continued to work until her death at age 85 in 1778, wandering the town beating a wooden plate with a spoon to draw public attention and making local announcements. This lengthy working life took a heavy toll. “The lifestyle of the poor was physically and mentally demanding even for those in the pink of health” and could be devastating in old age. Nonetheless, working until one’s dying day or the arrival of debilitating infirmity was a common fate among poor people, who once comprised the greater share of humanity.

The idea of a leisurely retirement being within ordinary people’s reach is a modern concept. For most of history, ordinary laborers worked until they became bedridden or died, owing to the extreme poverty of the preindustrial world. “Most of them were unable to save enough for their old age during their working years. They could thus not afford to retire and were obliged to continue working as long as they could.” Old age and poverty were practically synonyms. “As women generally worked in more poorly paid occupations than men, they were even more exposed to dire want in their old age.” By the 17th century, “at a certain stage in his life the peasant handed over his farm to one of his offspring [and] moved from the main room to a back room, or to the attic, or to a spare cottage.” After the handover, he would still assist with farmwork to the extent of his abilities. For women, living with family in old age was less common, at least partly because women who avoided childbirth had better chances of surviving to old age than women who had children.

A common narrative maintains that in the past, the elderly received far better treatment, enjoying greater respect and more familial support than today. “Insofar as old age is thought to have a history, it is presented as a story of decline . . . [in the past, the elderly] were valued, respected, cherished and supported by their families as, it is said, they are not today.” Nowadays, in contrast, the narrative holds that disrespect and loneliness are more likely to characterize the last years of life than in ages past. Yet in reality, “none of [the evidence] suggests that comfortable succour in the household of one’s children was the expected lot of older people in pre-industrial . . . Europe.” The evidence suggests quite the opposite, in fact.

Contrary to popular belief, preindustrial people were far less likely to have any surviving children or grandchildren to care for them in old age than modern people. That is partly because even though birth rates were higher in the past, children died with such horrifying frequency that they often predeceased their parents. “Given the higher rate of death at all ages before the later 20th century, older people could not be sure that their children would outlive them. In the 18th century just one-third of Europeans had a surviving child when they reached their 60th birthday.” Hence, the majority of those who lived to old age had no surviving children. In the modern world, in contrast, that is only the case for a minority. For example, US Census Bureau data suggests that among adults age 55 and older, over 83 percent have living adult children. Despite “today’s pessimistic narrative of old age [that] stresses the increasing loneliness of older people in the modern world,” loneliness was more pervasive in the preindustrial past.

What became of the childless majority of elderly people in the preindustrial era? “If they had no surviving children, they entered hospitals and institutions for the poor, which, throughout pre-industrial Europe and early America, were filled with older people without surviving relatives. Or they died alone.” Conditions in the hospitals were famously unsanitary and overcrowded. “There, sharing a bed with whoever else needed one, the destitute elderly lived out their final years.” Despite the poor conditions, demand for a hospital bed far exceeded the supply. “Seventeenth-century Brunswick had only 23 beds for every 1,000 inhabitants, Rheims had 24.94 for every 1,000; and in Marne, they were particularly scarce, with just 2.77 beds per 1,000. Furthermore, the elderly were only one of many eligible groups vying for accommodation. . . . It has been suggested that 74 per cent of all applications were denied.”

Some were even less fortunate: Older people without relatives also often faced harassment and even accusations of witchcraft. While old men also suffered through such allegations, old women were particularly likely to be targeted. That is at least in part because, then as now, women often outlived men, so there were more elderly women around. (Although in some times and places, men outlived women, such as Quattrocento Venice). “A physician in 17th-century south Germany explained why old women were so often accused of witchcraft: ‘They are so unfairly despised and rejected by everyone, enjoy no-one’s protection, much less their affection and loyalty . . . so that it is no wonder that, on account of poverty and need, misery and faint-heartedness, they often . . . give themselves to the devil and practice witchcraft.’ A 70-year-old woman said at her trial, ‘The children call all old people witches.’”

In other words, many communities violently scapegoated the aging. Any local misfortune, from illness to a house fire, could be blamed on supposed witches, usually impoverished older women without surviving children. Superstitions related to menopause did not help matters. “It was said that a menopausal woman could cause grass to dry up, fruit to wither on the vine, and trees to die. Dogs would become rabid and mirrors crack by her mere presence. Such women, without even trying, could cast the evil eye. With malice and aforethought, the glance of the post-menopausal woman could kill.” In reality, it was the aging women themselves who were killed by such delusions. From the 14th century through the 17th century, between 200,000 and 500,000 alleged witches—over 85 percent of them female and mostly middle-aged or elderly—were executed. Public shaming, harsh interrogations, and torture often preceded witch burnings.

Such violence was enabled by attitudes toward the elderly that were often grotesquely negative. “Literary depictions of old men in epics and romances [show] the old man is an object of contempt.” In the 17th century, “the Italian theatrical genre of Commedia dell’Arte reflected the Europe-wide characterization of old men as objects of mockery and disdain,” featuring a prominent stock character called Pantaloon, who was meant to represent a decrepit and ridiculous old man. “The 17th-century stage, elite literature and the sayings of peasants belittled and mocked the old in ways that few groups are targeted today.”

Old women often fared even worse in the public imagination. “Generally old women were feared or held in contempt.” To give an example, in the allegorical text Le Pèlerinage de la vie humane, “The Pilgrimage of human life,” written in the 14th century by the monk Guillaume of Deguileville, the virtues are all personified by beautiful young women, while ugly old women represent the vices. Even in the 17th century, women “were thought to grow increasingly evil and dangerous as menopause set it.” A literary genre popular from the 13th century onward known as “sermones ad status”—sermons divided according to their audience (i.e., sermons to the nobility, to merchants, and so forth)—reveals how the people of the past viewed different groups. In this classification scheme, “the elderly, like women and children, were represented as a single marginal group irrespective of social stratum, rank, profession or lifestyle. In some texts they were classed with invalids, foreigners or the very poor, the emphasis being on their . . . social inferiority.”

Public ridicule of the elderly was also commonplace and considered an ordinary pastime for children. A description of each decade of life “popular in Germany in the 16th century and probably familiar still in the 17th” describes a man of 90 years as ‘the scorn of children.’” A Viennese woodcut from 1579 depicts a nonagenarian man derided by a young child.

The minority of old people who did have surviving children were not necessarily much better off, as treatment of the elderly was often appalling, even by close family members. One “popular . . . tale, already old in medieval Europe, told of a man who, tired of caring for his old father, starts to carve a trough from which the old man is to eat, instead of sitting at the family table, or, in another version, starts to exchange his father’s bedding for a piece of sacking.” Similar stories abounded that depict cruelty toward the elderly. “In another, bleaker version, the old man is gradually demoted from a place of honour at the head of the table to a bench behind the door, where he dies in misery.” In some areas, this power imbalance was reversed. “In late 17th century Upper Provence, for example, until the death of his father, the heir was ‘completely subservient to his father economically, socially, and legally, just as though he were still a child.’ He could not, without his father’s permission, buy, sell, trade, make a will or complete any legal contract. Trouble arose repeatedly as a consequence.” In most areas, however, elder abuse was likely more frequent than aging parents legally tyrannizing their adult children.

Of course, individuals varied, and many adult children dutifully supported their aging parents and maintained positive relationships with them. But economic stress made it hard even for willing adult children to support their parents. “As the younger generation was typically poor themselves and overburdened by children, leaving little food or money to spare for an aged parent. Barbara Ziegler, from Bächlingen in southwestern Germany, described what the 1620s had been like for her: ‘I stayed with my son for four years, but the food was bad and [he] supported me only with great effort.’” Far from the romantic notion that the past offered greater familial support to older adults, the prevailing attitude toward any older person relying on their adult children was often one of bitterness and disgust.

This is true even in antiquity, despite the “common myth about the classical past . . . that older individuals enjoyed something of a golden age when they were treated with great respect.” The reality was that attitudes toward the elderly were often cruel. Classical literature often depicted the old as “witches or alcoholics.” In Greek and Roman mythology, the personification of old age, Geras or Senectus, is said to be “the offspring of Night, and has siblings Doom, Fate, Death, Sleep, Blame, Nemesis and Deceit, among others.” The philosopher Juncus noted that even to his friends and family, an aging man is nothing but “an oppressive, painful, grievous and decrepit spectacle: in short, an Iliad of woes.” The Greek satirist Lucian in his work On Grief points out, albeit jokingly, that one benefit of a man’s untimely demise is that “by dying young, he will not be scorned in old age.” In fact, “it was a common proverb that old age and poverty are both burdensome—in combination they are impossible to bear.” Even when adult children took care of their parents, it was often with great resentment. In the playwright Aristophanes’ comedy Wasps, a son is depicted supporting his father but without any hint of the filial respect often imagined to characterize the past. The son character says with disgust, “I’ll support him, providing everything that’s suitable for an old man: gruel to lick up, a soft thick cloak, a goatskin mantle, a whore to massage his . . . loins.” At the beginning of Plato’s Republic, the elderly Cephalus says this of “old men”: “Most of them are full of woes [and] grumble that their families show no respect for their age.” The old were often despised as “marginal members of society.”

Even in the later 18th century, “the town gates of some cities in Brandenburg hung large clubs with this inscription: ‘He who has made himself dependent on his children for bread and suffers from want, he shall be knocked dead by this club.’”

These facts and more can be found in this fascinating book.

Blog Post | Culture & Tolerance

The Ancient Roots of Western Self-Criticism

The West’s enduring success is rooted in its awareness of its own faults and constant striving to be better. Far from being a modern phenomenon, the tradition of Western self-criticism began with Homer.

Summary: Western civilization is now often criticized from within for its imperialism, decadence, and moral failings. But the tradition of Western self-criticism is not a modern weakness; it is an ancient strength. The Greeks and Romans consistently questioned their own actions, empathized with their enemies, and questioned their societal norms. This deep-rooted capacity for introspection helped build the resilient, self-correcting culture whose contributions to human flourishing have shaped the world of today.


At a time when Western histories and societies face relentless internal scrutiny—accused of imperialism, cultural arrogance, decadence, and other failings—it is tempting to view this self-criticism as a modern malaise, a sign of weakness. Yet even a cursory look at the literature of ancient Greece and Rome reveals a different story: the West’s tendency to question itself, empathise with its enemies, and confront its own imperfections is not a recent phenomenon. It is age-old and unique. It may even be one of the main sources of Western strength. Far from undermining Western civilisation, this introspective tradition—evident in the works of Homer, Aeschylus, Sophocles, Euripides, Virgil, Tacitus, and others—has catalysed its resilience and moral progress. By holding a mirror to their own flaws and extending sympathy to adversaries, the ancients laid the groundwork for a culture built on self-correction and the pursuit of betterment—traits that continue to define the West’s success.

The ancient Greeks, whose city-states birthed and gave name to democracy, logic, ethics, geography, biology, aesthetics, economics, mathematics, astronomy, physics, history, politics, and philosophy, were no strangers to self-examination, even in times of war. Homer’s Iliad—a foundational text of the Western literary canon, composed in the late eighth century BC—is a masterclass in humanising the enemy. While celebrating Greek heroism, Homer does not vilify the Trojans. Instead, he paints Hector, Troy’s greatest but ultimately doomed warrior, as a devoted husband and father whose heartbreaking farewell to his wife, Andromache, moves readers nearly 3,000 years later. Later, Achilles, the Greek champion, shares a moment of profound empathy with Priam, the Trojan king, as they weep together over their respective losses. This is not mere storytelling; it is a moral stance, urging Greeks to see their enemies as mirrors of themselves, subject to the same cruel fate. Such understanding reflects a culture unafraid to question the glorification of conquest and to seek understanding across battle lines.

This introspective spirit shines even brighter in Greek tragedy. Its best-known playwrights—Aeschylus, Sophocles, and Euripides—are generally rated, along with Shakespeare, as the greatest tragedians of all time; they used the stage to probe their society’s values. In fifth-century BC Athens, tragedies were performed before a mass audience in an open-air theatre at the annual festival of Dionysus, god of wine and fertility. When people today think of plays, they imagine small theatres with audiences whose average level of education and intelligence is much higher than that of the general population. Given the composition of Greek audiences, therefore, the adversarial nature of Attic tragedies—built around the agōn, a formal clash of characters and ideals that let spectators see moral and political questions tested through direct confrontation—is even more remarkable. Let us look at a few examples.

In 472 BC, just eight years after the Greeks repulsed the Persian invasion at Salamis, Aeschylus, reportedly a veteran of the Battle of Marathon, presided over the performance of his play The Persians. It is an extraordinary example of cultural humility. Rather than gloating over a defeated foe, Aeschylus sets his drama in the Persian court, giving voice to Queen Atossa’s grief and Xerxes’ humiliation. The chorus of Persian elders laments the loss of their youth—a universal cry that would resonate with any Athenian who had lost a son in battle. Aeschylus could have written a jingoistic paean to Greek superiority; instead, he penned a tragedy that invited his audience to mourn with their enemies, acknowledging the hubris that threatens all nations.

Sophocles, too, contributes to this tradition in Antigone (c. 441 BC), where the adolescent heroine’s defiance of King Creon’s edict to leave her brother Polynices unburied pits individual conscience against state authority. Polynices, branded a traitor, is the “enemy,” yet Antigone’s loyalty to him is portrayed as noble, and Creon’s eventual regret reveals the folly of his rigid rule. The play’s sympathy for those who challenge the state reflects a Greek willingness to question authority and empathise with outcasts—a precursor to modern debates about justice and dissent.

Finally, we come to the truly remarkable case of Euripides. In Hecuba (424 BC), Trojan Women (415 BC), and Andromache (date disputed), the playwright portrays the savage cruelty inflicted by victorious Greeks on the Trojan women they enslaved. In front of a mass audience—a significant share of which consisted of highly patriarchal Greek men—Euripides bemoans the horrific fate of enemy slave women at the hands of Greek men. By giving voice to the defeated, he challenges the moral certainty of conquest, urging his audience to see their enemies as victims of the same forces that could one day destroy Athens. These plays are not just art; they are acts of cultural self-criticism, exposing the flaws of Greek society—xenophobia, misogyny, hubris, cruelty—while affirming the humanity of those it deemed enemies. How modern.

The Romans were great innovators in jurisprudence, administration, engineering, logistics, urban planning, and politics, bequeathing to the world such words as republicliberty, and legal—concepts they valued highly. Culturally, however, they were greatly beholden to the Greeks. Virgil’s Aeneid (19 BC) is both a national epic and, by consensus, the greatest work of Latin literature. It narrates how, after the Trojan War, the Trojan prince Aeneas led the remnants of his people to Latium, where they intermarried with the native Italians to become the ancestors of the Romans. The epic’s high point is Aeneas’ interaction with Dido, queen of Rome’s archenemy Carthage. They have an affair, he leaves, and she commits suicide. Her curse on the departing Aeneas foreshadows Carthage’s enmity, yet Virgil portrays her as a noble, broken figure—not a villain. In fact, Virgil focused readers’ attention on Dido so completely that she became the heroine of the Aeneid. In the early fifth century AD, Macrobius, a Roman provincial author, observed, “The story of Dido in love … flies through the attention of everyone to such an extent that painters, sculptors, and embroiderers use this subject as if there were no other … that she committed suicide in order not to endure dishonour.” Virgil’s Carthaginian queen remained the heroine of poetry (Chaucer’s Legend of Good Women), tragedy (Marlowe’s Dido, Queen of Carthage), and opera (Purcell’s Dido and Aeneas).

Tacitus, the greatest Roman historian, was also a senator, praetor, suffect consul, and proconsular governor of the province of Asia. In other words, he was at the very centre of the imperial establishment. Tacitus wrote Agricola (c. AD 98) to honour his eponymous father-in-law by recounting how the latter solidified Roman control over what is now England and Wales. Nevertheless, Tacitus attributes to Agricola’s enemy, the British chieftain Calgacus, a powerful denunciation of the Roman Empire: “Plunder, slaughter, rapine they call by the false name of empire, and where they make a desert, they call it peace.” With that almost certainly invented statement, Tacitus undermined the proudest Roman boast—that empire brought peace (see Aeneid 6.852–53; the Pax Romana; and the Emperor Augustus’ Altar of Peace). Similarly, in Germania (c. AD 98), Tacitus idealises the Germanic tribes’ simplicity and courage, contrasting them with Rome’s supposed decadence. By praising Rome’s enemies, he holds a mirror to what he sees as his own society’s moral decline.

Finally, Lucan’s Pharsalia (c. AD 61–65), an epic of Rome’s civil war, mourns Pompey Magnus, Caesar’s rival, as a tragic figure fighting for the Republic’s lost ideals. His murder in Egypt, lamented by Lucan, evokes sympathy for a defeated enemy whose loss marks Rome’s slide into autocracy. Writing under Emperor Nero, Lucan uses Pompey’s fate to critique tyranny, showing how sympathy for an enemy can serve as a veiled rebuke of one’s own rulers.

The ancient Greeks and Romans waged wars, built empires, and committed atrocities. Yet their literature reveals a unique capacity to question those actions, to see the humanity in their adversaries, and to strive for moral improvement. This mindset formed a cornerstone of Western resilience—a culture that thrives on self-criticism, not self-congratulation, a culture that is alert to its faults and resolute in correcting them. To quote Arthur Schlesinger Jr.’s The Disuniting of America: “No doubt Europe has done terrible things, not least to itself. But what culture has not? … There remains a crucial difference between the Western tradition and the others. The crimes of the West have produced their own antibodies. They have provoked great movements to end slavery, to raise the status of women, to abolish torture, to combat racism, to defend freedom of inquiry.”

Western self-criticism, then, is not new. What is new is the apparent imbalance between recognising Western shortcomings on the one hand and appreciating the West’s magnificent bequests to humanity on the other. That should not be surprising, given that the commanding heights of Western culture—universities, museums, galleries, and theatres—have become dominated by a motley crew of Marxists, Frankfurt-schoolers, post-structuralists, deconstructionists, postcolonialists, de-colonialists and critical race theorists. Despondency over the future of the West, however, would be an over-reaction.

In 184 BC, amidst worry about Rome’s decline, Cato the Elder won the election as Censor on a platform of a “great purification,” in which he aimed to “cut and sear … the hydra-like luxury and effeminacy of the time.” At that point, Rome controlled Italy, Corsica, southern Spain, and small parts of the Dalmatian Coast. Yet, Rome proceeded to grow and would not reach its maximum territorial extent as well as the period of its greatest prosperity and tranquility until three centuries later, under the Nerva-Antonine Dynasty. It would take another three and a half centuries before the Western Empire disintegrated in AD 476.

Its eastern half survived under the leadership of rulers whose title was “Basileus ton Romaion” (King of the Romans) until the sack of Constantinople by the Ottomans in 1453—some 1,600 years after Cato expressed his concern over Rome’s future. Paying homage to the Byzantine custom, Sultan Mehmed II declared himself “Kayser-i Rum” (Caesar of the Romans). By that time, Western Europe was on the mend. The Renaissance was in full swing, and in 1492, Columbus sailed for the New World. The stage was set for the Scientific Revolution, followed by the Enlightenment, the Industrial Revolution, and a half-millennium-long Western preeminence that transformed the globe—largely for the better. The revolutions that originated in Europe brought to all the peoples of the world greater knowledge, prosperity, and control over nature than anyone could previously have imagined possible. Let us, by all means, continue the tradition of self-doubt and self-criticism that have characterised Western civilisation from its beginning. However, now that the West has come under sustained and vitriolic attack from without and within, perhaps we should balance that self-criticism with recognition of Western civilisation’s unmatched contributions to human wellbeing and progress. 

This article was published by Quillette on 7/4/2025.

The Economist | Tolerance & Prejudice

The Stunning Decline of the Preference for Having Boys

“Without fanfare, something remarkable has happened. The noxious practice of aborting girls simply for being girls has become dramatically less common. It first became widespread in the late 1980s, as cheap ultrasound machines made it easy to determine the sex of a fetus. Parents who were desperate for a boy but did not want a large family—or, in China, were not allowed one—started routinely terminating females. Globally, among babies born in 2000, a staggering 1.6m girls were missing from the number you would expect, given the natural sex ratio at birth. This year that number is likely to be 200,000—and it is still falling.

The fading of boy preference in regions where it was strongest has been astonishingly rapid. The natural ratio is about 105 boy babies for every 100 girls; because boys are slightly more likely to die young, this leads to rough parity at reproductive age. The sex ratio at birth, once wildly skewed across Asia, has become more even. In China it fell from a peak of 117.8 boys per 100 girls in 2006 to 109.8 last year, and in India from 109.6 in 2010 to 106.8. In South Korea it is now completely back to normal, having been a shocking 115.7 in 1990.”

From The Economist.

Blog Post | Communications

We’re Living in a Split-Screen America

The evolution from broadcast news to personalized feeds has fractured how we see the world, but progress is possible.

Summary: Americans once shared a common media landscape, but the rise of personalized digital feeds has splintered that reality into partisan echo chambers. Social platforms now amplify outrage, reinforce tribal instincts, and erode agreement on basic facts. While there is no easy fix, reforms in design, digital literacy, and cultural norms offer hope for a more truthful and united public discourse.


“And that’s the way it is.” At least, that’s the way it was. When Walter Cronkite closed his nightly broadcasts with those words, America was a foreign country. At the height of broadcast news, Americans had differences of opinion but agreed on a basic set of facts about what was going on in the country and the world. Anchors like Cronkite, voted in 1972 by Democrats and Republicans alike as the most trusted man in America, aimed to be impartial and to win bipartisan credibility. But as partisan cable news and talk radio came to prominence in the 1990s, basic agreement on the facts began to erode. And with the rise of social media, it splintered entirely.

Platforms like Facebook, YouTube, TikTok, and Twitter personalize content to maximize engagement (time spent on an app, posts liked and shared), showing you what you want to see. That reinforces users’ existing beliefs and limits exposure to opposing views. Strikingly, a Meta-commissioned study of 208 million users during the 2020 U.S. election cycle showed that liberals and conservatives on Facebook encountered almost entirely non-overlapping news sources. Once a social media user spends time looking at political content on one of these platforms, he or she is fed more and more of the same. Far from the broadcasts of the mid-century, modern news is delivered via increasingly bespoke “narrowcast.”

This political siloing is not trivial. Americans now inhabit split-screen realities. In one 2023 Gallup poll, 90 percent of Republicans believed crime was rising, while 60 percent of Democrats believed it was falling. On climate change, a 2021 survey showed a 56-point partisan gap in beliefs about whether humans have a serious impact on the climate system (compared to a 16-point gap in 2001). In 2024, 44 percent of Democrats rated the national economy as “excellent or good,” compared to only 13 percent of Republicans, despite the same underlying economic conditions. The gap wasn’t driven by personal finances, but by partisan interpretations of identical economic indicators. These are not differences of opinion; they are incommensurable beliefs about the state of the world.

But platforms don’t just feed us headlines that align with our politics. They also bait our strongest emotions. In 2017, Facebook began weighting “angry” reactions five times more heavily than “likes” when floating posts to the top of our feeds. That same year, a study found that each additional moral-emotional word in a tweet (think “shameful,” “detestable”, “evil”) significantly increased the likelihood of it being shared and reshared.

This platform design calls up ancient instincts. Humans evolved to detect threats to the coalition, to signal our group loyalty, and to rally allies against rivals. A tweet calling someone “abhorrent” isn’t just an opinion; it’s a tribal call to action. And because these platforms so reliably elicit our ire and impel us to spread it to others, they’ve become outrage engines.

They create sealed chambers that echo our anger, where contrary evidence is unlikely to penetrate. Carl Sagan now sounds prescient when he warned in 1995 of a future where Americans, embedded in an information economy, would become “unable to distinguish between what feels good and what’s true,” leaving society vulnerable to illusion and manipulation.

And the consequences of the outrage engines don’t stop at our borders. In 2016, Russian operatives used fake personas on Facebook and Twitter to spread inflammatory memes targeting both liberals and conservatives. They didn’t need to hack anything. They simply exploited an information ecosystem already optimized for spreading partisan outrage.

What can be done? There is no single fix, but meaningful improvements are possible.

In a randomized study, older adults who received just one hour of digital literacy training from MediaWise improved their ability to tell false headlines from real ones by 21 percentage points. When Twitter added a prompt asking users if they wanted to read an article before retweeting it, people were 40 percent more likely to click through to the article before sharing it impulsively.

Choice helps too. In one study, switching users from a feed that had been personalized by the algorithm to one that showed posts in chronological order measurably increased their exposure to content across the political aisle. While it may not be a silver bullet, giving users the ability to choose their feed structure, including which algorithm to use, allows for opportunities to be exposed to contrary opinions and to peer outside the echo chamber.

But deeper change is cultural. A compelling case has been made that human reasoning evolved not to uncover objective truth, but to persuade others, to justify our own ideas, and to win arguments. That is why the habits of sound reasoning must be cultivated through norms that prize truth over tribal loyalty, deliberation over impulsivity, and the ability to make the best case for opposing views in order to oppose them on their merits.

This isn’t a call for censorship or government control of the news, nor is it a plea to go back to three-network broadcasting. The democratization of media has brought real benefits, including broader participation in public discourse and greater scrutiny of powerful institutions. But it has also made public life more combustible and has manufactured disagreements about factual questions. In a competition for attention, platforms are designed to maximize time spent on them. That means elevating content that provokes strong emotional responses, especially outrage, and targeting it toward the users most likely to react. The more incendiary the content, the more likely it is to hold us captivated.

What we are witnessing is not a failure of the market, but a particularly efficient version of it, albeit one that optimizes for attention, not accuracy. Personalized feeds, algorithmic curation, and viral content are giving people more of what they want. And yet, many Americans say they are dissatisfied with the result. In a 2023 Pew survey, 86 percent of U.S. adults said they believe Democrats and Republicans are more focused on fighting each other than solving real problems, and respondents across party lines cited political polarization as the biggest problem with the political system.

While online outrage bubbles may not qualify as a market failure in the technical sense, they are clearly a civic problem worth confronting. An information ecosystem optimized for attention rather than accuracy will reliably amplify division and distrust, even while giving users more of what they like to see and share. The incentives are working as designed, but the outcome is a fragmented public unable to agree on the real state of the world. If democracy depends on a shared understanding of basic facts of the matter, then reckoning with these tradeoffs is well worth our much-demanded attention.

CNN | LGBT

Same-Sex Couples Wed as Thailand’s Marriage Bill Takes Effect

“Hundreds of same-sex couples are tying the knot across Thailand on Thursday as the country becomes the first in Southeast Asia to recognize marriage equality…

Under the legislation, passed by Thailand’s parliament and endorsed by the king last year, same-sex couples are able to register their marriages with full legal, financial, and medical rights, as well as adoption and inheritance rights.”

From CNN.