fbpx
01 / 05
Grim Old Days: Robert Muchembled’s Cultural History of Odours in Early Modern Times

Blog Post | Water & Sanitation

Grim Old Days: Robert Muchembled’s Cultural History of Odours in Early Modern Times

Putrid smells were a defining feature of the past that is now largely forgotten.

Summary: The preindustrial world was anything but fresh and clean, as historian Robert Muchembled reveals in his cultural history of odors. From the stench of slaughtered animals and overflowing latrines to suffocating city air and dubious perfumes made from animal secretions, early modern life was defined by overwhelming, often toxic smells. Despite their constant exposure, people did not become desensitized to the filth, and their desperate attempts to mask or combat foul odors often led to bizarre, ineffective, or even dangerous solutions.


French historian Robert Muchembled’s Smells: A Cultural History of Odours in Early Modern Times, as translated by Susan Pickford, describes what the preindustrial past smelled like in frightening detail.

Some entertain the idea that before industrialization and its “dark Satanic Mills“ belching smoke, the air was sweet and pristine. In reality, an “apocalyptic stench . . . formed the olfactory backdrop to many people’s lives.”

Air pollution is not a modern phenomenon. “The good old days are a myth. The towns and villages of Europe stank horribly in the days of yore.” While certain forms of air pollution are relatively recent, “the foul air of medieval towns” was suffocating.

Animals were being killed nonstop for food, hides, quack medicines, entertainment, and more. Hence “the reek of death constantly hung over towns and cities.” Sometimes the poor air quality even prompted appeals for change. “When the air became too grim to breathe,” outrage occurred. In 1363, several scholars and students at the University of Paris complained to the king about how butchers killed animals in their homes:

The blood and waste from the animals is thrown day and night into the Rue Sainte-Geneviéve, and on several occasions the waste and blood of the animals was kept in pits and latrines in their houses until it was corrupted and rotten and then thrown into that same street day and night, until the street, Place Maubert and all the surrounding air was corrupted, foul, and reeking.

Muchembled quotes the French historian Henri Sauval (1623–1676) describing Paris as “black, foul-smelling, its stench unbearable for those from elsewhere; it stings the nostrils from three or four leagues distant.” Many causes of foul odors besmirched the air. “The noisy, dirty, crowded streets were home to more and more polluting trades, well before the Industrial Revolution.” For example:

Certain trades were a major source of noxious emissions for their immediate neighbourhood, including butchers, tripe makers, fishmongers, potters (who deliberately left their clay to sour in cellars in Paris and elsewhere), and painters who used pigments made from metal oxides. The worst were tanners, glove- and purse-makers, and fullers, who made abundant use of toxic plant and animal substances as mordants, like alum, tartar and soda, urine (often collected from humans), chicken droppings and dog excrement, which accelerated the process of fermenting and rotting the fibres they worked with.

Poor storage of human waste and the remains of slaughtered animals alike affected the quality of city air. Only in 1760 were the massive sewage dumps in Faubourg Saint-Germain and Faubourg Saint-Marceau moved some two and a half miles outside Paris to combat the “foul air” they caused. Another Parisian dump site, which remained operational until 1781, was infamous: “Its ten hectares of cesspits full of fermenting sewage and its slaughterhouse piled high with rotting carcasses could almost have been something out of Dante’s Inferno.”

Parisians complained that “noxious emissions from the boats [transporting sewage] were tarnishing and bleaching their silverware, gilding and mirrors.” Many homes abutted towering dung heaps, prompting the Italian physician Bernardino Ramazzini (1633–1714) to observe that “the air they live in must be polluted with the foul vapours that rise constantly.” Many prominent minds were deeply troubled by air quality affected by insufficient sanitation systems. “The noxious vapours of excrement were the main concern of hygienists in the reign of Louis XVI.” The smell of sewage was everywhere. “A French royal edict in 1539 complained of the “‘mud, dung, rubble and other rubbish’ piled up outside people’s doors and blocking the streets, despite earlier royal decrees.”

The night soil men who cleaned the sewer dump sites sometimes even died from the smell. “Fatal suffocation was a real risk on opening a latrine. . . . The rotting excrement released a dangerous, fetid sewer gas called ‘mofette’ . . . or ‘plomb’ (the French term for lead, as the symptoms were thought to be similar to those of lead poisoning). . . . Cases of fatal sewer gas poisoning among night soil men remained a cause for medical research throughout the nineteenth century. . . . In 1777, the king [of France] appointed a commission of chemists to study the effects of mephitism, a disease which struck fear into the hearts of night soil men.” The sewer gases could kill directly or through incineration when they burst into flames. “The gas sometimes caught fire, as in Lyon in July 1749.”

Not to mention the stench of human corpses. “Then there was the smell of bodies buried in and around churches, often in shallow graves, before a 1776 decree banished graveyards to outside urban areas” in France. The situation before that decree was a nightmare. “Before the 1776 order to relocate France’s graveyards out of urban centres,” one writer complained of the “mephitic vapours” emitted by the country’s cemeteries into the surrounding cities.

Outside of cities, the air was not necessarily better. Indeed, the countryside our ancestors knew has been described as “a concentration of bad smells: sweaty livestock, poultry droppings, rotting rat carcases, bodies living together in a single room, rubbish hidden in dark corners, and combustible fumes steaming from the dung heap outside the door.” Bizarrely, rural people sometimes took pride in the filth and used “the height of dung heaps as a measure of wealth.”

One might imagine that noses were dulled from the constant assault on the senses. Yet despite being used to “the terror of omnipresent putrid smells” that did not happen: “In the sixteenth and seventeenth centuries, the extraordinary stench in towns and cities and at court did not weaken people’s sense of smell.” In other words, “The local population’s sense of smell, long accustomed to the urban fug, was triggered afresh by unusual events such as unexpected flooding from the Isére or Drac rivers, which left behind a tide of ‘stinking mud, a mix of latrines and graves,’ as one observer wrote in 1733.”

Not only were our ancestors surrounded by horrific smells, but they themselves were often rather stinky. The frequency with which most ordinary people today bathe, wash their hands, and engage in other bodily cleansing would utterly bewilder their preindustrial ancestors, who often feared contact with water as a threat to health. In 16th-century France, “the culture set little store by cleanliness, water being considered dangerous.” Daily bathing would be seen as eccentric and possibly harmful. “The population should be imagined as filthy, crawling with vermin and scabies-ridden.” Muchembled quotes a French writer who described how in 1764 some people bathed just once a year, in accordance with tradition, while others had adopted the more modern habits of bathing once a week, fortnight, or month. When they did bathe, what passed for soap at the time would not pass muster today. One French work from 1764 contains several pages of “soap recipes liable to lead to rough, even wrinkly skin, being heavy on soda ash, quicklime and olive oil.”

Given their smelly surroundings and lack of basic personal hygiene, preindustrial people were certainly in need of a way to conceal the smell of their unwashed, often diseased bodies and stinking breath. Many turned to what they considered to be perfume, although a person today might not recognize the pungent concoctions as such. Most popular perfumes today smell like flowers, such as rose or jasmine, or other sweet things. The perfumes of the past were rather different and, in many cases, would not readily appeal to modern noses. “Heady perfumes painfully extracted from the sex glands of exotic creatures were used in extravagant quantities to hide the ever-present stink.”

“All sixteenth- and seventeenth-century perfumes were saturated with animal base notes made from glandular secretions.” Consider some of the most popular perfume ingredients. “Ambergris” came from the stomachs of sperm whales, “castoreum” from the abdominal sacs of beavers, “civet” from the anal glands of its namesake wildcat, and “musk” from glands between the navel and genitals of the Asian musk deer. In other words, perfume ingredients were far from sweet-smelling; “civet” in particular carried a fecal odor. “Without the [excrement] of martens, civets, and other animals, would we not be deprived of the strongest and best scents?” asked Sophia the Electress of Hanover (1630–1714) in a letter.

Other ingredients were added to these bases. Some are still beloved today, such as roses; others were less sweet. In 1522, among the perfume ingredients sold by a French apothecary were litharge (a form of lead), verdigris (which is mildly poisonous), asafoetida (colloquially known as “devil’s dung” for its fecal stench), and sulphur, which is commonly considered to smell like rotting eggs. Eau de millefleurs (“water of a thousand flowers”) was a pretty name given to a concoction derived from the urine or dung of a cow, although by the late 18th century, a less-repulsive version of this creation “made from cow pats, was later made from musk, ambergris and civet.”

Pigeon blood and goat bile were also acceptable perfume ingredients. Consider a text from 1686 by the French chemist Nicolas Lémery that advised unscrupulous perfumers on how

to make cheap ‘Western’ musk from small quantities of the original product: in the last three days of the moon, feed the blackest rough-footed pigeons you can then find with spike lavender seed and sprinkle them with rose water. Then feed them on beans and pills for fifteen days. Slit their neck on the sixteenth and catch the blood in an earthenware dish standing on hot ashes. Skim off the top, then crown each ounce with a drachma (one-eighth of an ounce, or 576 grains) of genuine oriental musk dissolved in spirits of wine. Add four or five drops of billy goat bile, leave the mixture to steep in good, hot horse manure, and warm it through again.

Such ingredients were common not only in perfume but also in many mainstream beauty treatments. A facial skin treatment promoted by the writer Pierre Erresalde in 1669 “consisted of calves’ feet, river water, white breadcrumbs, fresh butter and egg whites.” Keep in mind that rivers also often functioned as sewers at the time. Nicolas de Blégny, medical adviser to Louis XIV, the Sun King, recommended that the court women drink a broth that listed ox bile among its ingredients to improve their complexions. Other ingredients used in his recommendations include crushed snails, pearls dissolved in pork fat, frog sperm, and of course, lead. Another popular makeup sometimes made with a form of lead was “virgin’s milk,” which corroded the very skin it was meant to improve. “Virgin’s milk, used to whiten the skin, contained litharge, which was very harsh on the skin and deeply toxic.” Other beautifiers containing lead included “ceruse” and “vinegar of Saturn.”

Minor blemishes “were treated with silver sublimate, white lead and vitriol, while litharge was regularly recommended as a skin whitener. . . . One recipe called for a freshly killed white hen whose blood was to be rubbed on spots or freckles and left on to dry.” One French recipe for preventing a suntan (because pale skin was fashionable) called for “half a dozen whelps mixed with calf’s blood, pigeon droppings, a pigeon with its innards stripped out, the ‘blood of a male hare’ mixed with ‘an equal part of the urine from the person who is to use it’, and ox bile.”

The animal-based scents of ambergris, musk, and civet only fell out of favor around the mid-18th century, when flowery and fruit-based fragrances came into vogue. The preindustrial people’s tendency to douse themselves with “perfumes” consisting of animal secretions, toxins, and stinking ingredients to hide the powerful odor of their own unbathed bodies was not the sole use of perfume. Perfume was also used to fight the bubonic plague.

Many of our preindustrial forebears thought that the plague spread through foul air. In the town of Arras in France, in 1655, a rule banned feeding pigs, whose foul odor was thought to corrupt the air in a way that spread the plague. In 1604, a French physician complained that some peasants tried to prevent the plague by eating “cheese on an empty stomach.” In contrast, “mainstream advice [recommended that people] keep sniffing at a pomander, sprig of herbs or flowers, or a sponge dipped in vinegar and rose water when out walking.” Many ordinary people used pomanders for this purpose. “The fashion for pomanders was by no means limited to the aristocracy or the wealthy. It was perfectly possible to make one simply by sticking cloves into an orange or lemon or even a ball of clay with various scents kneaded into it.” Ordinary people, in other words, often used them.

Other physicians advised rubbing “genuine scorpion oil” over the body on the theory that “one venom or poison often cures or drives out another.” Foul smells were thus thought to offer protection against the plague, which was itself theorized to derive from putrid air. The cures of physicians often closely resembled folk remedies. “The doctor Jean de Renou wrote in 1624 that his colleagues were using rat droppings to treat kidney stones, dog dirt for throat infections, and peacock droppings for ‘falling sickness’ (epilepsy), while human excrement was ‘marvellously suppurative.’ . . . Madame de Sévigné used spirit of urine against rheumatism and the vapours. Some doctors believed that one cure for airborne contagion was breathing in an even fouler smell. . . . This was a serious medical opinion, not folk wisdom; its popularity among the poorest sections of society was doubtless due to the fact it was free.”

Many people seeking to ward off the plague thus “sniffed rotting cheese, drank their own urine, bred goats to keep their homes safe, and breathed in the air from privies first thing in the morning on an empty stomach. One German doctor was still recommending privy sniffing as late as 1680.” In Poland, some “fought the epidemic by throwing the stinking carcases of dogs, horses, cattle, ewes and wolves into the streets on the grounds that ‘the horrible stench drives out the pestilential air.’”

“Garlic and rue were considered to smell vile” and hence to offer protection against the plague. Rue-based perfume actually may have offered some protection because rue naturally repels fleas (a spreader of bubonic plague), but merely sniffing rue occasionally or placing rue in one’s mouth, as was sometimes advised, likely made no difference. The use of garlic, which does not repel fleas, was just as popular as a preventive measure—some physicians advised washing one’s hands and face with “garlic vinegar or rue.” One common concoction meant to drive plague contagion out of the air was “four thieves vinegar,” which counted garlic and rue among its ingredients, along with onion and the pungent asafoetida.

In 1624, the physician Jean de Renou advised, “Not only mainstream protections such as . . . scorpion’s oil” but also “unicorn horn, mercury, viper flesh . . . mummia (a medicine made from powdered mummies), the mythical bezoar” and many other bizarre, dangerous, or simply nonexistent things. (A bezoar is a hard mass of undigested matter sometimes found lodged in the gastrointestinal tracts of animals such as oxen and horses; these objects were widely believed to have magical healing properties in much of the preindustrial world, in areas as diverse as China and Europe). There were worse cures still, such as those offered by the physician Blégny (1652–1722):

Nicolas de Blegny also had several even more astonishing recipes to cure those suffering from the dreadful disease. Take, he wrote, large toads in the hottest July days, hang them upside down by a small fire, then dry them and their vomit in the oven. Grind them to powder to be shaped into small flat medallions. Sprinkle these generously with theriac and apply them over the heart in a pouch. The same result could be obtained by placing large toads in a pot over the fire, dissolving the resulting powder in white wine and drinking the mix in bed in the morning, leading to profuse sweating.

Even royalty were subjected to horrific, unscientific cures. Recall that de Blégny gave medical advice to Louis XIV and consider another recipe that the former recommended:

Dog excrement, ground and soaked in vinegar and plantain water, was, in Blégny’s expert opinion, an excellent remedy for diarrhoea when applied as a hot, if rather smelly, poultice. Nosebleeds needed a liquid blend of donkey droppings that were ground and mixed with plantain syrup, certainly intended to attenuate the taste and smell. Fresh pig’s droppings could also be dried on a fire-shovel, ground, heated and inhaled. It is interesting to think that the king, who hired the imaginative doctor in 1682, might have tried out some of his bold ideas.

Tobacco, with its strong odor, was also considered a miraculous cure for many ailments. Unsurprisingly, then, “tobacco also had a role to play in the fight against the Black Death.” In Europe, tobacco was among the scents frequently sniffed to ward off the bubonic plague and fight the putrid and omnipresent stench of toxic and occasionally lethal sewage gas. “In England, the night soil men described in Daniel Defoe’s 1720 Journal of the Plague Year followed medical advice to the letter, working with garlic and rue in their mouths and smoking scented tobacco.”

Blog Post | Wealth & Poverty

Dinner With Dickens Was Slim Pickins

Claims that characters in "A Christmas Carol" were better off than modern Americans are pure humbug.

Summary: There have recently been widespread claims that Dickens’s working poor were better off than modern minimum-wage workers. Such comparisons rely on misleading inflation math and selective reading. The severe material deprivation of Victorian life—crowded housing, scarce possessions, and basic sanitation problems—dwarfs today’s standards. Modern Americans, even at the lower end of the income scale, enjoy far greater material comfort than the Cratchits ever did.


Christmas is often a time for nostalgia. We look back on our own childhood holidays. Songs and traditions from the past dominate the culture.

Nostalgia is not without its purposes. But it can also be misleading. Take those who view the material circumstances of Charles Dickens’s “A Christmas Carol” as superior to our own.

Claims that an American today earning the minimum wage is worse off than the working poor of the 19th century have been popular since at least 2021. A recent post with thousands of likes reads:

Time for your annual reminder that, according to A Christmas Carol, Bob Cratchit makes 15 shillings a week. Adjusted for inflation, that’s $530.27/wk, $27,574/yr, or $13.50/ hr. Most Americans on minimum wage earn less than a Dickensian allegory for destitution.

This is humbug.

Consider how harsh living conditions were for a Victorian earning 15 shillings a week.

Dickens writes that Mr. Cratchit lives with his wife and six children in a four-room house. It is rare for modern residents of developed nations to crowd eight people into four rooms.

It was common in the Victorian era. According to Britain’s National Archives, a typical home had no more than four rooms. Worse yet, it lacked running water and a toilet. Entire streets (or more) would share a few toilets and a pump with water that was often polluted.

The Cratchit household has few possessions. Their glassware consists of merely “two tumblers, and a custard-cup without a handle.” For Christmas dinner, Mr. Cratchit wears “threadbare clothes” while his wife is “dressed out but poorly in a twice-turned gown.”

People used to turn clothing inside-out and alter the stitching to extend its lifespan. The practice predated the Victorian era, but continued into it. Eventually, clothes would become “napless, threadbare and tattered,” as the historian Emily Cockayne noted.

The Cratchits didn’t out-earn a modern American earning the minimum wage. Mr. Cratchit’s weekly salary of 15 shillings in 1843, the year “A Christmas Carol” was published, is equivalent to almost £122 in 2025. Converted to U.S. dollars, that’s about $160 a week, for an annual salary of $8,320.

The U.S. federal minimum wage is $7.25 per hour or $15,080 per year for a full-time worker. That’s about half of what the meme claims Mr. Cratchit earned. Only 1% of U.S. workers earned the federal minimum wage or less last year. Most states set a higher minimum wage. The average worker earns considerably more. Clerks like Mr. Cratchit now earn an average annual salary of $49,210.

Mr. Cratchit couldn’t have purchased much of the modern “basket of goods” used in inflation calculations. Many of the basket’s items weren’t available in 1843. The U.K.’s Office of National Statistics recently added virtual reality headsets to it.

Another way to compare the relative situation of Mr. Cratchit and a minimum-wage worker today is to see how long it would take each of them to earn enough to buy something comparable. A BBC article notes that, according to an 1844 theatrical adaptation of “A Christmas Carol,” it would have taken Mr. Cratchit a week’s wages to purchase the trappings of a Christmas feast: “seven shillings for the goose, five for the pudding, and three for the onions, sage and oranges.” Mr. Cratchit opts for a goose for the family’s Christmas meal. A turkey—then a costlier option—was too expensive.

The American Farm Bureau Federation found that the ingredients for a turkey-centered holiday meal serving 10 people cost $55.18 in 2025. At the federal minimum wage, someone would need to work seven hours and 37 minutes to afford that feast.

A minimum-wage worker could earn more than enough in a single workday to purchase a meal far more lavish than the modest Christmas dinner that cost Mr. Cratchit an entire week’s pay. And the amount of time a person needs to work to afford a holiday meal has fallen dramatically for the average blue-collar worker in recent years despite inflation. Wages have grown faster than food prices.

There has been substantial progress in living conditions since the 1840s. We’re much better off than the Cratchits were. In fact, most people today enjoy far greater material comfort than did even Dickens’s rich miser Ebenezer Scrooge.

This article was originally published in the Wall Street Journal on 12/23/2025.

Blog Post | Poverty Rates

Modern Freedom Beats Feudal Serfdom

Make the Middle Ages Great Again?

Summary: Some influential voices today romanticize feudalism, but the reality of feudalism was misery for nearly everyone. Life under that system meant hunger, disease, violence, and lives cut brutally short. By contrast, modern societies have lifted billions out of poverty and extended life far beyond what kings and queens once knew. Progress comes from freedom, innovation, and hard work, not a return to the rule of lords and monarchs.


On a recent podcast, Tucker Carlson praised feudalism as “so much better than what we have now” because a ruler is “vested in the prosperity of the people he rules.” This romantic view of medieval hierarchy ignores a brutal reality: For most people, feudalism meant grinding poverty, disease, and early death.

As Gale L. Pooley and I found in our 2022 book Superabundance, society in preindustrial Europe was bifurcated between a small minority of the very rich and the vast majority of the very poor. One 17th-century observer estimated that the French population consisted of “10 percent rich, 50 percent very poor, 30 percent who were nearly beggars, and 10 percent who were actually beggars.” In 16th-century Spain, the Italian historian Francesco Guicciardini wrote, “except for a few Grandees of the Kingdom who live with great sumptuousness … others live in great poverty.”

An account from 18th-century Naples recorded beggars finding “nocturnal asylum in a few caves, stables or ruined houses” where “they are to be seen there lying like filthy animals, with no distinction of age or sex.” Children fared the worst. Paris, according to the French author Louis-Sébastien Mercier, had “7,000 to 8,000 abandoned children out of some 30,000 births around 1780.” These children were then taken—three at a time—to the poor house, with carriers often finding at least “one of them dead” upon arrival.

People were constantly hungry, and starvation was only ever a few bad harvests away. In 1800, even France, one of the world’s richest countries, had an average food supply of only 1,846 calories per person per day. In other words, the majority of the population was undernourished. (Given that the average person needs about 2,000 calories a day.) That, in the words of the Italian historian Carlo Cipolla, gave rise to “serious forms of avitaminosis,” or medical conditions resulting from vitamin deficiencies. There was also, he noted, a prevalence of intestinal worms, which is “a slow, disgusting, and debilitating disease that caused a vast amount of human misery and ill health.”

Sanitation was a nightmare. As the English historian Lawrence Stone wrote in his book The Family, Sex and Marriage in England 1500–1800, “city ditches, now often filled with stagnant water, were commonly used as latrines; butchers killed animals in their shops and threw the offal of the carcasses into the streets; dead animals were left to decay and fester where they lay.” London had “poor holes” or “large, deep, open pits in which were laid the bodies of the poor, side by side, row by row.” The stench was overwhelming, for “great quantities of human excrement were cast into the streets.”

The French historian Fernand Braudel found that in 15th-century England, “80 percent of private expenditure was on food, with 20 percent spent on bread alone.” An account of 16th-century life in rural Lombardy noted that peasants lived on wheat alone: Their “expenses for clothing and other needs are practically non-existent.” Per Cipolla, “One of the main preoccupations of hospital administration was to ensure that the clothes of the deceased should not be usurped but should be given to lawful inheritors. During epidemics of plague, the town authorities had to struggle to confiscate the clothes of the dead and to burn them: people waited for others to die so as to take over their clothes.”

Prior to mechanized agriculture, there were no food surpluses to sustain idle hands, not even those of children. And working conditions were brutal. A 16th-century ordinance in Lombardy found that supervisors in rice fields “bring together a large number of children and adolescents, against whom they practice barbarous cruelties … [They] do not provide these poor creatures with the necessary food and make them labor as slaves by beating them and treating them more harshly than galley slaves, so that many of the children die miserably in the farms and neighboring fields.”

Such violence pervaded daily life. Medieval homicide rates reached 150 murders per 100,000 people in 14th-century Florence. In 15th-century England, it hovered around 24 per 100,000. (In 2020, the Italian homicide rate was 0.48 per 100,000. It was 0.95 per 100,000 in England and Wales in 2024.) People resolved their disputes through physical violence because no effective legal system existed. The serfs—serfdom in Russia was abolished only in 1861—lived as property, bound to land they could never own, subject to masters who viewed them as assets rather than humans. And between 1500 and the first quarter of the 17th century, Europe’s great powers were at war nearly 100 percent of the time.

Carlson’s nostalgia for feudalism is not unique on the MAGA right. The influential American blogger Curtis Yarvin, for example, attributes to monarchs such as France’s Louis XIV decisive and long-term leadership that modern democracies apparently lack. But less frequently mentioned is how, for example, that same Louis ruined his country during the War of the Spanish Succession. As Winston Churchill wrote in Marlborough: His Life and Times,

After more than sixty years of his reign, more than thirty years of which had been consumed in European war, the Great King saw his people face to face with actual famine. Their sufferings were extreme. In Paris the death-rate doubled. Even before Christmas the market-women had marched to Versailles to proclaim their misery. In the countryside the peasantry subsisted on herbs or roots or flocked in despair into the famishing towns. Brigandage was widespread. Bands of starving men, women, and children roamed about in desperation. Châteaux and convents were attacked; the market-place of Amiens was pillaged; credit failed. From every province and from every class rose the cry for bread and peace.

The Great Enrichment, a phrase coined by my Cato Institute colleague Deirdre McCloskey, of the past 200 years or so lifted billions from the misery that defined human existence for millennia. It was driven by market economies and limits on the rulers’ arbitrary power, not feudal hierarchy.

There are many plausible reasons for Carlson’s (and Yarvin’s) openness to giving pre-modern institutions such as feudalism and absolute monarchy a second look. One is a lack of appreciation for the reality of the daily existence of ordinary people whose lives, in the immortal words of the English philosopher Thomas Hobbes, were “poor, nasty, brutish, and short.”

Another is their apparent conviction that the United States is, in the words of President Donald Trump, “a failed nation.” Except that we are nothing of the sort. The United States has plenty of problems, but the lives of ordinary Americans in 2025 are incomparably better than those of the kings and queens of the past. Our standard of living is, in fact, the envy of the world, which is the most parsimonious explanation for millions of people trying to get here.

Solving the problems that remain and will arise in the future will depend on careful evaluation of evidence, historical experience, reason, and hard work. Catastrophism does not help, for it rejects human agency by declaring that the future is already decided. Hunkering down under a protective shield of feudal hierarchy or placing our trust in a modern incarnation of Louis XIV is no guarantee of success. We tried it before, and the results were disastrous.

This article originally appeared in The Dispatch on August 26, 2025.

JMP | Water & Sanitation

A Quarter of the World Population Gained Safe Water Since 2000

“Between 2000 and 2024, the global population increased from 6.2 billion to 8.2 billion. Over this period, a quarter of the world’s population (2.2 billion) gained access to safely managed drinking water, and a third (2.8 billion) gained safely managed sanitation. But while billions have gained access to WASH services, progress has been uneven and the total number of people still lacking access has decreased more slowly.

Since the start of the SDG period in 2015, 961 million have gained safely managed drinking water and the number of people still lacking access has decreased by 270 million. Among the 2.1 billion people still lacking access to safely managed drinking water in 2024, two thirds (1.4 billion) had a basic service, 287 million used limited services, 302 million used unimproved sources and 106 million still used surface water (61 million fewer people used surface water than in 2015).

Between 2015 and 2024, 1.2 billion people gained safely managed sanitation, and the number of people without decreased from 3.9 billion to 3.4 billion. In 2024, more than half of these people (1.9 billion) had a basic service, 560 million used limited services, 555 million with unimproved services and 354 million still practised open defecation (the number of people practising open defecation has decreased by 429 million since 2015).

Since 2015, 1.5 billion people have gained access to basic hygiene services and the number of people who are still unserved has fallen by nearly 900 million (from 2.5 billion to 1.7 billion). Among the 1.7 billion people who still lacked basic hygiene services in 2024,two thirds (1 billion) had a limited service and 611 million still had no handwashing facility.”

From JMP.

DD News | Water & Sanitation

Tap Water Coverage Crosses 81 Percent in Rural India

“More than 15.68 crore rural households – 81% of the total 19.36 crore – now have tap water connections under the government’s flagship Jal Jeevan Mission (JJM), Minister of State for Jal Shakti V. Somanna informed the Rajya Sabha on Monday.

At the time of announcement of JJM, 3.23 crore (17%) rural households were reported to have tap water connections.”

From DD News.