fbpx
01 / 05
Grim Old Days: Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770

Blog Post | Human Development

Grim Old Days: Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770

The book gives insight into a far crueler and more unpleasant society than people today can easily fathom.

Summary: The realities of life in preindustrial England reveal a world teeming with physical discomforts, social cruelty, and environmental hazards unimaginable to modern sensibilities. England from 1600 to 1770 was plagued by disease, primitive hygiene, adulterated food, and oppressive punishments. Far from the romanticized notions of simpler times, living in this early modern time and place often meant enduring relentless hardship and indignity.


British historian Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770 provides a window into the lives of ordinary people in the preindustrial and early industrial era. A “social history,” the book conveys how the world sounded, smelled, felt, and tasted—a horror show beyond the comprehension of most modern people. The chapters bear titles such as “Itchy,” “Mouldy,” “Grotty,” “Dirty,” and “Gloomy.”

A preindustrial person transported to the present day would be amazed by the current prevalence of relatively smooth, clear skin enabled by better general health in addition to the widespread use of sunscreen, moisturizers, and all manner of modern beauty treatments. In the past, frequent illnesses left victims permanently marked. To be “Pock-broken” or “pock-freckled” was a common descriptor. Skin was often directly disfigured by diseases and further damaged by how fleas and then-common medical conditions caused compulsive scratching. “Fleas would have been a common feature of institutions and inns, as well as domestic settings,” proliferating in crammed households, cities, and seaports. A Dutch traveler named William Schellinks (1623–1678) found the fleas in one English inn so “aggressive” that he opted to sleep on a hard bench rather than the provided bed. But fleas were far from the sole culprit. “Many conditions would have caused itching, including eczema, impetigo, ‘psorophthalmy’ (eyebrow dandruff), scabies, chilblains, chapped and rough skin, tetters’ (spots and sores), ‘black morphew’ (leprous or scurvy skin) and ringworm. Few citizens [of Britain] enjoyed smooth unblemished skin.”

If you could visit the past, you would be shocked at the commonness not just of pockmarks but also of oozing open sores. “Venereal disease was the secret epidemic that blighted the entire period,” resulting in such outward signs as “weeping sores on the lips” and “pocky” countenances. Many other diseases also produced wounds that festered and exuded foul discharges on the faces of everyday people. “In this pre-antibiotic era, skin eruptions in the forms of bulging pustules, lesions, acne and gout-induced ulcers could all have become infected, causing chronic wounds.” Such skin problems affected all social classes. “In 1761, as an Oxford undergraduate, the parson-in-waiting James Woodforde . . . was plagued by a ‘bad Boyle on my Eye-brow’. This boil reappeared the following year, to be joined by a stye among his lower right eyelashes.”

With so many faces covered in scars, as well as boils and sores emitting blood and infected pus, it is an understatement to say the people of the past were in desperate need of skincare. Sadly, their primitive skincare and makeup regimens made matters even worse. “Caustic and toxic ingredients lurked in many ready-made and home-mixed cosmetics and toiletries. Eliza Smith’s cure for pimples included brimstone (sulphur). Johann Jacob Wecker suggested the use of arsenic and ‘Dogs-turd’ as ingredients for ointments to ‘make the nails fall’. The Duchess of Newcastle warned that the mercury in some cosmetics could cause consumption and oedema. Indeed, some preparations were so toxic that they could ‘take away both the Life and Youth of a Face, which is the greatest Beauty.’ The Countess of Coventry was said to have died from toxic properties in her cosmetics.” That countess, Maria Coventry née Gunning (1732–1760), died at age 27, likely of lead poisoning, as lead was a common ingredient in skin-whitening makeup at the time, despite lead’s propensity to make its wearers ill (or, in Maria’s case, deceased).

An 18th century English aristocrat who was likely killed by the lead in her makeup at age 27.
Maria, Countess of Coventry, killed by lead in her makeup. Photo credit: Wikimedia.

Even nonlethal makeup was of far poorer quality than today’s cosmetics, frequently dissolving and dripping. Women “shunned hot places for fear of melting visages.” Even royalty, with access to the best cosmetics of the era, fell victim to this tendency of makeup to drip. One observer remarked in his diary after seeing the queen of England at a banquet in 1662 that “her make-up was running down her sweaty face.”

The state of clothing for the masses contributed to skin and health problems. The truly poor bought used garments. “Poorer citizens rarely bought new items of clothing, but made do with second-, third- and fourth-hand clothes. . . . By the time they reached the poorest members of society, garments would be smutted, food-stained, sweat-ridden, pissburnt and might shine with grease. . . . Clothes in such a state would be hard, unyielding and smelly.”

“The second-hand market was a thriving one” in early modern London. “Some specialised in old shoes, or even old boots. [The Dutch-born artist] Marcellus Laroon included an image of a trader who exchanged brooms for casto-off shoes . . . in his Cryes of London (1688). . . . A high demand for second-hand clothing meant that garments constituted a considerable proportion of property that was stolen. Thomas Sevan was apprehended . . . wearing three stolen shirts in 1724. He had left his old ragged shirt behind at the scene of the crime. Elizabeth Pepys’s new farandine waistcoat was snatched from her lap as she sat in traffic in Cheapside. On Easter Monday 1732 John Elliott became the victim of highway robbers who relieved him of his hat, wig, waistcoat and shoes. . . . No item of clothing was immune from theft—even odd shoes and bundles of dirty washing were lifted.”

“Clothes could be taken to a botcher, or a botching tailor, for patching and repair. . . . Old shoes were rejuvenated or modified by cobblers, or ‘translators.’ The subsequent wearers of shoes would have worked their feet into spaces stretched to fit a foreign shape, which might have caused blisters, bunions and corns. . . . Partial unstitching and ‘turning’—the inner parts becoming the new exterior—could prolong the life of coats and other garments. Even the rich eked out the life of their favourite garments by turning, dyeing and scouring. . . . However, clothes could only be refashioned a limited number of times before they became napless, threadbare and tattered. If enough good fabric remained, this could be reused to make a smaller item of clothing, a garment for a child, or a cloth cap. . . . Tired garments were passed down to apprentices or servants.”

The condition of teeth was also disturbingly poor. “Queen Elizabeth sported black teeth. Emetics were popular cure-alls, and these would have hastened tooth decay through the acidic erosion of the enamel. Archaeological surveys suggest that the majority of early modern adults suffered tooth decay.” While they did not meet with much success, the people of the past certainly attempted to keep their teeth from rotting. “There was an array of dentifrice powders and cures on the market. Although most would have had little or no effect on cavities or diseased gums, some of these powders and recipes would have carried away some dirt and plaque from teeth. Powders were concocted from cuttlefish, cream of tartar and sal amoniack (ammonium chloride). These abrasive substances could be rubbed on” teeth, and some recommended “hard rubbing with a dry cloth or sage leaf” to cleanse teeth. The writer Thomas Tryon (1634–1703) recommended swishing river water as a mouthwash. Needless to say, such routines were insufficient. “A lack of adequate tooth cleansing and an inappropriate diet led to bad breath and also caused tooth decay.” Missing teeth were common. “A character in an eighteenth-century play bemoaned the poor dental state of London’s women” by claiming that “not one in ten has a Tooth left.” When those suffering from toothaches sought dental care, what passed for dentistry at the time could make matters even worse. Consider the unfortunate case of the English lawyer and politician Dudley Ryder (1691–1756). “After spending a month in 1715 chewing on just one side of his mouth to avoid the pain of a severely decayed tooth, Dudley Ryder finally summoned up the courage to have it drawn. In the process, a little of his jaw was broken off, but he rallied, claiming it didn’t hurt. Much. By the mid-eighteenth century wealthier citizens would have the option of trying out a transplant, using teeth from a paid donor.”

Tooth and skin problems were visible, but internal ailments that were less apparent also plagued our ancestors. One of the many negative health effects of animals crowding the cities was that parasites from the creatures often spread to humans. “The abundance of dogs and pigs on the city streets provided the perfect breeding ground for a variety of intestinal parasites, many of which wormed their way into humans. Eliza Smith asserted that ‘vast numbers’ were infested. Many bottoms would have itched with discomfort thanks to the presence of thread and tape worms in the digestive system. According to the numerous contemporary adverts, worms created a myriad of physical discomforts, including ‘pinching Pain in the Belly, when hungry, a stinking Breath’, vomiting, nightmares, pallidness, fever and teeth gnashing.” The animals caused other problems as well. “Neighbours near to houses in which beasts were kept or slaughtered would have endured stench and noise.” For example, “those living near Lewis Smart’s huge piggery on London’s Tottenham Court Road described how servants fell sick and resigned on account of the smell, which ‘Drive thro’ the walls of the houses.’ Visitors to the house opposite were forced to hold their noses, and one neighbour explained how the fumes dirtied newly laundered linen and tarnished plate.”

The people of the past often went hungry. “Recording a high rate of corn spoilage in 1693, due to a wet summer season, [the English antiquary] Anthony Wood noted that scarcity pushed prices out of the pockets of the poor, who were forced to ‘eat turnips instead of bread’. During this dearth [the writer] Thomas Tryon outlined a diet for a person on a budget of twopence per day. The recipes are uniformly bland: flour, water, milk and peas, all boiled to differing consistencies.”

Food often spoiled during transport to the market. “Eggs that came to London from Scotland or Ireland were often rotten by the time they arrived.” Food was often adulterated, and some degree of adulteration was considered unavoidable. Malt was only deemed unacceptable if it contained “half a peck of dust or more” per quarter. “Butchers would disguise stale slaughtered birds. [A contemporary account] warns of one such operator who greased the skin and dredged on a fine powder to make the bird strike ‘a fine Colour.’” Butter was frequently adulterated with “tallow and pig’s lard.” “Some fishmongers coated gills with fresh blood, as red gills indicated a recent netting,” to misrepresent stale fish to the unwary buyer. Fish were often wormy and if not cooked thoroughly remained so at the time of serving. The English statesman Samuel Pepys (1633–1703) once noted his disgust at the sight of a sturgeon dish upon which he observed “very many little worms creeping.”

Bread, the mainstay of most diets, was not immune to contamination. “Some loaves were deliberately adulterated with stones and other items to bulk them up.” In 1642, an unscrupulous Liverpool woman named Alice Gallaway “tried to sell a white loaf that contained a stone, to make up its weight. This sort of practice would have been widespread—the baker could claim that the stone had not been removed in milling, and blamed the miller. Stone, grit and other unwelcome contaminants would have posed dangers to the teeth of the unwary.” Millers also engaged in such unethical behavior as adding “beanmeal, chalk, animal bones and slaked lime” to disguise musty flour. Perhaps it should be no surprise then that London bread was described in 1771 as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.”

There are even accounts of human remains being added to food for sale, resulting in unknowing cannibalism on the part of the buyer. The author of the 1757 public health treatise Poison Detected claimed, “The charnel houses of the dead are raked to add filthiness to the food of the living.” The squalid state of the marketplace further exposed food to pollution or contamination. “The market stalls, and the streets on which they stood, were frequently described as being filthy and strewn with rotting debris.” Flies and other insects swarmed each market. “Hanging meats were vulnerable to attack by hopper-fly, and if they got too warm they would rust and spoil.” The smoke of London’s chimneys was said to fill the air and “so Mummife, drye up, wast and burn [hanging meat in the marketplace], that it suddainly crumbles away, consumes and comes to nothing.”

The population was so accustomed to foul-smelling meat that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.” Between the bugs, the smoke, and the dirt, few groceries reached customers unscathed. One 18th-century writer complained of “pallid contaminated mash, which they call strawberries; soiled and tossed by greasy paws through twenty baskets crusted with dirt.” The state of the marketplace even inspired deprecating lyrics, such as these from 1715, “As thick as Butchers Stalls with Fly-blows [where] every blue-ars’d Insect rambles.” “As the market day progressed, perishables . . . were more likely to be fly-blown or decayed.” Those undesirable leftovers unsold at the end of the market day were often later hawked by street vendors. A letter in The Spectator in 1712 complained that everything sold by such vendors was “perished or putrified.” Recipes took into account the poor quality of available ingredients. “Imparting some dubious tips for restoring rotting larder supplies, [cookbook author] Hannah Glasse’s strategy ‘to save Potted Birds, that begin to be bad’ (indeed, those which ‘smell so bad, that no body [can] . . . bear the Smell for the Rankness of the Butter’) involved dunking the birds in boiling water for thirty seconds, and then merely retopping with new butter.”

Yet those shopping at the marketplace with all its terrors were relatively fortunate compared to others. Broken victuals, the remnants and scrapings from the more affluent plates, were a perk of service for some servants, and the saviour of many paupers.” One account from 1709 tells of a woman reduced to living off “a Mouldy Cryst [crust] and a Cucumber” while breastfeeding, an activity that greatly increases caloric needs. Desperation sometimes resulted in swallowing nonfood objects, such as wax, to ease hunger pangs. “Witnesses reported that a young London servant girl was so hungry in 1766 that she ate cabbage leaves and candles.” She was far from the first person to use candle wax as a condiment. “The underfed spread butter thickly on bread (this was necessary to facilitate swallowing dark or stale bread). Cheap butter was poor grade, akin to grease . . . a ‘tallowy rancid mass’ made of candle ends and kitchen grease was the worst type” of concoction to pass under the name of butter. Another account of hunger from 1756 relates how a starving woman felt “obliged to eat the cabbage stalks off the dunghill.”

The people of the past also had good reason to wonder whether their homes would collapse around them. “A proverb warned that ‘old buildings may fall in a moment’. So familiar was the sound of collapsing masonry that in 1688 Randle Holme included ‘a crash, a noise proceeding from a breach of a house or wall’ in a list of only nine descriptive sentences to illustrate the ‘Sense of Hearing’. Portmeadow House in Oxford collapsed in the early seventeenth century. Among the casualties recorded in the Bills of Mortality for 1664 was one hapless soul killed by a falling house in St Mary’s Whitechapel . . . Dr Johnson described London of the 1730s as a place where ‘falling Houses thunder on your Head.’ . . . In the 1740s, ‘Props to Houses’ appeared among a list of common items hindering free passage along the pavement in London. A German visitor wondered if he should go into the street in 1775 during a violent storm, ‘lest the house should fall in, which is no rare occurrence in London.’” “Thomas Atwood, a Bath plumber and property developer, died in 1775 when the floor of an old house gave way.” Regulations sometimes made matters worse, preventing the tearing down of homes on the verge of collapse. One account notes that homes in disrepair became “the rendezvous of thieves; and at last . . . fall of themselves, to the great distress of whole neighborhoods, and sometimes to bury passengers in their ruins.” Windy days could knock down homes. “Gales swept [London] in 1690, leaving ‘very many houses shattered, chimneys blowne down.’”

Inside, homes were often filled with smoke from fireplaces. “With open fires providing most of the heating, filthy discharges of soot and smut clung to interiors.” Even with regular chimney sweepings, clogged chimney pots and soot deluges could and did occur. One writer railed against the “pernicious smoke . . . superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.” Interior smoke disturbed the air of the humblest homes and the grandest palaces alike. The German consul Zacharias Conrad von Uffenbach (1683–1734) complained that the Painted Chamber of London’s Westminster Hall could “scarce be seen for the smoke” that filled the interior; in the Upper Chamber he similarly noted that the tapestries were “so wretched and tarnished with smoke that neither gold nor silver, colours or figures can be recognized.”

“Householders struggled to contain infestations of vermin.” This was a problem even in well-off homes. Samuel Pepys recorded in his diary his multiyear struggle with mice, which “scampered across his desk” with abandon despite his purchase of a cat and deployment of mousetraps. “In 1756 Harrop’s Manchester Mercury ran an advert for a book detailing how to rid houses of all manner of vermin,” including adders, ants, badgers, birds, caterpillars, earwigs, flies, fish, fleas, foxes, frogs, gnats, lice, mice, moles, otters, polecats, rabbits, rats, snakes, scorpions (an invasive species of which had entered England via Italian masonry shipments), snails, spiders, toads, wasps, weasels, and worms.

As if that wasn’t enough to keep people up at night, nighttime was loud. Crying babies and the moaning of the hungry, ill, and dying echoed in the night, as well as the pained wails of women suffering through domestic violence. In London, in 1595, a law was passed to prevent men from beating their wives after 9 p.m. The legislation was not prompted by concern for the wives (after all, wife-beating was generally accepted as normal and morally unproblematic) but by consideration for neighbors trying to sleep through the noise. The law read in part: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” A similar law forbade smiths from using their hammers “after the houre of nyne in the night, nore afore the houre of four in the Morninge.”

The book gives insight into a far crueler and more violent society. Legal punishments could be grotesque and sadistic. For example, in 1611, a woman who had conducted “lewd acts . . . was punished by the Westminster burgesses by being stripped naked from the waist upwards, fastened to a cart, and whipped through the streets on a cold December day.” Women deemed “scolds” were often publicly humiliated in ritual fashion. “Ducking stools or cuckstools were equipment for punishing scolds and were items of town furniture [and] were still used as a deterrent in the eighteenth century. Ducking was a rite of humiliation intended to put the woman in her place and to teach her a lesson.” Many towns took pride in the maintenance of their ducking stools, and sometimes a device with a similar rationale called a “scold’s bridle,” an iron muzzle that enclosed the head and compressed the tongue to silence the unfortunate wearer.

“Across the country [of England] the civic authorities ensured that their cuckstools were functioning. In 1603 the Southampton authorities complained that ‘the Cuckinge stoole on the Towne ditches is all broken’ and expressed their desire for a new one, to ‘punish the manifold number of scoldinge woemen that be in this Towne’. The following year they wondered whether a stool-on-wheels might be invented. This could be carried from dore to dore as the scolde shall inhabit’. This mobile stool would, it was explained, be ‘a great ease to mr mayor . . . whoe is daylie troubled w[i]th suche brawles’. The Oxford Council erected a cuck stool at the Castle Mills in 1647. The Manchester stool was set up in 1602 ‘for the punyshement of Lewde Wemen and Scoldes . . . six scolds were immersed in 1627. A decade later the town added a scold’s bridle to their armoury of reform. A new ducking chair was erected in ‘the usual place’ in 1738. Even as late as 1770 a knot and bridle hung from the door of the stationers, near the Dark Entry in the Market Place ‘as a terror to the scolding huxter-women.’”

Outhouses doubled as dumping grounds for victims of infanticide with shocking frequency. “Much of what we know about London’s privies and houses of ease comes from unpleasant witness statements concerning gruesome discoveries of infants’ corpses found among the filth. In the trial of Mercy Hornby for killing her newborn daughter we find details of the privy into which the child was cast. Newly constructed in the 1730s, it was six foot deep, with just over three feet of soil at the time of the incident.”

And that is only a small slice of the manifold horrors detailed in Cockayne’s book, where practically every page provides fresh fodder for nightmares.

Blog Post | Food Production

Wheat Superabundance Proves Malthus Wrong

Compared to 1960, we can grow 250 percent more wheat on 9 percent more land, at an 85.7 percent lower time price.

Summary: For centuries, people feared that population growth would outstrip food supply, leading to famine and collapse. Yet wheat tells a different story: production has soared, yields have multiplied, and the cost in human effort has plummeted. Despite wars, droughts, and disruptions, innovation and open markets have made wheat more abundant than ever.


The Reverend Thomas Malthus (1766–1834) got it backwards. In his 1798 Essay on Population he warned that “the power of population is indefinitely greater than the power in the earth to produce subsistence for man. Population, when unchecked, increases in a geometrical ratio. Subsistence increases only in an arithmetical ratio.”

Malthus even added, with no small dose of condescension, that “a slight acquaintance with numbers will shew the immensity of the first power in comparison of the second.”

When Malthus published his essay, the world’s population hovered around 1 billion. By 1960 it had reached 3 billion. Today it stands at roughly 8.2 billion. And yet, instead of mass starvation, food production has outpaced population growth. Consider wheat.

According to the US Department of Agriculture, since 1960 wheat production has surged by 250 percent, while the world’s population grew by only 171 percent. For every 1 percent increase in population, wheat production rose by 1.46 percent. Even more remarkable, this bounty came from just 9 percent more arable land. Wheat yields—the amount harvested per acre—have soared by 271 percent.

But what about the time price? Glad you asked. Since 1960, the time price of wheat has fallen by 85.7 percent.

Put differently, the time it took to earn the money to buy a single bushel of wheat now buys almost seven bushels.

Yes, there have been moments when wheat prices spiked—due to droughts, wars, and politics. Yet with fewer conflicts, relentless innovation, and open markets, wheat has only grown more abundant. If Reverend Malthus could see our world today, I suspect he’d be relieved—and perhaps even delighted—that human ingenuity proved him to be so spectacularly wrong.

Find more of Gale’s work at his Substack, Gale Winds.

Blog Post | Environment & Pollution

Climate Litigation Can’t Fix the Past, but It Can Hinder the Future

Dealing with climate change requires technological innovation and economic growth, not legal warfare between nations.

Summary: The International Court of Justice has suggested nations could be held liable for historic greenhouse gas emissions, opening the door to lawsuits over centuries of industrial activity. Yet this approach risks punishing the very innovations that lifted billions out of poverty and advanced human health and flourishing. Lasting progress on climate challenges will come not from courtroom battles, but from technological solutions and continued economic development.


The International Court of Justice’s advisory opinion purporting to establish legal grounds that would allow nations to sue one another over climate damages represents judicial overreach that ignores economic history and threatens global development. While the opinion was undeniably legally adventurous, the framework it envisages would be practically unworkable as well as economically destructive.

The ICJ’s ruling suggests countries can be held liable for historical emissions of planet-warming gases. That creates an accounting nightmare that no legal system can resolve. How does one calculate damages from coal burned in Manchester in 1825 versus emissions from a Beijing power plant in 2025? How does one stack up the harm caused by a warming world against the benefits of industrialization?

Britain began large-scale coal combustion during the Industrial Revolution, when atmospheric CO2 concentrations were 280 parts per million and climate science did not exist. Holding Britain liable for actions taken without knowledge of consequences violates basic principles of jurisprudence. The same applies to the United States, whose early industrialization occurred during an era when maximizing economic output was considered unambiguously beneficial to human welfare.

Critics of historical emissions ignore what those emissions purchased. British coal combustion powered textile mills that clothed much of the world, steam engines that revolutionized transportation, and factories that mass-produced goods previously available only to elites. American industrialization followed, creating assembly lines, electrical grids, and chemical processes that form the backbone of modern civilization.

These developments were not zero-sum exercises in resource extraction. They created knowledge, infrastructure, and institutions that benefited everyone. The steam engine led to internal combustion engines, which enabled mechanized agriculture that now feeds 8 billion people. Coal-powered steel production made possible skyscrapers, bridges, and the infrastructure that supports modern cities, where most humans now live longer, healthier lives than their ancestors.

The data on human welfare improvements since industrialization began are explicit. Global life expectancy increased from approximately 29 years in 1800 to 73 years today. Infant mortality rates fell from over 40 percent to under 3 percent. Extreme poverty, defined as living on less than $2.15 per day in purchasing power parity terms, declined from over 80 percent of the global population in 1800 to under 10 percent today.

Nutrition improved dramatically. Caloric availability per person has increased by roughly 40 percent since 1960 alone, while food prices relative to wages fell consistently. Height, a reliable indicator of childhood nutrition, increased significantly across all regions. Educational attainment expanded from literacy rates below 10 percent globally in 1800 to over 85 percent today.

These improvements correlate directly with energy consumption and industrial development. Countries that industrialized earliest experienced these welfare gains first, then transmitted the knowledge and technology globally. The antibiotics developed in American and European laboratories now save lives worldwide. The agricultural techniques pioneered in industrialized nations now feed populations that would otherwise face starvation.

The International Court of Justice’s liability framework threatens to undermine the very mechanisms that created these welfare improvements. Innovation requires investment, which requires confidence in property rights and legal stability. If successful economic development subjects countries to retroactive liability, the incentive structure tilts away from growth and toward stagnation.

Consider current developing nations. Under this legal framework, should India or Nigeria limit their industrial development to avoid future liability? Should they forgo the coal and natural gas that powered Western development? That creates a perverse situation where the legal system penalizes the exact processes that lifted billions from poverty.

The framework also ignores technological solutions. The same innovative capacity that created the Industrial Revolution is now producing renewable energy technologies, carbon capture systems, and efficiency improvements that address climate concerns without sacrificing development. Market incentives and technological progress offer more promise than legal blame assignment.

Which emissions count as legally actionable? All anthropogenic CO2 remains in the atmosphere for centuries, making every emission since 1750 potentially relevant. Should liability begin with James Watt’s steam engine improvements in 1769? With the first coal-fired power plant? With Henry Ford’s assembly line? The temporal boundaries are arbitrary and politically motivated rather than scientifically determined.

Similarly, which countries qualify as defendants? The largest current emitters include China and India, whose recent emissions dwarf historical American and British totals. China alone now produces more CO2 annually than the United States and Europe combined. Any coherent liability framework must address current emissions, not just historical ones.

And where would the money go? This aspect of the case was brought up by Vanuatu. If the island nation receives compensation from the UK and the US, should it not be obliged to pay the British and the Americans for a plethora of life-enhancing Western discoveries, including electricity, vaccines, the telephone, radio, aviation, internet, refrigeration, and navigation systems?

Climate adaptation and mitigation require technological innovation and economic growth, not legal warfare between nations. The countries that industrialized first possess the technological capacity and institutional knowledge to develop solutions to today’s problems. Channeling resources toward litigation rather than innovation represents a misallocation that benefits lawyers while harming global welfare.

The ICJ opinion reflects wishful thinking rather than practical policy. Legal frameworks cannot repeal economic reality or reverse the historical processes that created modern prosperity. Instead of seeking retroactive justice for emissions that enabled human flourishing, policymakers should focus on technologies and institutions that sustain development while addressing environmental concerns. The alternative is a world where legal systems punish success and innovation while offering nothing constructive in return.

The original version of this article was published in National Review on 8/12/2025.

Euronews | Nutrition

More Children Are Obese than Underweight in World-First

“More children worldwide are now obese than underweight for the first time, due partly to rampant junk food access, the United Nations said in a new report.

While the share of school-aged children and teenagers who are underweight has fallen since the turn of the century – from 13 per cent to 9.2 per cent – obesity rates have risen from 3 per cent to 9.4 per cent, according to UNICEF, the United Nations Children’s Fund.

The only regions where children are still more likely to be underweight than obese are sub-Saharan Africa and South Asia.

That translates to about 188 million children aged 5 to 19 with obesity globally in 2025, putting them at risk of serious health complications, the report warned. An estimated 184 million children are underweight.”

From Euronews.