fbpx
01 / 05
Grim Old Days: Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770

Blog Post | Human Development

Grim Old Days: Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770

The book gives insight into a far crueler and more unpleasant society than people today can easily fathom.

Summary: The realities of life in preindustrial England reveal a world teeming with physical discomforts, social cruelty, and environmental hazards unimaginable to modern sensibilities. England from 1600 to 1770 was plagued by disease, primitive hygiene, adulterated food, and oppressive punishments. Far from the romanticized notions of simpler times, living in this early modern time and place often meant enduring relentless hardship and indignity.


British historian Emily Cockayne’s Hubbub: Filth, Noise & Stench in England, 1600–1770 provides a window into the lives of ordinary people in the preindustrial and early industrial era. A “social history,” the book conveys how the world sounded, smelled, felt, and tasted—a horror show beyond the comprehension of most modern people. The chapters bear titles such as “Itchy,” “Mouldy,” “Grotty,” “Dirty,” and “Gloomy.”

A preindustrial person transported to the present day would be amazed by the current prevalence of relatively smooth, clear skin enabled by better general health in addition to the widespread use of sunscreen, moisturizers, and all manner of modern beauty treatments. In the past, frequent illnesses left victims permanently marked. To be “Pock-broken” or “pock-freckled” was a common descriptor. Skin was often directly disfigured by diseases and further damaged by how fleas and then-common medical conditions caused compulsive scratching. “Fleas would have been a common feature of institutions and inns, as well as domestic settings,” proliferating in crammed households, cities, and seaports. A Dutch traveler named William Schellinks (1623–1678) found the fleas in one English inn so “aggressive” that he opted to sleep on a hard bench rather than the provided bed. But fleas were far from the sole culprit. “Many conditions would have caused itching, including eczema, impetigo, ‘psorophthalmy’ (eyebrow dandruff), scabies, chilblains, chapped and rough skin, tetters’ (spots and sores), ‘black morphew’ (leprous or scurvy skin) and ringworm. Few citizens [of Britain] enjoyed smooth unblemished skin.”

If you could visit the past, you would be shocked at the commonness not just of pockmarks but also of oozing open sores. “Venereal disease was the secret epidemic that blighted the entire period,” resulting in such outward signs as “weeping sores on the lips” and “pocky” countenances. Many other diseases also produced wounds that festered and exuded foul discharges on the faces of everyday people. “In this pre-antibiotic era, skin eruptions in the forms of bulging pustules, lesions, acne and gout-induced ulcers could all have become infected, causing chronic wounds.” Such skin problems affected all social classes. “In 1761, as an Oxford undergraduate, the parson-in-waiting James Woodforde . . . was plagued by a ‘bad Boyle on my Eye-brow’. This boil reappeared the following year, to be joined by a stye among his lower right eyelashes.”

With so many faces covered in scars, as well as boils and sores emitting blood and infected pus, it is an understatement to say the people of the past were in desperate need of skincare. Sadly, their primitive skincare and makeup regimens made matters even worse. “Caustic and toxic ingredients lurked in many ready-made and home-mixed cosmetics and toiletries. Eliza Smith’s cure for pimples included brimstone (sulphur). Johann Jacob Wecker suggested the use of arsenic and ‘Dogs-turd’ as ingredients for ointments to ‘make the nails fall’. The Duchess of Newcastle warned that the mercury in some cosmetics could cause consumption and oedema. Indeed, some preparations were so toxic that they could ‘take away both the Life and Youth of a Face, which is the greatest Beauty.’ The Countess of Coventry was said to have died from toxic properties in her cosmetics.” That countess, Maria Coventry née Gunning (1732–1760), died at age 27, likely of lead poisoning, as lead was a common ingredient in skin-whitening makeup at the time, despite lead’s propensity to make its wearers ill (or, in Maria’s case, deceased).

An 18th century English aristocrat who was likely killed by the lead in her makeup at age 27.
Maria, Countess of Coventry, killed by lead in her makeup. Photo credit: Wikimedia.

Even nonlethal makeup was of far poorer quality than today’s cosmetics, frequently dissolving and dripping. Women “shunned hot places for fear of melting visages.” Even royalty, with access to the best cosmetics of the era, fell victim to this tendency of makeup to drip. One observer remarked in his diary after seeing the queen of England at a banquet in 1662 that “her make-up was running down her sweaty face.”

The state of clothing for the masses contributed to skin and health problems. The truly poor bought used garments. “Poorer citizens rarely bought new items of clothing, but made do with second-, third- and fourth-hand clothes. . . . By the time they reached the poorest members of society, garments would be smutted, food-stained, sweat-ridden, pissburnt and might shine with grease. . . . Clothes in such a state would be hard, unyielding and smelly.”

“The second-hand market was a thriving one” in early modern London. “Some specialised in old shoes, or even old boots. [The Dutch-born artist] Marcellus Laroon included an image of a trader who exchanged brooms for casto-off shoes . . . in his Cryes of London (1688). . . . A high demand for second-hand clothing meant that garments constituted a considerable proportion of property that was stolen. Thomas Sevan was apprehended . . . wearing three stolen shirts in 1724. He had left his old ragged shirt behind at the scene of the crime. Elizabeth Pepys’s new farandine waistcoat was snatched from her lap as she sat in traffic in Cheapside. On Easter Monday 1732 John Elliott became the victim of highway robbers who relieved him of his hat, wig, waistcoat and shoes. . . . No item of clothing was immune from theft—even odd shoes and bundles of dirty washing were lifted.”

“Clothes could be taken to a botcher, or a botching tailor, for patching and repair. . . . Old shoes were rejuvenated or modified by cobblers, or ‘translators.’ The subsequent wearers of shoes would have worked their feet into spaces stretched to fit a foreign shape, which might have caused blisters, bunions and corns. . . . Partial unstitching and ‘turning’—the inner parts becoming the new exterior—could prolong the life of coats and other garments. Even the rich eked out the life of their favourite garments by turning, dyeing and scouring. . . . However, clothes could only be refashioned a limited number of times before they became napless, threadbare and tattered. If enough good fabric remained, this could be reused to make a smaller item of clothing, a garment for a child, or a cloth cap. . . . Tired garments were passed down to apprentices or servants.”

The condition of teeth was also disturbingly poor. “Queen Elizabeth sported black teeth. Emetics were popular cure-alls, and these would have hastened tooth decay through the acidic erosion of the enamel. Archaeological surveys suggest that the majority of early modern adults suffered tooth decay.” While they did not meet with much success, the people of the past certainly attempted to keep their teeth from rotting. “There was an array of dentifrice powders and cures on the market. Although most would have had little or no effect on cavities or diseased gums, some of these powders and recipes would have carried away some dirt and plaque from teeth. Powders were concocted from cuttlefish, cream of tartar and sal amoniack (ammonium chloride). These abrasive substances could be rubbed on” teeth, and some recommended “hard rubbing with a dry cloth or sage leaf” to cleanse teeth. The writer Thomas Tryon (1634–1703) recommended swishing river water as a mouthwash. Needless to say, such routines were insufficient. “A lack of adequate tooth cleansing and an inappropriate diet led to bad breath and also caused tooth decay.” Missing teeth were common. “A character in an eighteenth-century play bemoaned the poor dental state of London’s women” by claiming that “not one in ten has a Tooth left.” When those suffering from toothaches sought dental care, what passed for dentistry at the time could make matters even worse. Consider the unfortunate case of the English lawyer and politician Dudley Ryder (1691–1756). “After spending a month in 1715 chewing on just one side of his mouth to avoid the pain of a severely decayed tooth, Dudley Ryder finally summoned up the courage to have it drawn. In the process, a little of his jaw was broken off, but he rallied, claiming it didn’t hurt. Much. By the mid-eighteenth century wealthier citizens would have the option of trying out a transplant, using teeth from a paid donor.”

Tooth and skin problems were visible, but internal ailments that were less apparent also plagued our ancestors. One of the many negative health effects of animals crowding the cities was that parasites from the creatures often spread to humans. “The abundance of dogs and pigs on the city streets provided the perfect breeding ground for a variety of intestinal parasites, many of which wormed their way into humans. Eliza Smith asserted that ‘vast numbers’ were infested. Many bottoms would have itched with discomfort thanks to the presence of thread and tape worms in the digestive system. According to the numerous contemporary adverts, worms created a myriad of physical discomforts, including ‘pinching Pain in the Belly, when hungry, a stinking Breath’, vomiting, nightmares, pallidness, fever and teeth gnashing.” The animals caused other problems as well. “Neighbours near to houses in which beasts were kept or slaughtered would have endured stench and noise.” For example, “those living near Lewis Smart’s huge piggery on London’s Tottenham Court Road described how servants fell sick and resigned on account of the smell, which ‘Drive thro’ the walls of the houses.’ Visitors to the house opposite were forced to hold their noses, and one neighbour explained how the fumes dirtied newly laundered linen and tarnished plate.”

The people of the past often went hungry. “Recording a high rate of corn spoilage in 1693, due to a wet summer season, [the English antiquary] Anthony Wood noted that scarcity pushed prices out of the pockets of the poor, who were forced to ‘eat turnips instead of bread’. During this dearth [the writer] Thomas Tryon outlined a diet for a person on a budget of twopence per day. The recipes are uniformly bland: flour, water, milk and peas, all boiled to differing consistencies.”

Food often spoiled during transport to the market. “Eggs that came to London from Scotland or Ireland were often rotten by the time they arrived.” Food was often adulterated, and some degree of adulteration was considered unavoidable. Malt was only deemed unacceptable if it contained “half a peck of dust or more” per quarter. “Butchers would disguise stale slaughtered birds. [A contemporary account] warns of one such operator who greased the skin and dredged on a fine powder to make the bird strike ‘a fine Colour.’” Butter was frequently adulterated with “tallow and pig’s lard.” “Some fishmongers coated gills with fresh blood, as red gills indicated a recent netting,” to misrepresent stale fish to the unwary buyer. Fish were often wormy and if not cooked thoroughly remained so at the time of serving. The English statesman Samuel Pepys (1633–1703) once noted his disgust at the sight of a sturgeon dish upon which he observed “very many little worms creeping.”

Bread, the mainstay of most diets, was not immune to contamination. “Some loaves were deliberately adulterated with stones and other items to bulk them up.” In 1642, an unscrupulous Liverpool woman named Alice Gallaway “tried to sell a white loaf that contained a stone, to make up its weight. This sort of practice would have been widespread—the baker could claim that the stone had not been removed in milling, and blamed the miller. Stone, grit and other unwelcome contaminants would have posed dangers to the teeth of the unwary.” Millers also engaged in such unethical behavior as adding “beanmeal, chalk, animal bones and slaked lime” to disguise musty flour. Perhaps it should be no surprise then that London bread was described in 1771 as “a deleterious paste, mixed up with chalk, alum and bone ashes, insipid to the taste and destructive to the constitution.”

There are even accounts of human remains being added to food for sale, resulting in unknowing cannibalism on the part of the buyer. The author of the 1757 public health treatise Poison Detected claimed, “The charnel houses of the dead are raked to add filthiness to the food of the living.” The squalid state of the marketplace further exposed food to pollution or contamination. “The market stalls, and the streets on which they stood, were frequently described as being filthy and strewn with rotting debris.” Flies and other insects swarmed each market. “Hanging meats were vulnerable to attack by hopper-fly, and if they got too warm they would rust and spoil.” The smoke of London’s chimneys was said to fill the air and “so Mummife, drye up, wast and burn [hanging meat in the marketplace], that it suddainly crumbles away, consumes and comes to nothing.”

The population was so accustomed to foul-smelling meat that “in 1736 a bundle of rags that concealed a suffocated newborn baby was mistaken for a joint of meat by its stinking smell.” Between the bugs, the smoke, and the dirt, few groceries reached customers unscathed. One 18th-century writer complained of “pallid contaminated mash, which they call strawberries; soiled and tossed by greasy paws through twenty baskets crusted with dirt.” The state of the marketplace even inspired deprecating lyrics, such as these from 1715, “As thick as Butchers Stalls with Fly-blows [where] every blue-ars’d Insect rambles.” “As the market day progressed, perishables . . . were more likely to be fly-blown or decayed.” Those undesirable leftovers unsold at the end of the market day were often later hawked by street vendors. A letter in The Spectator in 1712 complained that everything sold by such vendors was “perished or putrified.” Recipes took into account the poor quality of available ingredients. “Imparting some dubious tips for restoring rotting larder supplies, [cookbook author] Hannah Glasse’s strategy ‘to save Potted Birds, that begin to be bad’ (indeed, those which ‘smell so bad, that no body [can] . . . bear the Smell for the Rankness of the Butter’) involved dunking the birds in boiling water for thirty seconds, and then merely retopping with new butter.”

Yet those shopping at the marketplace with all its terrors were relatively fortunate compared to others. Broken victuals, the remnants and scrapings from the more affluent plates, were a perk of service for some servants, and the saviour of many paupers.” One account from 1709 tells of a woman reduced to living off “a Mouldy Cryst [crust] and a Cucumber” while breastfeeding, an activity that greatly increases caloric needs. Desperation sometimes resulted in swallowing nonfood objects, such as wax, to ease hunger pangs. “Witnesses reported that a young London servant girl was so hungry in 1766 that she ate cabbage leaves and candles.” She was far from the first person to use candle wax as a condiment. “The underfed spread butter thickly on bread (this was necessary to facilitate swallowing dark or stale bread). Cheap butter was poor grade, akin to grease . . . a ‘tallowy rancid mass’ made of candle ends and kitchen grease was the worst type” of concoction to pass under the name of butter. Another account of hunger from 1756 relates how a starving woman felt “obliged to eat the cabbage stalks off the dunghill.”

The people of the past also had good reason to wonder whether their homes would collapse around them. “A proverb warned that ‘old buildings may fall in a moment’. So familiar was the sound of collapsing masonry that in 1688 Randle Holme included ‘a crash, a noise proceeding from a breach of a house or wall’ in a list of only nine descriptive sentences to illustrate the ‘Sense of Hearing’. Portmeadow House in Oxford collapsed in the early seventeenth century. Among the casualties recorded in the Bills of Mortality for 1664 was one hapless soul killed by a falling house in St Mary’s Whitechapel . . . Dr Johnson described London of the 1730s as a place where ‘falling Houses thunder on your Head.’ . . . In the 1740s, ‘Props to Houses’ appeared among a list of common items hindering free passage along the pavement in London. A German visitor wondered if he should go into the street in 1775 during a violent storm, ‘lest the house should fall in, which is no rare occurrence in London.’” “Thomas Atwood, a Bath plumber and property developer, died in 1775 when the floor of an old house gave way.” Regulations sometimes made matters worse, preventing the tearing down of homes on the verge of collapse. One account notes that homes in disrepair became “the rendezvous of thieves; and at last . . . fall of themselves, to the great distress of whole neighborhoods, and sometimes to bury passengers in their ruins.” Windy days could knock down homes. “Gales swept [London] in 1690, leaving ‘very many houses shattered, chimneys blowne down.’”

Inside, homes were often filled with smoke from fireplaces. “With open fires providing most of the heating, filthy discharges of soot and smut clung to interiors.” Even with regular chimney sweepings, clogged chimney pots and soot deluges could and did occur. One writer railed against the “pernicious smoke . . . superinducing a sooty Crust or furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and Corroding the very Iron-bars and hardest stone with those piercing and acrimonious Spirits which accompany its Sulphur.” Interior smoke disturbed the air of the humblest homes and the grandest palaces alike. The German consul Zacharias Conrad von Uffenbach (1683–1734) complained that the Painted Chamber of London’s Westminster Hall could “scarce be seen for the smoke” that filled the interior; in the Upper Chamber he similarly noted that the tapestries were “so wretched and tarnished with smoke that neither gold nor silver, colours or figures can be recognized.”

“Householders struggled to contain infestations of vermin.” This was a problem even in well-off homes. Samuel Pepys recorded in his diary his multiyear struggle with mice, which “scampered across his desk” with abandon despite his purchase of a cat and deployment of mousetraps. “In 1756 Harrop’s Manchester Mercury ran an advert for a book detailing how to rid houses of all manner of vermin,” including adders, ants, badgers, birds, caterpillars, earwigs, flies, fish, fleas, foxes, frogs, gnats, lice, mice, moles, otters, polecats, rabbits, rats, snakes, scorpions (an invasive species of which had entered England via Italian masonry shipments), snails, spiders, toads, wasps, weasels, and worms.

As if that wasn’t enough to keep people up at night, nighttime was loud. Crying babies and the moaning of the hungry, ill, and dying echoed in the night, as well as the pained wails of women suffering through domestic violence. In London, in 1595, a law was passed to prevent men from beating their wives after 9 p.m. The legislation was not prompted by concern for the wives (after all, wife-beating was generally accepted as normal and morally unproblematic) but by consideration for neighbors trying to sleep through the noise. The law read in part: “No man shall after the houre of nine at the Night, keepe any rule whereby any such suddaine out-cry be made in the still of the Night, as making any affray, or beating hys Wife, or servant.” A similar law forbade smiths from using their hammers “after the houre of nyne in the night, nore afore the houre of four in the Morninge.”

The book gives insight into a far crueler and more violent society. Legal punishments could be grotesque and sadistic. For example, in 1611, a woman who had conducted “lewd acts . . . was punished by the Westminster burgesses by being stripped naked from the waist upwards, fastened to a cart, and whipped through the streets on a cold December day.” Women deemed “scolds” were often publicly humiliated in ritual fashion. “Ducking stools or cuckstools were equipment for punishing scolds and were items of town furniture [and] were still used as a deterrent in the eighteenth century. Ducking was a rite of humiliation intended to put the woman in her place and to teach her a lesson.” Many towns took pride in the maintenance of their ducking stools, and sometimes a device with a similar rationale called a “scold’s bridle,” an iron muzzle that enclosed the head and compressed the tongue to silence the unfortunate wearer.

“Across the country [of England] the civic authorities ensured that their cuckstools were functioning. In 1603 the Southampton authorities complained that ‘the Cuckinge stoole on the Towne ditches is all broken’ and expressed their desire for a new one, to ‘punish the manifold number of scoldinge woemen that be in this Towne’. The following year they wondered whether a stool-on-wheels might be invented. This could be carried from dore to dore as the scolde shall inhabit’. This mobile stool would, it was explained, be ‘a great ease to mr mayor . . . whoe is daylie troubled w[i]th suche brawles’. The Oxford Council erected a cuck stool at the Castle Mills in 1647. The Manchester stool was set up in 1602 ‘for the punyshement of Lewde Wemen and Scoldes . . . six scolds were immersed in 1627. A decade later the town added a scold’s bridle to their armoury of reform. A new ducking chair was erected in ‘the usual place’ in 1738. Even as late as 1770 a knot and bridle hung from the door of the stationers, near the Dark Entry in the Market Place ‘as a terror to the scolding huxter-women.’”

Outhouses doubled as dumping grounds for victims of infanticide with shocking frequency. “Much of what we know about London’s privies and houses of ease comes from unpleasant witness statements concerning gruesome discoveries of infants’ corpses found among the filth. In the trial of Mercy Hornby for killing her newborn daughter we find details of the privy into which the child was cast. Newly constructed in the 1730s, it was six foot deep, with just over three feet of soil at the time of the incident.”

And that is only a small slice of the manifold horrors detailed in Cockayne’s book, where practically every page provides fresh fodder for nightmares.

Blog Post | Population Growth

No, Prosperity Doesn’t Cause Population Collapse

Wealth doesn’t have to mean demographic decline.

Summary: For decades, experts assumed that rising prosperity inevitably led to falling birth rates, fueling concerns about population collapse in wealthy societies. But new data show that this link is weakening or even reversing, with many high-income countries now seeing higher fertility than some middle-income nations. As research reveals that wealth and fertility can rise together, policymakers have an opportunity to rethink outdated assumptions about tradeoffs between prosperity and demographic decline.


For years, it was treated as a demographic law: as countries grow wealthier, they have fewer children. Prosperity, it was believed, inevitably drove birth rates down. This assumption shaped countless forecasts about the future of the global population.

And in many wealthy countries, such as South Korea and Italy, very low fertility rates persist. But a growing body of research is challenging the idea that rising prosperity always suppresses fertility.

University of Pennsylvania economist Jesús Fernández-Villaverde recently observed that middle-income countries are now experiencing lower total fertility rates than many advanced economies ever have. His latest work shows that Thailand and Colombia each have fertility rates around 1.0 births per woman, which is even lower than rates in well-known low-fertility advanced economies such as Japan, Spain and Italy.

“My conjecture is that by 2060 or so, we might see rich economies as a group with higher [total fertility rates] than emerging economies,” Fernández-Villaverde predicts.

This changing relationship between prosperity and fertility is already apparent in Europe. For many years, wealthier European countries tended to have lower birth rates than poorer ones. That pattern weakened around 2017, and by 2021 it had flipped.

This change fits a broader historical pattern. Before the Industrial Revolution, wealthier families generally had more children. The idea that prosperity leads to smaller families is a modern development. Now, in many advanced economies, that trend is weakening or reversing. The way that prosperity influences fertility is changing yet again. Wealth and family size are no longer pulling in opposite directions.

This shift also calls into question long-standing assumptions about women’s income and fertility. For years, many economists thought that higher salaries discouraged women from having children by raising the opportunity cost of taking time off work. That no longer seems to hold in many countries.

In several high-income nations, rising female earnings are now associated with higher fertility. Studies in Italy and the Netherlands show that couples where both partners earn well are more likely to have children, while low-income couples are the least likely to do so. Similar findings have emerged from Sweden as well. In Norway, too, higher-earning women now tend to have more babies.

This trend is not limited to Europe. In the United States, richer families are also beginning to have more babies than poorer ones, reversing patterns observed in previous decades. A study of seven countries — including the United States, the United Kingdom, Germany and Australia — found that in every case, higher incomes for both men and women increased the chances of having a child.

This growing body of evidence challenges the assumption that prosperity causes people to have fewer children. 

Still, birth rates are falling across much of the world, with many countries now below replacement level. While this trend raises serious concerns, such as the risk of an aging and less innovative population and widening gaps in public pension solvency, it is heartening that it is not driven by prosperity itself. Wealth does not automatically lead to fewer children, and theories blaming consumerism or rising living standards no longer hold up.

Although the recent shift in the relationship between prosperity and fertility is welcome, it is not yet enough to raise fertility to the replacement rate of around 2.1 children per woman — a challenging threshold to reach.

But the growing number of policymakers around the world concerned about falling fertility can consider many simple, freedom-enhancing reforms that lower barriers to raising a family, including reforms to education, housing and childcare. Still, it’s important to challenge the common assumption that prosperity inevitably leads to lower birth rates: Wealth does not always mean fewer children.

This article was published at The Hill on 6/16/2025.

Scoop | Women's Employment

Gender Gap Closes at Fastest Rate Since Pandemic

“The global gender gap has closed to 68.8%, marking the strongest annual advancement since the COVID-19 pandemic. Yet full parity remains 123 years away at current rates, according to the World Economic Forum’s Global Gender Gap Report 2025, released today. Iceland leads the rankings for the 16th year running, followed by Finland, Norway, the United Kingdom and New Zealand.

The 19th edition of the report, which covers 148 economies, reveals both encouraging momentum and persistent structural barriers facing women worldwide. The progress made in this edition was driven primarily by significant strides in political empowerment and economic participation, while educational attainment and health and survival maintained near-parity levels above 95%. However, despite women representing 41.2% of the global workforce, a stark leadership gap persists with women holding only 28.8% of top leadership positions.”

From Scoop.

Blog Post | Manufacturing

Grim Old Days: Virginia Postrel’s Fabric of Civilization

Beneath today’s abundance of clothing lies a long and brutal history.

Summary: Virginia Postrel’s book weaves a sweeping history of textiles as both drivers of innovation and toil. From ancient women spinning for months to make a single garment to brutal sumptuary laws and dye trades steeped in labor and odor, it is revealed how fabric shaped the foundations of human society.


Virginia Postrel’s The Fabric of Civilization: How Textiles Made the World is the riveting story of how humanity’s quest for thread, cloth, and clothing built modern civilization, by motivating achievements from the Neolithic Revolution to the Industrial Revolution and more. While much of the book contains inspiring tales of innovation, artistry, and entrepreneurship, the parts of the book about the preindustrial era also reveal some dark and disturbing facts about the past.

In the preindustrial era, clothing was often painstakingly produced at home. Postrel estimates that, in Roman times, it took a woman about 909 hours—or 114 days, almost 4 months—to spin enough wool into yarn for a single toga. With the later invention of the spinning wheel, the time needed to produce yarn for a similarly sized garment dropped to around 440 hours, or 50 days. Even in the 18th century, on the eve of industrialization, Yorkshire wool spinners using the most advanced treadle spinning wheels of the time would have needed 14 days to produce enough yarn for a single pair of trousers. Today, by contrast, spinning is almost entirely automated, with a single worker overseeing machines that are able to produce 75,000 pounds of yarn a year—enough to knit 18 million T-shirts.

Most preindustrial women devoted enormous amounts of time to producing thread, which they learned how to make during childhood. It is not an exaggeration to say, as Postrel does, “Most preindustrial women spent their lives spinning.” This was true across much of the world. Consider Mesoamerica:

At only four years old, an Aztec girl was introduced to spinning tools. By age six, she was making her first yarn. If she slacked off or spun poorly, her mother punished her by pricking her wrists with thorns, beating her with a stick, or forcing her to inhale chili smoke.

These girls often multitasked while spinning: “preindustrial spinners could work while minding children or tending flocks, gossiping or shopping, or waiting for a pot to boil.” The near-constant nature of the task meant that prior to the Industrial Revolution, “industry’s visual representation was a woman spinning thread: diligent, productive, and absolutely essential” to the functioning of society, and from antiquity onward cloth-making was viewed as a key feminine virtue. Ancient Greek pottery portrays spinning “as both the signature activity of the good housewife and something prostitutes do between clients,” showing that women of different social classes were bound to spend much of their lives engaged in this task.

Women of every background worked day and night, but still, their efforts were never enough. “Throughout most of human history, producing enough yarn to make cloth was so time-consuming that this essential raw material was always in short supply.”

Having sufficient spun yarn or thread was only the beginning; it still had to be transformed into cloth. “It took three days of steady work to weave a single bolt of silk, about thirteen yards long, enough to outfit two women in blouses and trousers,” although silk-weavers themselves could rarely afford to wear silk. According to Postrel, a Chinese poem from the year 1145, paired with a painting of a modestly dressed, barefoot peasant weaving silk, suggests that “the couple in damask silk . . . should think of the one who wears coarse hemp.”

Subdued colors often defined the clothing of the masses. “‘Any weed can be a dye,’ fifteenth-century Florentine dyers used to say. But that’s only if you want yellows, browns, or grays—the colors yielded by the flavonoids and tannins common in shrubs and trees.” Other dye colors were harder to produce.

In antiquity, Tyrian purple was a dye derived from crushed sea snails, and the notoriously laborious and foul-smelling production process made it expensive. As a result, it became a status symbol, despite the repulsive stench that clung to the fabric it colored. In fact, according to Postrel, the poet Martial included “a fleece twice drenched in Tyrian dye” in a list of offensive odors, with a joke that a wealthy woman wore the reeking color to conceal her own body odor. The fetor became a status symbol. “Even the purple’s notorious stench conveyed prestige, because it proved the shade was the real thing, not an imitation fashioned from cheaper plant dyes.” The color itself was not purple, despite the name, but a dark hue similar to the color of dried blood. Later, during the Renaissance, Italian dyers yielded a bright red from crushed cochineal insects imported from the Americas, as well as other colors that were created by using acidic bran water that was said to smell “like vomit.”

Numerous laws strictly regulated what people were allowed to wear. Italian city-states issued more than 300 sumptuary laws between 1300 and 1500, motivated in part by revenue-hungry governments’ appetite for fines. For example, in the early 1320s, Florence forbade women from owning more than four outfits that were considered presentable enough to wear outside. Postrel quotes the Florentine sumptuary law official Franco Sacchetti as writing that women often ignored the rules and argued with officials until the latter gave up on enforcement; he ends his exasperated account with the saying, “What woman wants the Lord wants, and what the Lord wants comes to pass.” But enough fines were collected to motivate officials to enact ever more restrictions.

In Ming Dynasty China, punishment for dressing above one’s station could include corporal punishment or penal servitude. Yet, as in Florence, and seemingly nearly everywhere that sumptuary laws were imposed, such regulations were routinely flouted, with violators willing to risk punishment or fines. In France in 1726, the authorities harshened the penalty for trafficking certain restricted cotton fabrics, which were made illegal in 1686, to include the death penalty. The French law was not a traditional sumptuary law, but an economic protectionist measure intended to insulate the domestic cloth industry from foreign competition. Postrel quotes the French economist André Morellet lamenting the barbarity of this rule, writing in 1758,

Is it not strange that an otherwise respectable order of citizens solicits terrible punishments such as death and the galleys against Frenchmen, and does so for reasons of commercial interest? Will our descendants be able to believe that our nation was truly as enlightened and civilized as we now like to say when they read that in the middle of the eighteenth century a man in France was hanged for buying [banned cloth] to sell in Grenoble for 58 [coins]?

Despite such disproportionate punishments, the textile-smuggling trade continued.

Postrel’s book exposes the brutal realities woven into the history of textiles; stories not just of uplifting innovation, but of relentless toil, repression, and suffering. Her book fosters a deeper appreciation for the wide range of fabrics and clothes that we now take for granted, and it underscores the human resilience that made such abundance and choice possible.

Girls Not Brides | Women's Empowerment

Kuwait Raises Minimum Legal Age for Marriage to 18 Years Old

“Kuwait has taken a major step to protect the rights of adolescent girls and boys by raising the minimum legal age for marriage to 18 years. The new law, enacted under Decree-Law No. 10 of 2025, came into effect on 16 March 2025.

The law amends Article 26 of Law No. 51/1984 (Personal Status Law), now prohibiting the documentation or ratification of marriage contracts for anyone under 18. It also modifies Article 15 of the Jaafari Personal Status Law No. 124/2019, thus extending the same minimum age across this religious legal framework.

Previously, girls could marry at 15 and boys at 17 with parental or judicial consent.”

From Girls Not Brides.