fbpx
01 / 05
Modernization and the Loss of Japan’s Samurai Culture Benefited the Japanese People

Blog Post | Health & Medical Care

Modernization and the Loss of Japan’s Samurai Culture Benefited the Japanese People

Economic, technological, industrial, and other progress radically improved the life of the ordinary Japanese citizen.

Summary: In the mid-19th century, Japan’s feudal society underwent a profound transformation during the Meiji Restoration, embracing Westernization and modernization. The shift from isolationism to openness resulted in rapid industrialization and technological advancements, improving living standards, education, and social mobility for ordinary citizens. This article examines Japan’s journey from a closed society to a prosperous nation, dispelling romanticized notions of the “good old days” and highlighting the benefits of progress and innovation.


Imagine you’re a farmer in Japan in 1850. You pay homage to your feudal lord, wear clothes of plain cotton, eat rice and fish, and are mostly preoccupied with surviving the occasional famine and outbreaks of disease. You likely have no education. Fifty years later, life has changed beyond recognition. Farmers now have an education, have fertilizer to farm with, have access to vaccination, and can use the telegraph and the postal service. They have more money to spend, more leisure time, and access to mass media.

The 2003 movie The Last Samurai portrays Japan during this period of modernization. The film laments the loss of traditional samurai culture amid rising Westernization. The film is inspired by the Satsuma Rebellion, a revolt from disaffected samurai amid the loss of their privileged position in society.

Longing for a privileged past is not unique to Japan; many in Europe romanticize the medieval era as one of knightly chivalry. However, such portrayals usually look at history through rose-tinted glasses. The “good old days” is a common fallacy, with facts becoming more distorted the further one looks back in history.

What really happened in the era of The Last Samurai?

The period takes places after the Meiji Restoration, showcasing the Westernization of Japan. Before this period, Japan was ruled by Tokugawa shogunate, a military dictatorship that had dominated the island for over 260 years. It imposed the foreign policy of Sakoku—that is, one of extreme isolationism. Aiming to reduce the spread of Christianity and cement the power of the shogun, the islands of Japan became closed to foreigners. No one was allowed to enter or leave Japan, and foreign trade was virtually nonexistent. (There was some trade allowed from the Dutch through the island of Kyushu, notably in porcelain.) This period was one of peace, which many in Japan welcomed after the Sengoku Jidai (a period of civil war) of the 1500s.

Conservatives in Japan welcomed this closing of the country to foreign influence. At the time, Japan was dominated by the samurai class. Samurai, while traditionally warriors, had moved in peacetime to become aristocratic bureaucrats at the service of their daimyo, a feudal lord. Samurai had a monopoly on military force and controlled most of education. Merchants were seen as a lower class, even lower than farmers. Feudalism, a system where a lord would rent out land in return for labor from the peasantry, had ended in parts of Europe around 1500. Whereas competition among European powers had created the emergence of a middle class, Japan had remained socially, technologically, and militarily stagnant from 1639 onwards.

As described by Mitsutomo Yuasa in his study The Scientific Revolution in Nineteenth Century Japan:

The traditional society (feudalism) before the Meiji Restoration, namely the age of Edo of Tokugawa Shogunate, was based on pre-Newtonian science and technology, and on pre-Newtonian attitudes towards the physical world.

In 1853, Japanese isolationism came to an end. With the arrival of Commodore Matthew Perry demonstrating a textbook example of gunboat diplomacy, the United States forced an end to Japanese isolationism and the opening of Japanese ports to American trade. In the years that followed, Japan established diplomatic relations with the Western Great Powers and underwent a collapse of the ruling Tokugawa shogunate.

Japan then went through a period of rapid modernization, importing Western technology, ideas, and culture. Ian Inkster describes the impact:

By 1855, Western machinery and factory organization had been introduced at Nagasaki for the maintenance of warships, and a spurt of building began in 1860 under Dutch leadership. It was Englishmen who in 1867 constructed the first steam powered spinning plant, the Kagoshima Spinning Factory. . . . By 1882, the Osaka Spinning Company operated 16 mules, 10,500 spindles and was practically powered by steam. . . . From 1870 to 1872, 245 railway engineers arrived in Japan from Europe. . . . Telegraphic communication was also established by the British from 1871.

The industries that were revolutionized by foreign influence included the iron industry, mining, railways, electricity, civil engineering, medicine, administration, shipbuilding, porcelain, earthenware, glass, brewing, sugar, chemicals, gunpowder, and cement manufacture. Japan developed its staple industry and export product, silk manufacturing and spinning, under guidance from a Swedish engineer using Italian methods. The silk industry also employed a large amount of female labor in Japan, with more women in the industrial labor force in Japan than in any other country in Asia.

The development of technological innovations improved Japanese industry. Ryoshin Minami showed the growth in total horsepower between 1891 and 1937 was in the order of 13 percent annually. The figure below shows the growth rate of development of primary industries during the period between 1887 and 1920, as well as overall economic growth. In many of the years during that period, growth in private non-primary fixed capital was in the double digits.

By the 1890s, Japanese textiles dominated the home markets and competed successfully with British products in China and India. Japanese shippers were competing with European traders to carry these goods across Asia and even to Europe.

The Satsuma Rebellion occurred in 1877, as Japanese government restricted the ability to carry a katana (long sword) in public. Regardless of one’s thoughts on the right to bear arms, the reduction in the power of the samurai class was a win for ordinary Japanese people. Having access to modern medical techniques, transportation, and goods benefited the whole society, rather than just feudal elites. Indeed, many of the samurai were able to adapt to their new roles in a modern Japan, working in business or government. In the 1880s, 23 percent of prominent Japanese businessmen were from the samurai class. By the 1920s, the number had grown to 35 percent.

By 1925, universal manhood suffrage had been implemented, a stark contrast from the Tokugawa shogunate. The social structure had loosened, allowing societal advancement far more easily than in the feudal era. By 1897, 95 percent of citizens were receiving some form of formal education, in contrast to 3 percent in 1853. With a more educated population, Japan’s industrial sector grew significantly. Of course, the new system still had its problems, such as labor strikes and industrial unrest. However, Westernization brought far more economic freedom to the Japanese people. Attitudes to commerce changed. Merchants rose from being the lowest class to becoming a vital part of the burgeoning middle class.

In Japan, progress was seen in economics, science, technology, education, consumer goods, industry, and social mobility. Society and the traditional order had been uprooted, in an example of Schumpeterian “creative destruction.” The inflow of new ideas, of new ways of doing things, allowed people to become freer, wealthier, healthier, and better educated. The opening of Japan was fundamentally an opening to progress. By isolating itself, Japan fell behind the rest of the world. As it opened itself to competition, it was able to catch up, and in some cases, surpass other countries. And the ordinary citizen of Japan was better for it.

Blog Post | Environment & Pollution

Climate Litigation Can’t Fix the Past, but It Can Hinder the Future

Dealing with climate change requires technological innovation and economic growth, not legal warfare between nations.

Summary: The International Court of Justice has suggested nations could be held liable for historic greenhouse gas emissions, opening the door to lawsuits over centuries of industrial activity. Yet this approach risks punishing the very innovations that lifted billions out of poverty and advanced human health and flourishing. Lasting progress on climate challenges will come not from courtroom battles, but from technological solutions and continued economic development.


The International Court of Justice’s advisory opinion purporting to establish legal grounds that would allow nations to sue one another over climate damages represents judicial overreach that ignores economic history and threatens global development. While the opinion was undeniably legally adventurous, the framework it envisages would be practically unworkable as well as economically destructive.

The ICJ’s ruling suggests countries can be held liable for historical emissions of planet-warming gases. That creates an accounting nightmare that no legal system can resolve. How does one calculate damages from coal burned in Manchester in 1825 versus emissions from a Beijing power plant in 2025? How does one stack up the harm caused by a warming world against the benefits of industrialization?

Britain began large-scale coal combustion during the Industrial Revolution, when atmospheric CO2 concentrations were 280 parts per million and climate science did not exist. Holding Britain liable for actions taken without knowledge of consequences violates basic principles of jurisprudence. The same applies to the United States, whose early industrialization occurred during an era when maximizing economic output was considered unambiguously beneficial to human welfare.

Critics of historical emissions ignore what those emissions purchased. British coal combustion powered textile mills that clothed much of the world, steam engines that revolutionized transportation, and factories that mass-produced goods previously available only to elites. American industrialization followed, creating assembly lines, electrical grids, and chemical processes that form the backbone of modern civilization.

These developments were not zero-sum exercises in resource extraction. They created knowledge, infrastructure, and institutions that benefited everyone. The steam engine led to internal combustion engines, which enabled mechanized agriculture that now feeds 8 billion people. Coal-powered steel production made possible skyscrapers, bridges, and the infrastructure that supports modern cities, where most humans now live longer, healthier lives than their ancestors.

The data on human welfare improvements since industrialization began are explicit. Global life expectancy increased from approximately 29 years in 1800 to 73 years today. Infant mortality rates fell from over 40 percent to under 3 percent. Extreme poverty, defined as living on less than $2.15 per day in purchasing power parity terms, declined from over 80 percent of the global population in 1800 to under 10 percent today.

Nutrition improved dramatically. Caloric availability per person has increased by roughly 40 percent since 1960 alone, while food prices relative to wages fell consistently. Height, a reliable indicator of childhood nutrition, increased significantly across all regions. Educational attainment expanded from literacy rates below 10 percent globally in 1800 to over 85 percent today.

These improvements correlate directly with energy consumption and industrial development. Countries that industrialized earliest experienced these welfare gains first, then transmitted the knowledge and technology globally. The antibiotics developed in American and European laboratories now save lives worldwide. The agricultural techniques pioneered in industrialized nations now feed populations that would otherwise face starvation.

The International Court of Justice’s liability framework threatens to undermine the very mechanisms that created these welfare improvements. Innovation requires investment, which requires confidence in property rights and legal stability. If successful economic development subjects countries to retroactive liability, the incentive structure tilts away from growth and toward stagnation.

Consider current developing nations. Under this legal framework, should India or Nigeria limit their industrial development to avoid future liability? Should they forgo the coal and natural gas that powered Western development? That creates a perverse situation where the legal system penalizes the exact processes that lifted billions from poverty.

The framework also ignores technological solutions. The same innovative capacity that created the Industrial Revolution is now producing renewable energy technologies, carbon capture systems, and efficiency improvements that address climate concerns without sacrificing development. Market incentives and technological progress offer more promise than legal blame assignment.

Which emissions count as legally actionable? All anthropogenic CO2 remains in the atmosphere for centuries, making every emission since 1750 potentially relevant. Should liability begin with James Watt’s steam engine improvements in 1769? With the first coal-fired power plant? With Henry Ford’s assembly line? The temporal boundaries are arbitrary and politically motivated rather than scientifically determined.

Similarly, which countries qualify as defendants? The largest current emitters include China and India, whose recent emissions dwarf historical American and British totals. China alone now produces more CO2 annually than the United States and Europe combined. Any coherent liability framework must address current emissions, not just historical ones.

And where would the money go? This aspect of the case was brought up by Vanuatu. If the island nation receives compensation from the UK and the US, should it not be obliged to pay the British and the Americans for a plethora of life-enhancing Western discoveries, including electricity, vaccines, the telephone, radio, aviation, internet, refrigeration, and navigation systems?

Climate adaptation and mitigation require technological innovation and economic growth, not legal warfare between nations. The countries that industrialized first possess the technological capacity and institutional knowledge to develop solutions to today’s problems. Channeling resources toward litigation rather than innovation represents a misallocation that benefits lawyers while harming global welfare.

The ICJ opinion reflects wishful thinking rather than practical policy. Legal frameworks cannot repeal economic reality or reverse the historical processes that created modern prosperity. Instead of seeking retroactive justice for emissions that enabled human flourishing, policymakers should focus on technologies and institutions that sustain development while addressing environmental concerns. The alternative is a world where legal systems punish success and innovation while offering nothing constructive in return.

The original version of this article was published in National Review on 8/12/2025.

Blog Post | Education Spending

Growth Comes From Ideas, Not Degrees | Podcast Highlights

Marian Tupy interviews Bryan Caplan about the relationship between formal education and innovation.

Listen to the podcast or read the full transcript here.

Get The Case Against Education here.

I want to start with a broad question. What is economic growth, and where does it come from?

Economic growth is just change in economic well-being. Usually, we measure it with GDP.

Where does it come from? There are a lot of stories that people tell. Traditionally, people said it comes from capital accumulation and better-quality labor. But when you really go to the numbers, neither of these things can explain anywhere close to the full change, so most growth has got to be from technological progress, broadly defined. That is the main difference between the world of today and the world of 2000 years ago.

In your piece, you distill it to a single word: ideas.

That’s right.

Why is economic growth important?

In any given year, it seems like getting another percentage point of growth couldn’t make much difference. You barely even notice it. And yet, as many people have pointed out, when you compound an extra percentage point of growth per year over the course of 100 years, it’s the difference between poverty and riches. And riches are what allow you to buy free time. Riches are what allow you to buy culture, to save your child from worms.

Right. So economic growth is an increase in wealth, it comes from new ideas, and ultimately, it is highly correlated with things like better infrastructure, better hospitals, and so on.

Absolutely.

What is the purported relationship between education and growth?

The normal view is that education is the crucial determinant of growth, that it turns unskilled humans into the skilled workers of the modern economy. This is an idea not just from politicians, teachers, and the general public, but also from economics. If you take a class in economics, they will constantly talk about how it’s important to have lots of education because that’s how we build human capital.

So, the purported relationship is that education creates human capital, which creates new ideas and thus more growth?

That’s one version. The more common one is simply that education leads to human capital, which immediately leads to growth. The typical college grad isn’t going to invent anything, but they’re capable of being a more valuable cog in the machine.

Right, so the standard inference is that if you have a more educated workforce, they can accomplish more sophisticated tasks. What does the evidence show?

So, I have a book called The Case Against Education, and I’m not going to be coy about this: I expected to find that education was overrated. However, I also expected to find that a lot of other people researching would say they had clear evidence that education raises economic growth.

However, when I read all the mainstream work on education, there was a big debate about “how come we’re not finding what we know to be true, which is that education is the crucial cause of economic growth?” I think that they are finding the truth, which is that education isn’t a factory for building human capital, but a certification machine for stamping people: good worker, great worker, not so great worker. People like to think about education as a way of building skills, but actually, it’s more like a passport to the real training, which happens on the job.

So, by going to university, you are offering your employer a sign that you are intelligent and conscientious enough to do so.

You’re showing intelligence, conscientiousness, and also conformity. There’s no “I” in team. Most jobs require you to follow a chain of command to achieve the goal of the group. While on some level I don’t like conformity, on a deeper level it’s really important for most purposes.

I want to read you something that you wrote. “Contrary to conventional stories about the positive externalities of education, mainstream estimates of education’s national rate of return were consistently below estimates of education’s individual rate of return.”

What does that mean?

Great question.

A rate of return is basically a measure of how good an investment is. So, for example, you might try to calculate the rate of return of putting extra insulation on a house. We can do the same for education and figure out how all the costs of education compare to the payoffs.

When you do this from the point of view of an individual person, it’s pretty common to get a 10 percent inflation-adjusted rate of return. In my book, I say this is probably too high, but you can bring it down to maybe 7 or 8 percent.

We can also think about this at the level of the country. What if we raise the education level of the whole workforce of a country by a year? How much does that enrich the country? What that quote is saying is that even the high estimates of how much a year of education does for a country are typically around half of what it does for an individual. And a lot of the estimates find that sending the whole country to school for an extra year increases national income by 1 or 2 percent.

In other words, a stamp is a good way for one person to get ahead in life, but stamping the whole country does not help that country get ahead; it just creates credential inflation. You need more and more degrees in order to get the same job that your parents and grandparents got with fewer.

Let’s talk a little bit about innovation. Where do new ideas come from? Are we talking about a very small group of individuals who share certain characteristics?

It’s an exaggeration to say that innovation only comes from a few people. There are millions of small-scale improvements coming from many different people. Opening a new kind of restaurant is not revolutionary R&D, but so much of the improvement in our living standards comes from these small acts of entrepreneurship. When I was in high school, there were only three kinds of restaurants: American, Italian, and Chinese. Now we have a cornucopia of different cuisines. The same goes for so many other simple products. Dog collars now come in 100 more varieties than they did back when I was growing up in the ’80s.

However, the really revolutionary stuff—new vaccines, new business models, new forms of energy—comes from very special people. I think it’s reasonable to say that almost all the really big ideas are coming out of the top sliver of the IQ distribution. There was a psychologist named Lewis Terman in California who, I believe, in the 1920s, saw that there was a standardized test administered to all the kids in the state of California school system. He managed to get data on the top hundred scorers in the whole state of California in that year, and he followed them through life. In his honor, these kids are named the termites, and there’s been a lot of research on them.

While the vast majority of this group didn’t do anything really impressive, they had many times, maybe a thousand times, the normal rate of stellar success. So, just doing these kinds of tests is a good way of identifying the most promising people. At a minimum, just have a system where you basically let children advance as rapidly as they’re capable of. A lot of very intelligent people feel very isolated from their own age group, and it makes sense just to advance them as far as their talent will take them.

I have a personal view, which is that our society is very open to the idea of the STEM prodigy, but we are very closed to the idea of there being a prodigy in, say, history. And I think that there are history prodigies. I have met kids with not just a broad, but a deep understanding of history by the time they’re 13 or 14. People think it’s crazy to put them in a PhD program in history when they’re 14 years old, but I don’t. Why not skip that kid ahead and let him become a star? Look, maybe he wants to be a regular 12-year-old even though he is a genius, but maybe he doesn’t. Maybe he wants to be with a peer group of geniuses. Let’s pave the way for him if that’s what he wants.

Do you think that AI will allow us to continue innovating if the population starts declining?

There was a long period where people working on AI kept over-promising and under-delivering. I would personally hear extravagant claims and check them out and find that they weren’t true. Finally, about two years ago, they started being correct. I was as shocked as anyone. I actually have a bet out about AI, which I’m probably going to lose. It’s embarrassing because I have otherwise a perfect public betting record.

That said, one incredible achievement does not mean that they’re going to have a whole series of incredible achievements. And there’s a lot to the idea that AI is basically just amazing at compiling what has already been said rather than truly coming up with new stuff. While it’s not impossible for it to get better, a lot better, it’s also not guaranteed.

Another thing worth pointing out is that we’ve had, by many measures, falling rates of innovation despite a rising population. There’s an idea that we’ve already discovered a lot of the low-hanging fruit, and so we need to keep multiplying our efforts to maintain the same rate of growth. Another plausible story is that we have doubled the number of people that we call researchers, but really only the best ones count, and the other ones are kind of fake.

Given that much of the money we spend on education is spent poorly or even counterproductively, what should we do with the money instead?

I’m totally on board with giving it back to the taxpayers or just paying down the national debt. We badly need austerity. We are driving at 100 miles per hour towards a brick wall, but there’s still time to change course and get our foot on the brakes. One of the easiest ways of doing that is by spending less on education.

Is education more useful in the developing world?

Poor countries have a severe problem with teachers even showing up. They, on paper, have many years of education—I think Haiti now is around where France was in 1960—but mostly they are just throwing money at a corrupt system that doesn’t even teach basic literacy and numeracy. The way that people in the third world are learning to use technology is the way that almost all normal people learn anything, which is by doing.

It seems to me that we are doing the exact opposite. We are keeping people in the education system for many years, which could prevent them from starting to work and learning by doing.

Yeah. It would be much better if people started adult life at an earlier age. They’re totally ready for it. There’s no reason why 13- or 14-year-olds should not be working. One of the best ways to get kids to actually learn stuff, especially the kids who hate school, is to make it practical. They need to see concrete results and make money.

If you read biographies or autobiographies of people in earlier eras, it is amazing how far people got at young ages. By the age of 15, Malcolm X had worked four different jobs and been all over the country. Many people listen to me and say, “Oh, that’s so dystopian.” I think the system we have now is dystopian, where someone has to sit in a classroom until they’re 30 listening to some boring windbag talk about things he doesn’t even know how to do.

Blog Post | Human Development

Grim Old Days: Peter Laslett’s The World We Have Lost

Poverty and hardship long predated the factory age.

Summary: Before the Industrial Revolution, life in England was marked by widespread poverty, illiteracy, and relentless labor. Even children worked from as young as three. Most people lacked education, political voice, and basic comforts, enduring hunger, disease, and harsh living conditions that kept them in constant proximity to hardship and death. Peter Laslett’s The World We Have Lost reveals that the deprivations often blamed on early industrialization were in fact the norm long before factories and industry.


Peter Laslett’s book The World We Have Lost is an influential history of what life was like in England before the Industrial Revolution. Laslett makes clear that the infamous problems of the industrial era were preexisting, not innovations that first arose with the construction of factories: “The coming of industry cannot be shown to have brought economic oppression and exploitation along with it. It was there already.” His book brings into focus the poverty and hardship faced by preindustrial people and the fact that “we now inhabit a world wealthy on a scale quite unknown before industrialization.”

Laslett describes the dearth of schooling, observing that neither Isaac Newton’s nor William Shakespeare’s parents could read. Inventories from Kentish towns between the 1560s and 1630s show a steady increase from a fifth or less owning books to nearly a quarter, although such inventories were recorded only for prosperous households and thus probably overestimate the extent of book ownership. Leicestershire wills from the 1620s to 1640s show that only 17 percent of people with wills bequeathed books to their heirs, and even among the gentry that figure was only 50 percent.

The “inability to share in literate life cut most men off from even contemplating a share in political power.” And the idea of women attaining a political voice was more absurd still. Even James Tyrrell—an associate of John Locke, a critic of absolutism, and a believer in limited political authority—noted in 1681, “There never was any government where all the promiscuous rabble of women and children had votes.”

Illiteracy often not only limited women’s ability to engage with society but also increased women’s vulnerability. “An illiterate maidservant whose place was five or ten miles from home was cut off from her parents and her brothers and sisters,” effectively unable to send them messages and alert them if her employer physically abused her or sexually assaulted her (as was, sadly, common).

Instead of learning to read, many children began work at shockingly young ages. Laslett informs the reader that, as John Locke noted in 1697, poor children were expected to start working at age three, contributing in what capacity they could, often through apprenticeships. The apprentice’s contract typically went thus: “He shall not absent himself by night or by day without his master’s leave.” Some apprentices “stayed subordinate to a master in a master’s house for the whole of their lives,” far beyond the initial terms of their contract.

Not only could children start work at age 3, but by age 12, they were considered old enough to help run businesses. In 1699, at an alehouse in Harefield, Middlesex, run by Catherine and John Baily, 6 of their 10 children still living at home “were above the age of twelve, . . . old enough to help run the family establishment.”

In England grooms could legally be as young as 14 and brides as young as 12, although Laslett notes that thankfully that was relatively rare in practice. Early marriages did occur, though. In 1623, a London parish clerk wrote disapprovingly of the wedding of a 17-year-old boy working as a threadmaker to the 14-year-old daughter of a porter, calling them a “couple of young Fooles.”

A rather offensive (to modern sensibilities) form of divorce known as “wife-selling” sometimes occurred among those who could not afford a formal dissolution of marriage. The Ipswich Journal records such a sale occurring in 1789:

Oct. 29, Samuel Balls sold his wife to Abraham Rade in the parish of Blythburgh in his county for 1 [shilling]. A halter was put around her neck and she was resigned up to this Abraham Rade.

Such bizarre episodes “reveal something of the slightly quizzical attitude of ordinary people to the official marriage code,” with local customs and practices varying wildly. Upon settling down typically, a man tilled land with the aid of his wife and children. Picture the “hard-working, needy, half-starved labourers of pre-industrial times,” who toiled nonstop and yet never produced enough to live comfortably.

Here was an economy conspicuously lacking in those devices for the saving of exertion which are so marked a feature of our own everyday life. The simplest operation needed effort; drawing the water from the well, striking steel on flint to catch the tinder alight, cutting goose-feather quills to make a pen, they all took time, trouble and energy. The working of the land, the labour in the craftsmen’s shop, were infinitely taxing. [The peasantry would] shock us with their worn hands and faces, their immeasurable fatigue.

Those who didn’t work in agriculture were often servants. The percentage of workers employed as servants in the population varied from as low as 4 percent to as high as a third of the population in relatively wealthy times and places, such as London and parts of Norwich in the 1690s. “Everywhere work of all kinds varied alarmingly with the state of the weather and of trade, so that hunger was not very far away.” Many had no employment and begged. “Wandering beggars . . . were . . . a feature of the countryside at all times.”

Any increase in the cost of food staples could prompt social discord. “Right up to the time of the French Revolution and beyond, in Europe the threat of high prices for food was the commonest and most potent cause of public disorder.” Public panic about food was often warranted, as the threat of hunger was all too real. In 1698 in Scotland, contemporary accounts say, “[m]any have died for want of bread, and have been necessitate to make use of wild-runches draff and the like for the support of nature.” A runch is a common weed.

Laslett makes clear that England, being wealthier than much of Europe, saw relatively few famines by the late early modern period. Still, England’s harvest year of 1623–1624 was devastating, and in some locations, such as Ashton, the number of recorded burials was over two-and-a-half times the typical level. Numerous burials record the cause of the death as starvation. The deaths recorded in the Register of Greystoke in England, in 1623, put names to some of these victims of starvation, including, “A poor hungerstarved beggar child, Dorothy,” and “Thomas Simpson, a poor hungerstarved beggar boy,” as well as “Leonard . . . which child died for want of food,” and 4-year-old “John, son of John Lancaster, late of Greystoke, a waller by trade, which child died for want of food and means.”

Preindustrial people also froze. Indeed, in cold climates such as those of northern and western Europe, “the necessity of gathering round fires and sharing beds, make it obvious that the privacy now regarded as indispensable, almost as a human right,” was once rare, with the masses forced to sleep next to each other and their farm animals for body heat.

If there was one thing that was better about the past, it was perhaps that people were—by necessity—tougher. London’s suicide rate circa 1660 is estimated as somewhere between 2.5 and 5 per 100,000 people, low by modern standards.1 But on the whole, what Laslett calls “the world we have lost” is not a world we’d want back.

  1. According to the most recent data from Britain’s Office of National Statistics, London’s suicide rate now stands at 7.3 per 100,000 people, while England and Wales have a suicide rate of 17.4 per 100,000. According to the most recent year of OECD data, only one OECD country has a suicide rate of under 5 per 100,000: Turkey, at 4.8 per 100,000. (In recent years, only two or three OECD countries typically manage to keep suicides below the upper bound of the estimated level seen in 17th-century London).