fbpx
01 / 05
Cooking: from Full-Time Job to Hobby

Blog Post | Adoption of Technology

Cooking: from Full-Time Job to Hobby

The world is spending less and less time in the kitchen.

A wealthy Manhattan woman died after her clothes caught fire while cooking last week. Her tragic death was unusual, but there was a time when cooking was far more dangerous and time-consuming. Even today, more than 4 million people lacking modern stoves die prematurely each year from breathing in cooking fumes. Not only was cooking once unsafe, it left time for little else.

As Professor Deirdre McCloskey once noted, “[in] 1900 a typical American household of the middle class would spend 44 hours [a week] in food preparation,” and most of that work fell to women. In other words, back in the days of churning one’s own butter and baking one’s own bread, food preparation took up the same hours as a full-time job. That estimate includes time spent on purchasing, cooking and serving food as well as dishwashing. Keep in mind that in addition to cooking, women were also often responsible for cleaning the home, laundry, mending clothes, and tending to children.

Things started changing quickly. In 1910, U.S. households spent approximately six hours daily cooking meals, including cleanup; by the mid-1960s (when more reliable estimates began), that fell to one and a half hours. By 2008, the average low-income American spent just over an hour on food preparation each day and the average high-income American spent slightly less than an hour on food preparation daily. Disaggregating the data by gender reveals even more progress for women. In the United States, from the mid-1960s to 2008, women more than halved the amount of time they spent on food preparation (whereas men nearly doubled the time they spent on that activity, as household labor distributions became more equitable between the genders).

Mass production of everyday foodstuffs helped transform how women spent their time. In 1890, 90 percent of American women baked their own bread. Factory-baked, pre-sliced bread debuted in 1928. By 1965, 78 out of every 100 pounds of flour a U.S. woman brought into her kitchen came in the form of baked bread or some other ready-prepared good. Today, baking bread is an amusement for foodies, rather than a necessary chore for all women.

Over time, markets brought about and lowered the cost of such innovations as microwaves, convection ovens, ranges, grills, toasters, blenders, food processors, slow-cookers and other labor-saving kitchen devices. Markets have even produced grocery delivery services that bring food to one’s door with the touch of an application on a smartphone. Market processes also lowered the cost of dining out, and today Americans spend more dining out than eating in.

The liberation of women from the kitchen is ongoing, as technological devices and mass-produced goods spread to new parts of the world. Globally, as many as 55 percent of households still cook entirely from raw ingredients at least once a week. A 2015 survey found that average hours spent cooking among those who regularly cook are as high as 13.2 hours per week in India, and 8.3 in Indonesia, compared to 5.9 in the United States.

The gap in time spent on food preparation between rich and poor countries remains wide. But even in India — the poorest country surveyed, and the one with the highest reported average food preparation hours — women devote almost 31 fewer hours to food preparation per week than U.S. households did in 1900. Even allowing for compatibility problems when comparing those figures (the estimate for 1900 was for the household and included meal cleanup time), the sheer size of this difference suggests some improvement.

Much room for progress remains. In 2016, only 0.6 percent of Chinese households and 0.1 percent of Indian households had a dishwasher, compared to 67 percent of U.S. households, according to Euromonitor data. In 2016, microwave market penetration was just 23.4 of Chinese households and 3.1 percent of Indian households, compared to 91.3 percent of U.S. households. Only 15 percent of Indian households owned a refrigerator in 2006.

If prosperity continues to spread and poverty to decline globally, kitchen appliances and ready-made goods will free up more and more hours of women’s food preparation time around the world. There may always be freak accidents like the one in Manhattan, but there is no reason why innovation cannot lessen the risk by liberating women everywhere from kitchen chores.

This first appeared in the American Spectator.

Bloomberg | Science & Technology

Warehouse Workers Face New Competition: A Humanlike Robot

“Warehouse workers could soon face new competition from robots: GXO Logistics Inc. is testing a humanoid model at its facility in Flowery Branch, Georgia. … Taking into account the price of the robot and its lifespan of about 20,000 hours, the price tag to operate it is about $10 to $12 an hour. With increased production, that cost is expected to fall to $2 or $3 an hour plus overhead for software.”

From Bloomberg.

Blog Post | Conservation & Biodiversity

Can Finance Save the Wolves?

Economics informs us that unsolvable societal disputes don’t have to become political wrestling matches.

Most people have quite the dire view of finance and markets. Unscrupulous bankers and pompous hedge funds place unsound bets on obscure and risky investments; greedy businessmen jack prices and fire workers at the first sight of recessions caused by their own avarice. Money rules the world, goes the trope. But that also means that financial incentives have the power to align behavior more powerfully than most appeals to morals, kindness, or the good of the community.

Yale University finance professor William Goetzmann opens his book Money Changes Everything: How Finance Made Civilization Possible with the observation that “finance is the story of a technology: a way of doing things. Like other technologies, it developed through innovations that improved efficiency. It is not intrinsically good or bad.” Markets, especially those for financial assets and property, are a way to rearrange reality’s unavoidable risks, benefits, and payoffs; they are “by, for, and about people’s lives.”

One fascinating way that modern financial engineering helps make the world a better place is through counterintuitive payments, such as in global forestry. Making money by chopping down trees is a model that everyone understands—get chainsaws and harvesters, hire some laborers, chop trees, and sell the wood for profit.

Another way is to make money by not chopping down trees, courtesy of resourceful financiers and carbon sequestration markets. In efforts to reduce their carbon emissions, major corporations routinely pay forest owners to keep more trees in the ground for longer. This “negative logging” is made possible by financial flows from those who want more trees to those who manage them.

In 2021, the World Bank paid nine districts in Mozambique’s Zambézia province for keeping forests intact. When in power, Brazil’s ex-president Jair Bolsonaro routinely tried to shake down the international community for cash payments in exchange for not deforesting the Amazon. Think what you will of this controversial political figure and his policies, but the economic mechanism his government proposed here was sound – rich Westerners want flourishing rainforests and an end to global deforestation, and poor farmers and loggers want to use economically unproductive land to better their standards of living. A deal naturally presented itself.

Well-structured financial payments can also solve another pickle that routinely devolves into political mudslinging: wildlife. City-dwellers often have a romanticized view of nature and ecologic systems, like the idea of healthy wolf populations. Ranchers and pastoralists who bear the visible costs of livestock killed usually have a different view. Cue unsolvable political showdowns.

In Sweden, where ecological concerns usually reign supreme, rural constituents and an anti-wolf lobby have recently gotten the upper hand. This summer, the government announced that it wanted to reduce the already inbred and endangered wolf population by half. The policy is based on no scientific evidence whatsoever. It is a political measure to reduce concentrated economic damages among a loud constituency.

It seems that only one group can be satisfied. The groups who favor more wolves and those favoring fewer can’t both have their way. When management over common-pool resources devolves into political disputes, policy usually pinballs between various interests as they wrestle control over the political apparatus.

Finance and Markets Can Align Mutually Incompatible Interests

Economics informs us that unsolvable societal disputes don’t have to become political wrestling matches. Instead, we need financial instruments and payoffs that have city-dwellers paying rural communities for the unavoidable death caused by having thriving predator populations.

If city-dwellers’ desire to have large or growing wolf populations in their countries is genuine, they should be willing to pay extra for cattle meat sourced from wolf territories, the livestock most at risk for wolf attacks.

Ecologic systems, like economic systems, are dynamic – changes to them don’t impact just one thing. When wolves return to areas where they were hunted to extinction during the 20th century, they unfortunately attack livestock or domestic animals. But they also keep the population of boars, deer, or elk in check, which reduce the damage to agriculture and gardens, cars, and people. Insurance companies could play a role by supporting conservation efforts for large predators—or offer reduced premiums for customers that do—since more wolves means fewer and/or more skittish deer and elk, which dramatically reduce vehicle collisions with wildlife.

Another way to achieve the same reshuffling of economic value is to have (generally wealthier) city-dwellers pay lavishly for ecotourism trips into areas where wolves are plentiful—like these projects in Spain’s Sierra de la Culebra. Some of the revenue streams should make it back to shepherds losing livestock to attacks or farmers who can credibly show the presence of wolves on their grounds (say, through wildlife cameras capturing their movements).

In Scandinavia, these conflicts become overwhelmingly political not only out of a lack of financial engineering but also because most compensation schemes are run by bureaucrats and financed by taxpayers. Vultures circle around political payouts as well as fresh carcasses.

Modeling by Anders Skonhoft at the Norwegian University of Science and Technology suggests that ex-ante payments for predator presence yield better outcomes than ex-post reimbursement of livestock damages. This is the animal husbandry equivalent to paying for not cutting down trees.

In the 1990s, the Swedish government introduced such an ex-ante scheme for the Sámi population and the reindeer they manage. Sámi herders routinely lose some 20 percent of their animals to carnivore attacks every year. By tying reimbursement to the presence of lynx and wolverine offspring rather than exact reindeer attacks, the scheme turns those most posed to disapprove of predators into their greatest defenders.

With the introduction of ecotourism in Africa and the Amazon, the same financial incentives have flipped loggers and poachers into guides, the enemies of predators becoming their greatest protectors. On a larger scale, the right financial structures—payouts, markets, and assets—can align the interest of unsolvable political enemies.

Blog Post | Health & Medical Care

COVID-19 Should Make Us Grateful for Technology

Imagine a pre-modern pandemic.

“In a way, everything is technology,” noted one of the world’s greatest economic historians, Fernand Braudel, in his monumental study Civilization and Capitalism. “Not only man’s most strenuous endeavors but also his patient and monotonous efforts to make a mark on the external world; not only the rapid changes . . . but also the slow improvements in processes and tools, and those innumerable actions which may have no immediate innovating significance but which are the fruit of accumulated knowledge,” he continued.

Yes, land, labor, and capital (that’s to say, the factors of production) are important components of economic growth. In the end, however, human progress in general and global enrichment in particular are largely dependent on invention and innovation. That is surely even clearer now that humanity’s hopes for the end of the pandemic and for our liberation from the accompanying lockdown rest on further scientific breakthroughs within the pharmaceutical industry. Let’s take a brief look at the impact of technology on health care, food supply, work, and sociality in the time of COVID-19.

Healthcare

The impact of modern technology is surely most keenly felt and anticipated within the sphere of human health care. Consider some of the worst diseases that humanity has had to face in the past. Smallpox, which is thought to have killed an estimated 300 million people in the 20th century alone, originated in either India or Egypt at least 3,000 years ago. Smallpox variolation, it seems, was practiced in China in the tenth century, but it was not until the late 18th century that Edward Jenner vaccinated his first patient against the disease. Smallpox was fully eradicated only in 1980.

Similar stories could be told about other killer diseases. Polio, which can be seen depicted in Egyptian carvings from the 18th dynasty, is of ancient origin. Yet the disease wasn’t properly analyzed until the year of the French Revolution, with Jonas Salk’s vaccine appearing only in 1955. Today, polio is close to being eradicated (just 95 cases were reported in 2019).

Malaria, probably humanity’s greatest foe, is at least 30 million years old (the parasite has been found in an amber-encased mosquito from the Paleogene period). It was only after the discovery of the New World that knowledge about the fever-reducing benefits of the bark of the cinchona tree spread to Europe and Asia. Quinine was first isolated in 1820, and chloroquine was introduced in 1946. Artemisinin drugs, which we still use, were discovered in the late 1970s. That’s to say that humanity lived with deadly diseases for millennia without fully knowing what they were, how they were transmitted, and how they could be cured. The fate of humanity, our ancestors thought, fluctuated under the extraneous influence of the “wheel of fortune” and there was nothing that anyone could do about it. One day you were alive and next day you were not.

Contrast that glacial pace of progress, and the fatalistic acceptance of disease and death, with our response time to the current pandemic. The Wuhan Municipal Health Commission reported the existence of a cluster of cases of “pneumonia” in Wuhan on December 31. On January 7 the Chinese identified the pathogen (novel coronavirus) responsible for the outbreak. On January 11 China sequenced the genetic code of the virus, and the next day it was publicly available. That enabled the rest of the world to start making diagnostic kits to identify the disease.

To take one example, the first COVID-19 infection in South Korea was identified on January 20. On February 4, the first test kit (made by Kogene Biotech) entered production. On February 7, the test kit was available at 50 locations around the country. Other countries followed suit.

The World Health Organization, which declared COVID-19 a global pandemic on March 11, may have acted too late. Still, it is noteworthy that just two months expired between the first sign of trouble and the time when the entire world put measures in place to retard the spread of the disease. In the meantime, we have learned a lot about governmental incompetence and regulatory overreach. But we have also learned a great deal about the spread and symptoms of the disease. Instead of starting from scratch, medical specialists in Europe and America can draw on the expertise of their colleagues in the Far East. Before the telegraph appeared midway through the 19th century, it took up to a month for a ship to carry information from London to New York. Today, we learn about the latest COVID-19 news (good and bad) and research in seconds.

By mid April, thousands of highly educated and well-funded specialists throughout the world were using supercomputers and artificial intelligence to identify promising paths toward victory over the disease. Some 200 different programs are underway to develop therapies and vaccines to combat the pandemic. They include studies of the effectiveness of existing antiviral drugs, such as Gilead’s Remdesivir, Ono’s protease inhibitor, and Fujifilm’s favipiravir. The effectiveness of generic drugs, such as hydroxychloroquine and chloroquine, is also being evaluated. Takeda is hard at work on convalescent plasma (TAK-888) in Japan, while Regeneron works on monoclonal antibodies in the United States. New vaccines, such as Moderna’s mRNA-1273, Inovio’s INO-4800, and BioNTech’s BNT162, are under development.

We don’t know which of these treatments (if any) will work, but here is what we can be sure of: There has never been a better time for humans to face and defeat a global pandemic. The world is richer than ever before, and money is what enables us to sustain a massive pharmaceutical industry and pay for highly sophisticated medical research and development.

Coronavirus may be deadly, but it is not the bubonic plague, which had a mortality rate of 50 percent. Luckily, it is a far milder virus that has reawakened us to the danger posed by communicable diseases. Once the immediate crisis is behind us, researchers will collect billions of data from dozens of countries and analyze the different governmental responses to the pandemic. That knowledge will be deployed by governments and the private sector to ensure that best practices are adopted, so that next time we are better prepared.

Food

When the Black Plague struck Europe in 1347, the disease found the local population ripe for slaughter. Following the close of the Medieval Warm Period at the end of the 13th century, the climate turned cold and rainy. Harvests shrunk and famines proliferated. France, for example, saw localized famines in 1304, 1305, 1310, 1315–17, 1330–34, 1349–51, 1358–60, 1371, 1374–75, and 1390. The Europeans, weakened by shortages of food, succumbed to the disease in great numbers.

The people of yore faced at least three interrelated problems. First, the means of transport and the transportation infrastructure were awful. On land, the Europeans used the same haulage methods (carts pulled by donkeys, horses, and oxen) that the ancients had invented. Similarly, much of Europe continued to use roads built by the Romans. Most people never left their native villages or visited the nearest towns. They had no reason to do so, for all that was necessary to sustain their meager day-to-day existence was produced locally.

The second problem was the lack of important information. It could take weeks to raise the alarm about impending food shortages, let alone organize relief for stricken communities.

Third, regional trade was seldom free (France did not have a single internal market until the Revolution) and global trade remained relatively insignificant in economic terms until the second half of the 19th century. Food was both scarce and expensive. In 15th-century England, 80 percent of ordinary people’s private expenditure went for food. Of that amount, 20 percent was spent on bread alone. Under those circumstances, a local crop failure could spell the destruction of an entire community. (Those who think that COVID-19 exposed the fragility of modern society should look up the Great Famine.)

By comparison, by 2013 only 10 percent of private expenditure in the United States was spent on food, a figure that is itself inflated by the amount Americans typically spend in restaurants. Speaking of restaurants, while most have been forced to close their doors, the restaurateurs use apps to deliver excellent food at reasonable prices. Moreover, months into the COVID-19 pandemic, the shops are, generally, well stocked and regularly replenished by the largely uninterrupted stream of cargo flights, truck hauling, and commercial shipping. Due to the miracle of mobile refrigeration, fresh produce continues to be sourced from different parts of the United States and abroad. Shortly before writing this piece, I was able to buy oranges from California, avocados from Mexico, and grapes from Chile in my local supermarket. Globalization may be under pressure from both the left and the right of the U.S. political spectrum, but should the pandemic impair U.S. agricultural production, many will be forced to acknowledge the benefits of the global food supply and our ability to import food from COVID-19-unaffected parts of the world.

This extensive and, at this point, still sturdy supply chain is, of course, a technological marvel. Computers collate information about items on the shelf that are in short supply, adjust the variety and quantity of items shipped between stores, fill new orders, etc. And so, commerce that’s still allowed to go on goes on. So does charity. Feeding America, a network of more than 200 food banks, feeds tens of millions of people through food pantries, soup kitchens, shelters, etc. Since 2005, the organization has been using a computerized internal market to allocate food more rationally. Feeding America uses its own currency, called “shares,” with which individual food banks can bid on the foods that they need the most. Grocery-delivery services bring food to the doorsteps of those who cannot or do not want to leave their homes. The old and the infirm can also use phones, emails, and apps to call upon volunteers to do their shopping and delivery.

Work

The nature of work has changed a lot over the last 200 years or so. Before the industrial revolution, between 85 percent and 90 percent of the people in the Western world were farm laborers. Their work was excruciatingly difficult, as witnessed by one 18th-century Austrian physician who observed that “in many villages [of the Austrian Empire] the dung has to be carried on human backs up high mountains and the soil has to be scraped in a crouching position; this is the reason why most of the young people are deformed and misshapen.” People lived on the edge of starvation, with both the very young and the very old expected to contribute as much as they could to the economic output of the family (most production in the pre-modern era was based on the family unit, hence the Greek term oikonomia, or household management). In those circumstances, sickness was a catastrophe: It reduced the family unit’s production, and therefore its consumption.

The industrial revolution allowed people to move from farms to factories, where work was better paid, more enjoyable, and less strenuous (which is largely why people in poor countries continue to stream from agricultural employment to manufacturing jobs today). Moreover, wealth exploded (real annual income per person in the United States rose from $1,980 in 1800 to $53,018 in 2016). That allowed for ever-increasing specialization, which included a massive expansion of services catering to the desires of an ever-more-prosperous population.

The service sector today consists of jobs in the information sector, investment services, technical and scientific services, health care, and social-assistance services, as well as in arts, entertainment, and recreation. Most of these jobs are less physically arduous, more intellectually stimulating, and better paid than either agricultural or manufacturing jobs ever were. Crucially, many of these service-sector jobs can be performed remotely. That means that even in the midst of the government-imposed economic shutdown, some work (about a third, estimates suggest) can go on. The economic losses from COVID-19, in other words, will be astronomical, but not total.

My own organization, for example, shut its doors in mid March. Since then, everyone has been scribbling away at home or appearing on news shows around the world via the Internet. All of us are in regular contact via the phone, Zoom, and Microsoft Teams. Other organizations are doing the same. As we already discussed, a great deal of shopping is taking place online. Shipping and delivery companies are expanding, with Amazon hiring 100,000 additional workers in the United States. Home entertainment, of course, has grown tremendously, with Netflix adding millions of new customers and expanding its offerings with thousands of new films and television shows. With over 30 million American children stuck at home, online learning companies are booming, and educators from high-school teachers to college professors continue to perform their jobs remotely. Telehealth is expanding, allowing patients to see their doctors in a safe and convenient way. Even minor medical procedures, such as eye exams, can be conducted remotely, and multiple companies will deliver your new specs to your front door. Banking and finance are still going on, with many people taking advantage of low interest rates to refinance their mortgages. Finally, the often unfairly maligned pharmaceutical industry is expanding as we all wait and hope for the release of a COVID-19 vaccine or effective therapeutic treatment.

Sociality

Aristotle observed that “man is by nature a social animal” and noted that without friends we would be unhappy. But the role of sociality (that is to say, the tendency to associate in or form social groups) goes much deeper than that. As William von Hippel explained in his 2018 book The Social Leap, sociality is the mechanism by which Homo sapiens came about. When early hominids were forced down from the trees (perhaps as a result of a climatic change that dried up African forests), they became more vulnerable to predators. To cover longer distances between the fast-disappearing trees while maintaining a modicum of protection against other animals, our ancestors developed bipedalism, which allowed them to free their upper body to carry weapons such as sticks and stones.

Even more important was the invention of cooperation. While a stick-wielding ape is slightly better-off than an unarmed one, a group of armed apes is much better at dispatching predators. Individuals in more cooperative bands survived to adulthood and bred more often, resulting in more-cooperative species. Furthermore, since living alone was tantamount to a death sentence, selfish apes who didn’t care about being ostracized for not pulling their weight died off, resulting in a desire for communal cooperation and a deep-rooted fear of rejection by the group.

The early hominids had brains more like those of chimps than those of modern humans. That’s because the evolutionary pressures that created the former — such as predation and food scarcity — could be overcome without tremendous intelligence. These pressures to survive were part of the physical landscape — a challenging but static environment that didn’t require a lot of cognitive ability to navigate. The environmental pressure that resulted in modern humans was the social system itself. The social landscape is much more dynamic than the physical one. Once they had banded together in groups, our ancestors were forced to forge relationships with, and avoid being exploited by, individuals with divergent and constantly shifting interests. Those who couldn’t keep up with the increasingly complex social game either died or were unable to mate.

This new pressure created a positive evolutionary cycle: Banding together created more complex social systems, which required bigger brains; bigger brains needed to be fed; and the best way to get more food was more cooperation and a more sophisticated social system. The main cognitive development that evolved from this evolutionary cycle is known as the “theory of mind.” In short, the theory of mind is the ability to understand that other minds can have different reasoning, knowledge, and desires from your own. While that seems basic, the theory of mind distinguishes us from all other life on Earth. It allows us to determine whether an affront, for example, was intentional, accidental, or forced. It allows us to feel emotions such as empathy, pride, and guilt — abilities that are keys to a functioning society.

So sociality and human beings are inseparable, as we have all been clearly reminded by the sudden restrictions on our ability to interact with others. As we sit at home, working away on our computers or watching television, most of us feel a tremendous sense of isolation (“social distancing”) from our family, friends, and colleagues. The urge to be around others is innate to us. It is who we are.

Dissatisfied with impersonal modes of communication, such as email and texting, we have rediscovered the need for a face-to-face interaction with our fellow humans. To that end, we utilize digital platforms such as Zoom, Google Hangouts, Facebook Live, and FaceTime to catch up on the latest news in other people’s lives, or simply to complain about the misery of loneliness and the pathetic inadequacy of our public officials (of both parties). Throughout the nation, people engage in virtual happy hours, dinners, book clubs, fitness classes, religious services, and group meditation. As my Cato Institute colleague Chelsea Follett recently wrote, “Technology has made it easier than ever to hold a physically-distanced ‘watch party’ synchronized so that viewers in different locations see the same part of a movie at the same time. For those who like to discuss movies as they watch, technology also enables a running group commentary of each scene in real time.” In the saddest of cases, technology enables people to say goodbye to dying friends and relatives. In a very real sense, therefore, technology keeps us sane (or, at the very least, saner).

Technology, then, allows us to cope with the challenges of the pandemic in ways that our ancestors could not even dream about. More important, technology allows our species to face the virus with grounds for rational optimism. In these dark days, remember all the scientists who are utilizing the accumulated store of human knowledge to defeat COVID-19 in record time and all the marvelous (not to say miraculous) ways the modern world keeps us well fed, psychologically semi-balanced, and (in many cases) productively engaged.

This originally appeared in National Review.