fbpx
01 / 05
The Self-Made $4,000 Suit and the Benefits of Exchange

Blog Post | Economic Growth

The Self-Made $4,000 Suit and the Benefits of Exchange

Critics decry increasingly affordable clothing, viewing falling prices as a sign of worker exploitation. They're mistaken.

Last week, I wrote about a man who spent 6 months of his life and $1,500 to make a sandwich entirely from scratch, without the benefits of market exchange. The story illustrates how exchange and trade enrich our lives.   

After making his incredibly costly sandwich, the same man embarked on an even costlier endeavor: making a suit from scratch. He picked cotton from a field, spun the cotton into thread, wove the thread into cloth, sheared wool from a sheep, harvested hemp, raised silkworms for their silk, killed a deer, and tanned its hide to make leather. This process cost him 10 months of work and $4,000.   

At the end of the video documenting how he made the “suit,” he stands in a bizarre-looking outfit with pants that end at his knees and says with regret, “OK, even with all that work, I might have run a little short on material.” Even after 10 months of intense labor, he was unable to come close to matching the quality and price of a product that he could procure through the free market.    

Thanks to market exchange and the division of labor, obtaining new clothes is simple and increasingly affordable. For example, increasing cotton yields have lowered the price of a staple fabric material. 

The real price of a suit, measured in the number of hours it takes an average worker to earn enough to buy one, has declined: a two-piece wool suit cost the average American 12.4 fewer hours of work in 2012 than it did in 1956. (Check out Professor Don Boudreaux’s analysis for further details).   

Critics sometimes decry increasingly affordable clothing, viewing falling prices as a sign of worker exploitation. In 1891, U.S. President Benjamin Harrison summed up this viewpoint when he said, “I pity the man who wants a coat so cheap that the man or woman who produces the cloth or shapes it into a garment will starve in the process.” However, as Johan Norberg pointed out yesterday in the U.K. Huffington Post, far from making people poorer, the garment industry has actually helped to decrease poverty. As he eloquently puts it:

Western activists rail against “sweatshops”, but among researchers and economists from left to right there is a consensus that these jobs are the stepping stones out of poverty.

Take a moment to consider what you are wearing right now, and how much work went into its creation, from the harvesting of its raw materials to the finishing touches. No one person created it—it is the fruit of a complex family tree of mutually beneficial human cooperation through the market.

Wall Street Journal | Health & Medical Care

You Can Now Get Weight-Loss Drug Zepbound through Amazon

“Amazon Pharmacy, which has sold prescription medicines online since 2020, will now handle some of the home delivery of anti-obesity therapy Zepbound and other Eli Lilly drugs that are ordered through the drugmaker’s new direct-to-consumer service, the companies said Wednesday.

The service, called LillyDirect, connects patients with telehealth services specializing in obesity that can write prescriptions for Zepbound or another weight-loss drug. The service also arranges for a prescription to be processed and mailed directly to customers.”

From Wall Street Journal.

Wall Street Journal | Science & Technology

Amazon Introducing Robotics to Speed Deliveries

“Amazon.com is introducing an array of new artificial intelligence and robotics capabilities into its warehouse operations that will reduce delivery times and help identify inventory more quickly.

The revamp will change the way Amazon moves products through its fulfillment centers with new AI-equipped sortation machines and robotic arms. It is also set to alter how many of the company’s vast army of workers do their jobs.”

From Wall Street Journal.

Blog Post | Health & Medical Care

COVID-19 Should Make Us Grateful for Technology

Imagine a pre-modern pandemic.

“In a way, everything is technology,” noted one of the world’s greatest economic historians, Fernand Braudel, in his monumental study Civilization and Capitalism. “Not only man’s most strenuous endeavors but also his patient and monotonous efforts to make a mark on the external world; not only the rapid changes . . . but also the slow improvements in processes and tools, and those innumerable actions which may have no immediate innovating significance but which are the fruit of accumulated knowledge,” he continued.

Yes, land, labor, and capital (that’s to say, the factors of production) are important components of economic growth. In the end, however, human progress in general and global enrichment in particular are largely dependent on invention and innovation. That is surely even clearer now that humanity’s hopes for the end of the pandemic and for our liberation from the accompanying lockdown rest on further scientific breakthroughs within the pharmaceutical industry. Let’s take a brief look at the impact of technology on health care, food supply, work, and sociality in the time of COVID-19.

Healthcare

The impact of modern technology is surely most keenly felt and anticipated within the sphere of human health care. Consider some of the worst diseases that humanity has had to face in the past. Smallpox, which is thought to have killed an estimated 300 million people in the 20th century alone, originated in either India or Egypt at least 3,000 years ago. Smallpox variolation, it seems, was practiced in China in the tenth century, but it was not until the late 18th century that Edward Jenner vaccinated his first patient against the disease. Smallpox was fully eradicated only in 1980.

Similar stories could be told about other killer diseases. Polio, which can be seen depicted in Egyptian carvings from the 18th dynasty, is of ancient origin. Yet the disease wasn’t properly analyzed until the year of the French Revolution, with Jonas Salk’s vaccine appearing only in 1955. Today, polio is close to being eradicated (just 95 cases were reported in 2019).

Malaria, probably humanity’s greatest foe, is at least 30 million years old (the parasite has been found in an amber-encased mosquito from the Paleogene period). It was only after the discovery of the New World that knowledge about the fever-reducing benefits of the bark of the cinchona tree spread to Europe and Asia. Quinine was first isolated in 1820, and chloroquine was introduced in 1946. Artemisinin drugs, which we still use, were discovered in the late 1970s. That’s to say that humanity lived with deadly diseases for millennia without fully knowing what they were, how they were transmitted, and how they could be cured. The fate of humanity, our ancestors thought, fluctuated under the extraneous influence of the “wheel of fortune” and there was nothing that anyone could do about it. One day you were alive and next day you were not.

Contrast that glacial pace of progress, and the fatalistic acceptance of disease and death, with our response time to the current pandemic. The Wuhan Municipal Health Commission reported the existence of a cluster of cases of “pneumonia” in Wuhan on December 31. On January 7 the Chinese identified the pathogen (novel coronavirus) responsible for the outbreak. On January 11 China sequenced the genetic code of the virus, and the next day it was publicly available. That enabled the rest of the world to start making diagnostic kits to identify the disease.

To take one example, the first COVID-19 infection in South Korea was identified on January 20. On February 4, the first test kit (made by Kogene Biotech) entered production. On February 7, the test kit was available at 50 locations around the country. Other countries followed suit.

The World Health Organization, which declared COVID-19 a global pandemic on March 11, may have acted too late. Still, it is noteworthy that just two months expired between the first sign of trouble and the time when the entire world put measures in place to retard the spread of the disease. In the meantime, we have learned a lot about governmental incompetence and regulatory overreach. But we have also learned a great deal about the spread and symptoms of the disease. Instead of starting from scratch, medical specialists in Europe and America can draw on the expertise of their colleagues in the Far East. Before the telegraph appeared midway through the 19th century, it took up to a month for a ship to carry information from London to New York. Today, we learn about the latest COVID-19 news (good and bad) and research in seconds.

By mid April, thousands of highly educated and well-funded specialists throughout the world were using supercomputers and artificial intelligence to identify promising paths toward victory over the disease. Some 200 different programs are underway to develop therapies and vaccines to combat the pandemic. They include studies of the effectiveness of existing antiviral drugs, such as Gilead’s Remdesivir, Ono’s protease inhibitor, and Fujifilm’s favipiravir. The effectiveness of generic drugs, such as hydroxychloroquine and chloroquine, is also being evaluated. Takeda is hard at work on convalescent plasma (TAK-888) in Japan, while Regeneron works on monoclonal antibodies in the United States. New vaccines, such as Moderna’s mRNA-1273, Inovio’s INO-4800, and BioNTech’s BNT162, are under development.

We don’t know which of these treatments (if any) will work, but here is what we can be sure of: There has never been a better time for humans to face and defeat a global pandemic. The world is richer than ever before, and money is what enables us to sustain a massive pharmaceutical industry and pay for highly sophisticated medical research and development.

Coronavirus may be deadly, but it is not the bubonic plague, which had a mortality rate of 50 percent. Luckily, it is a far milder virus that has reawakened us to the danger posed by communicable diseases. Once the immediate crisis is behind us, researchers will collect billions of data from dozens of countries and analyze the different governmental responses to the pandemic. That knowledge will be deployed by governments and the private sector to ensure that best practices are adopted, so that next time we are better prepared.

Food

When the Black Plague struck Europe in 1347, the disease found the local population ripe for slaughter. Following the close of the Medieval Warm Period at the end of the 13th century, the climate turned cold and rainy. Harvests shrunk and famines proliferated. France, for example, saw localized famines in 1304, 1305, 1310, 1315–17, 1330–34, 1349–51, 1358–60, 1371, 1374–75, and 1390. The Europeans, weakened by shortages of food, succumbed to the disease in great numbers.

The people of yore faced at least three interrelated problems. First, the means of transport and the transportation infrastructure were awful. On land, the Europeans used the same haulage methods (carts pulled by donkeys, horses, and oxen) that the ancients had invented. Similarly, much of Europe continued to use roads built by the Romans. Most people never left their native villages or visited the nearest towns. They had no reason to do so, for all that was necessary to sustain their meager day-to-day existence was produced locally.

The second problem was the lack of important information. It could take weeks to raise the alarm about impending food shortages, let alone organize relief for stricken communities.

Third, regional trade was seldom free (France did not have a single internal market until the Revolution) and global trade remained relatively insignificant in economic terms until the second half of the 19th century. Food was both scarce and expensive. In 15th-century England, 80 percent of ordinary people’s private expenditure went for food. Of that amount, 20 percent was spent on bread alone. Under those circumstances, a local crop failure could spell the destruction of an entire community. (Those who think that COVID-19 exposed the fragility of modern society should look up the Great Famine.)

By comparison, by 2013 only 10 percent of private expenditure in the United States was spent on food, a figure that is itself inflated by the amount Americans typically spend in restaurants. Speaking of restaurants, while most have been forced to close their doors, the restaurateurs use apps to deliver excellent food at reasonable prices. Moreover, months into the COVID-19 pandemic, the shops are, generally, well stocked and regularly replenished by the largely uninterrupted stream of cargo flights, truck hauling, and commercial shipping. Due to the miracle of mobile refrigeration, fresh produce continues to be sourced from different parts of the United States and abroad. Shortly before writing this piece, I was able to buy oranges from California, avocados from Mexico, and grapes from Chile in my local supermarket. Globalization may be under pressure from both the left and the right of the U.S. political spectrum, but should the pandemic impair U.S. agricultural production, many will be forced to acknowledge the benefits of the global food supply and our ability to import food from COVID-19-unaffected parts of the world.

This extensive and, at this point, still sturdy supply chain is, of course, a technological marvel. Computers collate information about items on the shelf that are in short supply, adjust the variety and quantity of items shipped between stores, fill new orders, etc. And so, commerce that’s still allowed to go on goes on. So does charity. Feeding America, a network of more than 200 food banks, feeds tens of millions of people through food pantries, soup kitchens, shelters, etc. Since 2005, the organization has been using a computerized internal market to allocate food more rationally. Feeding America uses its own currency, called “shares,” with which individual food banks can bid on the foods that they need the most. Grocery-delivery services bring food to the doorsteps of those who cannot or do not want to leave their homes. The old and the infirm can also use phones, emails, and apps to call upon volunteers to do their shopping and delivery.

Work

The nature of work has changed a lot over the last 200 years or so. Before the industrial revolution, between 85 percent and 90 percent of the people in the Western world were farm laborers. Their work was excruciatingly difficult, as witnessed by one 18th-century Austrian physician who observed that “in many villages [of the Austrian Empire] the dung has to be carried on human backs up high mountains and the soil has to be scraped in a crouching position; this is the reason why most of the young people are deformed and misshapen.” People lived on the edge of starvation, with both the very young and the very old expected to contribute as much as they could to the economic output of the family (most production in the pre-modern era was based on the family unit, hence the Greek term oikonomia, or household management). In those circumstances, sickness was a catastrophe: It reduced the family unit’s production, and therefore its consumption.

The industrial revolution allowed people to move from farms to factories, where work was better paid, more enjoyable, and less strenuous (which is largely why people in poor countries continue to stream from agricultural employment to manufacturing jobs today). Moreover, wealth exploded (real annual income per person in the United States rose from $1,980 in 1800 to $53,018 in 2016). That allowed for ever-increasing specialization, which included a massive expansion of services catering to the desires of an ever-more-prosperous population.

The service sector today consists of jobs in the information sector, investment services, technical and scientific services, health care, and social-assistance services, as well as in arts, entertainment, and recreation. Most of these jobs are less physically arduous, more intellectually stimulating, and better paid than either agricultural or manufacturing jobs ever were. Crucially, many of these service-sector jobs can be performed remotely. That means that even in the midst of the government-imposed economic shutdown, some work (about a third, estimates suggest) can go on. The economic losses from COVID-19, in other words, will be astronomical, but not total.

My own organization, for example, shut its doors in mid March. Since then, everyone has been scribbling away at home or appearing on news shows around the world via the Internet. All of us are in regular contact via the phone, Zoom, and Microsoft Teams. Other organizations are doing the same. As we already discussed, a great deal of shopping is taking place online. Shipping and delivery companies are expanding, with Amazon hiring 100,000 additional workers in the United States. Home entertainment, of course, has grown tremendously, with Netflix adding millions of new customers and expanding its offerings with thousands of new films and television shows. With over 30 million American children stuck at home, online learning companies are booming, and educators from high-school teachers to college professors continue to perform their jobs remotely. Telehealth is expanding, allowing patients to see their doctors in a safe and convenient way. Even minor medical procedures, such as eye exams, can be conducted remotely, and multiple companies will deliver your new specs to your front door. Banking and finance are still going on, with many people taking advantage of low interest rates to refinance their mortgages. Finally, the often unfairly maligned pharmaceutical industry is expanding as we all wait and hope for the release of a COVID-19 vaccine or effective therapeutic treatment.

Sociality

Aristotle observed that “man is by nature a social animal” and noted that without friends we would be unhappy. But the role of sociality (that is to say, the tendency to associate in or form social groups) goes much deeper than that. As William von Hippel explained in his 2018 book The Social Leap, sociality is the mechanism by which Homo sapiens came about. When early hominids were forced down from the trees (perhaps as a result of a climatic change that dried up African forests), they became more vulnerable to predators. To cover longer distances between the fast-disappearing trees while maintaining a modicum of protection against other animals, our ancestors developed bipedalism, which allowed them to free their upper body to carry weapons such as sticks and stones.

Even more important was the invention of cooperation. While a stick-wielding ape is slightly better-off than an unarmed one, a group of armed apes is much better at dispatching predators. Individuals in more cooperative bands survived to adulthood and bred more often, resulting in more-cooperative species. Furthermore, since living alone was tantamount to a death sentence, selfish apes who didn’t care about being ostracized for not pulling their weight died off, resulting in a desire for communal cooperation and a deep-rooted fear of rejection by the group.

The early hominids had brains more like those of chimps than those of modern humans. That’s because the evolutionary pressures that created the former — such as predation and food scarcity — could be overcome without tremendous intelligence. These pressures to survive were part of the physical landscape — a challenging but static environment that didn’t require a lot of cognitive ability to navigate. The environmental pressure that resulted in modern humans was the social system itself. The social landscape is much more dynamic than the physical one. Once they had banded together in groups, our ancestors were forced to forge relationships with, and avoid being exploited by, individuals with divergent and constantly shifting interests. Those who couldn’t keep up with the increasingly complex social game either died or were unable to mate.

This new pressure created a positive evolutionary cycle: Banding together created more complex social systems, which required bigger brains; bigger brains needed to be fed; and the best way to get more food was more cooperation and a more sophisticated social system. The main cognitive development that evolved from this evolutionary cycle is known as the “theory of mind.” In short, the theory of mind is the ability to understand that other minds can have different reasoning, knowledge, and desires from your own. While that seems basic, the theory of mind distinguishes us from all other life on Earth. It allows us to determine whether an affront, for example, was intentional, accidental, or forced. It allows us to feel emotions such as empathy, pride, and guilt — abilities that are keys to a functioning society.

So sociality and human beings are inseparable, as we have all been clearly reminded by the sudden restrictions on our ability to interact with others. As we sit at home, working away on our computers or watching television, most of us feel a tremendous sense of isolation (“social distancing”) from our family, friends, and colleagues. The urge to be around others is innate to us. It is who we are.

Dissatisfied with impersonal modes of communication, such as email and texting, we have rediscovered the need for a face-to-face interaction with our fellow humans. To that end, we utilize digital platforms such as Zoom, Google Hangouts, Facebook Live, and FaceTime to catch up on the latest news in other people’s lives, or simply to complain about the misery of loneliness and the pathetic inadequacy of our public officials (of both parties). Throughout the nation, people engage in virtual happy hours, dinners, book clubs, fitness classes, religious services, and group meditation. As my Cato Institute colleague Chelsea Follett recently wrote, “Technology has made it easier than ever to hold a physically-distanced ‘watch party’ synchronized so that viewers in different locations see the same part of a movie at the same time. For those who like to discuss movies as they watch, technology also enables a running group commentary of each scene in real time.” In the saddest of cases, technology enables people to say goodbye to dying friends and relatives. In a very real sense, therefore, technology keeps us sane (or, at the very least, saner).

Technology, then, allows us to cope with the challenges of the pandemic in ways that our ancestors could not even dream about. More important, technology allows our species to face the virus with grounds for rational optimism. In these dark days, remember all the scientists who are utilizing the accumulated store of human knowledge to defeat COVID-19 in record time and all the marvelous (not to say miraculous) ways the modern world keeps us well fed, psychologically semi-balanced, and (in many cases) productively engaged.

This originally appeared in National Review.

Blog Post | Health & Medical Care

Technology Makes Social Distancing Easier

It has become increasingly clear that social distancing should more aptly be called physical distancing, because those practicing it can still be social.

Not long ago, many people decried screen time as an epidemic. But now that humanity finds itself in the midst of an actual disease pandemic, screens are proving to be a boon to the species. Progress in digital technology has perhaps never been more evident than in this moment of widespread social distancing measures.

Without today’s technology, “social distancing” would have meant isolation. From work, education and errands to leisure activities and socializing, technology is making “social distancing” possible with minimal sacrifice compared to what previous generations would have had to endure to achieve the same degree of physical separation.

It is of course true that looking at screens for prolonged periods has its downsides and that moderation is important. But the use of technology to help people stay connected and keep society running smoothly during this pandemic is turning the narrative that digital technology threatens human interaction and happiness upside-down.

Widespread reports have emerged of virtual dinner parties (warranting coverage in The Washington Post) and other virtual gatherings. It has become increasingly clear that social distancing should more aptly be called physical distancing — because those practicing it can still be social.

As bars temporarily shut down to prevent potential virus transmission, virtual cocktail parties and happy hours are taking off, meriting recent articles in The New York Times and The Wall Street Journal covering the phenomenon. Happy hour gatherings, those fixtures of many young professionals’ lives, have transformed into digital social events involving split-screen video chats between participants as they each raise a glass from their respective locations.

Virtual gatherings, enabled by digital platforms like Zoom, Google Hangouts, Facebook Live, FaceTime and others, are helping socially-distanced people across the world to engage with one another and socialize.

Activities that normally involve congregations of people, ranging from book clubs and fitness classes to religious services and group meditation, are going online.

Physical distancing also does not mean cultural deprivation. Many of the world’s museums, including the British Museum in London, the Guggenheim Museum in New York and the Louvre  in Paris, offer virtual online tours. For those who prefer the presence of a tour guide, it is now even possible to take a live guided virtual tour at some museums (such as the third U.S. president Thomas Jefferson’s historic home Monticello), asking your guide questions and receiving answers in real time as you tour.

Unable to hold live concerts, musicians ranging from pop star Miley Cyrus to country singer Willie Nelson are holding virtual concerts. In a similar vein, theater-streaming services are stepping in to offer plays, ballets and Broadway performances online. New York’s Metropolitan Opera House now offers “Nightly Met Opera Streams” of past performances, set to continue for the duration of the opera house’s pandemic-induced closure.

And of course movie streaming services can bring the magic of the cinema into your home. Technology has made it easier than ever to hold a physically-distanced “watch party” synchronized so that viewers in different locations see the same part of a movie at the same time. For those who like to discuss movies as they watch, technology also enables a running group commentary of each scene in real time.

If you miss traveling, know that Google has created an online experience whereby five U.S. National Parks can be toured virtually. Without leaving home, birdwatching enthusiasts can enjoy a live view of the birds of the Panamanian rainforest thanks to Cornell University’s lab of ornithology or watch puffins off the coast of Maine, courtesy of the private non-profit National Audubon Society. Similarly, live zoo webcams can bring the fun of observing nature’s creatures, from majestic lions to playful sea otters, into your living room.

What about errands? Shopping at home is easier than ever, and now that regulations on the production of hand sanitizer have loosened, perhaps it will even become available again soon. For those who prefer to try clothes on before they buy, many retailers now offer a free trial period for clothing purchased online and delivered to the customer.

Telehealth is being utilized on a scale never seen before, allowing patients to connect with medical professionals without leaving home. It may soon be possible to order a COVID-19 test online, with a medical professional remotely reviewing your symptoms, as some companies have already promised. (The FDA has just announced that it has moved to ban in-home tests, but hopefully it will reverse that decision given the testing shortage). The internet can also help with more mundane health concerns. For example, it is now possible to take an online eye exam to update your lens or contacts prescription, and multiple companies will ship sample frames to you to try on at home.

And, of course, online learning platforms let students learn without risking their health, while remote work similarly allows employees to keep being productive while slowing the spread of the pandemic. Even internships can be conducted remotely.

Some recent changes, like greater workplace flexibility toward remote work and improved accessibility of telehealth services, may prove enduring. “This is an inflection point, and we’re going to look back and realize this is where it all changed,” Jared Spataro, a Microsoft executive, opined in an online press briefing, referring to more organizations shifting toward openness to remote work amid the pandemic. “We’re never going to go back to working the way that we did,” he predicted. Whether he is right or not, it is clear that the pandemic has pushed humanity to use technology in innovative new ways, and that technology has made severe social distancing measures much more bearable.