fbpx
01 / 05
Ridley: Bureaucracies Stifle Innovation and Progress

Blog Post | Innovation

Ridley: Bureaucracies Stifle Innovation and Progress

We need to promote a new regulatory culture based on permissionless innovation.

While the world economy continues to grow at more than 3 per cent a year, mature economies, from Europe to Japan, are coagulating, unable to push economic growth above sluggish. The reason is that we have more and more vested interests against innovation in the private as well as the public sector.

Continuing prosperity depends on enough people putting money and effort into what the economist Joseph Schumpeter called creative destruction. The normal state of human affairs is what The jurist Sir Henry Maine called a “status” society, in which income is assigned to individuals by authority. The shift to a “contract” society, in which people negotiate their own rewards, was an aberration and it’s fading. I am writing this from Amsterdam and am reminded we caught the idea off the Dutch, whose impudent prosperity so annoyed the ultimate status king, Louis XIV.

In most western economies, it is once again more rewarding to invest your time and effort in extracting nuggets of status wealth, rather than creating new contract wealth, and it has got worse since the great recession, as zombie firms kept alive by low interest rates prevent the recycling of capital into new ideas. A new book by two economists, Brink Lindsey and Steven Teles, called The Captured Economy: How the Powerful Enrich Themselves, Slow Down Growth, and Increase Inequality, argues that “rent-seeking” behaviour — the technical term for extracting nuggets — explains the slow growth and rising inequality in the US.

They make the case that, in four areas, there is ever more opportunity to live off “rents” from artificial scarcity created by government regulation: financial services, intellectual property, occupational licensing and land use planning: “The rents enjoyed through government favouritism not only misallocate resources in the short term but they also discourage dynamism and growth over the long term.”

Here, too, hidden subsidies ensure that financial services are a lucrative closed shop; patents and copyrights reward the entertainment and pharmaceutical industries with monopolies known as blockbusters; occupational licensing gives those with requisite letters after their name ever more monopoly earning power; and planning laws drive up the prices of properties. Such rent seeking redistributes wealth regressively — that is to say, upwards — by creating barriers to entry and rewarding the haves at the expense of the have-nots. True, the tax and benefit system then redistributes income back downwards just enough to prevent post-tax income inequality from rising. But government is taking back from the rich in tax that which it has given to them in monopoly.

As an author, my future grandchildren will earn (modest) royalties from my books thanks to lobbying by American corporations to extend copyright to an absurd 70 years after I am dead. Yet there is no evidence that patents and copyrights incentivise innovation, except in a very few cases. Indeed, say Lindsey and Teles, the evidence suggests that “rents that now accrue to movie studios, record companies, software producers, pharmaceutical firms, and other [intellectual property] holders amount to a significant drag on innovation and growth, the very opposite of IP law’s stated purpose.”

[Thomas Babington Macaulay MP summarised an early attempt to extend copyright in a debate thus: “The principle of copyright is this. It is a tax on readers for the purpose of giving a bounty to writers. The tax is an exceedingly bad one; it is a tax on one of the most innocent and most salutary of human pleasures; and never let us forget, that a tax on innocent pleasures is a premium on vicious pleasures.” A correspondent sends me the following details of this appalling saga: “Someone noted that there is a divergence in copyright term in the European Union. All the then member states protect works for the life of the author plus fifty years while West Germany alone protects works for the life of the author plus seventy years. Immediately the copyright publishers suggested this as something in need of harmonisation. But instead of harmonising down to the norm, all the member states were lobbied to harmonise up to the unique German standard. As a result, Adolf Hitler’s “Mein Kampf” which was going out of copyright in 1995 was suddenly revived and protected as a copyrighted work throughout the European Union. Gilbert and Sullivan operettas whose copyright had been controlled by the stultifying hand of the D’Oyly Carte Opera Company found themselves in a position to once again stop anyone else performing Gilbert and Sullivan works or creating anything based upon them. It is not surprising that, following a brief flowering of new creativity when the Gilbert and Sullivan copyrights initially expired (e.g. Joseph Papp’s production of Pirates on Broadway and the West End stage), since their revival by the European Union harmonisation legislation their use have become effectively moribund. A generation of young people are growing up without knowing anything about Gilbert and Sullivan – an art form which, it can be argued, gave birth to the modern American and British musical theatre.”]

As for occupational licensing, Professor Len Shackleton of the University of Buckingham argues that it is mostly a racket to exploit consumers. After centuries of farriers shoeing horses, uniquely in Europe in 1975 a private members bill gave the Farriers Registration Council the right to prosecute those who shod horses without its qualification.

Then there are energy prices. Lobbying by renewable energy interests has resulted in a system in which hefty additions are made to people’s energy bills to reward investors in wind, solar and even carbon dioxide-belching biomass plants. The rewards go mostly to the rich; the costs fall disproportionately on the poor, for whom energy bills are a big part of their budgets.

An example of how crony capitalism stifles innovation: Dyson found that the EU energy levels standards for vacuum cleaners were rigged in favour of German manufacturers. The European courts rebuffed Dyson’s attempts to challenge the rules, but Dyson won on appeal and then used freedom of information requests to uncover examples of correspondence between a group of German manufacturers and the EU, while representations by European consumer groups were ignored.

So deeply have most businesses become embedded in government cronyism that it is hard to draw the line between private, public and charitable entities these days. Is BAE Systems or Carillion really a private enterprise any more than Oxford University, Oxfam, Oxfordshire county council or the NHS? All are heavily dependent on government contracts, favours or subsidies; all are closely regulated; all have well-paid senior managers extracting rent with little risk, and thickets of middle-ranking bureaucrats incentivised to resist change. Disruptive start-ups are rare as pandas; the vast majority work for corporate brontosaurs.

Capitalism and the free market are opposites, not synonyms. Some in the Tory party grasp this. Launching Freer, a new initiative to remind the party of the importance of freedom, two new MPs, Luke Graham and Lee Rowley, not only lambast fossilised socialism and anachronistic unions, but also boardrooms “peppered with oligarchical and monopolist cartels”.

One of the most insightful books of recent years was The Innovation Illusion by Fredrik Erixon and Björn Weigel, which argues that big companies increasingly spend their profits not on innovation but on share buybacks and other “rents”. Far from swashbuckling enterprise, much big business is “increasingly hesitant to invest and innovate”. Like Kodak and Nokia they resist having to reinvent themselves even unto death. Microsoft “was too afraid of destroying the value of Windows” to go where software was heading.

As a result, globalisation, far from being a spur to change, is an increasingly conservative force. “In several sectors, the growing influence of large and global firms has increasingly had the effect of slowing down market dynamism and reducing the spirit of corporate experimentation”.

The real cause of Trump-Brexit disaffection is not too much change, but too little. We need to “radically reduce the restrictive effect of precautionary regulation” and promote a new regulatory culture based on permissionless innovation, Erixon and Weigel say. “Western economies have developed a near obsession with precautions that simply cannot be married to a culture of experimentation”. Amen.

This first appeared in The Times.  

Blog Post | Economic Growth

Economic Growth Is More Important than You Think

Growth is a saving grace for the world's poorest people, and also has a major impact on the daily lives of Americans and the rest of the developed world.

This article first appeared in CapX. To read the original, click here.

What is economic growth, and why should it matter to ordinary people? Those questions are hard to answer in a hysterical world where once-dry academic matters are now politicized without fail. Recently, commentators from all sides have taken to dismissing growth as a golden idol of narrow-minded capitalists. Likewise, many people see the pursuit of growth as an alternative, not a complement, to the pursuit of social needs like public health and sustainability.

These narratives are understandable, considering the misinformed and tone-deaf ways in which many public figures have attempted to advocate the importance of growth and economic activity, particularly during the current pandemic. But the narratives themselves could not be more misleading. Economic growth affects the lives of ordinary people in many crucial ways, not just in the West, but importantly in countless developing nations too. In fact, growth is generally the greatest source of improvement in global living standards.

If we visualize the economy as a pie, then growth can be visualized as the pie getting bigger. Most economists measure growth using a metric called Gross Domestic Product (GDP), which defines the pie’s “ingredients” as consumption, investment, government spending and net exports. In developing countries, growth is largely driven by investment, while wealthier countries tend to rely on innovation to continue growing.

These working definitions, while highly simplified, are better than nothing. They are important because they can make it easier to understand how GDP correlates with countless key metrics of living standards.

In sub-Saharan Africa, for instance, Real Average GDP per Capita grew by 42% between 1990 and 2018. That growth corresponded to major decreases in extreme povertyinfant mortality and undernourishment.

Growth also increases access to resources that make people safer and healthier. A 2019 paper shows that, while disaster-related fatality rates fell for all global income groups between 1980 and 2016, developing countries in the early stages of growth experienced the greatest improvements. That is because those countries made the greatest relative advances in infrastructure and safety measures—advances facilitated by growth.

Growth is a saving grace for the world’s poorest people, but it also has a major impact on the daily lives of Americans and the rest of the developed world, and that impact is especially important in the age of coronavirus. For example, continuous growth has led to lifesaving breakthroughs in medical technology and research, which has allowed humanity to fight COVID-19 more quickly and effectively than we ever could have in the past. Vaccines for certain ailments took decades to develop as late as the mid-20th century, but it is quite possible that a vaccine for COVID-19 will be widely available just one year after the virus’s initial outbreak.

To many supposedly environmentally conscious critics, it seems intuitive that growth is not sustainable. However, sustainability-based criticisms of growth tend to ignore the reality that growth leads to green innovations that help the planet. Labor-augmenting technologies allow us to produce more while conserving resources and protecting the environment. Moreover, wealthier countries are better equipped to develop and adopt green technologies.

MIT scientist Andrew McAfee has documented many of the concrete environmental benefits of growth in his recent book, More From Less. McAfee notes that increases in America’s population and productive activity in recent decades have coincided with significant decreases in air and water pollution, along with gross reductions in the uses of water, fertilizer, minerals and other resources—all because economic growth and market coordination led to improvements in manufacturing and technology. For facilitating this process, which McAfee calls “dematerialization,” growth should be seen as a key to sustainability, not a barrier.

In a broader sense, growth has made our lives more convenient, dynamic and entertaining via developments in consumer technologies and other innovations. Imagine quarantining for five months (and counting) without the internet, PCs or smartphones. Many people would have no way of doing their jobs. Even for those that could, life would be much more difficult, not to mention dull.

Indeed, if one thing could be said to summarize the impact of growth around the world, it would be that growth makes everyone’s life easier. For instance, the amount of labor needed for average workers to purchase countless basic goods and services is at an all-time low and decreasing, largely because supply chains have grown and become more efficient. The result is that ordinary people, especially those in lower income groups with relatively greater reliance on basic goods, are better off.

The story of economic growth is in many ways the story of how cooperation and exchange can defeat poverty and scarcity. The better we understand that, the more likely we will be to support policies which allow resources to flow into areas that need them the most. Broadly speaking, no political idea has been more effective in this regard than free trade.

Knowing the importance of innovation to human well-being should also encourage us to embrace new technology instead of fearing it. We must therefore be wary of overbearing regulations and fiscal policies that prevent ideas from flourishing.

Most importantly, we should not listen to those who claim that economic growth is a pointless, abstract goal that only benefits the rich and leaves ordinary people behind. Growth is a vital driver of progress in modern society and should be taken seriously for the sake of humanity and the planet.

Blog Post | Science & Technology

Vasquez Reviews Ridley's “How Innovation Works”

Innovation requires trial and error. It requires the possibility to experiment and to fail. Only then can innovation provide the path to success and human progress.

The dog is an innovation. It took place at least 20,000 years ago, when a group of humans domesticated wolves and subsequently began to develop different breeds. The light bulb, wheeled baggage, and the computer are also innovations.

Since prehistoric times, innovation has changed the course of our lives and is, according to science writer Matt Ridley, “The most important fact about the modern world, but one of the least well understood.” Ridley is the author of the new eye-opening book, How Innovation Works: And Why It Flourishes in Freedom. In it, he tells the story of dozens of innovations, to illustrate how that phenomenon is the main cause of the enormous progress humanity has seen in the past few centuries. He derives the following lessons from his study.

Innovation almost always happens gradually and not suddenly. It is “not an individual phenomenon, but a collective, incremental and messy network phenomenon.” Ridley asks, “Who invented the computer?” To answer the question, we would have to go back more than 200 years to the Jacquard loom and then review countless succeeding contributions and innovators. The same can be said about the car or virtually all other innovations, even though we sometimes identify them with individuals like Henry Ford who discovered a way of making the automobile widely accessible to the public.

Innovation is thus collaborative, and the same ideas often independently occur to different individuals at the same time. The telegraph, the thermometer, photography, and the hypodermic needle are examples of simultaneous inventions. Twenty-one people invented the light bulb at about the same time. If Thomas Edison or the Wright brothers would not have existed, we would still enjoy artificial light or the wonder of airplane travel.

Innovation is not the same thing as invention. One can invent something novel, but the people who make a difference are the innovators who figure out the way that that new idea can be useful to society, typically by improving on it and lowering its costs. Innovations exist due to our growing knowledge and a demand for the innovative product. As Ridley observes, “The light bulb emerged inexorably from the combined technologies of the day. It was bound to appear when it did, given the progress of other technologies.”

It’s not possible to plan innovation. Not even the innovators can do so. Innovations more often than not come about because of chance events or unexpected discoveries. Sheer luck explains Alexander Fleming’s discovery of penicillin. The founders of Google did not “set out in search of search engines. The founders of Instagram were trying to make a gaming app. The founders of Twitter were trying to invent a way for people to find podcasts.” Innovation is unpredictable.

That unpredictability also helps explain why the so-called “entrepreneurial state” has not effectively promoted innovation and why we can’t expect it to do so. For example, Ridley explains that contrary to those who advocate in favor of publicly funded innovation, the U.S. government did not intend to create a global internet. Only when the internet “escaped the clutches” of the government – that is, when it was essentially privatized in the 1990s – did the private sector and universities begin to transform the internet into what we use today.

Ridley observes that the widely held view that science leads to technology and innovation—frequently canvassed to justify public subsidies for science—is only partially correct. It is equally true that scientific knowledge is the product of technological improvements and attempts to understand the latter. The first inoculations were conducted without a good understanding as to how and why they worked. Attempts to resolve problems in the yogurt industry contributed to the development of the revolutionary gene-editing method known as CRISPR (which may yet help us find a treatment for COVID-19).

Innovation requires trial and error. It requires the possibility to experiment and to fail. Only then can innovation provide the path to success and human progress. Or, as Ridley puts it, “Innovation is the child of freedom and the parent of prosperity.”

Blog Post | Science & Technology

Our Technological Renaissance

Claims of stagnation are not persuasive.

I put on a record today.

Well, I didn’t put on a record, so much as I put on a . . . well, a what? It wasn’t a vinyl plate or a spool of tape or even a piece of shiny circular plastic. Indeed, whatever physical medium was being used to store the music I was listening to wasn’t available to me at all. It simply came in through the air—like lightning. From the comfort of my chair, I picked up my iPhone, chose the album I wanted from the million-strong list that loaded instantly before my eyes, and directed the sound to the speakers in my vicinity, all of which started to play my choice within a few milliseconds. And then, when I tired of it, I shushed it with my voice.

I think about this sometimes when I hear people complain that the bright technological future we were all promised has steadfastly failed to appear. How, I wonder, would I even begin to explain Spotify and Sonos to my grandfather, who died in 1994? A compact disc could be comprehended by the elderly as a better vinyl record, much as the Space Shuttle could be comprehended as a faster airplane. But streaming? If my grandfather came back today, where would I start?

“Okay, so I’m using my telephone, which isn’t really a telephone so much as a supercomputer-cum-Library-of-Alexandria-cum-high-definition-movie-studio, to send a wireless signal to the magical speakers in my home, which, upon my request, will contact a set of servers 3,000 miles away in San Francisco, and request instant access to the closest digital copy of—”

“Wait, what’s a server?”

“—hold on—to the closest digital copy of one of millions of high-quality songs to which I have full and unlimited access, but neither own nor have to store, and—”

It boggles the mind.

It may be tempting to regard this example as a mere bauble or trinket, or even as a sign of decadence. But to do so would represent a disastrous miscalculation of its significance. It is true that some of our advances have slowed since the 1970s. We do not go to the moon on a regular basis, despite the promises of the Apollo program; transatlantic travel has become slower, rather than faster—R.I.P. Concorde; our cars essentially still use the same engines as they always have; and life expectancy is no longer leaping forward. But it is also true that, unlike then, we now enjoy a magnificent worldwide communications network that offers the sum of human knowledge in the blink of an eye and is open to anybody who wishes to join it. If that is “all” we’ve done in the last four decades, I think we should congratulate ourselves rather heartily.

Forget my grandfather for a moment and imagine explaining that to almost any literate person in human history. What do we imagine his reaction would have been? Do we think he would have said, “That sounds like stagnation to me”? Or do we think he would have said, “It sounds as if you have reached the promised land, I hope you are extremely grateful for the bounties you have inherited.” If not the latter, he’d be a fool.

From the desk on which I am writing these words, I have access to all of the great works in history: every song, every play, every book, every poem, every movie, every pamphlet, every piece of art. I can find every translation of the Bible that has ever been compiled and put them side by side for comparison. I can read the missives that were sent during the American Revolution, and examine the patents for the first steam engine, and listen to all of Winston Churchill’s speeches between 1939 and 1945. The world’s recipes are available to me without exception, and, if I desire, I can watch a cornucopia of free-to-use instructional videos in which experts show me how to cook them. At no cost or inconvenience, I can learn how to fix my sink or change my car’s tires or troubleshoot my dishwasher. If I want to know where the “panda ant” lives (Chile), to which genus it belongs (Euspinolia), how long it is (up to 8 millimeters), and whether it’s actually an ant (it’s not, it’s a wasp), I can find this information in seconds. What was on the front page of the Key West Citizen on June 2, 1943? Easy: “City Council Takes Up Incinerator Project with Representative of FWA.” Nearly 2,000 years ago, Pliny the Elder wondered if it might be a good idea to collect all of human knowledge in one place, available to all. That dream has become a reality—and we got to live when it happened. I’d say that’s pretty darn good.

The airplane annihilated distance; the smartphone has annihilated geography altogether. Provided that I have a stable connection to the Internet, it takes me the same amount of time to send a digital photograph to Delhi as it does for me to send it to a person in the house next door. On Saturday mornings I can sit and watch the same soccer games, broadcast live from England, that my dad is watching in England and text him about the developments in real time, as if I were sitting next to him. If I need to keep an eye on the news, it makes no difference whether I am sitting in the headquarters of Reuters or on a beach in Australia. Wherever I am, the information flow is the same. Except by design, there is no longer any such thing as “out of the loop.” As an achievement, this is monumental.

The “Spaceship Earth” attraction at Disney’s Experimental Prototype Community of Tomorrow tells the story of human communication from the days of the Neanderthal to the invention of the computer. I have wondered at times what Disney will substantively add to this story when it comes time to update the show, and I have come to conclude that the answer is almost certainly nothing. One cannot improve on instant worldwide communication that is accessible to every person and in every place. One can tinker around the edges to upgrade its speed, its reliability, its quality, and its durability, one can add some security into the mix for good measure, but, give or take, this is a problem that has now been solved. As the Phoenicians solved the alphabet problem, so have our contemporary engineers solved the transmission problem. The dream has arrived.

Not everyone appreciates this, of course, which is why it is customary for the complaint I am addressing to be amended slightly, from “technology has stagnated” to “technology is frivolously used and may even be bad for us.” But, while the latter proposition is arguably true, it concedes my premise that something dramatic has changed in the way in which we live. It is indeed entirely possible that the volume and speed of information that the I.T. revolution has ushered in have had a destructive effect on individuals or on society. It is possible, too, that, while the benefits are immense, most people choose not to take advantage of them. I would not be the first to lament that the first thing users seem to do with their access to the Internet is to begin arguing with strangers. And yet to contend that the abuse of the personal computer in some way undermines the value of the personal computer would be equivalent to contending that the use of the airplane for bombing renders the significance of its invention questionable.

I suspect that some of our disappointment is the fault of comic books. Riffle through any Bumper Sci-Fi Book for Boys!–style volume that was published between the 1920s and the 1960s and you will see that the physical breakthroughs that were anticipated—spacesuits, rocket ships, jetpacks, flying cars, laser guns, etc.—are featured prominently and enthusiastically, while the less tangible mass communications that were anticipated are set quietly in the background, as if they are inevitable. In story after story, the astronauts communicate from the planet Zog in an instant using video chat, and yet that, evidently, is not the exciting part. The exciting part is that they are on Zog.

I must confess that I do not understand why, for it is not at all obvious to me that exploring Zog is more useful than inventing Wikipedia, or that the ability to get to Zog would represent a greater leap forward than the ability to talk to our friends from it. Certainly, Zog may have some interesting rocks, and the technical feat of sending men there and returning them safely to Earth would be worth celebrating. (I do tend to tear up watching the original Moon landing.) But in comparison to a breakthrough that allows me to enjoy the words, faces, music, food, counsel, art, and research of every other human being on Earth, whether living or dead, it would pale. I have that. In my pocket.

Stagnation? Nope. Renaissance, more like.

This originally appeared in National Review. 

Blog Post | Health & Medical Care

COVID-19 Should Make Us Grateful for Technology

Imagine a pre-modern pandemic.

“In a way, everything is technology,” noted one of the world’s greatest economic historians, Fernand Braudel, in his monumental study Civilization and Capitalism. “Not only man’s most strenuous endeavors but also his patient and monotonous efforts to make a mark on the external world; not only the rapid changes . . . but also the slow improvements in processes and tools, and those innumerable actions which may have no immediate innovating significance but which are the fruit of accumulated knowledge,” he continued.

Yes, land, labor, and capital (that’s to say, the factors of production) are important components of economic growth. In the end, however, human progress in general and global enrichment in particular are largely dependent on invention and innovation. That is surely even clearer now that humanity’s hopes for the end of the pandemic and for our liberation from the accompanying lockdown rest on further scientific breakthroughs within the pharmaceutical industry. Let’s take a brief look at the impact of technology on health care, food supply, work, and sociality in the time of COVID-19.

Healthcare

The impact of modern technology is surely most keenly felt and anticipated within the sphere of human health care. Consider some of the worst diseases that humanity has had to face in the past. Smallpox, which is thought to have killed an estimated 300 million people in the 20th century alone, originated in either India or Egypt at least 3,000 years ago. Smallpox variolation, it seems, was practiced in China in the tenth century, but it was not until the late 18th century that Edward Jenner vaccinated his first patient against the disease. Smallpox was fully eradicated only in 1980.

Similar stories could be told about other killer diseases. Polio, which can be seen depicted in Egyptian carvings from the 18th dynasty, is of ancient origin. Yet the disease wasn’t properly analyzed until the year of the French Revolution, with Jonas Salk’s vaccine appearing only in 1955. Today, polio is close to being eradicated (just 95 cases were reported in 2019).

Malaria, probably humanity’s greatest foe, is at least 30 million years old (the parasite has been found in an amber-encased mosquito from the Paleogene period). It was only after the discovery of the New World that knowledge about the fever-reducing benefits of the bark of the cinchona tree spread to Europe and Asia. Quinine was first isolated in 1820, and chloroquine was introduced in 1946. Artemisinin drugs, which we still use, were discovered in the late 1970s. That’s to say that humanity lived with deadly diseases for millennia without fully knowing what they were, how they were transmitted, and how they could be cured. The fate of humanity, our ancestors thought, fluctuated under the extraneous influence of the “wheel of fortune” and there was nothing that anyone could do about it. One day you were alive and next day you were not.

Contrast that glacial pace of progress, and the fatalistic acceptance of disease and death, with our response time to the current pandemic. The Wuhan Municipal Health Commission reported the existence of a cluster of cases of “pneumonia” in Wuhan on December 31. On January 7 the Chinese identified the pathogen (novel coronavirus) responsible for the outbreak. On January 11 China sequenced the genetic code of the virus, and the next day it was publicly available. That enabled the rest of the world to start making diagnostic kits to identify the disease.

To take one example, the first COVID-19 infection in South Korea was identified on January 20. On February 4, the first test kit (made by Kogene Biotech) entered production. On February 7, the test kit was available at 50 locations around the country. Other countries followed suit.

The World Health Organization, which declared COVID-19 a global pandemic on March 11, may have acted too late. Still, it is noteworthy that just two months expired between the first sign of trouble and the time when the entire world put measures in place to retard the spread of the disease. In the meantime, we have learned a lot about governmental incompetence and regulatory overreach. But we have also learned a great deal about the spread and symptoms of the disease. Instead of starting from scratch, medical specialists in Europe and America can draw on the expertise of their colleagues in the Far East. Before the telegraph appeared midway through the 19th century, it took up to a month for a ship to carry information from London to New York. Today, we learn about the latest COVID-19 news (good and bad) and research in seconds.

By mid April, thousands of highly educated and well-funded specialists throughout the world were using supercomputers and artificial intelligence to identify promising paths toward victory over the disease. Some 200 different programs are underway to develop therapies and vaccines to combat the pandemic. They include studies of the effectiveness of existing antiviral drugs, such as Gilead’s Remdesivir, Ono’s protease inhibitor, and Fujifilm’s favipiravir. The effectiveness of generic drugs, such as hydroxychloroquine and chloroquine, is also being evaluated. Takeda is hard at work on convalescent plasma (TAK-888) in Japan, while Regeneron works on monoclonal antibodies in the United States. New vaccines, such as Moderna’s mRNA-1273, Inovio’s INO-4800, and BioNTech’s BNT162, are under development.

We don’t know which of these treatments (if any) will work, but here is what we can be sure of: There has never been a better time for humans to face and defeat a global pandemic. The world is richer than ever before, and money is what enables us to sustain a massive pharmaceutical industry and pay for highly sophisticated medical research and development.

Coronavirus may be deadly, but it is not the bubonic plague, which had a mortality rate of 50 percent. Luckily, it is a far milder virus that has reawakened us to the danger posed by communicable diseases. Once the immediate crisis is behind us, researchers will collect billions of data from dozens of countries and analyze the different governmental responses to the pandemic. That knowledge will be deployed by governments and the private sector to ensure that best practices are adopted, so that next time we are better prepared.

Food

When the Black Plague struck Europe in 1347, the disease found the local population ripe for slaughter. Following the close of the Medieval Warm Period at the end of the 13th century, the climate turned cold and rainy. Harvests shrunk and famines proliferated. France, for example, saw localized famines in 1304, 1305, 1310, 1315–17, 1330–34, 1349–51, 1358–60, 1371, 1374–75, and 1390. The Europeans, weakened by shortages of food, succumbed to the disease in great numbers.

The people of yore faced at least three interrelated problems. First, the means of transport and the transportation infrastructure were awful. On land, the Europeans used the same haulage methods (carts pulled by donkeys, horses, and oxen) that the ancients had invented. Similarly, much of Europe continued to use roads built by the Romans. Most people never left their native villages or visited the nearest towns. They had no reason to do so, for all that was necessary to sustain their meager day-to-day existence was produced locally.

The second problem was the lack of important information. It could take weeks to raise the alarm about impending food shortages, let alone organize relief for stricken communities.

Third, regional trade was seldom free (France did not have a single internal market until the Revolution) and global trade remained relatively insignificant in economic terms until the second half of the 19th century. Food was both scarce and expensive. In 15th-century England, 80 percent of ordinary people’s private expenditure went for food. Of that amount, 20 percent was spent on bread alone. Under those circumstances, a local crop failure could spell the destruction of an entire community. (Those who think that COVID-19 exposed the fragility of modern society should look up the Great Famine.)

By comparison, by 2013 only 10 percent of private expenditure in the United States was spent on food, a figure that is itself inflated by the amount Americans typically spend in restaurants. Speaking of restaurants, while most have been forced to close their doors, the restaurateurs use apps to deliver excellent food at reasonable prices. Moreover, months into the COVID-19 pandemic, the shops are, generally, well stocked and regularly replenished by the largely uninterrupted stream of cargo flights, truck hauling, and commercial shipping. Due to the miracle of mobile refrigeration, fresh produce continues to be sourced from different parts of the United States and abroad. Shortly before writing this piece, I was able to buy oranges from California, avocados from Mexico, and grapes from Chile in my local supermarket. Globalization may be under pressure from both the left and the right of the U.S. political spectrum, but should the pandemic impair U.S. agricultural production, many will be forced to acknowledge the benefits of the global food supply and our ability to import food from COVID-19-unaffected parts of the world.

This extensive and, at this point, still sturdy supply chain is, of course, a technological marvel. Computers collate information about items on the shelf that are in short supply, adjust the variety and quantity of items shipped between stores, fill new orders, etc. And so, commerce that’s still allowed to go on goes on. So does charity. Feeding America, a network of more than 200 food banks, feeds tens of millions of people through food pantries, soup kitchens, shelters, etc. Since 2005, the organization has been using a computerized internal market to allocate food more rationally. Feeding America uses its own currency, called “shares,” with which individual food banks can bid on the foods that they need the most. Grocery-delivery services bring food to the doorsteps of those who cannot or do not want to leave their homes. The old and the infirm can also use phones, emails, and apps to call upon volunteers to do their shopping and delivery.

Work

The nature of work has changed a lot over the last 200 years or so. Before the industrial revolution, between 85 percent and 90 percent of the people in the Western world were farm laborers. Their work was excruciatingly difficult, as witnessed by one 18th-century Austrian physician who observed that “in many villages [of the Austrian Empire] the dung has to be carried on human backs up high mountains and the soil has to be scraped in a crouching position; this is the reason why most of the young people are deformed and misshapen.” People lived on the edge of starvation, with both the very young and the very old expected to contribute as much as they could to the economic output of the family (most production in the pre-modern era was based on the family unit, hence the Greek term oikonomia, or household management). In those circumstances, sickness was a catastrophe: It reduced the family unit’s production, and therefore its consumption.

The industrial revolution allowed people to move from farms to factories, where work was better paid, more enjoyable, and less strenuous (which is largely why people in poor countries continue to stream from agricultural employment to manufacturing jobs today). Moreover, wealth exploded (real annual income per person in the United States rose from $1,980 in 1800 to $53,018 in 2016). That allowed for ever-increasing specialization, which included a massive expansion of services catering to the desires of an ever-more-prosperous population.

The service sector today consists of jobs in the information sector, investment services, technical and scientific services, health care, and social-assistance services, as well as in arts, entertainment, and recreation. Most of these jobs are less physically arduous, more intellectually stimulating, and better paid than either agricultural or manufacturing jobs ever were. Crucially, many of these service-sector jobs can be performed remotely. That means that even in the midst of the government-imposed economic shutdown, some work (about a third, estimates suggest) can go on. The economic losses from COVID-19, in other words, will be astronomical, but not total.

My own organization, for example, shut its doors in mid March. Since then, everyone has been scribbling away at home or appearing on news shows around the world via the Internet. All of us are in regular contact via the phone, Zoom, and Microsoft Teams. Other organizations are doing the same. As we already discussed, a great deal of shopping is taking place online. Shipping and delivery companies are expanding, with Amazon hiring 100,000 additional workers in the United States. Home entertainment, of course, has grown tremendously, with Netflix adding millions of new customers and expanding its offerings with thousands of new films and television shows. With over 30 million American children stuck at home, online learning companies are booming, and educators from high-school teachers to college professors continue to perform their jobs remotely. Telehealth is expanding, allowing patients to see their doctors in a safe and convenient way. Even minor medical procedures, such as eye exams, can be conducted remotely, and multiple companies will deliver your new specs to your front door. Banking and finance are still going on, with many people taking advantage of low interest rates to refinance their mortgages. Finally, the often unfairly maligned pharmaceutical industry is expanding as we all wait and hope for the release of a COVID-19 vaccine or effective therapeutic treatment.

Sociality

Aristotle observed that “man is by nature a social animal” and noted that without friends we would be unhappy. But the role of sociality (that is to say, the tendency to associate in or form social groups) goes much deeper than that. As William von Hippel explained in his 2018 book The Social Leap, sociality is the mechanism by which Homo sapiens came about. When early hominids were forced down from the trees (perhaps as a result of a climatic change that dried up African forests), they became more vulnerable to predators. To cover longer distances between the fast-disappearing trees while maintaining a modicum of protection against other animals, our ancestors developed bipedalism, which allowed them to free their upper body to carry weapons such as sticks and stones.

Even more important was the invention of cooperation. While a stick-wielding ape is slightly better-off than an unarmed one, a group of armed apes is much better at dispatching predators. Individuals in more cooperative bands survived to adulthood and bred more often, resulting in more-cooperative species. Furthermore, since living alone was tantamount to a death sentence, selfish apes who didn’t care about being ostracized for not pulling their weight died off, resulting in a desire for communal cooperation and a deep-rooted fear of rejection by the group.

The early hominids had brains more like those of chimps than those of modern humans. That’s because the evolutionary pressures that created the former — such as predation and food scarcity — could be overcome without tremendous intelligence. These pressures to survive were part of the physical landscape — a challenging but static environment that didn’t require a lot of cognitive ability to navigate. The environmental pressure that resulted in modern humans was the social system itself. The social landscape is much more dynamic than the physical one. Once they had banded together in groups, our ancestors were forced to forge relationships with, and avoid being exploited by, individuals with divergent and constantly shifting interests. Those who couldn’t keep up with the increasingly complex social game either died or were unable to mate.

This new pressure created a positive evolutionary cycle: Banding together created more complex social systems, which required bigger brains; bigger brains needed to be fed; and the best way to get more food was more cooperation and a more sophisticated social system. The main cognitive development that evolved from this evolutionary cycle is known as the “theory of mind.” In short, the theory of mind is the ability to understand that other minds can have different reasoning, knowledge, and desires from your own. While that seems basic, the theory of mind distinguishes us from all other life on Earth. It allows us to determine whether an affront, for example, was intentional, accidental, or forced. It allows us to feel emotions such as empathy, pride, and guilt — abilities that are keys to a functioning society.

So sociality and human beings are inseparable, as we have all been clearly reminded by the sudden restrictions on our ability to interact with others. As we sit at home, working away on our computers or watching television, most of us feel a tremendous sense of isolation (“social distancing”) from our family, friends, and colleagues. The urge to be around others is innate to us. It is who we are.

Dissatisfied with impersonal modes of communication, such as email and texting, we have rediscovered the need for a face-to-face interaction with our fellow humans. To that end, we utilize digital platforms such as Zoom, Google Hangouts, Facebook Live, and FaceTime to catch up on the latest news in other people’s lives, or simply to complain about the misery of loneliness and the pathetic inadequacy of our public officials (of both parties). Throughout the nation, people engage in virtual happy hours, dinners, book clubs, fitness classes, religious services, and group meditation. As my Cato Institute colleague Chelsea Follett recently wrote, “Technology has made it easier than ever to hold a physically-distanced ‘watch party’ synchronized so that viewers in different locations see the same part of a movie at the same time. For those who like to discuss movies as they watch, technology also enables a running group commentary of each scene in real time.” In the saddest of cases, technology enables people to say goodbye to dying friends and relatives. In a very real sense, therefore, technology keeps us sane (or, at the very least, saner).

Technology, then, allows us to cope with the challenges of the pandemic in ways that our ancestors could not even dream about. More important, technology allows our species to face the virus with grounds for rational optimism. In these dark days, remember all the scientists who are utilizing the accumulated store of human knowledge to defeat COVID-19 in record time and all the marvelous (not to say miraculous) ways the modern world keeps us well fed, psychologically semi-balanced, and (in many cases) productively engaged.

This originally appeared in National Review.