01 / 05
The World's Getting Better All the Time

Blog Post | Health & Medical Care

The World's Getting Better All the Time

Here is a taste of the progress humanity made in 2017

The end of 2017 is barely a week away. So now is the perfect time to reflect upon the positive difference humanity has made to the world over the past 12 months. How have we advanced as a species?  We often underestimate the progress we make because it is incremental: an algorithm here, a genetic tweak there… But all these things combine to improve our future. As Kevin Kelly from Wired wrote, “Ever since the Enlightenment and the invention of Science, we’ve managed to create a tiny bit more than we’ve destroyed each year… That few percent positive difference is compounded over decades in to what we might call civilisation … [Progress] is a self-cloaking action seen only in retrospect.” My website, Human Progress, tracks technological, medical and scientific improvements that make the lives of ordinary people better. Here are the ones that have caught our attention in 2017, doubtless only a tiny fraction of all the advances humanity has made in the past 12 months.

  1. February 6: The age of the bionic body – from robot hands controlled by your mind to electronic eyeballs, experts reveal 6 medical marvels introduced by new technology.
  2. February 6: Biologists help deaf mice hear again by inserting healthy genes into their ears – the work shows an ‘unprecedented recovery of inner ear function’ and could be used in humans.
  3. February 10: Computers turn medical sleuths and identify skin cancer – algorithm works as reliably as board-certified dermatologists, study shows.
  4. February 15: For the blind, an actual-reality headset – not just Star Trek fiction, a new visor from eSight is a lightweight, high-contrast vision system for legally blind people.
  5. February 17: Wyoming man receives ‘miracle’ face transplant 10 years after suicide attempt.
  6. February 18: Smartphones to become pocket doctors after scientists discover camera flash and microphone can be used to diagnose illness.
  7. February 20: Hope for millions as scientists discover multiple sclerosis treatment that can slow its progression.
  8. February 22: Life expectancy to break 90 barrier by 2030.
  9. March 2: Teenager’s sickle cell reversed with world-first therapy.
  10. March 3: Terminal cancer patients go into complete remission after groundbreaking gene therapy.
  11. March 3: UTA professor invents breath monitor to detect flu.
  12. March 6: Google’s artificial intelligence can diagnose cancer faster than human doctors – the system is able to scan samples to determine whether or not tissues are cancerous.
  13. March 8: The robot will see you now! Chat-bots that monitor symptoms will become more accurate and quicker at spotting illness than doctors.
  14. March 8: Scientists discover new state of matter called ‘Time Crystals’ – time crystals seemingly break the rules of normal time-keeping and potentially pave the way for quantum computers and quantum sensors.
  15. March 10: Scientists make progress toward engineering synthetic yeast – the work brings to six the number of yeast chromosomes that now can be synthesized, according to new research.
  16. March 13: More people could benefit from BRCA breast cancer drugs.
  17. March 14: Swedish men on target to be first to completely stub out smoking.
  18. March 15: Startup serves up chicken produced from cells in lab – ‘clean meat’ developers say it avoids towering costs of feeding, caring for livestock.
  19. March 15: The next innovation in shipping – Wind Power; Maersk launches trial run of tanker using rotating cylinders that can function as high-tech sails.
  20. March 18: An insect’s eye inspires a new camera for smartphones – a series of eyelets can make cameras much smaller.
  21. March 23: New rotavirus vaccine could prevent thousands of childhood deaths.
  22. March 24: Machines which detect cancer symptoms could be out in a year.
  23. March 27: Can tech speed up emergency room care? A New York hospital system tests a new way to use telemedicine, where E.R. doctors examine patients without being in the same room.
  24. April 2: Plastic-eating fungus may solve garbage problem.
  25. April 6: To handle electronic waste, freeze it and pulverize it – scientists say a new technique can make it more profitable to harvest metals and other materials from circuit boards in old TVs, computers and more.
  26. April 26: Artificial ‘brain in a dish’ is created in a world first: Breakthrough could shed light on conditions such as Alzheimer’s.
  27. May 2: CRISPR eliminates HIV in live animals.
  28. May 4: Scientists engineer baker’s yeast to produce penicillin molecules.
  29. May 5: Iceland drills 4.7 km down into volcano to tap clean energy.
  30. May 11: Scientists have created an exoskeleton to stop elderly people from falling.
  31. May 11: HIV life expectancy ‘near normal’ thanks to new drugs.
  32. May 17: Lab-grown blood stem cells produced at last – two research teams cook up recipe to make long-sought cells in mice and people.
  33. May 18: Antibodies from a human survivor define sites of vulnerability for broad protection against ebola viruses.
  34. May 18: Scientists look to skies to improve tsunami detection.
  35. May 22: World’s largest aircraft completes successful test flight.
  36. May 23: MIT researchers develop a moisture-responsive workout suit with live cells.
  37. June 1: ‘Instantly rechargeable’ battery could change the future of electric and hybrid automobiles.
  38. June 1: Extinct species of Galapagos giant tortoise may be resurrected.
  39. June 3: SpaceX successfully launches reused Dragon spacecraft for ISS resupply.
  40. June 3: ‘Liquid Biopsy’ passes early test in quest to find cancer in blood .
  41. June 13: Live antibiotics use bacteria to kill bacteria.
  42. July 3: This Silicon Valley company wants to ‘make better humans’ through biohacking.
  43. July 6: DNA from sharks that can live up to 400 years could hold secret to a longer life.
  44. July 6: Robot wars – knee surgery marks new battleground for companies.
  45. July 9: Banks deploy AI to cut off terrorists’ funding.
  46. July 11: Scientists design solar cell that captures nearly all energy of solar spectrum.
  47. July 17: Bananas – scientists create vitamin A-rich fruit that could save hundreds of thousands of children’s lives.
  48. July 22: This new material could let phones and electric cars charge in seconds.
  49. July 26: New artificial spider silk: stronger than steel and 98 percent water.
  50. July 28: Melanoma: – new, more effective drug steps closer.
  51. July 28: Could stem cells reverse the aging process?
  52. July 28: Juvenescence AI to develop first compounds generated by Insilico’s deep-learned drug discovery engines.
  53. August 4: IBM storage breakthrough paves way for 330TB tape cartridges.
  54. August 28: Anti-inflammatory drug ‘cuts heart attack risk’.
  55. August 30: Trial raises Parkinson’s therapy hope.
  56. August 30: FDA clears first gene-altering therapy — ‘a living drug’ — for childhood leukemia.
  57. September 7: Shropshire farm completes harvest with nothing but robots; a world-first in automation.
  58. September 12: U.S. middle-class incomes reached highest-ever level in 2016, Census Bureau says.
  59. September 23: Newly engineered antibody could kill off 99 per cent of HIV strains.
  60. September 28: DNA surgery on embryos removes disease.
  61. October 9: Google is going to use experimental high-altitude balloons to provide internet in Puerto Rico.
  62. October 18: Dyslexia link to eye spots confusing brain, say scientists.
  63. October 29: Saudi Arabia to allow women into sports stadiums.
  64. November 13: Brain implant boosts human memory by mimicking how we learn.
  65. November 14: ‘Better than Concorde’ supersonic 1,687mph airliner to ‘revolutionize’ air travel by 2025.
  66. November 15: The firm that can 3D print human body parts.
  67. November 15: US scientists try 1st gene editing in the body.
  68. November 15: US biotech unicorn steps up competition for BioNTech’s mRNA personalized cancer vaccine.
  69. November 24: Cancer breakthrough: Potential cure could be ready as early as next year.
  70. November 26: Diabetes drug ‘could be used to end agony of transplant rejection’.
  71. November 29: Expanding DNA’s alphabet lets cells produce novel proteins.
  72. November 30: Designer proteins—the new generation of HIV vaccines being put to the test
  73. November 30: Trees are covering more of the land in rich countries – the spread of forests is not always popular. But it is sure to continue.
  74. December 1: HIV breakthrough as cancer drug could hold secret to curing the virus.
  75. December 1: In Rwanda, drones deliver medical supplies to remote areas – such services help people in isolated regions—and could yield lessons for making shipments elsewhere.
  76. December 4: China rules aquaculture as fish output triples in decade.
  77. December 7: Bumper crops boost global cereal supplies in 2017/18.
  78. December 10: Democracy is far from dead – in 12 years, the share of the world’s people who live in ‘free’ countries has risen.
  79. December 11: Huntington’s breakthrough may stop disease.
  80. December 11: Tasmanian tigers aren’t extinct (or at least they won’t be for long!) – Scientists unlock mysterious creature’s DNA – and plan to clone it bring the beast back to Australia.
  81. December 13: Streetlights could be replaced by glowing trees, after scientists make plants shine in the dark.
  82. December 14: Haemophilia A trial results ‘mind-blowing’.

This first appeared in CapX. 

Blog Post | Economic Growth

Economic Growth Is More Important than You Think

Growth is a saving grace for the world's poorest people, and also has a major impact on the daily lives of Americans and the rest of the developed world.

This article first appeared in CapX. To read the original, click here.

What is economic growth, and why should it matter to ordinary people? Those questions are hard to answer in a hysterical world where once-dry academic matters are now politicized without fail. Recently, commentators from all sides have taken to dismissing growth as a golden idol of narrow-minded capitalists. Likewise, many people see the pursuit of growth as an alternative, not a complement, to the pursuit of social needs like public health and sustainability.

These narratives are understandable, considering the misinformed and tone-deaf ways in which many public figures have attempted to advocate the importance of growth and economic activity, particularly during the current pandemic. But the narratives themselves could not be more misleading. Economic growth affects the lives of ordinary people in many crucial ways, not just in the West, but importantly in countless developing nations too. In fact, growth is generally the greatest source of improvement in global living standards.

If we visualize the economy as a pie, then growth can be visualized as the pie getting bigger. Most economists measure growth using a metric called Gross Domestic Product (GDP), which defines the pie’s “ingredients” as consumption, investment, government spending and net exports. In developing countries, growth is largely driven by investment, while wealthier countries tend to rely on innovation to continue growing.

These working definitions, while highly simplified, are better than nothing. They are important because they can make it easier to understand how GDP correlates with countless key metrics of living standards.

In sub-Saharan Africa, for instance, Real Average GDP per Capita grew by 42% between 1990 and 2018. That growth corresponded to major decreases in extreme povertyinfant mortality and undernourishment.

Growth also increases access to resources that make people safer and healthier. A 2019 paper shows that, while disaster-related fatality rates fell for all global income groups between 1980 and 2016, developing countries in the early stages of growth experienced the greatest improvements. That is because those countries made the greatest relative advances in infrastructure and safety measures—advances facilitated by growth.

Growth is a saving grace for the world’s poorest people, but it also has a major impact on the daily lives of Americans and the rest of the developed world, and that impact is especially important in the age of coronavirus. For example, continuous growth has led to lifesaving breakthroughs in medical technology and research, which has allowed humanity to fight COVID-19 more quickly and effectively than we ever could have in the past. Vaccines for certain ailments took decades to develop as late as the mid-20th century, but it is quite possible that a vaccine for COVID-19 will be widely available just one year after the virus’s initial outbreak.

To many supposedly environmentally conscious critics, it seems intuitive that growth is not sustainable. However, sustainability-based criticisms of growth tend to ignore the reality that growth leads to green innovations that help the planet. Labor-augmenting technologies allow us to produce more while conserving resources and protecting the environment. Moreover, wealthier countries are better equipped to develop and adopt green technologies.

MIT scientist Andrew McAfee has documented many of the concrete environmental benefits of growth in his recent book, More From Less. McAfee notes that increases in America’s population and productive activity in recent decades have coincided with significant decreases in air and water pollution, along with gross reductions in the uses of water, fertilizer, minerals and other resources—all because economic growth and market coordination led to improvements in manufacturing and technology. For facilitating this process, which McAfee calls “dematerialization,” growth should be seen as a key to sustainability, not a barrier.

In a broader sense, growth has made our lives more convenient, dynamic and entertaining via developments in consumer technologies and other innovations. Imagine quarantining for five months (and counting) without the internet, PCs or smartphones. Many people would have no way of doing their jobs. Even for those that could, life would be much more difficult, not to mention dull.

Indeed, if one thing could be said to summarize the impact of growth around the world, it would be that growth makes everyone’s life easier. For instance, the amount of labor needed for average workers to purchase countless basic goods and services is at an all-time low and decreasing, largely because supply chains have grown and become more efficient. The result is that ordinary people, especially those in lower income groups with relatively greater reliance on basic goods, are better off.

The story of economic growth is in many ways the story of how cooperation and exchange can defeat poverty and scarcity. The better we understand that, the more likely we will be to support policies which allow resources to flow into areas that need them the most. Broadly speaking, no political idea has been more effective in this regard than free trade.

Knowing the importance of innovation to human well-being should also encourage us to embrace new technology instead of fearing it. We must therefore be wary of overbearing regulations and fiscal policies that prevent ideas from flourishing.

Most importantly, we should not listen to those who claim that economic growth is a pointless, abstract goal that only benefits the rich and leaves ordinary people behind. Growth is a vital driver of progress in modern society and should be taken seriously for the sake of humanity and the planet.

Blog Post | Science & Technology

Vasquez Reviews Ridley's “How Innovation Works”

Innovation requires trial and error. It requires the possibility to experiment and to fail. Only then can innovation provide the path to success and human progress.

The dog is an innovation. It took place at least 20,000 years ago, when a group of humans domesticated wolves and subsequently began to develop different breeds. The light bulb, wheeled baggage, and the computer are also innovations.

Since prehistoric times, innovation has changed the course of our lives and is, according to science writer Matt Ridley, “The most important fact about the modern world, but one of the least well understood.” Ridley is the author of the new eye-opening book, How Innovation Works: And Why It Flourishes in Freedom. In it, he tells the story of dozens of innovations, to illustrate how that phenomenon is the main cause of the enormous progress humanity has seen in the past few centuries. He derives the following lessons from his study.

Innovation almost always happens gradually and not suddenly. It is “not an individual phenomenon, but a collective, incremental and messy network phenomenon.” Ridley asks, “Who invented the computer?” To answer the question, we would have to go back more than 200 years to the Jacquard loom and then review countless succeeding contributions and innovators. The same can be said about the car or virtually all other innovations, even though we sometimes identify them with individuals like Henry Ford who discovered a way of making the automobile widely accessible to the public.

Innovation is thus collaborative, and the same ideas often independently occur to different individuals at the same time. The telegraph, the thermometer, photography, and the hypodermic needle are examples of simultaneous inventions. Twenty-one people invented the light bulb at about the same time. If Thomas Edison or the Wright brothers would not have existed, we would still enjoy artificial light or the wonder of airplane travel.

Innovation is not the same thing as invention. One can invent something novel, but the people who make a difference are the innovators who figure out the way that that new idea can be useful to society, typically by improving on it and lowering its costs. Innovations exist due to our growing knowledge and a demand for the innovative product. As Ridley observes, “The light bulb emerged inexorably from the combined technologies of the day. It was bound to appear when it did, given the progress of other technologies.”

It’s not possible to plan innovation. Not even the innovators can do so. Innovations more often than not come about because of chance events or unexpected discoveries. Sheer luck explains Alexander Fleming’s discovery of penicillin. The founders of Google did not “set out in search of search engines. The founders of Instagram were trying to make a gaming app. The founders of Twitter were trying to invent a way for people to find podcasts.” Innovation is unpredictable.

That unpredictability also helps explain why the so-called “entrepreneurial state” has not effectively promoted innovation and why we can’t expect it to do so. For example, Ridley explains that contrary to those who advocate in favor of publicly funded innovation, the U.S. government did not intend to create a global internet. Only when the internet “escaped the clutches” of the government – that is, when it was essentially privatized in the 1990s – did the private sector and universities begin to transform the internet into what we use today.

Ridley observes that the widely held view that science leads to technology and innovation—frequently canvassed to justify public subsidies for science—is only partially correct. It is equally true that scientific knowledge is the product of technological improvements and attempts to understand the latter. The first inoculations were conducted without a good understanding as to how and why they worked. Attempts to resolve problems in the yogurt industry contributed to the development of the revolutionary gene-editing method known as CRISPR (which may yet help us find a treatment for COVID-19).

Innovation requires trial and error. It requires the possibility to experiment and to fail. Only then can innovation provide the path to success and human progress. Or, as Ridley puts it, “Innovation is the child of freedom and the parent of prosperity.”

Blog Post | Science & Technology

Our Technological Renaissance

Claims of stagnation are not persuasive.

I put on a record today.

Well, I didn’t put on a record, so much as I put on a . . . well, a what? It wasn’t a vinyl plate or a spool of tape or even a piece of shiny circular plastic. Indeed, whatever physical medium was being used to store the music I was listening to wasn’t available to me at all. It simply came in through the air—like lightning. From the comfort of my chair, I picked up my iPhone, chose the album I wanted from the million-strong list that loaded instantly before my eyes, and directed the sound to the speakers in my vicinity, all of which started to play my choice within a few milliseconds. And then, when I tired of it, I shushed it with my voice.

I think about this sometimes when I hear people complain that the bright technological future we were all promised has steadfastly failed to appear. How, I wonder, would I even begin to explain Spotify and Sonos to my grandfather, who died in 1994? A compact disc could be comprehended by the elderly as a better vinyl record, much as the Space Shuttle could be comprehended as a faster airplane. But streaming? If my grandfather came back today, where would I start?

“Okay, so I’m using my telephone, which isn’t really a telephone so much as a supercomputer-cum-Library-of-Alexandria-cum-high-definition-movie-studio, to send a wireless signal to the magical speakers in my home, which, upon my request, will contact a set of servers 3,000 miles away in San Francisco, and request instant access to the closest digital copy of—”

“Wait, what’s a server?”

“—hold on—to the closest digital copy of one of millions of high-quality songs to which I have full and unlimited access, but neither own nor have to store, and—”

It boggles the mind.

It may be tempting to regard this example as a mere bauble or trinket, or even as a sign of decadence. But to do so would represent a disastrous miscalculation of its significance. It is true that some of our advances have slowed since the 1970s. We do not go to the moon on a regular basis, despite the promises of the Apollo program; transatlantic travel has become slower, rather than faster—R.I.P. Concorde; our cars essentially still use the same engines as they always have; and life expectancy is no longer leaping forward. But it is also true that, unlike then, we now enjoy a magnificent worldwide communications network that offers the sum of human knowledge in the blink of an eye and is open to anybody who wishes to join it. If that is “all” we’ve done in the last four decades, I think we should congratulate ourselves rather heartily.

Forget my grandfather for a moment and imagine explaining that to almost any literate person in human history. What do we imagine his reaction would have been? Do we think he would have said, “That sounds like stagnation to me”? Or do we think he would have said, “It sounds as if you have reached the promised land, I hope you are extremely grateful for the bounties you have inherited.” If not the latter, he’d be a fool.

From the desk on which I am writing these words, I have access to all of the great works in history: every song, every play, every book, every poem, every movie, every pamphlet, every piece of art. I can find every translation of the Bible that has ever been compiled and put them side by side for comparison. I can read the missives that were sent during the American Revolution, and examine the patents for the first steam engine, and listen to all of Winston Churchill’s speeches between 1939 and 1945. The world’s recipes are available to me without exception, and, if I desire, I can watch a cornucopia of free-to-use instructional videos in which experts show me how to cook them. At no cost or inconvenience, I can learn how to fix my sink or change my car’s tires or troubleshoot my dishwasher. If I want to know where the “panda ant” lives (Chile), to which genus it belongs (Euspinolia), how long it is (up to 8 millimeters), and whether it’s actually an ant (it’s not, it’s a wasp), I can find this information in seconds. What was on the front page of the Key West Citizen on June 2, 1943? Easy: “City Council Takes Up Incinerator Project with Representative of FWA.” Nearly 2,000 years ago, Pliny the Elder wondered if it might be a good idea to collect all of human knowledge in one place, available to all. That dream has become a reality—and we got to live when it happened. I’d say that’s pretty darn good.

The airplane annihilated distance; the smartphone has annihilated geography altogether. Provided that I have a stable connection to the Internet, it takes me the same amount of time to send a digital photograph to Delhi as it does for me to send it to a person in the house next door. On Saturday mornings I can sit and watch the same soccer games, broadcast live from England, that my dad is watching in England and text him about the developments in real time, as if I were sitting next to him. If I need to keep an eye on the news, it makes no difference whether I am sitting in the headquarters of Reuters or on a beach in Australia. Wherever I am, the information flow is the same. Except by design, there is no longer any such thing as “out of the loop.” As an achievement, this is monumental.

The “Spaceship Earth” attraction at Disney’s Experimental Prototype Community of Tomorrow tells the story of human communication from the days of the Neanderthal to the invention of the computer. I have wondered at times what Disney will substantively add to this story when it comes time to update the show, and I have come to conclude that the answer is almost certainly nothing. One cannot improve on instant worldwide communication that is accessible to every person and in every place. One can tinker around the edges to upgrade its speed, its reliability, its quality, and its durability, one can add some security into the mix for good measure, but, give or take, this is a problem that has now been solved. As the Phoenicians solved the alphabet problem, so have our contemporary engineers solved the transmission problem. The dream has arrived.

Not everyone appreciates this, of course, which is why it is customary for the complaint I am addressing to be amended slightly, from “technology has stagnated” to “technology is frivolously used and may even be bad for us.” But, while the latter proposition is arguably true, it concedes my premise that something dramatic has changed in the way in which we live. It is indeed entirely possible that the volume and speed of information that the I.T. revolution has ushered in have had a destructive effect on individuals or on society. It is possible, too, that, while the benefits are immense, most people choose not to take advantage of them. I would not be the first to lament that the first thing users seem to do with their access to the Internet is to begin arguing with strangers. And yet to contend that the abuse of the personal computer in some way undermines the value of the personal computer would be equivalent to contending that the use of the airplane for bombing renders the significance of its invention questionable.

I suspect that some of our disappointment is the fault of comic books. Riffle through any Bumper Sci-Fi Book for Boys!–style volume that was published between the 1920s and the 1960s and you will see that the physical breakthroughs that were anticipated—spacesuits, rocket ships, jetpacks, flying cars, laser guns, etc.—are featured prominently and enthusiastically, while the less tangible mass communications that were anticipated are set quietly in the background, as if they are inevitable. In story after story, the astronauts communicate from the planet Zog in an instant using video chat, and yet that, evidently, is not the exciting part. The exciting part is that they are on Zog.

I must confess that I do not understand why, for it is not at all obvious to me that exploring Zog is more useful than inventing Wikipedia, or that the ability to get to Zog would represent a greater leap forward than the ability to talk to our friends from it. Certainly, Zog may have some interesting rocks, and the technical feat of sending men there and returning them safely to Earth would be worth celebrating. (I do tend to tear up watching the original Moon landing.) But in comparison to a breakthrough that allows me to enjoy the words, faces, music, food, counsel, art, and research of every other human being on Earth, whether living or dead, it would pale. I have that. In my pocket.

Stagnation? Nope. Renaissance, more like.

This originally appeared in National Review. 

Blog Post | Health & Medical Care

COVID-19 Should Make Us Grateful for Technology

Imagine a pre-modern pandemic.

“In a way, everything is technology,” noted one of the world’s greatest economic historians, Fernand Braudel, in his monumental study Civilization and Capitalism. “Not only man’s most strenuous endeavors but also his patient and monotonous efforts to make a mark on the external world; not only the rapid changes . . . but also the slow improvements in processes and tools, and those innumerable actions which may have no immediate innovating significance but which are the fruit of accumulated knowledge,” he continued.

Yes, land, labor, and capital (that’s to say, the factors of production) are important components of economic growth. In the end, however, human progress in general and global enrichment in particular are largely dependent on invention and innovation. That is surely even clearer now that humanity’s hopes for the end of the pandemic and for our liberation from the accompanying lockdown rest on further scientific breakthroughs within the pharmaceutical industry. Let’s take a brief look at the impact of technology on health care, food supply, work, and sociality in the time of COVID-19.


The impact of modern technology is surely most keenly felt and anticipated within the sphere of human health care. Consider some of the worst diseases that humanity has had to face in the past. Smallpox, which is thought to have killed an estimated 300 million people in the 20th century alone, originated in either India or Egypt at least 3,000 years ago. Smallpox variolation, it seems, was practiced in China in the tenth century, but it was not until the late 18th century that Edward Jenner vaccinated his first patient against the disease. Smallpox was fully eradicated only in 1980.

Similar stories could be told about other killer diseases. Polio, which can be seen depicted in Egyptian carvings from the 18th dynasty, is of ancient origin. Yet the disease wasn’t properly analyzed until the year of the French Revolution, with Jonas Salk’s vaccine appearing only in 1955. Today, polio is close to being eradicated (just 95 cases were reported in 2019).

Malaria, probably humanity’s greatest foe, is at least 30 million years old (the parasite has been found in an amber-encased mosquito from the Paleogene period). It was only after the discovery of the New World that knowledge about the fever-reducing benefits of the bark of the cinchona tree spread to Europe and Asia. Quinine was first isolated in 1820, and chloroquine was introduced in 1946. Artemisinin drugs, which we still use, were discovered in the late 1970s. That’s to say that humanity lived with deadly diseases for millennia without fully knowing what they were, how they were transmitted, and how they could be cured. The fate of humanity, our ancestors thought, fluctuated under the extraneous influence of the “wheel of fortune” and there was nothing that anyone could do about it. One day you were alive and next day you were not.

Contrast that glacial pace of progress, and the fatalistic acceptance of disease and death, with our response time to the current pandemic. The Wuhan Municipal Health Commission reported the existence of a cluster of cases of “pneumonia” in Wuhan on December 31. On January 7 the Chinese identified the pathogen (novel coronavirus) responsible for the outbreak. On January 11 China sequenced the genetic code of the virus, and the next day it was publicly available. That enabled the rest of the world to start making diagnostic kits to identify the disease.

To take one example, the first COVID-19 infection in South Korea was identified on January 20. On February 4, the first test kit (made by Kogene Biotech) entered production. On February 7, the test kit was available at 50 locations around the country. Other countries followed suit.

The World Health Organization, which declared COVID-19 a global pandemic on March 11, may have acted too late. Still, it is noteworthy that just two months expired between the first sign of trouble and the time when the entire world put measures in place to retard the spread of the disease. In the meantime, we have learned a lot about governmental incompetence and regulatory overreach. But we have also learned a great deal about the spread and symptoms of the disease. Instead of starting from scratch, medical specialists in Europe and America can draw on the expertise of their colleagues in the Far East. Before the telegraph appeared midway through the 19th century, it took up to a month for a ship to carry information from London to New York. Today, we learn about the latest COVID-19 news (good and bad) and research in seconds.

By mid April, thousands of highly educated and well-funded specialists throughout the world were using supercomputers and artificial intelligence to identify promising paths toward victory over the disease. Some 200 different programs are underway to develop therapies and vaccines to combat the pandemic. They include studies of the effectiveness of existing antiviral drugs, such as Gilead’s Remdesivir, Ono’s protease inhibitor, and Fujifilm’s favipiravir. The effectiveness of generic drugs, such as hydroxychloroquine and chloroquine, is also being evaluated. Takeda is hard at work on convalescent plasma (TAK-888) in Japan, while Regeneron works on monoclonal antibodies in the United States. New vaccines, such as Moderna’s mRNA-1273, Inovio’s INO-4800, and BioNTech’s BNT162, are under development.

We don’t know which of these treatments (if any) will work, but here is what we can be sure of: There has never been a better time for humans to face and defeat a global pandemic. The world is richer than ever before, and money is what enables us to sustain a massive pharmaceutical industry and pay for highly sophisticated medical research and development.

Coronavirus may be deadly, but it is not the bubonic plague, which had a mortality rate of 50 percent. Luckily, it is a far milder virus that has reawakened us to the danger posed by communicable diseases. Once the immediate crisis is behind us, researchers will collect billions of data from dozens of countries and analyze the different governmental responses to the pandemic. That knowledge will be deployed by governments and the private sector to ensure that best practices are adopted, so that next time we are better prepared.


When the Black Plague struck Europe in 1347, the disease found the local population ripe for slaughter. Following the close of the Medieval Warm Period at the end of the 13th century, the climate turned cold and rainy. Harvests shrunk and famines proliferated. France, for example, saw localized famines in 1304, 1305, 1310, 1315–17, 1330–34, 1349–51, 1358–60, 1371, 1374–75, and 1390. The Europeans, weakened by shortages of food, succumbed to the disease in great numbers.

The people of yore faced at least three interrelated problems. First, the means of transport and the transportation infrastructure were awful. On land, the Europeans used the same haulage methods (carts pulled by donkeys, horses, and oxen) that the ancients had invented. Similarly, much of Europe continued to use roads built by the Romans. Most people never left their native villages or visited the nearest towns. They had no reason to do so, for all that was necessary to sustain their meager day-to-day existence was produced locally.

The second problem was the lack of important information. It could take weeks to raise the alarm about impending food shortages, let alone organize relief for stricken communities.

Third, regional trade was seldom free (France did not have a single internal market until the Revolution) and global trade remained relatively insignificant in economic terms until the second half of the 19th century. Food was both scarce and expensive. In 15th-century England, 80 percent of ordinary people’s private expenditure went for food. Of that amount, 20 percent was spent on bread alone. Under those circumstances, a local crop failure could spell the destruction of an entire community. (Those who think that COVID-19 exposed the fragility of modern society should look up the Great Famine.)

By comparison, by 2013 only 10 percent of private expenditure in the United States was spent on food, a figure that is itself inflated by the amount Americans typically spend in restaurants. Speaking of restaurants, while most have been forced to close their doors, the restaurateurs use apps to deliver excellent food at reasonable prices. Moreover, months into the COVID-19 pandemic, the shops are, generally, well stocked and regularly replenished by the largely uninterrupted stream of cargo flights, truck hauling, and commercial shipping. Due to the miracle of mobile refrigeration, fresh produce continues to be sourced from different parts of the United States and abroad. Shortly before writing this piece, I was able to buy oranges from California, avocados from Mexico, and grapes from Chile in my local supermarket. Globalization may be under pressure from both the left and the right of the U.S. political spectrum, but should the pandemic impair U.S. agricultural production, many will be forced to acknowledge the benefits of the global food supply and our ability to import food from COVID-19-unaffected parts of the world.

This extensive and, at this point, still sturdy supply chain is, of course, a technological marvel. Computers collate information about items on the shelf that are in short supply, adjust the variety and quantity of items shipped between stores, fill new orders, etc. And so, commerce that’s still allowed to go on goes on. So does charity. Feeding America, a network of more than 200 food banks, feeds tens of millions of people through food pantries, soup kitchens, shelters, etc. Since 2005, the organization has been using a computerized internal market to allocate food more rationally. Feeding America uses its own currency, called “shares,” with which individual food banks can bid on the foods that they need the most. Grocery-delivery services bring food to the doorsteps of those who cannot or do not want to leave their homes. The old and the infirm can also use phones, emails, and apps to call upon volunteers to do their shopping and delivery.


The nature of work has changed a lot over the last 200 years or so. Before the industrial revolution, between 85 percent and 90 percent of the people in the Western world were farm laborers. Their work was excruciatingly difficult, as witnessed by one 18th-century Austrian physician who observed that “in many villages [of the Austrian Empire] the dung has to be carried on human backs up high mountains and the soil has to be scraped in a crouching position; this is the reason why most of the young people are deformed and misshapen.” People lived on the edge of starvation, with both the very young and the very old expected to contribute as much as they could to the economic output of the family (most production in the pre-modern era was based on the family unit, hence the Greek term oikonomia, or household management). In those circumstances, sickness was a catastrophe: It reduced the family unit’s production, and therefore its consumption.

The industrial revolution allowed people to move from farms to factories, where work was better paid, more enjoyable, and less strenuous (which is largely why people in poor countries continue to stream from agricultural employment to manufacturing jobs today). Moreover, wealth exploded (real annual income per person in the United States rose from $1,980 in 1800 to $53,018 in 2016). That allowed for ever-increasing specialization, which included a massive expansion of services catering to the desires of an ever-more-prosperous population.

The service sector today consists of jobs in the information sector, investment services, technical and scientific services, health care, and social-assistance services, as well as in arts, entertainment, and recreation. Most of these jobs are less physically arduous, more intellectually stimulating, and better paid than either agricultural or manufacturing jobs ever were. Crucially, many of these service-sector jobs can be performed remotely. That means that even in the midst of the government-imposed economic shutdown, some work (about a third, estimates suggest) can go on. The economic losses from COVID-19, in other words, will be astronomical, but not total.

My own organization, for example, shut its doors in mid March. Since then, everyone has been scribbling away at home or appearing on news shows around the world via the Internet. All of us are in regular contact via the phone, Zoom, and Microsoft Teams. Other organizations are doing the same. As we already discussed, a great deal of shopping is taking place online. Shipping and delivery companies are expanding, with Amazon hiring 100,000 additional workers in the United States. Home entertainment, of course, has grown tremendously, with Netflix adding millions of new customers and expanding its offerings with thousands of new films and television shows. With over 30 million American children stuck at home, online learning companies are booming, and educators from high-school teachers to college professors continue to perform their jobs remotely. Telehealth is expanding, allowing patients to see their doctors in a safe and convenient way. Even minor medical procedures, such as eye exams, can be conducted remotely, and multiple companies will deliver your new specs to your front door. Banking and finance are still going on, with many people taking advantage of low interest rates to refinance their mortgages. Finally, the often unfairly maligned pharmaceutical industry is expanding as we all wait and hope for the release of a COVID-19 vaccine or effective therapeutic treatment.


Aristotle observed that “man is by nature a social animal” and noted that without friends we would be unhappy. But the role of sociality (that is to say, the tendency to associate in or form social groups) goes much deeper than that. As William von Hippel explained in his 2018 book The Social Leap, sociality is the mechanism by which Homo sapiens came about. When early hominids were forced down from the trees (perhaps as a result of a climatic change that dried up African forests), they became more vulnerable to predators. To cover longer distances between the fast-disappearing trees while maintaining a modicum of protection against other animals, our ancestors developed bipedalism, which allowed them to free their upper body to carry weapons such as sticks and stones.

Even more important was the invention of cooperation. While a stick-wielding ape is slightly better-off than an unarmed one, a group of armed apes is much better at dispatching predators. Individuals in more cooperative bands survived to adulthood and bred more often, resulting in more-cooperative species. Furthermore, since living alone was tantamount to a death sentence, selfish apes who didn’t care about being ostracized for not pulling their weight died off, resulting in a desire for communal cooperation and a deep-rooted fear of rejection by the group.

The early hominids had brains more like those of chimps than those of modern humans. That’s because the evolutionary pressures that created the former — such as predation and food scarcity — could be overcome without tremendous intelligence. These pressures to survive were part of the physical landscape — a challenging but static environment that didn’t require a lot of cognitive ability to navigate. The environmental pressure that resulted in modern humans was the social system itself. The social landscape is much more dynamic than the physical one. Once they had banded together in groups, our ancestors were forced to forge relationships with, and avoid being exploited by, individuals with divergent and constantly shifting interests. Those who couldn’t keep up with the increasingly complex social game either died or were unable to mate.

This new pressure created a positive evolutionary cycle: Banding together created more complex social systems, which required bigger brains; bigger brains needed to be fed; and the best way to get more food was more cooperation and a more sophisticated social system. The main cognitive development that evolved from this evolutionary cycle is known as the “theory of mind.” In short, the theory of mind is the ability to understand that other minds can have different reasoning, knowledge, and desires from your own. While that seems basic, the theory of mind distinguishes us from all other life on Earth. It allows us to determine whether an affront, for example, was intentional, accidental, or forced. It allows us to feel emotions such as empathy, pride, and guilt — abilities that are keys to a functioning society.

So sociality and human beings are inseparable, as we have all been clearly reminded by the sudden restrictions on our ability to interact with others. As we sit at home, working away on our computers or watching television, most of us feel a tremendous sense of isolation (“social distancing”) from our family, friends, and colleagues. The urge to be around others is innate to us. It is who we are.

Dissatisfied with impersonal modes of communication, such as email and texting, we have rediscovered the need for a face-to-face interaction with our fellow humans. To that end, we utilize digital platforms such as Zoom, Google Hangouts, Facebook Live, and FaceTime to catch up on the latest news in other people’s lives, or simply to complain about the misery of loneliness and the pathetic inadequacy of our public officials (of both parties). Throughout the nation, people engage in virtual happy hours, dinners, book clubs, fitness classes, religious services, and group meditation. As my Cato Institute colleague Chelsea Follett recently wrote, “Technology has made it easier than ever to hold a physically-distanced ‘watch party’ synchronized so that viewers in different locations see the same part of a movie at the same time. For those who like to discuss movies as they watch, technology also enables a running group commentary of each scene in real time.” In the saddest of cases, technology enables people to say goodbye to dying friends and relatives. In a very real sense, therefore, technology keeps us sane (or, at the very least, saner).

Technology, then, allows us to cope with the challenges of the pandemic in ways that our ancestors could not even dream about. More important, technology allows our species to face the virus with grounds for rational optimism. In these dark days, remember all the scientists who are utilizing the accumulated store of human knowledge to defeat COVID-19 in record time and all the marvelous (not to say miraculous) ways the modern world keeps us well fed, psychologically semi-balanced, and (in many cases) productively engaged.

This originally appeared in National Review.