fbpx
01 / 05
Centers of Progress, Pt. 38: Cambridge (Physics)

Blog Post | Science & Technology

Centers of Progress, Pt. 38: Cambridge (Physics)

The ideas generated in Cambridge transformed humanity’s understanding of the natural world.

Today marks the thirty-eighth installment in a series of articles by HumanProgress.org called Centers of Progress. Where does progress happen? The story of civilization is in many ways the story of the city. It is the city that has helped to create and define the modern world. This bi-weekly column will give a short overview of urban centers that were the sites of pivotal advances in culture, economics, politics, technology, etc.

Our thirty-eighth Center of Progress is Cambridge during the Scientific Revolution. The 16th and 17th centuries constituted a period of drastic change in the way humanity conceptualized and sought to understand the world. Scholars made massive leaps in fields such as mathematics, astronomy, chemistry, and, perhaps most notably, physics. Arguably, no city contributed more profoundly to that new understanding than Cambridge.

Today, Cambridge is a picturesque and walkable university city filled with stunning architecture, cozy pubs, and brilliant minds. “Cambridge is heaven … As you walk round, most people look incredibly bright, as if they are probably off to win a Nobel Prize,” says author Sophie Hannah. Indeed, if Cambridge were a country, it would rank fourth on the list of countries according to its number of Nobel Prize winners. Cambridge is nicknamed the “city of perspiring dreams” as a nod to its scholars’ tireless dedication, in contrast to rival university city Oxford’s older nickname, “the city of dreaming spires.”

The great minds that have defined Cambridge are reflected in its architecture and artworks. Architectural highlights include the Gothic-style King’s College chapel, featuring the world’s largest fan vault, and the Mathematical Bridge, designed in 1749 with the technique of tangential radial trussing, which creates the illusion of an arch although the bridge is built only of straight timbers. The conduit gutters, or runnels, lining many of the city’s ancient streets and university buildings owe their construction to none other than Thomas Hobson, the successful stable-owner-turned-benefactor from whom we get the term Hobson’s choice (e.g., “take it or leave it”). The city also boasts fascinating artworks such as the Corpus Chronophage Clock, a massive “inside-out” electromechanical timepiece that lets onlookers view the usually hidden grasshopper escapement mechanism. The clock is topped with a moving statue called the Chronophage (“time eater”), a grasshopper-like entity constructed from stainless steel, gold, and enamel. According to the artist, the horologist and Cambridge alumnus John Taylor (b. 1936), “the time is exactly correct every fifth minute to one hundredth of a second.”

Cambridge is also known for the Fitzwilliam Museum, the city’s lively market square, and the pastime popular among tourists and students alike of punting (a method of boating in shallow water) on the River Cam—the natural feature around which the city was built. The waterway has made the area an attractive farmsteading site since the Iron Age, and has allowed Cambridge to serve as a trading center throughout the ages, including for the Romans (who called the city Duroliponte, meaning “the fort at the bridge”), the Vikings, and the Saxons. 

But the true significance of Cambridge began with the founding of Cambridge University, which started with a murder mystery. In 1209, a woman was found killed in Oxford, and her death caused an uproar that would alter the course of academic history. She was a local and her fellow townspeople blamed outsiders drawn to their city to study and teach at Oxford University. At the time, most of the university’s students were teenage clerics, also called clerks. The primary suspect was a liberal arts clerk, who promptly fled his rented home. 

The townspeople resented the clerks’ special legal privileges, relative wealth, and reputation for drinking and fighting—tense town versus gown relations are nothing new. The killing took on an urgent significance in the broader conflict between the townspeople and the university, and a mob of riotous locals soon imprisoned the suspect’s roommates. This occurred in the midst of a power struggle between the Church and the crown. The excommunicated King John is said to have personally ordered the hanging of the imprisoned clerks “in contempt of the rights of the church.” The other pupils and instructors at the university fled, fearing further executions. To this day, the crime has not been solved—some say the killing was an accident, while others claim it was murder.

The dispersed scholars, possibly including the alleged killer, continued their studies elsewhere. What would become Cambridge University started as “no more than a bunch of scholars who had fled from Oxford and who had started to teach their students in rented houses in the neighborhood around St. Mary’s Church.” St. Mary’s Church today marks the center of Cambridge. In 1214, when the king and the Church had reconciled, Oxford’s townsfolk were made to welcome back scholars and offer them reduced rents. But tensions remained high in Oxford (boiling over periodically, such as in the St. Scholastica’s Day riot), and many ex-Oxonians chose to remain in Cambridge.

Soon Cambridge University was an intellectual powerhouse in its own right, where great thinkers took human understanding of the world to new heights. “I find Cambridge an asylum, in every sense of the word,” the English poet A. E. Housman once quipped. And indeed the city birthed ideas so groundbreaking that many of them may have sounded mad when they were first expressed.

Cambridge has nurtured great minds in many areas of achievement. Consider the arts. Cambridge’s streets have been trod throughout the centuries by literary and poetic geniuses, including Edmund Spenser (1552–1599), Christopher Marlowe (1564–1593), John Milton (1608–1674), William Wordsworth (1770–1850), Lord Byron (1788–1824), Alfred Tennyson (1809–1892), William Thackeray (1811–1863), A. A. Milne (1882–1956), C. S. Lewis (1898–1963), Vladimir Nabokov (1899–1977), Sylvia Plath (1932–1963), and Douglas Adams (1952–2001). Famous Cambridge alumni include comedians, such as John Cleese (b. 1939), Eric Idle (b. 1943), Sacha Baron Cohen (b. 1971), and John Oliver (b. 1977), as well as award-winning actors such as Emma Thompson (b. 1959) and Hugh Laurie (b. 1959). And Cambridge gave the world musical feats ranging from the comedic Always Look on the Bright Side of Life to the well-known 1980s hit Walking on Sunshine.

Next consider philosophy and economics. Cambridge educated the celebrated Catholic theologian, humanist philosopher, and pioneer of religious tolerance Erasmus (1466–1536). Other noted philosophers who were Cantabrigians include Bertrand Russell (1872–1970) and Ludwig Wittgenstein (1889–1951). Cambridge was also the alma mater of influential economists, such as the overpopulation alarmist Thomas Malthus (1766–1834), the father of the beleaguered field of macroeconomics John Maynard Keynes (1883–1946), and Nobel Prize winners Milton Friedman (1912–2006) and Angus Deaton (b. 1945).

But Cambridge’s greatest contributions to human progress arguably came in the natural and physical sciences. William Harvey (1578–1657), the physician and anatomist who first detailed the human blood circulatory system, studied at Cambridge. Francis Bacon (1561–1626), the father of empiricism and one of the founders of the scientific method, studied at Cambridge and represented Cambridge University (which for a time was a Parliament constituency with its own representatives) in the British Parliament in 1614. 

Most historians consider the Scientific Revolution to have begun with the insight of the Polish astronomer Nicolaus Copernicus (who studied in Bologna, another Center of Progress, and in Padua) that the Earth orbits the sun rather than the sun orbiting the Earth. However, the revolution culminated in the quiet university city of Cambridge with the writing of Newton’s Philosophiæ Naturalis Principia Mathematica (the Principia, published in 1687), a groundbreaking work that advanced mankind’s knowledge of physics and cosmology. To this day, Cambridge University Library retains the first edition print of the book owned by author Isaac Newton (1642–1727), which contains his handwritten notes for the second edition scrawled across it. 

If Newton was the father of modern physics, then Cambridge was arguably the field’s birthplace. The course of Newton’s life revolved around Cambridge; one might say the city’s intellectual gravity kept him in its orbit and he could not resist its pull. He received both his bachelor’s and master’s degree from Cambridge University. Like Bacon, Newton briefly served as member of Parliament for the University of Cambridge (from 1689–1690 and 1701–1702). In 1669, only a year after completing his master’s degree, Newton became the Lucasian Chair of Mathematics, which is now among the most prestigious professorships in the world, and remained in that position until 1702. 

The professorship was made possible thanks to private funding from a benefactor named Henry Lucas (c. 1610–1663). He was a clergyman, politician, and Cambridge alumnus who also generously bequeathed a collection of some four thousand books to the University Library at Cambridge. Other famous Lucasian professors include the mathematician Charles Babbage (1791–1871), often called the “father of computing” for conceiving the first automatic digital computer; and the theoretical physicist Stephen Hawking (1942–2018), who, despite severe health challenges from amyotrophic lateral sclerosis (ALS is a progressive motor-neuron illness), made several notable contributions to his field, including conceptualizing Hawking radiation. The Lucasian Chair has even attracted the attention of popular culture: in the well-known science-fiction franchise Star Trek, a central character named Data is said to hold the Lucasian Chair in the late 24th century.

The privately funded professorship allowed Newton to make several breakthroughs in the fields of mathematics, optics, and physics, such as developing the first reflecting telescope. Generous private funding also made publication of the Principia possible. The astronomer and physicist Edmond Halley (1656–1742), the namesake of Halley’s Comet and the heir to a soap mogul, traveled to Cambridge to encourage, edit, and fund the publication of Newton’s Principia. In his book, Newton demonstrated how the planets orbit the sun, controlled by gravity. A popular legend holds that Newton first formulated the theory of gravity in the mid-1660s after watching an apple fall from a tree. The precise apple tree often said to have inspired him is, remarkably, still alive—it stands around 70 miles northwest of Cambridge in Newton’s family home, Woolsthorpe Manor. Grafted from that storied tree, another “Newton’s apple tree” can now be viewed in Cambridge. Whether Newton ultimately did his most momentous thinking in his birthplace or in his intellectual home at Cambridge, one thing is certain: the Principia took the world by storm. Its publication is often said to have laid the foundation of modern physics.

After the Scientific Revolution, Cambridge continued to produce world-changing thinkers, such as Henry Cavendish (1731–1810), the discoverer of hydrogen (which he termed “inflammable air”). Later, Cambridge’s Cavendish Laboratory was home to major discoveries, including that of the electron in 1897, the neutron in 1932, and the structure of DNA in 1953. The latter came about thanks to the work of physicist Francis Crick (1916–2004) and biologist James Watson (b. 1928), who may have built on findings by other Cantabrigians, including chemist Rosalind Franklin (1920–1958). The physicist Niels Bohr (1885–1962), who developed the Bohr model of the atom, also studied at Cambridge. The university city was the site of other groundbreaking moments in scientific history, too, such as the invention of in vitro fertilization technology (1968-78), the first identification of stem cells (1981), and the earliest eye-recognition technology (1991).

It also must be mentioned that Cambridge educated the founder of evolutionary biology, Charles Darwin (1809–1882). He forever altered understanding of living things by positing the fundamental scientific concepts of animal and human evolution and natural selection. Along with Newton, he is probably the most influential figure in scientific history to emerge from Cambridge’s classrooms—and one of the most influential men in history, period.

Cambridge grew from its unconventional murder-mystery origins into an intellectual center that played a pivotal role in the Scientific Revolution, which is often said to have been made complete with the publication of the Principia. Thanks to the rigorous culture of Cambridge’s academic community and funding from generous benefactors, the city has often served as the headquarters of mankind’s quest for truth and understanding. Many scholars believe that the new way of thinking that emerged during the Scientific Revolution directly led to the Enlightenment movement in the 17th and 18th centuries. The innovations of the Scientific Revolution continue to form the basis for humanity’s present understanding of the natural world, including modern physics. It is for these reasons that we must gravitate toward Cambridge as our 38th Center of Progress. 

Blog Post | Economic Growth

The Human Meaning of Economic Growth

Misunderstandings of the relationship between wealth and flourishing have obscured the anti-​human implications of slowing growth rates.

Summary: Economic growth has been a driving force behind the dramatic improvements in human wellbeing over the past few centuries. This growth has resulted from the Enlightenment, the Industrial Revolution and capitalism. Criticisms of growth stem in large part from misunderstandings of the relationship between economics and human values.


Why is the world as prosperous a place as it is? And why isn’t it much more prosperous? These questions are broad enough to admit countless answers, but as good an answer as any is the economic growth rate.

You might have heard that economic growth is overrated, that it’s a fine idea, but unsustainable, or even that it’s entirely counterproductive because it puts profits above people and the economy above the planet. These narratives have been widespread in recent years. They’re also based on a fundamental misconception of the nature of wealth and what a growing economy means for humanity.

Properly conceived, wealth is the actualization of human values in the real world. Economic growth is the upward trajectory of human achievement. The forms of prosperity that most of humanity strives for, such as health, knowledge, pleasure, safety, professional and personal freedom, and so many others, were vastly scarcer throughout most of human history—and would be orders of magnitude more abundant today if economic policies had been slightly different. That is the power of economic growth, and it is within our power to influence the world of future generations for better or worse.

The History of Economic Growth

Virtually everywhere and always throughout human history, economic growth was nonexistent. While pockets of momentary economic progress took place in certain instances, the overall trend was one of perpetual stagnation. But just a few hundred years ago, with the advent of the Enlightenment, the Industrial Revolution, and capitalism, that all began to change.

When the conceptual tools of science became widely applied to create the technological advancements of the Industrial Revolution, they brought an unprecedented optimism about the capacity for investment in new discoveries and inventions to reliably uncover useful knowledge of the natural world. This change inspired the broad transformation of mere wealth (resources hidden away in vaults and treasure chests) into capital (resources invested in new inventions and discoveries).

By the time Friedrich Engels and Karl Marx wrote their Communist Manifesto in 1848, the optimism of investment had already transformed Western Europe. As Engels and Marx saw it, “The bourgeoisie [capitalist class], during its rule of scarce one hundred years, has created more massive and more colossal productive forces than have all preceding generations together. Subjection of Nature’s forces to man, machinery, application of chemistry to industry and agriculture, steam-​navigation, railways, electric telegraphs, clearing of whole continents for cultivation, canalisation of rivers, whole populations conjured out of the ground — what earlier century had even a presentiment that such productive forces slumbered in the lap of social labour?”

Marx and Engels misunderstand the complex reasons for increased productivity (attributing it to untapped “social labour”) but the quotation is significant because, despite their sympathy for state centralization of the economy, they could not ignore the success of capitalism.

While no year before 1700 saw a gross world product of more than $643 billion (in international inflation-​adjusted 2011 dollars), by 1820 global GDP reached 1 trillion. By 1940 the number had passed 7 trillion, and by 2015 it had passed 108 trillion.

Contrary to the popular misconception that capitalism has made the rich richer and the poor poorer, this new wealth contributed to growing the economies of every world region while outpacing population growth. While the world’s extreme poor have become wealthier so too have all other economic classes.

What’s So Great about Growth?

A growing economy isn’t about stacks of paper money getting taller, or digits being added to the spreadsheets of bank ledgers. These things may be indicators of growth, but the growth itself is composed of goods and services becoming more abundant. Farms and factories producing more and better consumption goods; engineers creating better machines and materials; clean water reaching more communities; sick people receiving better healthcare; scientists running more experiments, poets writing more poems, education becoming more broadly accessible; and for whatever other forms of value people choose to exchange their savings and labor.

Gross domestic product or GDP (called gross world product or world GDP when applied at the global level) is an imperfect but useful and widely employed measure of economic growth, and its reflection in the real world takes such forms as rising life expectancy, nutrition, literacy, safety from natural disaster, and virtually every other measure of human flourishing. This is because, at the most fundamental level, “economic growth” means the transformation and rearrangement of the physical environment into more useful forms that people value more.

Before the year 1820, human life expectancy had always been approximately 30-35 years. But with the great decline in poverty and rise of capital investment in technology and medicine, global life expectancy has roughly doubled in every geographic region in the last century. Similar trends have occurred in global nourishmentinfant survivalliteracy, access to clean water, and countless other crucial indicators of wellbeing. While these trends are bound to take the occasional momentary downturn because of life’s uncertainties and hardships, the unidirectional accumulation of technological and scientific knowledge since the Age of Enlightenment gives the forward march of progress an asymmetric advantage. For example, the COVID-19 pandemic and lockdowns resulted in a brief and tragic decline in life expectancy, but the number has since risen to an all-​time high of 73.36 years as of 2023.

What is the direct causal connection between economic growth and these improvements to human wellbeing? Consider the example of deaths by natural disaster, which have fallen in the last century from about 26.5 per 100,000 people to 0.51 per 100,000 people. More wealth means buildings can be constructed from stronger materials and better climate controls. And when those protections aren’t enough, a wealthier community can afford better infrastructure such as roads and vehicles to efficiently get sick or injured people to the hospital. When those injured end up in the hospital, a wealthier society’s medical facilities will be equipped with more advanced equipment, cleaner sanitation, and better-​trained doctors that will provide higher quality medical attention. These are just a few examples of how wealth allows humans to transform their world into a more hospitable place to live and face the inevitable challenges of life.

The benefits of economic growth go far beyond the maximization of health and safety for their own sake. If what you value in life is the contemplation of great art, the exaltation of your favorite deity, or time spent with your loved ones, wealth is what awards you the freedom to sustainably pursue those values rather tilling the fields for 16 hours per day and dying in your 30s. Wealth is what provides you access to an ever-​improving share of the world’s culture by increasing the abundance and accessibility of printed, recorded, and digital materials. Wealth is what provides you with the leisure time and transportation technology to travel the world and experience distant wonders, remote holy sites, and people whose personal or professional significance to you would otherwise dwell beyond your reach.

As the Harvard University cognitive scientist Steven Pinker demonstrates in his popular book Enlightenment Now, “Though it’s easy to sneer at national income as a shallow and materialistic measure, it correlates with every indicator of human flourishing, as we will repeatedly see in the chapters to come.”

The Long-​Term Future of Growth

Human psychology is ill-​equipped to comprehend large numbers, especially as they relate to the profound numerical implications of exponentiation. If it sounds insignificant when politicians and journalists refer to a 1 percent or 2 percent increase or decrease in the annual growth rate, then like most people, you’re being deceived by a quirk of human intuition. While small changes to the economic growth rate may not have noticeable effects in the short term, their long- term implications are absolutely astonishing.

Economist Tyler Cowen has pointed out in a Foreign Affairs article, “In the medium to long term, even small changes in growth rates have significant consequences for living standards. An economy that grows at one percent doubles its average income approximately every 70 years, whereas an economy that grows at three percent doubles its average income about every 23 years—which, over time, makes a big difference in people’s lives.” In his book Stubborn Attachments, Cowen offers a thought experiment to illustrate the real-​world implications of such “small changes” to the growth rate: “Redo U.S. history, but assume the country’s economy had grown one percentage point less each year between 1870 and 1990. In that scenario, the United States of 1990 would be no richer than the Mexico of 1990.”

Cowen gave the negative scenario in which the growth rate was 1 percent slower. US Citizens would have drastically shorter lifespans, less education, less healthcare, less safety from violence, more susceptibility to disease and natural disaster, fewer career choices, and so on. Now imagine the opposite scenario, in which US economic policy had just 1 additional percentage point of growth each year. The average American today would in all probability be living much longer, having much nicer housing, choosing from far more career opportunities, and enjoying more advanced technology.

Just imagine your income doubling, and what you could do for yourself, your family, or the charity of your choice with all that extra wealth. Something along those lines could have happened to most Americans. But instead, growth has been significantly slowed in the United States because taxes and regulations have constantly disincentivized and disallowed new innovations.

At the margins, many dying of preventable diseases could have been cured, many who spiraled into homelessness could have accessed the employment opportunities or mental health treatment they needed, and so on. While economic fortune seems like a luxury to those who already enjoy material comfort, there are always many at the margin for whom the health of the economy is the difference between life and death.

These are among the reasons that Harvard University economist Gregory Mankiw concludes in his commonly used college textbook Macroeconomics that, “Long-​run economic growth is the single most important determinant of the economic well-​being of a nation’s citizens. Everything else that macroeconomists study — unemployment, inflation, trade deficits, and so on — pales in comparison.”

When we think of the future our children or grandchildren will live in, depending on our choices between even slightly more or less restrictive economic policies today, we could be plausibly looking at a future of widespread and affordable space travel, life-​changing education and remote work opportunities in the metaverse, new sustainable energy innovations, a biotechnological revolution in the human capacity for medical and psychological flourishing, genome projects and conservation investments to revive extinct and protect endangered species, and countless other improvements to the human condition. Or we could be looking at a drawn-​out stagnation in poverty alleviation, technological advancement, and environmental progress. The difference may well hinge on what looks today like a tiny change in the rate of compounding growth.

At the broadest level, more wealth in the hands of the human species represents a greater capacity of humans to chart their course through life and into the future in accordance with their values. Like all profound and far-​reaching forms of change, economic growth has a wide range of consequences, some intended and others unintended, many desirable and many others undesirable. But it is not a random process. It is directed by the choices of individuals, and allocated by their drive to devote more resources and more investment into those things they view as worthwhile. Ever since the Scientific Revolution, the Enlightenment, and the Industrial Revolution, the investment in human values has been on balance a positive sum game, in which one group’s gains do not have to come in the form of another group’s losses. This is demonstrated by the upward trends in human flourishing since the global rise in exponential economic growth. Indeed, it is intrinsic to the fundamental difference between a growing and a shrinking or stagnant economy: In a growing economy, everyone can win.

This article was published at Libertarianism.org on 11/17/2023.

Blog Post | Science & Education

AI in the Classroom Can Make Higher Education Much More Accessible

For some school subjects, artificial intelligence can transform the landscape of tutoring accessibility.

Summary: ChatGPT4 has demonstrated superiority in various student exams, revealing its potential to support academic learning and improve educational outcomes, particularly in test preparation. With its accessibility and affordability compared to traditional tutoring services, AI tutoring can help address the increasing demand for academic support, especially as universities begin to reinstate standardized testing requirements.


In 2023, OpenAI shook the foundation of the education system by releasing ChatGPT4. The previous model of ChatGPT had already disrupted classrooms K–12 and beyond by offering a free academic tool capable of writing essays and answering exam questions. Teachers struggled with the idea that widely accessible artificial intelligence (AI) technology could meet the demands of most traditional classroom work and academic skills. GPT3.5 was far from perfect, though, and lacked creativity, nuance, and reliability. However, reports showed that GPT4 could score better than 90 percent of participants on the bar exam, LSAT, SAT reading and writing and math, and several Advanced Placement (AP) exams. This showed a significant improvement from GPT3.5, which struggled to score as well as 50 percent of participants.

This marked a major shift in the role of AI, from it being an easy way out of busy work to a tool that could improve your chances of getting into college. The US Department of Education published a report noting several areas where AI could support teacher instruction and student learning. Among the top examples was intelligent tutoring systems. Early models of these systems showed that an AI tutor could not only recognize when a student was right or wrong in a mathematical problem but also identify the steps a student took and guide them through an explanation of the process.

The role of tutoring in education has grown in significance as more and more high school students have gone to college. Private tutoring is now a booming industry. Often you can find tutors charging anywhere up to $80 for test preparation with no shortage of eager parents willing to pay for their services. Tutoring has been a go-to solution for students to improve their grades outside the classroom. But more importantly, it has been a solution to improve their chances of getting into college, with many private tutoring services focusing on AP and SAT exams. This connection between college admission success and private tutoring costs has been a problem for parents who cannot afford the costs.

ChatGPT4 is available for $20 a month. Although the program itself can be used to answer questions and provide academic support, dedicated education websites have begun incorporating AI tutors to help with test prep. Khan Academy provides free courses on AP content and SAT exams and offers an AI-powered tutor for these subjects at $4 a month. Duolingo, a popular language learning app that offers university-recognized language exams, offers Duolingo Max at $14 a month. These tutoring services are accessible at your fingertips at any time. There is no need to schedule video conferencing calls, do background checks on tutors, or pay extra costs. Quality individualized academic support is available at a moment’s notice.

The availability of AI tutoring services is occurring at a crucial moment in education. As students become accustomed to post-pandemic life, student achievement across the nation still has not returned to where it once was. Despite that, many universities have begun reversing test-optional policies that had allowed students to avoid taking standardized tests such as the SAT. The demand for tutoring has skyrocketed as many new high school seniors struggle to meet the old standards of college admissions. Many school tutoring programs have not been able to provide the support students need, and private tutoring costs are only increasing.

AI has the potential to provide cheap and effective tutoring for these exams while being easily accessible. A Harvard computer science course has been able to incorporate ChatGPT to great success, using it to provide continuous and customized technical support and allowing professors to focus more on pedagogy. As technology improves, students will have more support for academic pursuits, opening an easier path to higher education but also allowing students to more easily explore academic interests beyond rigid classroom instruction.

Blog Post | Science & Technology

AI Is a Great Equalizer That Will Change the World

A positive revolution from AI is already unfolding in the global East and South.

Summary: Concerns over potential negative impacts of AI have dominated headlines, particularly regarding its threat to employment. However, a closer examination reveals AI’s immense potential to revolutionize equal and high quality access to necessities such as education and healthcare, particularly in regions with limited access to resources. From India’s agricultural advancements to Kenya’s educational support, AI initiatives are already transforming lives and addressing societal needs.


The latest technology panic is over artificial intelligence (AI). The media is focused on the negatives of AI, making many assumptions about how AI will doom us all. One concern is that AI tools will replace workers and cause mass unemployment. This is likely overblown—although some jobs will be lost to AI, if history is any guide, new jobs will be created. Furthermore, AI’s ability to replace skilled labor is also one of its greatest potential benefits.

Think of all the regions of the world where children lack access to education, where schoolteachers are scarce and opportunities for adult learning are scant.

Think of the preventable diseases that are untreated due to a lack of information, the dearth of health care providers, and how many lives could be improved and saved by overcoming these challenges.

In many ways, AI will be a revolutionary equalizer for poorer countries where education and health care have historically faced many challenges. In fact, a positive revolution from AI is already unfolding in the global East and South.

Improving Equality through Education and Health Care

In India, agricultural technology startup Saagu Baagu is already improving lives. This initiative allows farmers to increase crop yield through AI-based solutions. A chatbot provides farmers with the information they need to farm more effectively (e.g., through mapping the maturity stages of their crops and testing soil so that AI can make recommendations on which fertilizers to use depending on the type of soil). Saagu Baagu has been successful in the trial region and is now being expanded. This AI initiative is likely to revolutionize agriculture globally.

Combining large language models with speech-recognition software is helping Indian farmers in other ways. For example, Indian global impact initiative Karya is working on helping rural Indians, who speak many different languages, to overcome language barriers. Karya is collecting data on tuberculosis, which is a mostly curable and preventable disease that kills roughly 200,000 Indians every year. By collecting voice recordings of 10 different dialects of Kannada, an AI speech model is being trained to communicate with local people. Tuberculosis carries much stigma in India, so people are often reluctant to ask for help. AI will allow Indians to reduce the spread of the disease and give them access to reliable information.

In Kenya, where students are leading in AI use, the technology is aiding the spread of information by allowing pupils to ask a chatbot questions about their homework.

Throughout the world, there are many challenges pertaining to health care, including increasing costs and staff shortages. As developed economies now have rapidly growing elderly populations and shrinking workforces, the problem is set to worsen. In Japan, AI is helping with the aging population issue, where a shortage of care workers is remedied by using robots to patrol care homes to monitor patients and alert care workers when something is wrong. These bots use AI to detect abnormalities, assist in infection countermeasures by disinfecting commonly touched places, provide conversation, and carry people from wheelchairs to beds and bathing areas, which means less physical exertion and fewer injuries for staff members.

In Brazil, researchers used AI models capable of predicting HER2 subtype breast cancer in imaging scans of 311 women and the patients’ response to treatment. In addition, AI can also help make health resource allocations more efficient and support tasks such as preparing for public health crises, such as pandemics. At the individual level, the use of this technology in wearables, such as smartwatches, can encourage patient adherence to treatments, help prevent illnesses, and collect data more frequently.

Biometric data gathered from wearable devices could also be a game-changer. This technology can detect cancers early, monitor infectious diseases and general health issues, and give patients more agency over their health where access to health care is limited or expensive.

Education and health care in the West could also benefit from AI. In the United States, text synthesis machines could help to address the lack of teachers in K–12 education and the inaccessibility of health care for low-income people.

Predicting the Future

AI is already playing a role in helping humanity tackle natural disasters (e.g., by predicting how many earthquake aftershocks will strike and their strength). These models, which have been trained on large data sets of seismic events, have been found to estimate the number of aftershocks better than conventional (non-AI) models do.

Forecasting models can also help to predict other natural disasters like severe storms, floods, hurricanes, and wildfires. Machine learning uses algorithms to reduce the time required to make forecasts and increase model accuracy, which again is superior to the non-AI models that are used for this purpose. These improvements could have a massive impact on people in poor countries, who currently lack access to reliable forecasts and tend to be employed in agriculture, which is highly dependent on the weather.

A Case for Optimism

Much of the fear regarding AI in the West concerns the rapid speed at which it is being implemented, but for many countries, this speed is a boon.

Take the mobile phone. In 2000, only 4 percent of people in developing countries had access to mobile phones. By 2015, 94 percent of the population had such access, including in sub-Saharan Africa.

The benefits were enormous, as billions gained access to online banking, educational opportunities, and more reliable communication. One study found that almost 1 in 10 Kenyan families living in extreme poverty were able to lift their incomes above the poverty line by using the banking app M-Pesa. In rural Peru, household consumption rose by 11 percent with access to phones, while extreme poverty fell 5.4 percent. Some 24 percent of people in developing countries now use the mobile internet for educational purposes, compared with only 12 percent in the richest countries. In lower-income countries, access to mobile phones and apps is life-changing.

AI, which only requires access to a mobile phone to use, is likely to spread even faster in the countries that need the technology the most.

This is what we should be talking about: not a technology panic but a technology revolution for greater equality in well-being.

Blog Post | Education & Literacy

How to Combat Gloom and Pessimism

Given the inhospitable world we have evolved in, humans have learned to prioritize the bad news.

Summary: Optimism flourishes more in rapidly growing countries, fueled by the promise of improvements in living standards, a phenomenon less evident in relatively developed nations like the US. Human nature, predisposed to focus on negative news, collides with media outlets’ profit-driven emphasis on sensationalism, perpetuating a cycle of pessimism. Understanding our negativity bias and learning probabilistic reasoning skills can help navigate the deluge of alarming headlines, while seeking out sources of positive news can provide a more balanced perspective.


Surveys show that optimism is highest in rapidly growing countries that are catching up with the developed world. High growth rates allow the citizens of those nations to experience massive year-on-year increases in standards of living – something that, in the absence of an AI-led revolution in productivity, is unlikely to occur in already developed countries. Slow and steady progress, such as the one currently underway in the United States, does not seem sufficient to inspire widespread optimism about the future.

The problem of incrementalism is compounded by the interaction between human nature and the media. Given the inhospitable world we have evolved in, humans have learned to prioritize the bad news. Consequently, the media has embraced the “if it bleeds, it leads” business model. Worse still, growing competition between television, newspapers, and websites has significantly increased negative content over time. The inclusion of an additional negative word in a headline, for example, leads to 2.3 percent more clicks, according to a recent study.

Unfortunately, most people are unaware of our innate negativity bias. It may be helpful to include the understanding of basic human psychology in high-school curricula. While we may not be able to purge the negativity bias from our brains, understanding how and why we react to a ceaseless barrage of terrifying headlines in certain ways may help us gain a proper perspective on the world around us.

Another way to get around the apocalyptic headlines and focus on the largely positive trendlines is to develop a more sophisticated understanding of statistical probabilities. While evidence suggests that humans have an innate capacity for probabilistic reasoning, the formal application of Bayesian inference – which is to say, adjustment of our beliefs or guesses about something as we learn more information – is a learned skill. Infants and untrained adults show abilities that align with Bayesian principles on a basic level, indicating an intuitive understanding of probability and uncertainty. However, the precise and formal application of Bayesian reasoning requires education, especially in complex scenarios.

Finally, humans can choose what kind of information to consume. Knowing that traditional media does not offer a realistic picture of the world, people can sign up for services – such as the Human Progress weekly newsletter – that collate the positive happenings ignored by mainstream media outlets.