fbpx
01 / 05
Centers of Progress, Pt. 38: Cambridge (Physics)

Blog Post | Science & Education

Centers of Progress, Pt. 38: Cambridge (Physics)

The ideas generated in Cambridge transformed humanity’s understanding of the natural world.

Today marks the thirty-eighth installment in a series of articles by HumanProgress.org called Centers of Progress. Where does progress happen? The story of civilization is in many ways the story of the city. It is the city that has helped to create and define the modern world. This bi-weekly column will give a short overview of urban centers that were the sites of pivotal advances in culture, economics, politics, technology, etc.

Our thirty-eighth Center of Progress is Cambridge during the Scientific Revolution. The 16th and 17th centuries constituted a period of drastic change in the way humanity conceptualized and sought to understand the world. Scholars made massive leaps in fields such as mathematics, astronomy, chemistry, and, perhaps most notably, physics. Arguably, no city contributed more profoundly to that new understanding than Cambridge.

Today, Cambridge is a picturesque and walkable university city filled with stunning architecture, cozy pubs, and brilliant minds. “Cambridge is heaven … As you walk round, most people look incredibly bright, as if they are probably off to win a Nobel Prize,” says author Sophie Hannah. Indeed, if Cambridge were a country, it would rank fourth on the list of countries according to its number of Nobel Prize winners. Cambridge is nicknamed the “city of perspiring dreams” as a nod to its scholars’ tireless dedication, in contrast to rival university city Oxford’s older nickname, “the city of dreaming spires.”

The great minds that have defined Cambridge are reflected in its architecture and artworks. Architectural highlights include the Gothic-style King’s College chapel, featuring the world’s largest fan vault, and the Mathematical Bridge, designed in 1749 with the technique of tangential radial trussing, which creates the illusion of an arch although the bridge is built only of straight timbers. The conduit gutters, or runnels, lining many of the city’s ancient streets and university buildings owe their construction to none other than Thomas Hobson, the successful stable-owner-turned-benefactor from whom we get the term Hobson’s choice (e.g., “take it or leave it”). The city also boasts fascinating artworks such as the Corpus Chronophage Clock, a massive “inside-out” electromechanical timepiece that lets onlookers view the usually hidden grasshopper escapement mechanism. The clock is topped with a moving statue called the Chronophage (“time eater”), a grasshopper-like entity constructed from stainless steel, gold, and enamel. According to the artist, the horologist and Cambridge alumnus John Taylor (b. 1936), “the time is exactly correct every fifth minute to one hundredth of a second.”

Cambridge is also known for the Fitzwilliam Museum, the city’s lively market square, and the pastime popular among tourists and students alike of punting (a method of boating in shallow water) on the River Cam—the natural feature around which the city was built. The waterway has made the area an attractive farmsteading site since the Iron Age, and has allowed Cambridge to serve as a trading center throughout the ages, including for the Romans (who called the city Duroliponte, meaning “the fort at the bridge”), the Vikings, and the Saxons. 

But the true significance of Cambridge began with the founding of Cambridge University, which started with a murder mystery. In 1209, a woman was found killed in Oxford, and her death caused an uproar that would alter the course of academic history. She was a local and her fellow townspeople blamed outsiders drawn to their city to study and teach at Oxford University. At the time, most of the university’s students were teenage clerics, also called clerks. The primary suspect was a liberal arts clerk, who promptly fled his rented home. 

The townspeople resented the clerks’ special legal privileges, relative wealth, and reputation for drinking and fighting—tense town versus gown relations are nothing new. The killing took on an urgent significance in the broader conflict between the townspeople and the university, and a mob of riotous locals soon imprisoned the suspect’s roommates. This occurred in the midst of a power struggle between the Church and the crown. The excommunicated King John is said to have personally ordered the hanging of the imprisoned clerks “in contempt of the rights of the church.” The other pupils and instructors at the university fled, fearing further executions. To this day, the crime has not been solved—some say the killing was an accident, while others claim it was murder.

The dispersed scholars, possibly including the alleged killer, continued their studies elsewhere. What would become Cambridge University started as “no more than a bunch of scholars who had fled from Oxford and who had started to teach their students in rented houses in the neighborhood around St. Mary’s Church.” St. Mary’s Church today marks the center of Cambridge. In 1214, when the king and the Church had reconciled, Oxford’s townsfolk were made to welcome back scholars and offer them reduced rents. But tensions remained high in Oxford (boiling over periodically, such as in the St. Scholastica’s Day riot), and many ex-Oxonians chose to remain in Cambridge.

Soon Cambridge University was an intellectual powerhouse in its own right, where great thinkers took human understanding of the world to new heights. “I find Cambridge an asylum, in every sense of the word,” the English poet A. E. Housman once quipped. And indeed the city birthed ideas so groundbreaking that many of them may have sounded mad when they were first expressed.

Cambridge has nurtured great minds in many areas of achievement. Consider the arts. Cambridge’s streets have been trod throughout the centuries by literary and poetic geniuses, including Edmund Spenser (1552–1599), Christopher Marlowe (1564–1593), John Milton (1608–1674), William Wordsworth (1770–1850), Lord Byron (1788–1824), Alfred Tennyson (1809–1892), William Thackeray (1811–1863), A. A. Milne (1882–1956), C. S. Lewis (1898–1963), Vladimir Nabokov (1899–1977), Sylvia Plath (1932–1963), and Douglas Adams (1952–2001). Famous Cambridge alumni include comedians, such as John Cleese (b. 1939), Eric Idle (b. 1943), Sacha Baron Cohen (b. 1971), and John Oliver (b. 1977), as well as award-winning actors such as Emma Thompson (b. 1959) and Hugh Laurie (b. 1959). And Cambridge gave the world musical feats ranging from the comedic Always Look on the Bright Side of Life to the well-known 1980s hit Walking on Sunshine.

Next consider philosophy and economics. Cambridge educated the celebrated Catholic theologian, humanist philosopher, and pioneer of religious tolerance Erasmus (1466–1536). Other noted philosophers who were Cantabrigians include Bertrand Russell (1872–1970) and Ludwig Wittgenstein (1889–1951). Cambridge was also the alma mater of influential economists, such as the overpopulation alarmist Thomas Malthus (1766–1834), the father of the beleaguered field of macroeconomics John Maynard Keynes (1883–1946), and Nobel Prize winners Milton Friedman (1912–2006) and Angus Deaton (b. 1945).

But Cambridge’s greatest contributions to human progress arguably came in the natural and physical sciences. William Harvey (1578–1657), the physician and anatomist who first detailed the human blood circulatory system, studied at Cambridge. Francis Bacon (1561–1626), the father of empiricism and one of the founders of the scientific method, studied at Cambridge and represented Cambridge University (which for a time was a Parliament constituency with its own representatives) in the British Parliament in 1614. 

Most historians consider the Scientific Revolution to have begun with the insight of the Polish astronomer Nicolaus Copernicus (who studied in Bologna, another Center of Progress, and in Padua) that the Earth orbits the sun rather than the sun orbiting the Earth. However, the revolution culminated in the quiet university city of Cambridge with the writing of Newton’s Philosophiæ Naturalis Principia Mathematica (the Principia, published in 1687), a groundbreaking work that advanced mankind’s knowledge of physics and cosmology. To this day, Cambridge University Library retains the first edition print of the book owned by author Isaac Newton (1642–1727), which contains his handwritten notes for the second edition scrawled across it. 

If Newton was the father of modern physics, then Cambridge was arguably the field’s birthplace. The course of Newton’s life revolved around Cambridge; one might say the city’s intellectual gravity kept him in its orbit and he could not resist its pull. He received both his bachelor’s and master’s degree from Cambridge University. Like Bacon, Newton briefly served as member of Parliament for the University of Cambridge (from 1689–1690 and 1701–1702). In 1669, only a year after completing his master’s degree, Newton became the Lucasian Chair of Mathematics, which is now among the most prestigious professorships in the world, and remained in that position until 1702. 

The professorship was made possible thanks to private funding from a benefactor named Henry Lucas (c. 1610–1663). He was a clergyman, politician, and Cambridge alumnus who also generously bequeathed a collection of some four thousand books to the University Library at Cambridge. Other famous Lucasian professors include the mathematician Charles Babbage (1791–1871), often called the “father of computing” for conceiving the first automatic digital computer; and the theoretical physicist Stephen Hawking (1942–2018), who, despite severe health challenges from amyotrophic lateral sclerosis (ALS is a progressive motor-neuron illness), made several notable contributions to his field, including conceptualizing Hawking radiation. The Lucasian Chair has even attracted the attention of popular culture: in the well-known science-fiction franchise Star Trek, a central character named Data is said to hold the Lucasian Chair in the late 24th century.

The privately funded professorship allowed Newton to make several breakthroughs in the fields of mathematics, optics, and physics, such as developing the first reflecting telescope. Generous private funding also made publication of the Principia possible. The astronomer and physicist Edmond Halley (1656–1742), the namesake of Halley’s Comet and the heir to a soap mogul, traveled to Cambridge to encourage, edit, and fund the publication of Newton’s Principia. In his book, Newton demonstrated how the planets orbit the sun, controlled by gravity. A popular legend holds that Newton first formulated the theory of gravity in the mid-1660s after watching an apple fall from a tree. The precise apple tree often said to have inspired him is, remarkably, still alive—it stands around 70 miles northwest of Cambridge in Newton’s family home, Woolsthorpe Manor. Grafted from that storied tree, another “Newton’s apple tree” can now be viewed in Cambridge. Whether Newton ultimately did his most momentous thinking in his birthplace or in his intellectual home at Cambridge, one thing is certain: the Principia took the world by storm. Its publication is often said to have laid the foundation of modern physics.

After the Scientific Revolution, Cambridge continued to produce world-changing thinkers, such as Henry Cavendish (1731–1810), the discoverer of hydrogen (which he termed “inflammable air”). Later, Cambridge’s Cavendish Laboratory was home to major discoveries, including that of the electron in 1897, the neutron in 1932, and the structure of DNA in 1953. The latter came about thanks to the work of physicist Francis Crick (1916–2004) and biologist James Watson (b. 1928), who may have built on findings by other Cantabrigians, including chemist Rosalind Franklin (1920–1958). The physicist Niels Bohr (1885–1962), who developed the Bohr model of the atom, also studied at Cambridge. The university city was the site of other groundbreaking moments in scientific history, too, such as the invention of in vitro fertilization technology (1968-78), the first identification of stem cells (1981), and the earliest eye-recognition technology (1991).

It also must be mentioned that Cambridge educated the founder of evolutionary biology, Charles Darwin (1809–1882). He forever altered understanding of living things by positing the fundamental scientific concepts of animal and human evolution and natural selection. Along with Newton, he is probably the most influential figure in scientific history to emerge from Cambridge’s classrooms—and one of the most influential men in history, period.

Church of St Mary the Great, Cambridge

Cambridge grew from its unconventional murder-mystery origins into an intellectual center that played a pivotal role in the Scientific Revolution, which is often said to have been made complete with the publication of the Principia. Thanks to the rigorous culture of Cambridge’s academic community and funding from generous benefactors, the city has often served as the headquarters of mankind’s quest for truth and understanding. Many scholars believe that the new way of thinking that emerged during the Scientific Revolution directly led to the Enlightenment movement in the 17th and 18th centuries. The innovations of the Scientific Revolution continue to form the basis for humanity’s present understanding of the natural world, including modern physics. It is for these reasons that we must gravitate toward Cambridge as our 38th Center of Progress. 

Blog Post | Education Spending

Growth Comes From Ideas, Not Degrees | Podcast Highlights

Marian Tupy interviews Bryan Caplan about the relationship between formal education and innovation.

Listen to the podcast or read the full transcript here.

Get The Case Against Education here.

I want to start with a broad question. What is economic growth, and where does it come from?

Economic growth is just change in economic well-being. Usually, we measure it with GDP.

Where does it come from? There are a lot of stories that people tell. Traditionally, people said it comes from capital accumulation and better-quality labor. But when you really go to the numbers, neither of these things can explain anywhere close to the full change, so most growth has got to be from technological progress, broadly defined. That is the main difference between the world of today and the world of 2000 years ago.

In your piece, you distill it to a single word: ideas.

That’s right.

Why is economic growth important?

In any given year, it seems like getting another percentage point of growth couldn’t make much difference. You barely even notice it. And yet, as many people have pointed out, when you compound an extra percentage point of growth per year over the course of 100 years, it’s the difference between poverty and riches. And riches are what allow you to buy free time. Riches are what allow you to buy culture, to save your child from worms.

Right. So economic growth is an increase in wealth, it comes from new ideas, and ultimately, it is highly correlated with things like better infrastructure, better hospitals, and so on.

Absolutely.

What is the purported relationship between education and growth?

The normal view is that education is the crucial determinant of growth, that it turns unskilled humans into the skilled workers of the modern economy. This is an idea not just from politicians, teachers, and the general public, but also from economics. If you take a class in economics, they will constantly talk about how it’s important to have lots of education because that’s how we build human capital.

So, the purported relationship is that education creates human capital, which creates new ideas and thus more growth?

That’s one version. The more common one is simply that education leads to human capital, which immediately leads to growth. The typical college grad isn’t going to invent anything, but they’re capable of being a more valuable cog in the machine.

Right, so the standard inference is that if you have a more educated workforce, they can accomplish more sophisticated tasks. What does the evidence show?

So, I have a book called The Case Against Education, and I’m not going to be coy about this: I expected to find that education was overrated. However, I also expected to find that a lot of other people researching would say they had clear evidence that education raises economic growth.

However, when I read all the mainstream work on education, there was a big debate about “how come we’re not finding what we know to be true, which is that education is the crucial cause of economic growth?” I think that they are finding the truth, which is that education isn’t a factory for building human capital, but a certification machine for stamping people: good worker, great worker, not so great worker. People like to think about education as a way of building skills, but actually, it’s more like a passport to the real training, which happens on the job.

So, by going to university, you are offering your employer a sign that you are intelligent and conscientious enough to do so.

You’re showing intelligence, conscientiousness, and also conformity. There’s no “I” in team. Most jobs require you to follow a chain of command to achieve the goal of the group. While on some level I don’t like conformity, on a deeper level it’s really important for most purposes.

I want to read you something that you wrote. “Contrary to conventional stories about the positive externalities of education, mainstream estimates of education’s national rate of return were consistently below estimates of education’s individual rate of return.”

What does that mean?

Great question.

A rate of return is basically a measure of how good an investment is. So, for example, you might try to calculate the rate of return of putting extra insulation on a house. We can do the same for education and figure out how all the costs of education compare to the payoffs.

When you do this from the point of view of an individual person, it’s pretty common to get a 10 percent inflation-adjusted rate of return. In my book, I say this is probably too high, but you can bring it down to maybe 7 or 8 percent.

We can also think about this at the level of the country. What if we raise the education level of the whole workforce of a country by a year? How much does that enrich the country? What that quote is saying is that even the high estimates of how much a year of education does for a country are typically around half of what it does for an individual. And a lot of the estimates find that sending the whole country to school for an extra year increases national income by 1 or 2 percent.

In other words, a stamp is a good way for one person to get ahead in life, but stamping the whole country does not help that country get ahead; it just creates credential inflation. You need more and more degrees in order to get the same job that your parents and grandparents got with fewer.

Let’s talk a little bit about innovation. Where do new ideas come from? Are we talking about a very small group of individuals who share certain characteristics?

It’s an exaggeration to say that innovation only comes from a few people. There are millions of small-scale improvements coming from many different people. Opening a new kind of restaurant is not revolutionary R&D, but so much of the improvement in our living standards comes from these small acts of entrepreneurship. When I was in high school, there were only three kinds of restaurants: American, Italian, and Chinese. Now we have a cornucopia of different cuisines. The same goes for so many other simple products. Dog collars now come in 100 more varieties than they did back when I was growing up in the ’80s.

However, the really revolutionary stuff—new vaccines, new business models, new forms of energy—comes from very special people. I think it’s reasonable to say that almost all the really big ideas are coming out of the top sliver of the IQ distribution. There was a psychologist named Lewis Terman in California who, I believe, in the 1920s, saw that there was a standardized test administered to all the kids in the state of California school system. He managed to get data on the top hundred scorers in the whole state of California in that year, and he followed them through life. In his honor, these kids are named the termites, and there’s been a lot of research on them.

While the vast majority of this group didn’t do anything really impressive, they had many times, maybe a thousand times, the normal rate of stellar success. So, just doing these kinds of tests is a good way of identifying the most promising people. At a minimum, just have a system where you basically let children advance as rapidly as they’re capable of. A lot of very intelligent people feel very isolated from their own age group, and it makes sense just to advance them as far as their talent will take them.

I have a personal view, which is that our society is very open to the idea of the STEM prodigy, but we are very closed to the idea of there being a prodigy in, say, history. And I think that there are history prodigies. I have met kids with not just a broad, but a deep understanding of history by the time they’re 13 or 14. People think it’s crazy to put them in a PhD program in history when they’re 14 years old, but I don’t. Why not skip that kid ahead and let him become a star? Look, maybe he wants to be a regular 12-year-old even though he is a genius, but maybe he doesn’t. Maybe he wants to be with a peer group of geniuses. Let’s pave the way for him if that’s what he wants.

Do you think that AI will allow us to continue innovating if the population starts declining?

There was a long period where people working on AI kept over-promising and under-delivering. I would personally hear extravagant claims and check them out and find that they weren’t true. Finally, about two years ago, they started being correct. I was as shocked as anyone. I actually have a bet out about AI, which I’m probably going to lose. It’s embarrassing because I have otherwise a perfect public betting record.

That said, one incredible achievement does not mean that they’re going to have a whole series of incredible achievements. And there’s a lot to the idea that AI is basically just amazing at compiling what has already been said rather than truly coming up with new stuff. While it’s not impossible for it to get better, a lot better, it’s also not guaranteed.

Another thing worth pointing out is that we’ve had, by many measures, falling rates of innovation despite a rising population. There’s an idea that we’ve already discovered a lot of the low-hanging fruit, and so we need to keep multiplying our efforts to maintain the same rate of growth. Another plausible story is that we have doubled the number of people that we call researchers, but really only the best ones count, and the other ones are kind of fake.

Given that much of the money we spend on education is spent poorly or even counterproductively, what should we do with the money instead?

I’m totally on board with giving it back to the taxpayers or just paying down the national debt. We badly need austerity. We are driving at 100 miles per hour towards a brick wall, but there’s still time to change course and get our foot on the brakes. One of the easiest ways of doing that is by spending less on education.

Is education more useful in the developing world?

Poor countries have a severe problem with teachers even showing up. They, on paper, have many years of education—I think Haiti now is around where France was in 1960—but mostly they are just throwing money at a corrupt system that doesn’t even teach basic literacy and numeracy. The way that people in the third world are learning to use technology is the way that almost all normal people learn anything, which is by doing.

It seems to me that we are doing the exact opposite. We are keeping people in the education system for many years, which could prevent them from starting to work and learning by doing.

Yeah. It would be much better if people started adult life at an earlier age. They’re totally ready for it. There’s no reason why 13- or 14-year-olds should not be working. One of the best ways to get kids to actually learn stuff, especially the kids who hate school, is to make it practical. They need to see concrete results and make money.

If you read biographies or autobiographies of people in earlier eras, it is amazing how far people got at young ages. By the age of 15, Malcolm X had worked four different jobs and been all over the country. Many people listen to me and say, “Oh, that’s so dystopian.” I think the system we have now is dystopian, where someone has to sit in a classroom until they’re 30 listening to some boring windbag talk about things he doesn’t even know how to do.

Bloomberg | Tertiary Education

College Is Actually Getting More Affordable

“The decline of the American system of higher education has many causes, several of which I have catalogued over the years, but one of the most popular reasons is overstated: cost. Higher education in America is becoming more affordable, as the laws of supply and demand are turning a crisis into a manageable problem.

As college became more expensive in the decades before and just after the turn of the century, students and their families adjusted. Many opted for a cheaper version of the basic product, such as state schools or junior colleges. Others went to vocational school or did something altogether different. In response to these market pressures, colleges have responded by making their product cheaper, as outlined in a new report from the College Board.

There are a lot of numbers, but here is the comparison I find most impressive: Adjusting for grants, rather than taking sticker prices at face value, the inflation-adjusted tuition cost for an in-state freshman at a four-year public university is $2,480 for this school year. That is a 40% decline from a decade ago.”

From Bloomberg.

Axios | Tertiary Education

AI Tutors Are Already Changing Higher Ed

“Generative AI is already transforming higher ed, giving students more access to professors’ expertise and boosting efficiency for both faculty and students in some fields.

Why it matters: For many college students, the world of ‘personal AI tutors for everyone’ promised by techno-optimists is already here.

The big picture: Computer science professors have had the most success with AI tutors in the classroom so far, mirroring the mass appeal of genAI as a coding assistant. Meanwhile, many educators outside of the STEM fields are more likely to view genAI with suspicion or skepticism.

State of play: In the two years since the release of ChatGPT, the conversations around its use in college classrooms have mostly focused on cheating. But some professors and their students are using it to boost individual learning and make education more equitable.”

From Axios.