More than a hundred million people watched the historic comeback of the Patriots in the Super Bowl on their TVs last night.
Virtually every American can do so, as 96% of households own a television. In fact, having a TV has become so normal that hardly anyone thinks about the amazing development this technology has undergone in the last century.
The first commercially available TVs are almost 90 years old. Yet, they were hardly useful in their early days because broadcasting companies were few. For example, the first televised football game in 1939 was captured by a single camera and reached only about 1,000 television sets.
It wasn’t until the late 1940s that TV channels started to transmit shows on a regular basis. In the 1950s colored television slowly became available. However, high prices for electronic color TVs prevented large scale sales until the 1960s. Video tapes too were unheard of until the 1970s, thus everything had to be watched live.
Nevertheless, the number of TVs in the United States grew steadily from 6,000 sets in 1946 to 12 million televisions in 1951. By 1955, half of American households had a television, inspiring broadcasting channels to pop up all around the country. However, a television was still a precious luxury good and the average American in 1976 had to work 60 hours to afford a new TV . Today, only 6 hours of working time is needed to afford a new television. As with many consumer goods, affording a TV has become more feasible over time.
As a consequence, the average worker today works 300 hours less per year than his counterpart did in 1950. He can now allocate his time elsewhere to read a book, play with his kids, or watch the Super Bowl.
Human Progress is a project of the Cato Institute that seeks to educate the public on global improvements in wellbeing by providing free empirical data on long-term developments.