Anyone who follows technology is familiar with Moore's Law and its many variations, and has come to expect the price of computing power to halve every 18 months. But many people don't see the true long-term impact of this beyond the need to upgrade their computer every three or four years. To not internalize this more deeply is to miss financial opportunities, grossly mispredict the future, and be utterly unprepared for massive, sweeping changes to human society. Hence, it is time to update the first version of this all-important article that was written on February 21, 2006.
Today, we will introduce another layer to the concept of Moore's Law-type exponential improvement. Consider that on top of the 18-month doubling times of both computational power and storage capacity (an annual improvement rate of 59%), both of these industries have grown by an average of approximately 12% a year for the last fifty years. Individual years have ranged between +30% and -12%, but let us say that the trend growth of both industries is 12% a year for the next couple of decades.
So, we can conclude that a dollar gets 59% more power each year, and 12% more dollars are absorbed by such exponentially growing technology each year. If we combine the two growth rates to estimate the rate of technology diffusion simultaneously with exponential improvement, we get (1.59)(1.12) = 1.78
The Impact of Computing grows at a scorching pace of 78% a year.
Sure, this is a very imperfect method of measuring technology diffusion, but many visible examples of this surging wave present themselves. Consider the most popular television shows of the 1970s, where the characters had all the household furnishings and electrical appliances that are common today, except for anything with computational capacity. Yet, economic growth has averaged 3.5% a year since that time, nearly doubling the standard of living in the United States since 1970. It is obvious what has changed during this period, to induce the economic gains.
We can take the concept even closer to the present. Among 1990s sitcoms, how many plot devices would no lon ger exist in the age of cellphones and Google Maps? Consider the episode of Seinfeld entirely devoted to the characters not being able to find their car, or each other, in a parking structure (1991). Or this legendary bit from a 1991 episode in a Chinese restaurant. These situations are simply obsolete in the era of cellphones. This situation (1996) would be obsolete in the era of digital cameras, while the 'Breakfast at Tiffany's' situation would be obsolete in an era of Netflix and YouTube.
In the 1970s, there was virtually no household product with a semiconductor component. In the 1980s, many people bought basic game consoles like the Atari 2600, had digital calculators, and purchased their first VCR, but only a fraction of the VCR's internals, maybe 20%, comprised of exponentially deflating semiconductors, so VCR prices did not drop that much per year. In the early 1990s, many people began to have home PCs. For the first time, a major, essential home device was pegged to the curve of 18-month halvings in cost per unit of power. In the late 1990s, the PC was joined by the Internet connection and the DVD player.
Now, I want everyone reading this to tally up all the items in their home that qualify as 'Impact of Computing' devices, which is any hardware device where a much more powerful/capacious version will be available for the same price in 2 years. You will be surprised at how many devices you now own that did not exist in the 80s or even the 90s.
Include : Actively used PCs, LCD/Plasma TVs and monitors, DVD players, game consoles, digital cameras, digital picture frames, home networking devices, laser printers, webcams, TiVos, Slingboxes, Kindles, robotic toys, every mobile phone, every iPod, and every USB flash drive. Count each car as 1 node, even though modern cars may have $4000 of electronics in them.
Do not include : Tube TVs, VCRs, film cameras, individual video games or DVDs, or your washer/dryer/oven/clock radio just for having a digital display, as the product is not improving dramatically each year.
If this doesn't persuade people of the exponentially accelerating penetration of information technology, then nothing can.
To summarize, the number of devices in an average home that are on this curve, by decade :
1960s and earlier : 0
1970s : 0-1
1980s : 1-2
1990s : 3-4
2000s : 6-12
2010s : 15-30
2020s : 40-80
The average home of 2020 will have multiple ultrathin TVs hung like paintings, robots for a variety of simple chores, VR-ready goggles and gloves for advanced gaming experiences, sensors and microchips embedded into clothing, $100 netbooks more powerful than $10,000 workstations of today, surface computers, 3-D printers, intelligent LED lightbulbs with motion-detecting sensors, cars with features that even luxury models of today don't have, and at least 15 nodes on a home network that manages the entertainment, security, and energy infrastructure of the home simultaneously.
At the industrial level, the changes are even greater. Just as telephony, photography, video, and audio before them, we will see medicine, energy, and manufacturing industries become information technology industries, and thus set to advance at the rate of the Impact of Computing. The economic impact of this is staggering. Refer to the Future Timeline for Economics, particularly the 2014, 2024, and 2034 entries. Deflation has traditionally been a bad thing, but the Impact of Computing has introduced a second form of deflation. A good one.
It is true that from 2001 to 2009, the US economy has actually shrunk in size, if measured in oil, gold, or Euros. To that, I counter that every major economy in the world, including the US, has grown tremendously if measured in Gigabytes of RAM, TeraBytes of storage, or MIPS of processing power, all of which have fallen in price by about 40X during this period. One merely has to select any suitable product, such as a 42-inch plasma TV in the chart, to see how quickly purchasing power has risen. What took 500 hours of median wages to purchase in 2002 now takes just 40 hours of median wages in 2009. Pessimists counter that computing is too small a part of the economy for this to be a significant prosperity elevator. But let's see how much of the global economy is devoted to computing relative to oil (let alone gold).
Oil at $50/barrel amounts to about $1500 Billion per year out of global GDP. When oil rises, demand falls, and we have not seen oil demand sustain itself to the extent of elevating annual consumption to more than $2000 Billion per year.
Semiconductors are a $250 Billion industry and storage is a $200 Billion industry. Software, photonics, and biotechnology are deflationary in the same way as semiconductors and storage, and these three industries combined are another $500 Billion in revenue, but their rate of deflation is less clear, so let's take just half of this number ($250 Billion) as suitable for this calculation.
So $250B + $200B + $250B = $700 Billion that is already deflationary under the Impact of Computing. This is about 1.5% of world GDP, and is a little under half the size of global oil revenues.
The impact is certainly not small, and since the growth rate of these sectors is higher than that of the broader economy, what about when it becomes 3% of world GDP? 5%? Will this force of good deflation not exert influcence on every set of economic data? At the moment, it is all but impossible to get major economics bloggers to even acknowledge this growing force. But over time, it will be accepted as a limitless well of rising prosperity.
12% more dollars spent each year, and each dollar buys 59% more power each year. Combine the two and the impact is 78% more every year.
Related :
A Future Timeline for Economics
Economic Growth is Exponential and Accelerating
Pre-Singularity Abundance Milestones
The Technological Progression of Video Games