The rate of technological change has been considerably slower than its trendline ever since the start of the 21st century. I wrote about this back in 2008, but at the time, I did not have quite as advanced techniques of observing and measuring the gap between the rate of change and the trendline, as I do now.
The dot-com bust coincided with a trend toward lower nominal GDP (since everyone wrongly focuses on 'real' GDP, which has less to do with real-world decisions than nominal GDP), and this has led to technological change, despite sporadic bursts, generally progressing at what is currently only 60-70% of its trendline rate. For this reason, may technologies that seemed just 10 years away in 2000, have still not arrived as of 2014. I will write much more on this at a later date.
But for now, two overdue technologies are finally plodding towards where many observers thought they would have been by 2010. Nonetheless, they are highly disruptive, and will do a great deal to change many industries and societies.
1) Artificial Intelligence :
A superb article by Kevin Kelly in Wired Magazine describes how three simultaneous breakthroughs have greatly accelerated the capabilities of Artificial Intelligence (AI). Most disruptions are usually the result of two or more seemingly unrelated technologies both crossing certain thresholds, and observers tend to be surprised because each group of observers was following only one of the technologies. For example, the iPod emerged when it did because storage, processing, and the ability to store music as software all reached certain cost, size, and power consumption limits at around the same time.
What is interesting about AI is how it can greatly expand the capabilities of those who know know to incorporate AI with their own intelligence. The greatest chess grandmaster of all time, Magnus Carlssen, became so by training with AI, and it is unclear that he would have become this great if he lived before a time when such technologies were available.
The recursive learning aspect of AI means that an AI can quickly learn more from new people who use it, which makes it better still. One very obvious area where this could be used is in medicine. Currently, millions of MD general practitioners and pediatricians are seen by billions of patients, mostly for relatively common diagnostics and treatments. If a single AI can learn enough from enough patient inputs to replace most of the most common diagnostic capabilities of doctors, then that is a huge cost savings to patients and the entire healthcare system. Some doctors will see their employment prospects shrink, but the majority will be free to move up the chain and focus on more serious medical problems and questions.
Another obvious use is in the legal system. On one hand, while medicine is universal, the legal system of each country is different, and lawyers cannot cross borders. On the other hand, the US legal system relies heavily on precedent, and there is too much content for any one lawyer or judge to manage, even with legal databases. An AI can digest all laws and precedents and create a huge increase in efficiency once it learns enough. This can greatly reduce the backlog of cases in the court system, and free up judicial capacity for the most serious cases.
The third obvious application is in self-driving cars. Driving is an activity where the full range of possible traffic situations that can arise is not a particularly huge amount of data. Once an AI gets to the point where it analyzes every possible accident, near-accident, and reported pothole, it can easily make self-driving cars far safer than human driving. This is already being worked on at Google, and is only a few years away.
Get ready for AI in all its forms. While many jobs will be eliminated, this will be exceeded by the opportunity to add AI into your own life and your own capabilities. Make your IQ 40 points higher than it is when you need it most, and your memory thrice as deep - all will be possible in the 2020s for those who learn to use these capabilities. In fact, being able to augment your own marketable skills through the use of AI might become one of the most valuable skillsets for the post-2025 workforce.
2) Virtual Reality/Augmented Reality :
Longtime readers recall that in 2006, I correctly predicted that by 2012-13, video games would be a greater source of entertainment than television. Now, we are about to embark on the next phase of this process, as a technology that has had many false starts for over 20 years might finally be approaching reality.
Everyone knows that the Oculus Rift headset will be released to the consumer in 2015, and that most who have tried it has had their expectations exceeded. It supposedly corrects many of the previous problems of other VR/AR technologies that have dogged developers for two decades, and has a high resolution.
But entertainment is not the only use for a VR/AR headset like the Oculus Rift, for the immersve medium that the device facilitates has tremendous potential for use in education, military training, and all types of product marketing. Entirely new processes and business models will emerge.
One word of caution, however. My decade of direct experience with running a large division of a consumer technology company compels me to advise you not to purchase any consumer technology product until it is in its third generation of consumer release, which is usually 24-48 months after initial release. The reliability and value for money are usually not compelling until Gen three. Do not mistake fractional generations (i.e. 'version 1.1', or 'iPhone 5, 5S, and 5C) for actual generations. Thre Oculus Rift may be an exception to this norm (as are many Apple products), but in general, don't be an early adopter on the consumer side.
Update (5/27/2016) : The same Kevin Kelly has an equally impressive article about VR/AR.
Combining the Two :
Imagine, if you would, that the immersive movies and video games of the near future are not just fully actualized within the VR of the Oculus Rift, but that the characters of the video game adapt via connection to some AI, so that game characters far too intelligent to be overcome by hacks and cheat codes emerge.
Similarly, imagine if various forms of training and education are not just improved via VR, but augmented via AI, where the program learns exactly where the student is having a problem, and adapts the method accordingly, based on similar difficulties from prior students. Suffice it to say, both VR and AI will transform medicine from its very foundations. Some doctors will be able to greatly expand their practices, while others find themselves relegated to obsolesence.
Two overdue technologies, are finally on our doorstep. Make the most of them, because if you don't, someone else surely is.
Related :
The Next Big Thing in Entertainment
The Impact of Computing : 78%/year
Speaking of "overdue", it is a pleasure to have one of your very interesting essays appear at long last, sir.
Many thanks for your very good work.
Posted by: Mal | December 22, 2014 at 06:40 AM
Hi Futurist,
Are you saying in the beginning of this post that slower than expected economic growth has delayed the rollout of new technologies? I think that is the meaning but would like to confirm.
I am very much looking forward to your future posting on changes in expected technological timelines.
Thanks,
Drew
Posted by: Drew | December 27, 2014 at 01:18 PM
At some point vr pornography will be huge. If it gets to where sex with a vr 10 is better than sex with a real world 7, there will be serious social changes. Where social media greatly increased a womans value and gave women unlimited acess to men, this might bring about the opposite effect. It might make men tune out and force women to make changes in their behavior.
Posted by: Lee | December 27, 2014 at 02:38 PM
Drew,
Are you saying in the beginning of this post that slower than expected economic growth has delayed the rollout of new technologies? I think that is the meaning but would like to confirm.
Sort of. Rather, while 'real' growth has slowed from 3% to 2%, nominal growth (which is more relevant to real-world decisions) has slowed from 6% to just 3%. This means all investments take much longer to breakeven, and a smaller percentage of tech startups succeed. This slows down the VC cycle, and the problem propagates.
Since 2002 or so, technology has only progressed at 60-70% of its trendline rate. Eventually, technology will move back to its trendline rate, toppling legislation and governments that have been choking it back.
Posted by: The Futurist | December 27, 2014 at 05:48 PM
Thanks for the follow-up. Fascinating that a consequence of the strong deflationary impact of technology may be delays to the roll-out of additional technology.
A secondary deflationary impact of technology may be that it leads to an older populations which has both a deflationary impact and may also lead to more conservative investing and adaptation of new technologies.
Thanks,
Drew
Posted by: Drew | December 28, 2014 at 11:09 AM
Drew,
Yes. But remember that technology follows a trendline that it always reverts to, and this supercedes nation-states. If deflation is a problem, technology will find a way to force central banks to print more money.
Also, if AIs themselves become users of technology, the lower birthrates of humans may also diminish as a factor in keeping technological progress constrained.
The process of technology correcting back to its trendline (it has been well below its trendline for 14 years now, leading to many expected advances being 4-8 years behind schedule) could be volatile.
Posted by: The Futurist | December 29, 2014 at 09:29 AM
I don't think technology will "force central banks to print more money" - that seems a bit too much of an anthropomorphism of technology - but I can see both investors learning how to take deflation into account in their technology investments as well as technological feedback cycles causing faster change with less investment.
Cloud computing would be one example of how one technological/business advance allows for faster developments in other areas. Artificial smartness on tap will be another one.
This could definitely cause a "snap-back" to long term trends.
Thanks,
Drew
Posted by: Drew | December 29, 2014 at 11:17 AM
Technology will cause deflation which causes market corrections and debt crises that force more money-printing by central banks, under duress.
Central banks will not connect the effect to technology, as they are already baffled as to why there is deflation despite over $13T of printing in the last 5 years...
In the old days, jobs growth caused inflation, but now, job growth also causes more consumption of technology which causes deflation that perhaps offsets wage-pressure inflation.
Posted by: The Futurist | December 29, 2014 at 11:19 AM
The Futurist,
Part of the reason that we don't see much inflation is that the money the banks got was in the main to cover bad bets (debts). In other words an accounting fiction.
BTW job growth does not cause inflation. Inflation is a monetary phenomenon. It is the money in CIRCULATION that counts. Bad debts don't circulate.
You might also want to look into the Black Economy. A subject well covered by Catherine Austin Fitts ("Narco Dollars"). The Black Economy can cover a LOT of misallocation because its money enters the "real" economy at a steep discount. Something like 30% off.
However, I think the PTTB have decided to cut that off. Newt Gingrich - once and ardent Prohibitionist - supported the decriminalization of heroin and cannabis in California. It passed.
Posted by: MSimon | January 04, 2015 at 07:04 AM
MSimon,
The effect of bad debts on causing deflation has been a traditional factor, but that is now accompanied by the technological factor, which was far too small before 2007 or so, but is now significant.
This is especially true since deflation is picking up even after more and more cumulative printing. If this deflation was due to bad debts alone, the worst of it would have been long past by now.
The world is still printing over $125B/month, and yet there is deflation. In 2010, there was higher inflation with just $75B/month being printed.
Even more importantly, the next crisis (~2017) will be one where the debt level is not much higher than now, but where current QE-style bond-buying is no longer effective, due to not enough diffusion of the printing.
Posted by: The Futurist | January 04, 2015 at 07:24 PM
I've often wondered about two things:
1) Is there a psychlogical limit to the rate of technological progress? That is to say we, as humans, can only adapt to change so quickly before we become overwhelmed? Perhaps our ability to absorb and process change is the limiting factor right now. As exhibit A: Uber. Uber allows anyone to set up and run their own cab company, quite successfully, with much less wasted labor and capitol. So, naturally, cab companies and cities are fighting them tooth and nail.
Now imagine Uber with driverless cars. Twenty people could run 1,000 cabs in five cities. That is going to be a bomb going off.
As you said, great inventions require the integration of a variety of advancements, but perhaps our ability to understand and integrate so many advancements is the limiting factor, along with our ability to adjust our laws and cultural norms.
2) I assume as we near the technological singularity there are going to turbulents. Sudden slow downs and speed-ups in the rate of change rather than a smooth progression. We are nearing the eye of the storm, where things get violent. Models always have smooth curves, while reality is unfortunately very bumpy.
You talk about self driving cars, but very soon I expect we'll have self driving planes. And the sex industry - it already changed significantly with technology (porn theaters, sex shops, and street walkers are virtually gone from advanced cities). I would not have predicted that, yet in retrospect it is obvious. The point is, the impacts are hard to predict precisely, and occur unevenly, other than in retrospect.
Posted by: Geoman | January 05, 2015 at 04:09 PM
Geoman,
1) Is there a psychlogical limit to the rate of technological progress?
Maybe, but this can vary a lot by person. Many people already are unable to adjust, whereas we may want it to be faster.
But overall, I don't think resistance can do much to slow technology down. All that happens is that the disruption moves higher up the chain, or to a different country. Even government, when approached for protectionism by an incumbent, will have to choose among the 5,6,7, or 10 industries seeking protection. Only the top 1-2 most connected industries will get government intervention, while the rest are left to be disrupted.
What is notable is how many industries are running to the govt. That alone indicates an overwhelming tsunami of disruption.
(porn theaters, sex shops, and street walkers are virtually gone from advanced cities)
I still see them on Broadway St. in SF, and in Las Vegas. But the Oculus Rift might indeed eliminate all that (as well as leave the bottom 80% of women wondering why men don't notice them).
Posted by: The Futurist | January 05, 2015 at 04:36 PM
Wow! A new article. Cool! (Now I'll go back and read it)
Posted by: Zyndryl | January 05, 2015 at 04:47 PM
*smirk*
Good stuff. But more details are needed. For example, 80% of women won't suddenly be ignored. They will suddenly have to work 10x harder just to get the same validation and most of the guys that yesterday carried their stuff for them won't even be interested in answering their txts.
I don't think marriage will survive as an institution. Oh well...
Posted by: ysg | January 07, 2015 at 08:16 PM
Following up on Geoman's point - I think a second brake on technologies and transformation is our current schooling system and related expectations. With college as the new high school - and on its way to being the new 8th grade - people are taking more time before getting out in the world and changing things. Gates and Jobs are called out because they are outliers, more common are Thiel and Musk with their multiple degrees. In contrast, Carnegie and Edison started working at age 13 and were not outliers.
There are ways to shorten this time period as you mentioned in the previous article and we will find a a way to get education back to transformation and away from credentialing - which will allow for some more snapback in change when it occurs.
Posted by: Drew | January 16, 2015 at 05:30 AM
Drew,
The most productive people tend to find a way to be productive despite the decline in standards of mass-market education. The smartest still reach the top, while the individual with a 'Women's Studies' degree who consumed $200K of taxpayer money despite having no real education to speak of, would not have been productive in any other era either.
Those with the common sense to tie education costs to earnings prospects will rapidly gravitate to MOOCs like the GATech MSCS degree. Those easily derailed by credentials were not likely to add real value anyway.
Also consider rising lifespan as a factor extending the careers of the most productive people.
Posted by: The Futurist | January 18, 2015 at 05:06 PM
The Futurist wrote:
Similarly, imagine if various forms of training and education are not just improved via VR, but augmented via AI, where the program learns exactly where the student is having a problem, and adapts the method accordingly, based on similar difficulties from prior students.
An AI-augmented VR tool that incorporates spaced-repetition for maximum learning speed would be a significant advance. I've seen various spaced repetition applications such as Anki, Mnemosyne, and Supermemo. These are all flashcard systems that are good for semantic learning (memorizing facts and rules) but aren't as useful for procedural forms of learning (learning by doing). Something that is able to optimally schedule reviews for both types of learning would be ideal.
The greatest chess grandmaster of all time, Magnus Carlssen, became so by training with AI, and it is unclear that he would have become this great if he lived before a time when such technologies were available.
The Wired article also mentions Garry Kaparov, who after his 1997 defeat by Deep Blue came up with the idea of human plus machine competitions. In the 2014 championship Freestyle Battle,
human + machine competitors (nicknamed "centaurs") won 53 games compared to the pure AI engines, which won 42. This points to the potential of intelligent human/machine couplings.
Posted by: Ray Manta | February 02, 2015 at 03:26 PM
The Futurist wrote:
The rate of technological change has been considerably slower than its trendline ever since the start of the 21st century
Some have come to believe that significant technological progress has ended, or is close to ending. Not that I agree with them, but their arguments are worth addressing.
Examples:
(1) Charles Murray (author of Human Accomplishment). He believes per capita innovation has been slowing down since the late 19th century. Of course the confounding variable here is the population is many times larger than the 19th century and that innovation works like a ratchet - once something is invented, others copy it and it isn't forgotten.
(2) Gunther Stent. Wrote "The Coming of the Golden Age" and "Paradoxes of Progress" where he predicted the end of technological progress.
(3) Jonathan Huebner has stated that the rate of technological innovation peaked in 1873 and has been declining. He forecasts a medieval (per capita) level of innovation around 2038.
http://www.kheper.net/topics/singularity/critiques.html
My own beliefs are as follows:
(1) I agree we've recently emerged from a period of relative stasis.
(2) Technological progress will eventually come to an end, but not yet. The limits will be determined by a combination of physical laws and economic feasibility.
(3) Temporary slowdowns and impasses may confuse the issue. Sometimes before technology A becomes practical, technology B needs to be advanced to a sufficient level to support it. For example, smartphones would not be practical without the battery technology to enable them.
Posted by: Ray Manta | February 02, 2015 at 04:44 PM
Ray Manta,
Thanks. A few thoughts :
(1) Charles Murray (author of Human Accomplishment). He believes per capita innovation has been slowing down since the late 19th century.
This is clearly untrue, since world annual productivity growth is about 2-3%/year, while it was under 1%/year in the 19th century. Almost 2% of world GDP now is comprised of Moore's Law-type rapid deflation and exponential improvement. So it is certainly not slower than the 19th century, and only briefly became slower than in the late 1990s. Even if there was a peak of innovation, it was 1999, rather than 1873 (human live expectancy also rose much more from 1873-present than in the thousands of years before 1873).
(1) I agree we've recently emerged from a period of relative stasis.
Partly, but I think things will still be slow until the next major recession (~2017) after which various structures thwarting progress will start to see cracks in the dam. By being below trendline for 14+ years now, the reversion back could be pretty dramatic. There are some technologies that are very overdue.
Sometimes before technology A becomes practical, technology B needs to be advanced to a sufficient level to support it.
This is why low nominal GDP (due to government obsession with inflation at the expense of real GDP growth) slows technology, since products that require many technologies get further delayed until the last piece of the puzzle is cost-effective.
(2) Technological progress will eventually come to an end, but not yet. The limits will be determined by a combination of physical laws and economic feasibility.
Perhaps. But it could be much later than we think, and even outlast humans altogether.
There are reasons to believe that the end result is that each elementary particle in the entire universe is adapted into a quantum computer, meaning the entire universe is then saturated with intelligence.
Even if it does not go that far.. it could still be very distant.
Posted by: The Futurist | February 02, 2015 at 11:12 PM
The Futurist wrote:
(on slowing of progress compared to the 19th century)
This is clearly untrue, since world annual productivity growth is about 2-3%/year, while it was under 1%/year in the 19th century
In John McCarthy's pages on the sustainability of progress he states the belief that earlier inventions had a greater impact on life for the average individual than later ones. Of course, this isn't really the same as GDP growth.
His list of inventions is here:
http://www-formal.stanford.edu/jmc/future/comparison.html Please note the update he made in 2008 (he died in 2011). I think there was the realization that there's more to come.
I think things will still be slow until the next major recession (~2017) after which various structures thwarting progress will start to see cracks in the dam. By being below trendline for 14+ years now, the reversion back could be pretty dramatic.
I'm just hoping that the changes won't cause so much upheaval they screw up the lives of a large number of people.
(about the end of technological progress)
But it could be much later than we think, and even outlast humans altogether.
Yes, it could. We may hit one "plateau" after another, culminating in what you suggested. I'd settle for us colonizing the solar system, and then moving on to the rest of the galaxy.
Posted by: Ray Manta | February 04, 2015 at 08:51 PM
Ray Manta,
One thing about John McCarthy's list is that while many things were technically invented in the 19th century, their diffusion on a WW basis was much more recent. For example, electricity reached more people from 1978-present than in the 150+ years before 1978. Same goes for the automobile - more first-time automobile owners worldwide from 1980-present than in the century before 1980.
The second huge omission on McCarthy's part is how much improvement has happened in each invention he lists.
The average 2015 car is, adjusted for PPP, vastly better than the 1985 car and certainly the 1955 car or 1925 car. The cost of telephone calls across oceans went from extremely expensive even in 1975 (let alone 1890) to virtually free in 2015.
That he sees 'inventions' as binary events, and ignores the immense improvement that is ongoing for long after the initial invention, destroys his entire point.
Remember that the US is unusual compared to the rest of the world. For most countries in the world, electricity and cellphones reached the average person only 30-50 years apart (most rural areas in non first-world countries did not have reliable, 24/7 electricity until 1970 or later, and many places still don't)...
Here is a chart of technological diffusion by years since invention (and even this is a US-only chart).
While GDP might be an imperfect metric, the life-expectancy one is pretty close to universal. Things invented in the 19th century are still extending life expectancy today, but things invented more recently are still to affect the bottom 95% of people.
I'm just hoping that the changes won't cause so much upheaval they screw up the lives of a large number of people.
Unfortunately, it will. Governments only react under duress. They rarely preempt situations proactively.
I'd settle for us colonizing the solar system, and then moving on to the rest of the galaxy.
I am pretty convinced that whatever intelligence from Earth colonizes space, it will not be humans, but rather some robots/AI. An AI does not need water or air (only easier-to-provide electric power), and, most importantly, can withstand a much wider band of temperature, pressure, gravity, and radiation than humans, and is more likely to agree to one-way trips. It is much more 'suited for space' in almost every way.
If the AI/robots later create contained environments for humans to live (much as we have fish tanks on land for fish), that is the only way. But otherwise, initial exploration and colonization by humans is just not cost-effective relative to an AI (whether in a robotic body or not). The AI might even be contained in a very small computer, no larger than a pea, thus making it far cheaper to launch thousands of them vs. the craft that humans go up in (that weigh thousands of tons, including the booster rockets).
Posted by: The Futurist | February 05, 2015 at 03:38 PM
Colonize the galaxy with pea-brains?
Someone had to say it.
Posted by: Geoman | February 25, 2015 at 02:41 PM
This is absolutely astounding
http://www.ipsoft.com/amelia/
A digital personal assistant that is able to do all that a human can do, and learns to improve itself.
Am I wrong in thinking that this will lead to disproportionate technological unemployment of women ? It is surely a threat to jobs that women to aggregate in such as receptionist, call centre, PA.
Automation has usually been the unemployer of men,Interesting turn of events.
Posted by: Idiocraties | December 04, 2016 at 08:59 PM
Idiocrates,
That could very well be the case in a few years. I think that technology is not yet near mainstream-marketable quality, but it will not take too long to get there..
Of course, an entrepreneur (male or female) can create a new business with this service eliminating the cost of humans doing that work..
Posted by: Kartik Gada | December 04, 2016 at 09:27 PM
Yes I agree, but in these times a lot can advance in 5 years.
I am aware that the last recession was called the "mancession" as it eliminated many many male jobs in construction, manufacturing and engineering, which were not ever really restored as the economy recovered. A lot of those jobs were permanently lost to automation.
Could it be that the next crash is focused on the automation of the service industry through things like the Amelia bot above? If so, the coming crash might be named the "femcession" - not a good thing, but will help balance the gender pay gap!
Loving your work as ever. Trying to spread the word here as much as I can. Hard to make the predictions seem plausible to the layman not acceleration aware.
Posted by: Idiocraties | December 05, 2016 at 08:03 AM
Could it be that the next crash is focused on the automation of the service industry through things like the Amelia bot above?
Quite plausibly. The more 'formed' a job is, the more prone it is to automation. Women have more jobs to lose at present, from this trend. Plus, while the loss of male jobs was partly due to overt government actions to that end, the losses in the other directions are in fact a pushback against that same social engineering.
Posted by: Kartik Gada | December 05, 2016 at 09:53 AM
This I think is what causes such confusion in people. Everyone can grasp the concept of cars being built by robots, or even google translating language. But when it comes to typically warm, human roles such as a receptionist or secretary, it seems like black magic and will catch people totally unawares.
Posted by: Idiocraties | December 05, 2016 at 02:28 PM
Oh yes. See Chapter 4 of The ATOM : http://atom.singularity2050.com/4-the-overlooked-economics-of-technology.html
Posted by: Kartik Gada | December 05, 2016 at 03:37 PM