« Video Conferencing : A Cascade of Disruptions | Main | Mobile Broadband Surge : A Prediction Follow Up »

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d83452455969e20120a4f2f615970b

Listed below are links to weblogs that reference Timing the Singularity:

Comments

Sublime Oblivion

I don't agree with your methodology. It assumes the impact of computing as a percentage of GDP will increase at 6-8% a year, whereas the fact that products to which Moore's Law applies are becoming rapidly cheaper in real terms, will increasingly constrain and eventually reduce their share of the economy - paradoxically, despite vast, low-cost computing power becoming universally available.

Just as GDP was a pretty useless metric for measuring / classifying pre-industrial societies, it will probably become increasingly inadequate for measuring the expanding "virtual world".

GK

S.O.,

Despite the price declines per unit performance, the dollar revenue of products that exhibit Moore's Law has still risen steadily. This is true of semis, storage, software, etc.

I don't think it will reduce their share of the economy. Rather, they will pervasely diffuse through the whole economy, reaching the 50% tipping point by the 2060s.

Geoman

Hmmmm.

Not a bad way of going about the prediction, but.....

The singularity is likely to be an extremely disruptive event. So disruptive in fact that it may either accelerate or decelerate it's own appearance. At the end, the last 10 years or so, the changes will be happening so fast they will be hard to understand or assimilate.

Think of water in a pipe. The flow can increase without a problem. But beyond a certain threshold the flow of the water becomes turbulent. Turbulent flow, even though more energy is being applied, results in a lower throughput than laminar flow. Therefore there is an ideal velocity for water flow in a pipe.

However, there is one way to increase flow. Break the pipe completely. Water flows out in an uncontrolled manner, at a very high rate.

Now imagine society (us), we are the pipe. The water is change. We can tolerate an acceleration in the rate of change only up to a certain point before society itself can't handle the rapidity of the change, and the assimilation rate actually decreases (turbulent flow). If the rate of change keeps increasing, society may break apart, allowing more change to occur, but in all sorts of crazy directions.

Somewhere as soon as 2030 things are likely to get pretty crazy. Whether this helps or hurts the singularity is hard to predict. We could see either a rapid acceleration, or a sudden deceleration in the rate of change.

Heck, who am I kidding? It is already crazy. China and India, after thousands of years, are modernizing in seemingly a blink of an eye. Third world countries are getting nukes. Financial systems are crashing. It is already all around us. The pipe is already showing cracks.

Expect more, much more.

GK

Geoman,

Yes. The point about the prediction wall/horizon at 2040 would support what you are saying. Try as I might, it is very hard to gain visibility beyond 2040, and other futurists are seeing the same wall. The period from 2040-60 could be described as 'turbulent flow'.

That is why I steer clear of any guesses about 'what' it may be like. 'When' is easy, but 'what', by definition, is hard.

Charly

Are you really claiming that the radio (not wireless telegraph but sound) and the plane could be predicted in 1900 for 1930 (or is it 1940, you use both dates)

250 years ago China was the most developed country in the world

China and the UK have had nukes for the last 50 years so it isn't really recently that third world countries got nukes

GK

Charly

1) Yes, futurists of the time predicted these items quite well in advance. Plus, you are cherry-picking anecdotes rather than see the entire change in the world from 1900 to 1930 (which was much, much less than from 1979 to 2009). Plus, you ignored the important diffusion point. The diffusion of the plane and AM radio by 1930 was still very minimal.

2) 250 years ago China was the most developed country in the world

Wrong. China had the largest GDP due to having the largest population. But in per-capita terms, it was far behind Britain, France, and Spain.

I see you don't think citing a source is important in making a claim (an easily debunked one, in fact).

JAM

GK:

This is OT, but maybe in a future post you could look the likely trends in life-extension tech and the costs thereof over time. Although Kurzweil might not make it to the Singularity, I'm hoping that someone in good shape in their early-30s with access to a fair amount of discretionary spending might. ;)

GK

JAM,

A couple of older articles here are about that.

Actuarial escape velocity.

JAM

GK,

Thanks for the link. I suppose it's too early to actually predict what-money-buys-what-in-what-year at this point in time. But making lots of money and ensuring that it grows is never a bad idea in a capitalistic society in any case.

Geoman

Charly,

GK already busted your earlier, points, but I'd add that the UK is not/has never been considered a third world country. I was referring to places like North Korea, Pakistan and Iran.

China was given the bomb by Russia, and probably had very little to do with development of the weapon. Korea, Iran, and Pakistan seem to be (largely) home grown efforts.

GK

Geoman,

Actually, North Korea, Pakistan, and Iran are all with nukes due to China.

China have nuclear weapons to North Korea and Pakistan in order to support China's goals of threatening South Korea, Japan, and India respectively.

North Korea then proceeded to help Iran (and Saddam's Iraq) gain nuclear weapons, which is what make North Korea part of the 'Axis of Evil'.

So all three : North Korea, Pakistan, and Iran are downstream recipients from China. Iraq would have been too if Saddam were still there. Iraq's Osirak nuclear reactor was making weapons, but was bombed by Isreal in 1981.

Charly

The only difference between now and 1979 are the fall of the USSR, AIDS, Internet, mobile phones and the Walkman.

Aids is a force of nature and has nothing to do with humans
USSR is not technology.
Mobile phones is an idea that everybody had since the invention of the radio, if not before.
The Walkman (and the subsequent disk, hard disk and flash music players) came on the market in 1979
Internet precursor minitel went into planning mode in 1977.

Between 1900 and 1930 you had the invention of the airplanes, radio, the assembly line, Relativity, quantummechanics and WWI. Compared wit those the last 30 years look very meagre.

About China, you are talking about GDPpp per capita. Kuwait GDP per capita is also higher than South Korea but is it more developed? There is also the question of GDPpp and if it can even be calculated meaningful. There are those like Bairoch that claim that China had a higher GDPpp than Europe.

Geoman, i was joking about the UK. Their nuke is more American than British. Do they even have the launch codes? A better example would be South Africa and the 200 man which made their bomb.

Charly

GK, now Iraq has to buy them like everybody else form North Korea. Actually the building of nukes isn't that hard. It is getting the nuclear material that is hard. And Pakistan got them from the Dutch because the Americans wanted them to have it.

GK

And Pakistan got them from the Dutch because the Americans wanted them to have it.

More Charly nonsense. Provide a proper source, otherwise your fantasies are exactly that - fantasies.

Private Joker

Hang on, Star Trek is optimistic?

GK

Hang on, Star Trek is optimistic?

Regarding the future of humanity, yes. Do you have reason to think otherwise?

Given that many other science fiction future-oriented franchises are dystopian, Star Trek stands out.

Dave

The discussions I've seen of the timing of the singularity give scant attention to two of the biggest variables: Economics & religion. I don't think you can take economic progress for granted insofar as much of it's performance is based on the political winds. In the decades to come, will we have more, or less government intervention in the world's economies? And will this help or hinder the singularity? What about religion? One can't help but surmise that most religions will be hostile to the idea of the singularity, as it effectively will render one of their most powerful recruiting mechanisms - namely, promises of deliverance from death (in one way or another - reincarnation, heaven, etc.) moot.

GK

Dave,

1) Political interference merely causes economic gains to quickly more to the more favorable economic zone. Look at how the leftward drift of the US has quickly caused a transfer to Asia. The Asian economies have bounced back from this recession very quickly as well. Even within the US, the flow of capital out of high-tax California and into low tax Nevada, Arizona, and Texas is visible.

2) Religions have already been trying to thwart changes, and have not succeeded. It certainly will be no easier for them now. Furthermore, many religions have their own version of an event analogous to the Singularity. Some Catholics say there will be only 2 more Popes after this one (a few decades more, in other words). Hinduism has elements that are compatible with the Singularity.

Ajay

What does it matter when this "Singularity" happens if you have no idea what it might consist of? Such soothsaying reminds me of cults who always warn of armageddon or rapture that's 50 years away, ;) as you admit of catholics and hindus. Since the charted trend has continued unabated for millenia, why couldn't it keep growing exponentially for millenia more? Given the giant size of the universe, we could just keep expanding with space travel for a long time. Two further mathematical related points, a trend continues until it doesn't. There is no reason that an exponential curve can't slow down or reverse, merely noticing an exponential pattern in the past is not an argument for why it will continue: there is no mathematical law of historic exponential growth and only crazies would posit one because the concept itself is stupid. The second mathematical point is that a singularity is a mathematical construction that has no analog in reality, although some make unsubstantiated about black holes, etc. To me, it sounds like the Singularity crowd is merely using the mathematical ignorance of most people to dress up the old armageddon/rapture concept in more modern clothes. All that said, the pace of change is no doubt accelerating and extinction or some breakthrough is a real possibility, but trying to apply such mathematical extrapolation to time its arrival is mostly silly, almost as bad as the idiotic "technical analysis" of markets that many use or astrology.

GK

Ajay,

The prominently displayed chart on paradigm shifts should answer your questions.

There is no reason that an exponential curve can't slow down or reverse,

Why would that happen now, if it has never happened in the past?

You certainly have not given any compelling analysis to support your claims that a Singularity is 'silly', 'stupid', etc., when the article provides a number of different avenues that provide a similar result.

You will have to do better than that.

Ajay

I see, so the chart answers my questions of why an undescribed Singularity would matter and why exponential growth can't continue for millenia more? You must be able to read a lot more into that chart than most. ;) If you really think a trend cannot slow down or stop, you're positing an iron law of exponential growth for human progress, which takes you into the realm of the loons. I wrote a very specific analysis about the incorrect use of mathematical language by you and other Singularists, I can't help it if even such simple mathematical verbiage flew over your head. For example, you do realize that an exponential curve has no singularity? Therefore, charting an exponential curve and then positing a singularity is mathematical twittery at best. The reason I posted my comment is to see if you could counter any of these specific claims, which is what makes it so funny when you can't and then try to excuse your inability by claiming I haven't made much of an argument. XD

GK

Ajay,

All you said is that an exponential trend 'can stop'. I countered that it has never stopped in 4 billion years, so why is it about to stop now? For this, you have no answer.

An exponential curve has a Singularity based on the limits of human perception. Just like human civilization was a singularity that surpassed the comprehension of all non-human creatures. This is clearly explained in the first sentence of the article.

You claim to have offered a 'very specific analyssis', when you have done nothing of the sort. In fact, you have contradicted yourself, since your misconceptions is that space travel is incompatible with a Singularity, even while the article clearly states that it is (given Star Trek as an example).

There are ways that make a strong counterargument against accelerating change, and I could show you how, but you don't appear to be able to do it.

Since all you can do is start off with works like 'loon', 'crazy', 'stupid', 'silly', etc. right off the bat in a forum where cordial respect is the norm. That is classic insecurity and immaturity.

BTW, are you the same Ajay as the 'micropayments Ajay'?

Hervé Musseau

I was wondering what would happen to your calculation if, instead of using the % of world GDP, you used the % of the OECD (alternatively, +BRIC)?

GK

Herve,

It doesn't change much. Note that the consumption of semiconductors in Asia exceeds that of the OECD even now.

Whether regions like Africa can ever modernize or not, however, remains an open question.

Ajay

I don't have to say why an exponential might stop, you have to say why it won't and simply extrapolating from the past is not good enough. Fundamentally, I'm making the point that the future is too uncertain, perhaps the exponential continues for millenia, perhaps a world-wide virus hits and it's reversed for centuries, we can't know what will happen. Your answer about an exponential having a singularity based on human perception is meaningless, more evidence that that term was chosen for religio-mathematical mumbo jumbo. In fact, if there have been several singularities, which is a new use of the term to me that there are several of them, why even bother using that term at all? We could have exponential growth for millenia more, dotted with another "singularity" every so often. My specific analysis was based on the misuse of mathematical modeling and terms because the rest of the argument is obvious: human progress has been accelerating for centuries, with various setbacks along the way, and everybody knew that long before Kurzweil came along. Funny how you claim that the exponential can't stop initially then say you can make a better case for how it might, :) which is it? As for my use of negative words, I have no problem labeling crazy ideas as such, perhaps you should focus on showing that they aren't rather than obsessing over that. Yep, I'm the micropayments guy.

Mark Bahner
What does it matter when this 'Singularity' happens if you have no idea what it might consist of?

The Singularity consists of machine intelligence that is far beyond human intelligence. This creates change that is extraordinarily rapid, which produces outcomes that can't be foreseen.

"Since the charted trend has continued unabated for millenia, why couldn't it keep growing exponentially for millenia more?"

If the charted trend is "major" technological change, and the rate of change becomes on the order of months, days, hours, minutes, or seconds, then there is no way to predict the future.

For example, suppose in 1900 you had tried to predict what the world would be like in 1902 and 1950. It would probably have been pretty easy to predict what the world would be like in 1902, but unless you were extraordinarily gifted, there would be no way you could predict that in 1950 nuclear bombs would exist; there would be no way you could predict that airplanes would be flying across the ocean, there would be no way you could predict automobiles would be so ubiquitous, etc.

Now suppose *all* those technologies came to be in 1901 (instead of being spread out over the 50 years). Then in 1900 you couldn't even predict what 1902 would be like. That's all the Singularity is; it's technological change that's so rapid that even predictions of the near future become impossible.

Geoman

Mark,

You should read the Jule Verne book "Paris in the 20th Century" Written in 1863 it tries to predict the world of 1960. It discusses air conditioning, sky scrapers, gasoline powered cars, high speed trains, the Internet, calculators, televisions, elevators, and fax machines. His other works predicted helicopters, airplanes, submarines, and the Apollo program.

HG Wells had similar success in predicting the future.

I think there is more to it than what you are saying. In the past the predictions were all about more...stuff. The hidden assumption was always that people will still be people. That they will have the same wants and desires, the same basic human needs.

Post singularity this no longer is the case. Humans will change. We do not know, nor can we predict, what will have value, or even what will or will not constitute a human being. Since we cannot predict the nature of humanity, predictions on the stuff end of things tend to flounder. Spaceships that travel at the speed of light? Sure. But will anyone even want to go that fast?

I'd say science fiction, as a literary genre, is dying right now for this very reason. There is no where for writers left to go.

I'd say, that, in the nearish future (next 100 years), everything that can be known about the universe will be known. Every experiment will be complete, every test confirmed.

Every profession will change. What will be the meaning of a journalist, a politician, professional athlete, plumber, actor, a priest, a social worker? Not in 100 years. In 10 years. In 50 years.

Everyone will be an expert at everything, will literally know everything there is to know. What then will be left?

I suspect that there will be vast levels of unemployment, but it won't matter since almost everything you want or need will be free, or as close to free as to be irrelevant.

We shall feast from the horn of plenty until we choke.

Maybe in the future we all blog all day every day. That, in fact, is my best guess. we have one, long, unending discussion.


Mark Bahner

"You should read the Jule Verne book "Paris in the 20th Century" Written in 1863 it tries to predict the world of 1960. It discusses air conditioning, sky scrapers, gasoline powered cars, high speed trains, the Internet, calculators, televisions, elevators, and fax machines."

I will get it. It sounds pretty amazing.

"I think there is more to it than what you are saying. In the past the predictions were all about more...stuff. The hidden assumption was always that people will still be people."

Yes, I agree completely. The fundamental aspect of the Singularity is that machine intelligence equals and then exceeds human intelligence.

Never since the neaderthals have humans shared the planet with beings of equal intelligence or very similar intelligence. And back then, technology was changing pretty slowly. ;-)

Based on Ray Kurzweil's calculations for computer intelligence and the number of person computers sold, I've calculated the number of "human brain equivalents" (HBEs) added to the population each year.

In 2000, the value was only about 10 HBEs. Even in 2010, it's only about 10,000. But in 2020 it's about 50,000,000 (about the same number of humans added). Then in 2030, it's 100 billion. And in 2040, it's 1 quadrillion human brain equivalents added:

http://markbahner.typepad.com/random_thoughts/2005/11/why_economic_gr.html#comments

That's why change is likely to be so fast that it will be hard to predict even the near future.

GK

Ajay,

I don't have to say why an exponential might stop, you have to say why it won't and simply extrapolating from the past is not good enough.

No, I don't. If something has never slowed in 4 billion years, it is not incumbent on me to prove why it would stop now for the first time.

if there have been several singularities, which is a new use of the term to me that there are several of them, why even bother using that term at all?

Because this is the first technological singularity. Also, the first one for which the existing species would be aware of it.

Funny how you claim that the exponential can't stop initially then say you can make a better case for how it might, :)

There are ways to make a counterargument far better than you are doing. That was the point, which was obvious to anyone else.

Yep, I'm the micropayments guy.

Well, you have a track record of multiple bizarre opinions prior to this thread. Among them, the notion that having a $300M net worth is middle class in Silicon Valley.

GK

Mark Bahner,

When are you going to append the economic growth forecasts with the latest decade of data? Your projections are based on data until 2000 but not after that.

GK

Geoman,

In 'Paris in the 20th century', Jules Verne correctly predicts the developments of the next 110 years, but not after that. He did not predict 'Moore's Law'-type computational advancement rates, etc. Thus, from 1863 onwards, the prediction wall for even the best acceleration-aware futurists was 110 years. At least this was much longer than a human lifetime, especially at the time.

Similarly, there is prediction wall today, but it is only 30 years. All the top Futurists in the world have noticed this same prediction wall coming up, and it is within the lifetimes of most people alive today. This is depicted by the lower right-hand corner of the chart in the article. Things get crazy when as get close to that corner.

It is a shame that 'Paris in the 20th Century' is so much less famous relative to 'Around the World in 80 days', given how much more profound the former is. Jules Verne was the spiritual predecessor to both Gene Roddenberry and Ray Kurzweil. HG Wells, by contrast, was not a good Futurist. A good sci-fi writer, but not a good Futurist.

Ajay

GK, you absolutely have to give a reason why the trend will continue, because as I already said, you're otherwise positing a law of exponential progress for humanity, which only fools would suggest and you apparently seem to be. First off, the historical trend is not quite as neat as your graph purports: there have been setbacks where the black plague and other diseases slowed down the curve, obstacles which can be obfuscated in that chart in many ways, such as carefully choosing data points to evade that reality or because an exponential graph covers so many orders of magnitude that detail is lost. Second, any historical trend can end, you yourself pointed out a couple posts ago that the aviation trend flattened out 50 years ago. Perhaps the most famous recent exponential trend, Moore's law, is hitting real limits, such as transistors increasingly approaching the size of molecules and clock speeds having flattened out this decade because of heat problems, Moore's "law" may peter out in the next decade. This decline can be mitigated somewhat by better microarchitectures and simply building bigger chips as they're doing now, but it's very likely that the Moore's exponential will flatten out long before your suggested "singularity." The fact that you cannot give any actual reasons why such limits will not apply in the coming century suggests that you're ignorant of the real issues involved with this highly complex topic.

You can talk around the use of the word singularity all you want but it's clear why it was chosen, to play on the word singular, meaning unrepeatable, and the connotation of infinity, similar to how religions always warn of an upcoming rapture that's always 50 years away. I would have no problem with the singularists if they used a more reasonable term like "phase shift," which is actually reasonable, but that would belie their whole purpose of talking up the well-known acceleration of human progress but combining it with some sort of transcendent quasi-religous rapture, and all the mumbo jumbo that entails.

As for the quality of my arguments, considering that you're incapable of making an argument for your own case, pretty funny that you say you can make my case, perhaps cuz my case is stronger? ;) As for my previous statement about Silicon Valley, in a place where $10 million means you're a nobody, $300 mil is middle class, particularly when Elon has basically shown he was lucky the first time cuz he's now running around looking for govt bailouts for his space and car companies.

Jonathan

One can't help but surmise that most religions will be hostile to the idea of the singularity..."

Actually, I find that the vast majority of religious people really aren't that religious. It is more a social and death-fear thing. When the prospect of true immortality in front of the masses, can you guess which choice they will eagerly devour? I know I can: the one that abandons religion. It will be a very good day for humanity.

Mark Bahner

"When are you going to append the economic growth forecasts with the latest decade of data? Your projections are based on data until 2000 but not after that."

Oh brother. Not again! :-(

GK, isn't it obvious that the time to append the economic growth forecasts with the latest decade of data is AFTER the decade finishes???

Specifically, the data I present in my analysis are for the years 1980, 1990, 2000, 2010 etc. Therefore, the EARLIEST that it would be appropriate to "append the economic growth forecasts with the latest decade of data" would be AFTER the 2010 data have been published. That will obviously not be until 2011.

Further, your comment about my projections being based on data until 2000 but not after that indicates that you don’t really understand my projections and the basis for them.

Here were my predictions in 2003 and 2004:

Time..........Annual P/C GDP growth...Annual P/C GDP growth
Period..........Oct 2004 prediction......Dec. 2003 prediction

2000-2010................3.0...........................2.5
2010-2020................3.5...........................3.0
2020-2030................4.5...........................3.5
2030-2040................6.0...........................4.0
2040-2050................8.0...........................4.5
2050-2060...............11.0...........................6.0
2060-2100........the differences keep growing...!

Even if the value from 2000-2010 is less than 2.5 percent per year, I still wouldn't reduce the prediction for 2010 to 2020. In fact, based on my 2005 calculations of the number of human brain equivalents added, I think the "knee" in the curve is going to be even sharper. I wouldn't be surprised if the per capita growth rate for 2020 to 2030 is more like 6 percent per year, rather than the 4.5 percent per year I predicted in 2005.

Mark Bahner

"Jules Verne was the spiritual predecessor to both Gene Roddenberry and Ray Kurzweil. HG Wells, by contrast, was not a good Futurist. A good sci-fi writer, but not a good Futurist."

Gene Roddenberry was in no way a "futurist." In Gene Roddenberry's "future," all human beings have hydrocarbon brains and hydrocarbon bodies, even in the 23rd and 24th centuries.

Mark Bahner

"Perhaps the most famous recent exponential trend, Moore's law, is hitting real limits, such as transistors increasingly approaching the size of molecules and clock speeds having flattened out this decade because of heat problems, Moore's "law" may peter out in the next decade."

In 1997, Hans Moravec predicted that $1000 personal computers would have the hardware capability of a human brain (which he estimated at 100 trillion instructions per second) circa 2022:

Moravec's 1997 prediction

In 2005, Ray Kurzweil predicted that $1000 personal computers would have the hardware capability of the human brain (which he estimated at 20 quadrillion instructions per second) circa 2020.

So there isn't much time that Moore's Law needs to last. Even at $1000 per (hardware) human brain equivalent, that means that 10 percent of the world's GDP (which circa 2020 should be about 70 trillion) would be $7 trillion. That would add 7 billion human brain equivalents--as many as the current population--every single year.

That would be a tremendous amount of brainpower attempting to overcome any (hypothetical) slowdown in Moore's Law.

GK

Mark Bahner,

GK, isn't it obvious that the time to append the economic growth forecasts with the latest decade of data is AFTER the decade finishes???

There are 2010 consensus projections from the IMF, World Bank, etc. which at most would vary by far too little to throw off your analysis, and should thus be plugged into your analysis.

The point is, an update once per decade is too infrequent, particularly given that your projections for this decade were far too optimistic.

Further, your comment about my projections being based on data until 2000 but not after that indicates that you don’t really understand my projections and the basis for them.

I understand them all too well. The actual GDP results for this decade were already far lower than your highly optimistic projections, which alone is enough reason for you to do an update.

I wouldn't be surprised if the per capita growth rate for 2020 to 2030 is more like 6 percent per year, rather than the 4.5 percent per year I predicted in 2005.

Now THAT is far too quick. Arnold Kling backed away from his super-optimistic forecasts from before, but you are revising them upwards, when the 2000-2009 actuals (+2010 consensus estimates) indicate the clear need to do the opposite?

So there isn't much time that Moore's Law needs to last. Even at $1000 per (hardware) human brain equivalent, that means that 10 percent of the world's GDP (which circa 2020 should be about 70 trillion) would be $7 trillion.

Where on Earth do you get the notion that semiconductors will be 10% of world GDP in 2020?? So in 11 years, the semiconductor industry will grow from $250B today to $7000B? Absurd. It is 1.5% or so of world GDP today, and will not get to 10% until the 2040s (cracking 50% by the 2060-65 period, as per the table).

Mark and Ajay represent opposite extremes - the former assumes that incredibly optimistic upticks in growth are about to happen within a decade, while the latter insists that the rate of acceleration will slow down and that there will be no Singularity. Both are ignoring the basic evidence that shows them to be very far off from the rational mean.

Ajay

Mark, first let's assume AI is even possible, which I don't concede but I'll assume it for the sake of argument. Then, the real problem becomes software. Regardless of how capable the hardware is, it's taken a long time to get even rudimentarily "intelligent" software going, so you underestimate our human stupidity in not being able to figure out how to create anything even approaching AI. Also, you assume that once AI is here, they can somehow find ways around physical limits for Moore's exponential like the ones I pointed out, I don't. If AI is possible, progress will be fantastically faster than human beings but not infinitely so, just another much faster speed that is still bounded by the speed of light and other real constraints. The world will just settle into this faster pace, just as when we humans started thinking up stuff faster than any previous mammal. It still took us millenia to get where we are today.

GK, uh, no, that's not what I said. I said that the concept of a singularity is stupid and that it's very hard to predict if we get hit by a world-ending virus or dirty bomb first or if some kind of phase shift happens at some point far into the future.

Mark Bahner
There are 2010 consensus projections from the IMF, World Bank, etc. which at most would vary by far too little to throw off your analysis,...

OK, so why don't you tell us all what the world per-capita GDP, purchasing power parity adjusted, is going to be in the year 2010 (in year 2000 dollars)? And using the same source(s) tell us what the value was in 2000 (again, PPP adjusted, in year 2000 dollars). Then tell us what growth rate you calculate for the period.

The point is, an update once per decade is too infrequent, particularly given that your projections for this decade were far too optimistic.

My projection in 2004 was 2.5 percent per year for 2000-2010, and in 2005 I raised that projection to 2.0 percent. In the 10 years from 1996 to 2005, in how many years did the growth in world capita GDP (purchasing power parity adjusted) exceed 2.75 percent per year (the midpoint of the two projections)? Also, what is your calculation for the actual growth rate in per-capita GDP for the period 2000-2010?

The actual GDP results for this decade were already far lower than your highly optimistic projections,…

If you think my so, you should provide your calculation for the actual growth rate for 2000-2010, plus your estimate of the number of years in that period that were above my projections, versus below my projections.

Now THAT is far too quick.

What do YOU project rate of increase of the world per-capita GDP from 2020 to 2030?

Mark and Ajay represent opposite extremes - the former assumes that incredibly optimistic upticks in growth are about to happen within a decade, while the latter insists that the rate of acceleration will slow down and that there will be no Singularity. Both are ignoring the basic evidence that shows them to be very far off from the rational mean.

GK, you obviously think you’re pretty hot stuff. Luckily, there’s a scientific way to figure out if you’re right. Provide your estimates of world per-capita GDP, purchasing power parity adjusted, in year 2000 dollars, for the years 2000, 2010, 2020, 2030, 2040, and 2050. Or alternatively, provide your estimates for per capita growth rates for each decade to 2050 (again, adjusted for purchasing power parity, in year 2000 dollars). Then we'll see what you know or don't know.

Mark Bahner
Mark, first let's assume AI is even possible, which I don't concede...

Does it take intelligence to: play chess? drive a car? fly an airplane (including taking off and landing)? mow a lawn? build a house? diagnose an illness? play soccer? play the TV game show Jeopardy (against human beings, on TV)? design and build faster computers?

Of those items above that you think require intelligence, which of them do you think a computer will never be able to do?

Also, you assume that once AI is here, they can somehow find ways around physical limits for Moore's exponential like the ones I pointed out,

Of all the brains in the world, integrated circuits are the only ones in only two dimensions. So going to three dimensions seems logical. As I mentioned, both Hans Moravec and Ray Kurzweil think $1000 computers with with human brain hardware capabilities are within 10-15 years from being reality. An analysis from Intel in 2008 estimated they can go to 2029 even with the current lithography (2 dimentional) techniques.

It still took us millenia to get where we are today.

Yes, because the human population didn't even get to 6 billion people until ~2000. What do you think the progress would have been if we'd been adding 6 billion people a year for 50 years? (People who never sleep, and only eat electricity.)


Ajay

Mark, interesting question about what intelligence those various activities require, I'll note that only chess is done well by computers today and in such a limited way that the "intelligent" software used is not even applicable to the other activities. I'll also note that many of those activities depend on vision recognition and mobility, technology that will have to be invented separately and will take longer. It's not particularly useful to say which of those a computer can't do, nor do I have an argument against any one of those. However, to be a true AI, it'd have to be able to do ALL of those, or at least the intellectual parts. As for expanding circuits into three dimensions, that's simply the alternative I mentioned earlier of building bigger chips, as we're seeing with larger die sizes and multiple cores already. However, there are real fundamental limits such as electron velocities in transistors peak out at around .01-.001 times the speed of light and of course optical transistors would be limited by the speed of light. One can use these fundamental limits, including the thermal limits from clock speeds mentioned earlier (how do you get heat off a 3D chip when we can barely get it off a planar chip today?) and the fundamental size limit of a molecule (1 nm-1A), to extrapolate real limits on the amount of computation possible within a block of material. Finally, as I mentioned before software is way behind the hardware, so that's the real bottleneck. As for adding 6 billion AIs, I'll worry about that when we can do one, ;) if it's even possible. I don't think we'll get to it anytime soon, meaning the next couple of decades or more.

GK

Mark Bahner,

What do YOU project rate of increase of the world per-capita GDP from 2020 to 2030?

I have already done this, in very high detail. A tiny amount of effort on your part would have found this prominently displayed article.

Until 2020, not many people will differ - even Arnold Kling and Jesse Ausubel are not that far apart. My problem is with your extremely steep spike after 2020 (particularly since you are talking of making that as high as 6%, with the assumption that 10% of world GDP would computation, vs. 1% in 2009). Your December 2003 analysis was more reasonable. It is after that where you got too extreme.

The presumption that all the human-level AIs will work towards producing knowledge to boost the human metric of GDP is also a very big assumption.

Then we'll see what you know or don't know.

What makes you a qualified arbiter of this, when you will not even update your own analysis, which was the single big analysis on your whole blog, that is over 5 years old?

Mark Bahner

GK,

I did an analysis in October 2004 that started with a world per capita GDP value in 2000, purchasing power parity adjusted, of $6,539 (in year 1990 dollars). That's from Brad DeLong's work.

Then I assumed that the world per capita GDP would grow by 3.0% per year from 2000 to 2010. That would produce a per-capita GDP, PPP adjusted, of $8,790 (year 1990 dollars).

You, on the other hand, started from 2007 with a world per capita GDP of $10,000 (year 2007 dollars) and assumed a growth rate of 3.5% per year from 2007 to 2010.

So why aren't you pestering yourself to update your analysis for the year 2010? You're obviously going to be too high in 2010.

And for the period 2010 to 2020, we both have the exact same value of 3.5 percent per year. So again, why in the world are you pestering me to update my analysis? You and I are both probably going to be too high in 2010, and we both have the EXACT SAME RATE from 2010 to 2020!

Also, I wrote, "Then we'll see what you know or don't know."

You responded, "What makes you a qualified arbiter of this, when you will not even update your own analysis, which was the single big analysis on your whole blog, that is over 5 years old?"

I wrote, "we," not "I." By "we" I meant that everyone in the world can see whether you know what you're talking about (and whether I know what I'm talking about), simply from observing the actual trend in world per-capita GDP growth (PPP adjusted) over the 21st century.

Here are our predictions (mine in October 2004, yours in July 2007) for world per-capita GDP percentage growth rates, PPP parity adjusted, in constant dollars:

Years.........MB_rate.....GK_rate(%)

2010-2020......3.5%.........3.5%

2020-2030......4.5%.........3.75%

2030-2040......6.0%.........4.5%

2040-2050......8.0%.........5.5%

Obviously, mine starts to be higher than yours beginning in 2020. But we both predict a trend of increasing growth rate as the decades pass.

GK

Mark,

Again, I think your December 2003 prediction is the one I essentially agree with (as well as with the unspoken conclusion that after 2060 or so, it is hard to make estimations). The October 2004 upward revision, followed by a still further increase of the 2020s to 6.0% a year, is too much.

The reason it is too much post-2020 is because you are not only estimating human-level AI too soon, but you are assuming that these AIs will happily work towards increasing human wealth. The basis for this belief is unclear.

So both the timing and the utility of these human-level AIs is far too optimistic in your analysis. The notion that 10% of world GDP in 2020 would be in computing is off by a huge amount. It will be 3% at most, by then.

Mino

Bullshit

Mino

It rains axioms.
AI research basically endend in the late 70's. The hardware is becoming faster and faster, enabling things like pattern recognition on normal PCs. But there is really not a single algorithm that mimics human intelligence, yet. Only very specific things, like playing chess or ping pong.

I can even admit that a computer with the same raw power as an human brain will appear in the next decades, but that doesn't mean that it can replace a human brain with all its functions.

We don't really know very much about the brain. An autistic brain doesn't differ from a "computational power" view from a normal brain, but results are vastly different.

BTW, don't singularity futurists find this concept a bit frightening?
I mean, imagine a thinking machine, wired through the internet with billions of others. What will happen? They will produce another culture or a new religion? There will be wars beetween computers, or beetween computers and humans? Or computers will obey the 3 laws of robotics?

To put it mildly this singularity thing seems a bit bogus.
BTW, anyone remeber when a weekend on the moon was considered feasible by the year 2000. Yea right. Or the man on Mars, nuclear fusion, or a gazillion of other things.

Mino


We still rely on fossil fuels, but we can access porn from a smartphone. And we still need to go to work on the morning with a car not dissimilar from that of our father.

Really guys, look how our father and grandfathers lived. They lived like us basically. Only slower. Grandpa used a car, a tractor, a phone. He didn't have access to internet porn unfortunately.

All the singularity thing is based on the assuption that computers will be capable of emulating a human brain, and there is not certainty in that. And in the next few year we will have a supercomputer with the same power of our brain. And I could bet my ass that it won't become self-aware.

Now I'll tell you something. The best candidate for a radical change in humanity wouldn't be a computer, but a clean, cheap source of energy. And it would not be solar, wind, or even "hot" nuclear fusion (it's simply too complicated too work).

Mark Bahner
Again, I think your December 2003 prediction is the one I essentially agree with (as well as with the unspoken conclusion that after 2060 or so, it is hard to make estimations). The October 2004 upward revision, followed by a still further increase of the 2020s to 6.0% a year, is too much.

The October 2004 revision upward, and the still further increase to 6.0% per year in the 2020s that I guesstimated a few days ago, are much more supported by the analysis of human brain equivalents that I performed in 2005.

As I noted in when I was performing that analysis, the cause of economic growth is (free) human minds. So the more (free) human minds there are the faster economic growth will be.

My analysis (based on hardware, ignoring software) estimated the following number of human brain equivalents (HBEs) added each year in the following years:

2015: 1 million
2024: 1 billion
2033: 1 trillion

In my mind it is simply not credible to be talking about adding more than 1 trillion human brain equivalents every year and not have absolutely spectacular economic growth.

The reason it is too much post-2020 is because you are not only estimating human-level AI too soon, but you are assuming that these AIs will happily work towards increasing human wealth.

1) So what are your estimates for the years in which computers add the equivalent of 1 million, 1 billion, and 1 trillion human brain equivalents per year?

2) By "working happily towards increasing human wealth" do you mean that the computers might actually rebel and attempt to destroy humanity? If so, I've always made it clear in my economic predictions that they did not include the possibility that computers would attempt to hurt humans. That would be an incredibly serious (even apocalyptic) problem, if self-replicating computers decided that they wanted to hurt humans. Otherwise, it seems to be a reasonable assumption that computers will be used to improve human welfare (e.g. drive cars and planes, take and serve fast food orders, build houses and roads, etc.), rather than to harm it.

GK

Mino,

You clearly do not comprehend the accelerating rate of change, and how the next X years will have far more change than the previous X years.

Hence, your comments are not enlightened.

Mark Bahner,

In my mind it is simply not credible to be talking about adding more than 1 trillion human brain equivalents every year and not have absolutely spectacular economic growth.

The big assumption here is that these 'human brain equivalents' will function in the same economic way as present humans. There is no basis to support this assumption.

based on hardware, ignoring software

Aha! But you cannot ignore software. Computer power has risen 100X from 1999 to 2009, but productivity has not risen as much. Why? Software is one of the reasons, and software continues to lag hardware.

you mean that the computers might actually rebel and attempt to destroy humanity?

Maybe not directly, but they could take the world in a direction that precludes human survival. Much like habitat destruction is killing off elephants indirectly. Except that this could happen very quickly (years) with humans being unable to adapt quickly enough to survive.

Again, I am generally in agreement with your December 2003 numbers, so whatever you thought until then has more corroboration that what you think now.


Mino

The problem is not accelerating change. I know how exponentials work.
The problem is that singularity people take strong AI as granted.
And we don't have good speech recognition yet.

I don't know if true AI is possible, in theory is not impossible, but really, Kurzweil predicts the singularity by 2045.
Ten years prediction are already hard, and despite what true believers say, Kurzweil fails hard at predicting events, even in a 10 year frame.
So I'm quite confident that his prediction for 2045 is bogus.
Regarding the singularity, to be honest, it scares the hell out of me, even if I'm pretty sure that I wont see it in action.

Maybe it's the heaven for the nerds, but people like me don't see how a future when people will live as cyborgs or within computers could be good.

Finally, if we could obtain AI smarter than us, the outcome is uncertain. There are possible scenarios, one of which is that described by Kurzweil and co.

Also, if you tink about it, the singularity scenario seems mutually exclusive with other civilizations in the Universe.

GK

Mino,

I take the view that The SIngularity need not be only through the AI path. It could be an extreme acceleration of human evolution (much like the emergence of successively more advanced humans happened in times that were orders of magnitude shorter than the evolution of, say, bears or mice).

So I'm quite confident that his prediction for 2045 is bogus.

I agree. Note my methodology for the much later 2060-65 prediction.

Also, if you tink about it, the singularity scenario seems mutually exclusive with other civilizations in the Universe.

Aha! Then you will really like my SETI and the Singularity article.

ces

GK --

You need to look at RAM. The overall revenue of ram drops now and then.

The Dec. 16 Gartner report found that the world’s semiconductor companies will collect $219.2 billion in revenue in 2009, a decline of more than 16 percent compared to 2008. Last week, Gartner published another report that found preliminary semiconductor revenue for 2008 is estimated at $261.9 billion, a 4.4 percent decline from 2007.

Cs

ces

"""But there is really not a single algorithm that mimics human intelligence, yet"""

Malarky. Whether or not one mimics a human isn't important. Whether the computer achieves equivalent or better results is what matters. Google has at least one human brain equivalent. Go google something. That computer plays "who wants to be a millionaire" or "jeopardy" a heck of a lot better than any human.

What we've basically figured out is that statistical analyses of huge quantities of data allow computers to produce human intelligence. It works for web search, it works for chess, it works for language translation.

GK

ces.

The trendlines are important, not the individual spikes of boom and bust years.

sg

What if tech surpasses humans and it isn't a problem or disruptive or anything we are all so anxious about? What if it is anti climatic? Will it be a let down?

Zach

GK, I recently came across a video on youtube that I thought was very interesting. I also thought that it would tie in nicely to your blog about the Singularity.

http://www.youtube.com/watch?v=4Q75KhAeqJg&feature=fvw

The world is already starting to change. The path to the singularity is upon us, and it would seem we are in for one hell of a ride! What an interesting time to be alive!

Jean-Louis Trudel

So, the Singularity is due in 2060, but possibly as late as 2075.

When did you say you were born?

Mel

"Part of this is because he insists that computer power per dollar doubles every year, when it actually doubles every 18 months"

How do you know it doubles every 18 months?

GK

Mel,

Surely this is not the first time you have heard about Moore's Law...

Mel

I was actually asking if you have a source that shows that computer power per dollar doubles every 18 months instead of per year like Kurzweil says.

GK

Google 'Moore's Law' for sources. Sometimes they even say a slower doubling, of 2 years.

Also, anecdotally, refer back to every PC you have ever owned, and see how much RAM it came with at purchase. Plot the RAM on one axis, and the year on the other. You will see an 18-month doubling.

Savethemales

I need to chime in GK,

Moore's law will end by 2020 and begin a new paradigm shift. We will be seeing 3-D chips and a doubling of every year. Ive read alot about this amazing futurist Kurzweil and assume you have done the same. He points out that technology is growing at a double exponential meaning there's an exponential growth in the exponential itself. In the past, it took 5 years for computers to double in speed, now it's 15-18 months. The doubling will shrink to 9 then 8, then 7 then 6, etc.

Ray points out that many people think linearly(10, 20, 30, 40, etc) while others think computers will never double at a rate faster than 18 months and soon slow down in that rate. They have been wrong before and are wrong now. I think the singularity will happen by 2050. That's also when a $1000 computer will be as smart as all human brains.

I am also optimistic about intelligent life. Gray goo and unfriendly AI are two major concerns. Humans will merge with machines, making unfriendly AI alot less likley because we will be part or completely machine. It's those humans who choose to stay biological that possibly have to worry. Will they be forced to assimilate or simply be allowed to go extinct?

As for gray goo, that's the worst of the two concerns. It won't be so bad if it does happen after intelligent life has already begun it's expansion across the universe. Then all matter in the universe will become smart. If gray goo also expands across the universe, all matter will remain dumb with little or no chance of ever becoming intelligant unless the gray goo itself evolves.

Anymoose

Savethemales,

Uhh, no. Processing speed has been doubling every months almost completely consistently since integrated circuits were invented in 1958. In fact, after a certain point, the 18 month doubling time prediction revealed by Moore in his 1968 paper, was shown to be so accurate, that it was adopted by the semiconductor industry as a guideline for technology development. That's what the "roadmap" white papers, outlining short term technology goals for CPU design, are based on. So if Ray said that the computer power is increasing at a "double exponential" as you claim, then he was being completely dishonest.
In fact, we're actually starting to reach limits in terms of shrinking lithography, so processing speed growth rate will start to slow down, if we continue using the same planar chip design. Technologies like 3D chips and other improvements will merely ensure the continuation of Moore's Law, or maybe cause a slight improvement in the growth rate. But there won't be any growth in the growth rate, not for any meaningful time anyways.

Grey goo is not a realistic concern.

AI is more of a concern. I don't understand people who claim we won't be able to produce AI in silico. There are only two alternatives to this possibility: consciousness, or even merely human intelligence, is based on specific quantum mechanical effects, not on our brain's neural networks. In other words, the exact substrate in which the brain is implemented in is critical. This view is held by physicist Roger Penrose, but few other people (especially most physicists and neurobiologists) take it seriously, and I myself find it highly doubtful. But even if it is the case, it doesn't necessarily preclude AI, just greatly complicates things.
The second possibility is that intelligence is routed in some sort of spiritualistic dualism, ie: a thinking soul. This crowd doesn't strike me as the type that would believe in this possibility.

If the reality of the mind is neither of these possibilities, then intelligence is the product of neural connections and processing. The actual material implementation is unimportant. In that case, we may be able to simulate a human brain or a greater intelligence by simply coding the neurological network of brains in silico. We understand the basic properties of neurons and synapses, and part of the structure of the brain. Even if we never understand the actual cause of intelligence, we can produce intelligence by copying the structure of the brain.
In the unlikely possibility that copying the structure of the brain on the cellular/neural level doesn't produce an intelligence, we can accomplish the task by simulating the brain at a molecular level. Simulating each molecule of each organelle of each cell, and then each cell, etc, up to the entire brain macrostructure. I don't see how any possibilities exist out of these cases. Either intelligence is non material or based on the specific implementation, or we can produce AI by simulating a brain, regardless of if we know how intelligence actually functions. Of course, if we have to simulate at the molecular level, then we probably won't have AI for many decades or centuries due to the astounding computing power required, and in all likelihood the brains will be less efficient and larger than human brains, but nevertheless.

Anymoose

More important than the increase brought about by Moore's Law,however, is the offset caused by the difficulty in developing more complex software for the faster processors. (some people here already alluded to this) I quote:

In a 2008 article in InfoWorld, Randall C. Kennedy,[32] formerly of Intel, introduces this term [The Great Moore's Law Compensator (TGMLC), or bloat] using successive versions of Microsoft Office between the year 2000 and 2007 as his premise. Despite the gains in computational performance during this time period according to Moore's law, Office 2007 performed the same task at half the speed on a prototypical year 2007 computer as compared to Office 2000 on a year 2000 computer.

Maybe a word processor isn't the best metric on developments in the software field, but it serves as an example that maybe things aren't so simple as "the gazillinth derivative of developments of every technological field everywhere is increasing superfast"

Michael howell

There's more to Singularity studies than Kurzweil though. One of the most interesting takes on it is the "Intelligence Explosion" concept being promoted by

cesium62

There's another approach toward Timing the Singularity. You can look at worldwide aggregate processing power, combining human processing (brain cells) and transistors. Moravec provides a rough conversion factor between transistors and brain tissue: about 100 million mips per human brain. Hilbert and Lovec surveyed world computational capacity
http://www.uvm.edu/~pdodds/files/papers/others/2011/hilbert2011a.pdf
and arrived at a figure of 1e13 mips worldwide, or 1e5 human brains worth of compute capacity in 2007. They also showed that worldwide computer processing capacity is growing by about 60% per year.

We won't hit a singularity until digital computing capacity is about as large as human computing capacity, and probably not until digital computing capacity dominates. So it should take about 1 million times as much digital computing capacity to hit the singularity: 2037.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment