« The Impact of Computing : 78% More per Year, v2.0 | Main | SETI and the Singularity »



This is the first step toward thinking machines, toward the "Intelligence Explosion". Human invention and progress will start to follow Moore's Law.

Get creeped out and excited by that.

Interesting. Web 1.0 ran about 12 years. Web 2.0 ran six years. Web 3.0 will run three years (2009 to 2012). Web 4.0 (ubiquitous informational synchronicity) will run just 1.5 years (2012 to 2013). Web 5.0 (direct human computer interface? artificial intelligence?) will happen over just a few months, and then we cross the threshold, whatever that may be.

What will Web 4.0 look like? Ubiquitous informational synchronicity basically means information is available all the time, everywhere, and is continually updated. My car talks to my house, talks to my cell phone, talks to me. Electronic agents take care of much of my personal business - pay my bills, manage my finances, do my shopping. What I want to know, I know, instantly. New information enters the system continuously, and is immediately assimilated.


As someone who has been a professional web developer since 1996, I have to say: you're blowing a lot of smoke. As in 'hype'. Web 2.0 was not that big of a deal, for example.

My retired parents asked me what Twitter was. I said, "Don't bother...its just a fad and the vast majority of new sign-ups never return to that site."

You got to keep things in perspective, folks.



Twitter is hardly representative of Web 2.0. No more than Webvan was representative of Web 1.0.

runescape accounts

I like your articles, you have a great writing style!
Thanks for sharing


Google is developing a semantic web product of it's own.


"One of the more experimental products was called Google Squared, which will go public in the next month or so. It takes information from the web and displays it in a spreadsheet in "split seconds", something Ms Mayer said would normally take someone half a day to do."

"Different tables, different structures, and then corroborating the evidence around whether or not something is a fact by looking at whether that fact occurs across pages."


I think there is another factor out there that will enable this type of engine to achieve great heights in augmenting human intelligence. The push to put raw data and databases online is growing; couple that with tools to take data to information, and information to intelligence and you've got something truly new.
If Wolfram A strikes out by letting the popular overwhelm the true in it's measures of trust it will be disheartening. That's the real key to the web right now: signal to noise ratio.


/bah "it's" == "its"



I was providing an example of the HYPE. All 'Web 2.0' amounted to was to enable folks like me to say some magic buzzwords during an interview to folks who buy the hype in order to get paid more.

Cynical, but true.

Don't get me wrong. There are real improvements. But the importance of it all is quite overblown relative to all of the money from the suckers, er...'investment' poured in.

And you are hearing this from one who stands to directly gain a lot from these 'hype trends' if I play my cards right, too.


"Answer engine." Hrmph.

The examples he gave looked like the ability to recall and correlate data. Mildly interesting but not beyond the ability of someone who is prepared to understand the answers.

I'm looking forward to the release just to see where it breaks. That's not a slam. All these tools have limits and it's fun to figure out where they are.

Here are a couple of questions I will try out on release. (I don't expect the query language to accept and parse the English but I would expect some form of answer.)

Which proteins when represented in the standard amino acid alphabet have the "words" ELVIS and LIVES in them? (From Wolfram's examples I really expect this to work)

What is the object plane diffraction spot size for a camera with aperture 15mm at range 20 m. What is the image plane spot size? (I think the knowledge base should know the formula but I left out important information - wavelength, radius versus diameter and, for the second part, focal length)

Which of the United States have a border with fractal dimension less than 1.2? (I have no doubt that polygon models for the states are in the knowledge base, will it know how to apply some possibly unanticipated algorithm to them).

Once it comes out, we'll see what it can do. I will hope for the best but I won't expect anything too great.


For those of you pouting on how this is not that big a deal, consider: The Model T had a maximum speed of 45 mph, had 20 hp, and had to be started with a hand crank. It could not climb a steep hill when the fuel was low, since the fuel was gravity fed. It had a two speed transmission (three speed if you included reverse). The wheels had wooden spokes. The head lights were powered by acetylene gas. It came in one color, black.

In short, the Model T was repleted with numerous problems, limitations, and disappointments. Yet it was and is considered a remarkable technological break through.

Therefore my response is - if Wolfram Alpha can answer a single plain English question I submit by calculating the results, rather than searching the web for sites that might have my answer, I will consider it a major advance in machine intelligence and interaction with human life. It can come in colors other than black some time down the road...


Exactly. Geoman gets it.

Wolfram Alpha, today, is underwhelming. Some might legitimately declare that it 'sucks'. Much like Netscape had limited use to the broader world in 1994.

But after a few years (and perhaps at a different company)......

Website Design

Thanks for the article. This is very helpful for me. I didn't know about "wolfram alpha".


I've used it and I am not impressed. It's not even that it doesn't work - Aardvark doesn't work all that well and yet I am quite bullish on it. WA is just the latest generation of Lycos and Altavista - it throws massive overhead at an incremental advance over Wikipedia. The problem with Lycos vs. Google wasn't the results, it was the algorithm that relied on cubicle farms of minions indexing websites. WA is basically doing the same thing. I'm not saying it won't morph or lead to something that does most of the work on its own, but this is so far a non-starter.


I have to agree, as a student W Alpha really does 'suck'-sorry.I really thought the thing would be as smart as the main computer on the Enterprise-D :) Hope the IBM/Jeopardy effort has better luck-traditional AI's credibility is at stake as never before.

The comments to this entry are closed.