« ATOM Award of the Month, March 2017 | Main | ATOM Award of the Month, April 2017 »

Comments

fatcat

Well, it could be that the change is too fast. But that’s because there are irrational misallocations. I have bought a hi-def TV 6 years ago. It is still good. In the meantime the 3D fad came and went, 4HIDEF TVs and smart TVs are pushed but still there is no compelling case to switch (no content, the hi-def is already good enough, the smart TVs/IoT are a security disaster). The PCs haven’t improved much, so the old laptops/desktops are not obsolete yet. Good enough. On top of that there are only 24 hours in a day, and much less for the working people. I don’t really use my laptop, tablet; probably once a month. I use my desktop once/twice a week. The rest of my screen time is on my cell. So basically, the smart-phone has dematerialized somehow the TV, the desktop and laptops/tablets.

fatcat

On top of that a family of two would pay two mobile and one broadband data plans, which alone can eat into the $2000 annual budget. Then you have the periodic upgrades. I would say that even a conservative household will go over 2K/y just to replace the failing equipment and maintain data plans. with diminishing benefits.

fatcat

And to make the long story short:
• Yes it costs too much to do constant upgrade and maintain the all the subscriptions
• Some of the home electronic gets de-emphasized, if not dematerialized by ubiquitous smart phones
• The existing household electronic devices are not becoming obsolete fast enough (Which means that the progress is in fact too slow)

Kartik Gada

fatcat,

Even with delayed upgrades, there is no avoiding the costs. Eventually, an old PC will fail, or become to slow when using the Internet. A TV is one of the few things that can last a very long time.

Data plans, as you point out, are often the largest cost of all.

Products 'not becoming obsolete fast enough' is indeed a problem as well. The reason for this is low Nominal GDP (see Chapter 4 of the ATOM). The rate of technological progress has, for years, been at just 60-70% of the trendline rate. This will snap back eventually (toppling the obstacles that are inhibiting it).

This growing paradox is exactly why ATOM-DUES is precisely the solution that is needed for the modern economy. ATOM-DUES rises at 16-24%/year, creating a far more robust demand base for technology products (and speeding up innovation and product cycles as well).

Geoman

This has always bugged me about the singularity - at some point it predicts that individuals will have infinite consumption. That is obviously not going to happen.

Until recently I used a 8 year old computer. It was a bit slow, but I had upgraded it, and it still did everything I needed. Better was available, but what I had was good enough. Like the TV example above, there was no compelling reason to switch.

In some ways, the ATOM reduces consumption - better products, that last longer, will reduce consumption. This will save the consumers money, but will eventually deflate the ATOM itself. Think of LEDs - assume we invent something even better, after everyone has made the switch. Given LEDS last so long, will there be a compelling reason for most people to switch to the new thing?

The ATOM deflates everything, including the ATOM.

Kartik Gada

Geoman,

I believe that a lot more products would arise, generating a lot more demand, if an ATOM-DUES were implemented and the low-NGDP suffocation were corrected.

There is still a vast range of possible products that could be here, if the NGDP tailwind were stronger and the demand base were more robust (through a DUES or equivalent)..

For example, VR is years behind, and still costs $1200+ (including PC upgrades) if one wants to get a full Oculus or HTC Vive system. So there is a lot of room there.

Geoman

well, yes...but...what if the need for VR is obliterated?

In my example, the need for lighting better than LEDs is largely nonexistent. Additional improvements yield very little additional demand.

The problem with VR is that there does not appear to be some overwhelming demand for VR. Just like there is not enough demand for cars that go 200 mph, therefore, such cars remain a high priced luxury item. Or for 3d TV.

Large screen HD is pretty damn good. Strapping on some sort of goofy helmet just doesn't add much to the experience.

I could invent a machine that cuts hair. In fact, it could cut hair so well, that it would reduce my time in the barber's chair by 10%, and cuts cost by 25%. This would be a ATOM-like improvement, saving billions over the years in costs and time. Problem is, I only go to the barber once a month, and it doesn't cost that much anyway. So it is unlikely to catch on.

I think computers for the home are in fact "topping out" that is, most computers now do most of the things most people might need them to do. They will get better in the future, but people will swap out the old ones when they fail, not because they need something better. That will slow the tech train considerably.

There are still areas of rapid innovation - batteries and solar panels for example. But others areas, cell phones, computers, VR, TVs, I expect to see only incremental improvement. Not inconsequential mind you, just not the big surge we have seen in the past.

If the past is any indicator, the big leaps happen in areas where we have lagged behind the furthest. I expect mining is going to be a big change in the near future - robotic mining equipment, subsea mining, asteroid mining. We will have a world with nearly unlimited sources of every element. Which opens up strange possibilities for manufacturing. Much of what we make is based on what we have available. Low cost, high volume gold or what not will change all sorts of things in interesting ways.

Kartik Gada

Geoman,

1) For LEDs, cost per lumen and lumens per watt continue to improve. Many people will not upgrade except very slowly, but improvements continue nonetheless, which may create new applications (street lights in more places + brighter = lower crime, etc).

2) VR demand will certainly arise once the hardware price drops and more content becomes available. In 1976, it appeared that demand for video games was very low. I would say VR is how humans will get satiated with a simulation of infinite consumption that satisfies their brain.

3) Hair-cutting robot. Yes, that would be an improvement that saturates. This is not a problem because entirely new services and products arise.

4) Computers in the home : Yes, but there was a time when an entire family had one. Now a family may have six. Plus, VR is forcing PC upgrades. Technology is forcing upgrades of perfectly drive-able 10-yr-old cars as well.

5) Big leaps certainly do arrive where the lag is largest. That is one of the core tenets of the ATOM, and enables some degree of estimation of where there is more 'pressure' of disruption being exerted. That is why healthcare, education, and government are under the most pressure, with nascent alternatives 10x to 100x cheaper.

6) Note that the bulk of ATOM diffusion, at present, is still in emerging markets, where they skip several past stages of technology and go directly to the modern version (e.g. going from no telephone to a smartphone).

That said, I do think that when artificial intelligence can truly advance without any human participation, then technologies that improve human living standards will plateau. We are nowhere near that level yet, but it could arrive by mid-century (aka the Singularity).

Geoman

Well, I would say the problem with VR isn't cost (I have seen set-ups that work with your phone - what could be cheaper?)the problem is there is not yet a compelling reason to buy them.

AI is a key to a lot of advancements. But I suspect AI will creep in slowly, diffusing over time. With LEDs it is simple - you have a not efficient product that fails after 2-3 years and needs replacing anyway, why not switch to LEDs? Especially if it saves money, yet does exactly what the old product did. There is no need for cultural displacement.

I have news to most people - AI is already highly deployed in our society, we just don't know it. If you own any stock at all, you are currently working with AI programs. The trading floor at the NYSE is almost empty - replaced by AI. 80-90% of all stock trades are handled by computer. I don't mean computers facilitate the trade (my order passes through a computer), I mean the computer makes the trade based on very sophisticated algorithms.

Much of the internet is run by computers. Phones too. More and more of our power systems.

It is easy to swap like for like (light bulbs), but much harder to change the economic/social paradigm. AI promises to do that across the board, meaning it is likely to do so much more slowly.

fatcat

Hi Geoman,
The algorithm trading is somewhat sophisticated but is not very intelligent. Its main virtue is that it is fast and uses arbitrage, even though slightest one. And even at that level it incomprehensible to the mere humans. (Quants might be able to make some sense out of it). And for sure people cannot keep up with the microsecond scale algos...

fatcat

Anyway,
Geoman is right that a technology that is not a drop-in replacement takes longer to find its use. For example now every big company is talking about big Data,yet have no idea why they need it. At least they provide funding for R&D

fatcat

And sometimes finding a good application to a technology is as much an innovation as the technology itself. Putting a screen on my fridge and connecting it to the botnet/IoT is not an innovation. Replacing the need for a fridge is. The later is much harder.

Geoman

Ha - I remember when lasers were described as "a solution looking for a problem." Hard to imagine those days.

Whether the algorithm and lightning fast computers used on the stock market are "artificial intelligence" or not is an interesting debate. I would argue that they are - very smart, but very narrowly focused forms of artificial intelligence (they respond independently to stimuli, and make decisions). It is how I would imagine AI evolving - the first applications will be in high dollar, competitive, and very narrowly focused fields. Like a laser, it is currently a solution looking for a problem.

What we are angling for now is something more broadly applicable. More of a cheap General AI than a expensive Focused AI. A general AI can manage my affairs, drive my car, etc.

Kartik Gada

Geoman and fatcat,

I have written an M&A report on AI.

http://www.woodsidecap.com/wcp-issues-ma-report-artificial-intelligence-sector/

The stock algos are definitely AI, but note that the AI sector has been notorious about redefining AI to exclude a form of AI that is currently in use (speech recognition, stock market algos, computer vision, self-driving cars, etc. are all forms of AI that are no longer counted as AI).

AI revenue at present is still only $3 Billion, but by many accounts 40% of all jobs can be affected (so easily $20 Trillion of disruptive impact worldwide in the medium term).

Drew

I agree with fatcat and geoman, there hasn't been a need for upgrades lately except for the early adopter. My Iphone 4 serves my current needs as does my low-end laptop. Maybe some app or game will be good enough to convince me to get a new one, but until then its like the LEDs, I will replace as needed and enjoy the better version once I've done so, but no drive to upgrade. VR still doesn't have a convincing reason to buy.

I agree that the AI impact is going on in the background, but it is not part of the consumer purchase except in the background for now.

Cheers,
Drew

fatcat

Hi Drew,
For the cell phones there are a couple of artificial drives for upgrade : firmware obsolenacy( apple is very good at maintaining old devices)
and battery life. It is good for an year, and after 3-4 years becomes a pain to charge.
Additionally there is the producer's desire to deprecate a perfectly working standard (hello headphone jack) while pushing something more expensive which is not ready for primetime.

I remember in nineties you HAD to upgrade your PC every 2-3 years. And you were getting double the CPU performance/storage. Now even after 5 years you might get an extra core and cheaper SSD. Doesn't feel like imminent singularity ,:)

Kartik Gada

Gentlemen,

While anecdotes like LED bulbs not being upgraded are true, the expenditure deferred here is much smaller than the increased demands of technology purchases elsewhere. The fact that tech products as a percentage of GDP is rising is proof of this. If total dollars spent on technology (new + upgrades) were falling, then it would be falling as a percentage of GDP.

Maybe B2B spending is where the bulk of the increase is. But the general level of technology spending by individuals, as a percentage of total expenditures, is still rising, and many upgrade delays are not by choice but simply due to budget limitations.

Lastly, don't underestimate the new markets created by ever-lower cost thresholds. An example I gave above was cheaper, brighter LEDs lead to more + brighter street lights and less crime. Ever-cheaper PCs lead to kids having their own PCs, rather than the entire family using one. Ditto for smartphones. Cheap monitors have led to ever-larger screen and two-monitor setups. I have gone from a cube-like 17" CRT monitor consuming half the desk, to a 20" LCD monitor, to two joined 27" monitors, over the years. Eventually maybe three 30" monitors.

pat

You need to upgrade your survey. I recently purchased a system with a NVIDIA Tegra X1 processor in it. It has 4 cpu and 256 gpu in it. It doesn't even come close to how many processing cores are coming with graphics cards. A young man I work with and his wife recently updated their graphics cards in their computers . He had 1280 cores in his card, and she had 759. This information came right from the manufacture's site on the web. These graphics cards are not even the top end models . That is 2039 with out even throwing in their phones, tablets, or car.
This doesn't count the AI's . Google Assistant on both phones, Microsoft Cortana , and NVIDIA. If I want I could go to the web and use Watson.
The WAVE is already here.

pat

People wonder when to upgrade their computers. A recent "I, Cringely" post mentioned a new company called FRAME. It's a cloud computing company. Serving both corporate and private clients . A private individual can have their own personal cloud computer running your own personal software. No need to upgrade your computer . Upgrading is taken care of at FRAME'S secure data center. All available for $10 a month. No I don't own shares.

pat

The graphics cards I mentioned above is just to render games at 4k and 60 frames a second at 120 hz ( I think ). As far as VR , think about remote manufacturing,or manufacturing in dangerous environments for starters. Chemical factories, nuke plants, under sea, etc. You need high def. sensory input. So you don't cut the wrong thing or misread the label on a cable, or mix the red chemical with the blue. That means a lot of computing power in real time .

I work for a company that does a lot of simulation before implementation. Circuit classes in college use software to simulate circuits before building them . Mechanical engineers using CAD to "see" how things fit and work. The physics,chemical design processes,as things shrink, need to take into account. Atomic scale forces at 1/10 of a nanometer scale. Which means quantum forces need to be taken in to account. This means high amount of computing power.

Kartik Gada

pat,

All true.

The comments to this entry are closed.