The Futurist

"We know what we are, but we know not what we may become"

- William Shakespeare

ATOM Award of the Month, January 2021

It is time for another ATOM AotM.  This one, in particular, goes retro and ties into concepts from the early days of The Futurist, back in 2006.  This award also dispels the misinformed myth that 'we can no longer send a man to the Moon like we did in 1969-72' or, even worse, that 'human progress peaked in 1969'.  If anything, progress in space-related advancement has been steadily on an exponential trendline.  

Rocket-launch-costs-trendThe first man-made object to be placed into orbit and send back information from orbit, was Sputnik in 1957.  Since that time, satellites have risen in number and complexity.  But the primary cost of a satellite is not the hardware itself, but rather the cost of getting it to orbit in the first place.  While the early data is sparse and the trendline was not easy to discern, we are now in an inflection point of this trajectory, enabling a variety of entities far smaller than governments to launch objects into orbit.  If the trendline of a 10x reduction per decade is in fact manifested, then a number of entirely new industries will emerge in short order.

(image from https://www.futuretimeline.net)

The emergence of private enterprises that can create profitable businesses from space is an aspect of the 21st century that is entirely different from the capital-intensive government space programs of the second half of the 20th century.  From geospatial data to satellite-derived high-speed Internet, the era of commercial space is here. 

SpaceX has already begun the Starlink program, which advertises 1 Gbps Internet access for rural customers.  It is not yet apparent how SpaceX will upgrade the hardware of its satellites over time, but if the 1 Gbps speed is a reality, this will break the cartel of existing land-based ISPs (such as Comcast), where the gross margin they earn on existing customers is as high as 97%.  Needless to say, high-speed access available to the backwaters of the world will boost their economic productivity.  

Other efficiencies are on the horizon.  3D Printing in space is very pragmatic, as only the 3D Printing filament has to be replenished from Earth, and finished objects are simply printed in orbit.  As the filament never has an awkward shape, it is far less expensive to send unprinted filament into an orbiting 3D printer.  Asteroid mining is another, and is an extension of the fundamental ATOM principle of technology always increasing the supply of, or alternatives to, any commodity.  The prices of precious metals on Earth could collapse when asteroid mining reaches fruition, to a much greater extent than oil prices plunged from hydraulic fracturing.  

MMNPBut the falling cost of launch per unit weight is only half of the story.  To see the second exponential, we go all the way back to an article from April 22, 2006, titled 'Milli, Micro, Nano, Pico'.  The point here is that the ability to engineer smaller and smaller (integrated circuits with 5 nm transistors), at greater and greater scale, comprise a double exponential of technological intricacy and integration.  Surely, this has to result in a modernization of the electronics sent up into space.  

Consider that the major unmanned spacecraft that NASA has launched, such as the Pioneer, Voyager, and Cassini probes.  These were electronics from the 1970s, with designs that are not being updated to this day given that the New Horizons probe (launched in 2006) was still the same size.  We know that an electronics design, from 1975 to 2020, is expected to shrink in both size and cost by a factor of over 1 million.  If a supercomputer the size of an entire room in 1975 is less powerful than a 200-gram Raspberry Pi system in 2021, then why is NASA still launching one-ton devices that have incorporated none of the advances in electronics that have happened in the last 45 years?  The camera and transmitter on Voyager 2 are surely far less powerful than what exists in 2021 smartphones.  

Given the continued shrinkage in electronics and decline in launch costs, it is long past time for thousands of Voyager-type probes, each the size of a smartphone, to be launched in all directions.  Every significant body in the Solar System should have a probe around it taking pictures and other readings, and the number of images available on the Internet should be hundreds of times greater than exists now.  This will happen once someone with the appropriate capabilities notices how far behind the electronics of NASA and other space agencies are.  

Hence, this ATOM AotM makes use of up to three exponential trends at once.  But the decline in launch costs per unit weight alone has immense implications.  

This will be the final ATOM AotM posted on this website as an article.  Future instances will be on my new YouTube channel, which I hope to inaugurate in February.  

 

Related ATOM Chapters :

3.  Technological Disruption is Pervasive and Deepening

12. The ATOM's Effect on the Final Frontier

 

 

January 10, 2021 in Accelerating Change, ATOM AotM, Space Exploration, The ATOM | Permalink | Comments (20)

Tweet This! |

ATOM Award of the Month, November 2019

It is time for another ATOM AotM.  This month's award has a major overlap with the November 2017 award, where we identified that telescopic power has been computerized, and as a result was rising at 26%/yr.  This itself was a finding from a much older article from all the way back in September 2006, where I first identified that telescopic power was improving at that rate.  

But how do better telescopes improve your life?  Learning about exoplanets and better images of stars are fun, but have no immediate relevance to our individual daily challenges.  If you are not interested in astronomy, why should you care?  Well, there is one area where this advancement has already improved millions and possibly billions of lives : we have now mapped nearly all of the Near Earth Objects (NEOs) that might be large enough to cause a major disaster if any of them strike the Earth.  Remember that this is an object with a mass that may be billions of tons, traveling at about 30 km/sec (image from sciencenews.org), of which there are many thousands that have already orbited the sun over 4 billion times each.  

image from upload.wikimedia.orgAll of us recall how, in the 1990s, there were a number of films portraying how such a disaster might manifest.  Well, in the 1990s, we had little awareness of which objects were nearby at what time, and so there really was a risk that a large asteroid could hit us with little or no warning.  However, as telescopes improved, 26%/yr (the square root of Moore's Law, since pixel numbers increase as a square of linear dimension shrinkage) got to work on this problem.  Now, as of today, all asteroids larger than 1km are mapped, and almost all of the thousands that are larger than 140m (the size above which it would actually hit the surface, rather than burn up in the atmosphere) are mapped as well (chart from Wikipedia).  We have identified which object might be an impact risk in what year.  In case you are wondering, there is a 370m asteroid that will get very near (but not hit the Earth) in 2036.  Of course, by 2036, we will have mapped everything with far more precision, at this rate of improvement.  In other words, don't worry about an asteroid impact in the near future, as none of significance are anticipated in the next 17 years, and probably not for much longer than that.  Comets are a different matter, as we have not mapped most of them (and cannot, as of yet), but large ones impact too infrequently to worry about.  

Hence, the risk of an impact event, and mitigation thereof, is no longer a technological problem.  It is merely a political one.  Will the governments of the world work to divert asteroids before one hits, or will they only react after one hits in order to prevent the next impact?  These questions are complicated, as this problem is completely borderless.  Why should the United States pay the majority of the expense for a borderless problem, particularly one that has a 71% chance of hitting an ocean?  At any rate, this is another problem that went from deadly to merely one of fiscal prioritization, on account of ATOM progress.  

More interestingly, within this problem is another major business opportunity that we have discussed here in the past.  Asteroid mining is a potential industry that is simultaneous with asteroid diversion, as asteroid pulverization may waste some precious metals that can be captured.  Many asteroids have a much greater proportion of precious metals than the Earth's surface does, since 'precious' metals are heavy and most of the quantity sunk to the center of the Earth while the Earth was forming, while an asteroid with much lower gravity has its precious metals more evenly distributed throughout its structure.  There are already asteroids identified that have hundreds of tons of gold and platinum in them.  Accessing these asteroids will, of course, crush the prices of these metals as traded on Earth (another ATOM effect we have seen elsewhere in other commodities), and may reduce gold to an industrial metal that is used in much the way copper is.  This, of course, may enable new applications that are not cost-effective at the current prices of gold, platinum, palladium, etc.  But that is a topic for another time.  

 

Related :

ATOM AotM, November 2017

SETI and the Singularity

Telescope Power - Yet Another Accelerating Technology

 

November 01, 2019 in Accelerating Change, ATOM AotM, Space Exploration, The ATOM | Permalink | Comments (53)

Tweet This! |

ATOM Award of the Month, November 2017

Picture39For this month, the ATOM AotM goes outward.  Much like the September ATOM AotM, this is another dimension of imaging.  But this time, we focus on the final frontier.  Few have noticed that the rate of improvement of astronomical discovery is now on an ATOM-worthy trajectory, such that this merited an entire chapter in the ATOM publication.  

Here at The Futurist, we have been examining telescopic progress for over a decade.  In September of 2006, I estimated that telescope power was rising at a compounding rate of 26%/year, and that this trend has been ongoing for decades.  26%/year happens to be the square root of Moore's Law, which is precisely what is to be expected, since to double resolution by halving the size of a pixel, one pixel has to be divided into four.  This is also why video game and CGI resolution rises at 26%/year.  

Rising telescope resolution enabled the first exoplanet to be discovered in 1995, and then a steady stream after 2005.  This estimated rate led me to correctly predict that the first Earth-like planets would be discovered by 2010-11, and that happened right on schedule.  But as with many such thresholds, after initial fanfare, the new status quo manifests and people forget what life was like before.  This leads to an continuous underestimation of the rate of change by the average person.

Histogram_Chart_of_Discovered_Exoplanets_as_of_2017-03-08Then, in May 2009, I published one of the most important articles ever written on The Futurist : SETI and the Singularity.  At that time, only 347 exoplanets were known, almost all of which were gas giants much larger than the Earth.  That number has grown to 3693 today, or over ten times as many.  Note how we see the familiar exponential curve inherent to every aspect of the ATOM.  Now, even finding Earth-like planets in the 'life zone' is no longer remarkable, which is another aspect of human psychology towards the ATOM - that a highly anticipated and wondrous advance quickly becomes a normalized status quo and most people forget all the previous excitement.   

1280px-KeplerHabitableZonePlanets-20170616The rate of discovery may soon accelerate further as key process components collapse in cost.  Recent computer vision algorithms have proven themselves to be millions of times faster than human examiners.  A large part of the cost of exoplanet discovery instruments like the Kepler Space Observatory is the 12-18 month manual analysis period.  If computer vision can perform this task in seconds, the cost of comparable future projects plummets, and new exoplanets are confirmed almost immediately rather than every other year.  This is another massive ATOM productivity jump that removes a major bottleneck in an existing process structure.  A new mission like Kepler would cost dramatically less than the previous one, and will be able to publish results far more rapidly.  

Given the 26%/year trendline, the future of telescopic discovery becomes easier to predict.  In the same article, I made a dramatic prediction about SETI and the prospects of finding extraterrestrial intelligence.  Many 'enlightened' people are certain that there are numerous extraterrestrial civilizations.  While I too believed this for years (from age 6 to about 35), as I studied the accelerating rate of change, I began to notice that within the context of the Drake equation, any civilization even slightly more advanced than us would be dramatically more advanced.  In terms of such a civilization, while their current activities might very well be indistinguishable from nature to us, their past activities might still be visible as evidence of their existence at that time.  This led me to realize that while there could very well be thousands of planets in our own galaxy that are slightly less advanced that us, it becomes increasingly difficult for there to be one more advanced than us that still manages to avoid detection.  Other galaxies are a different story, simply because the distance between galaxies is itself 10-20 times more than the diameter of the typical galaxy.  Our telescopic capacity is rising 26%/year after all, and the final variable of the Drake equation, fL, has risen from just 42 years at the time of Carl Sagan's famous clip in 1980, to 79 years now, or almost twice as long.  

Hence, the proclamation I had set in 2009 about the 2030 deadline (21 years away at the time) can be re-affirmed, as the 2030 deadline is now only 13 years away.  

2030

Despite the enormity of our galaxy and the wide range of signals that may exist, even this is eventually superseded by exponential detection capabilities.  At least our half of the galaxy will have received a substantial examination of signal traces by 2030.  While a deadline 13 years away seems near, remember that the extent of examination that happens 2017-30 will be more than in all the 400+ years since Galileo, for Moore's Law reasons alone.  The jury is out until then.  

(all images from Wikipedia or Wikimedia).  

 

Related Articles :

New Telescopes to Reveal Untold Wonders

SETI and the Singularity

Telescope Power - Yet Another Accelerating Technology

 

Related ATOM Chapters :

12. The ATOM's Effect on the Final Frontier

  

 

November 20, 2017 in Accelerating Change, ATOM AotM, Space Exploration, The Singularity | Permalink | Comments (122)

Tweet This! |

New Telescopes to Reveal Untold Wonders

A number of new telescopes are soon going to be entered into service, all of which are far more powerful than equivalent predecessors.  This is fully expected by any longtime reader of The Futurist, for space-related articles have been a favorite theme here.  

To begin, refer to the vintage 2006 article where I estimated telescope power to be rising at a compound annual rate of approximately 26%/year, although that is a trendline of a staircase with very large steps.  This, coincidentally, is exactly the same rate at which computer graphics technology advances, which also happens to be the square root of Moore's Law's rate of progress.  According to this timeline, a wave of powerful telescopes arriving now happens to be right on schedule.  Secondly, refer to one of the very best articles on The Futurist, titled 'SETI and the Singularity', where the impact of increasing telescopic power is examined.  The exponential increase in the detection of exoplanets (chart from Wikipedia), and the implications for the Drake Equation, are measured, with a major prediction about extraterrestrial life contained therein.  

UntitledBuilding on that, in the ATOM e-book, I detail how accelerating technological progress has a major impact on space exploration.  Contrary to a widely-repeated belief that space exploration has plateaued since the Apollo program, technology has ensure that quite the opposite is true.  Exoplanet detection is now in the hundreds per year (and soon to be in the thousands), even as technologies such as 3D Printing in space and asteroid mining are poised to generate great wealth here on Earth.  With space innovation no longer exclusively the domain of the US, costs have lowered through competition. India has launched a successful Mars orbiter at 1/10th the cost of equivalent US or Russian programs, which has been in operation for two years.  

Related ATOM Chapters :

3. Technological Disruption is Pervasive and Deepening

12. The ATOM's Effect on the Final Frontier

 

 

August 28, 2016 in Accelerating Change, Space Exploration, The ATOM | Permalink | Comments (6)

Tweet This! |

SETI and the Singularity

Picture39The Search for Extra-Terrestrial Intelligence (SETI) seeks to answer one of the most basic questions of human identity - whether we are alone in the universe, or merely one civilization among many.  It is perhaps the biggest question that any human can ponder.  

The Drake Equation, created by astronomer Frank Drake in 1960, calculates the number of advanced extra-terrestrial civilizations in the Milky Way galaxy in existence at this time.  Watch this 8-minute clip of Carl Sagan in 1980 walking the audience through the parameters of the Drake Equation.  The Drake equation manages to educate people on the deductive steps needed to understand the basic probability of finding another civilization in the galaxy, but as the final result varies so greatly based on even slight adjustments to the parameters, it is hard to make a strong argument for or against the existence of extra-terrestrial intelligence via the Drake equation.  The most speculative parameter is the last one, fL, which is an estimation of the total lifespan of an advanced civilization.  Again, this video clip is from 1980, and thus only 42 years after the advent of radio astronomy in 1938.  Another 29 years, or 70%, have since been added to the age of our radio-astronomy capabilities, and the prospect of nuclear annihilation of our civilization is far lower today than in was in 1980.  No matter how ambitious or conservative of a stance you take on the other parameters, the value of fL in terms of our own civilization, continues to rise.  This leads us to our first postulate :

The expected lifespan of an intelligent civilization is rising.       

Carl Sagan himself believed that in such a vast cosmos, that intelligent life would have to emerge in multiple locations, and the cosmos was thus 'brimming over' with intelligent life.  On the other side are various explanations for why intelligent life will be rare.  The Rare Earth Hypothesis argues that the combination of conditions that enabled life to emerge on Earth are extremely rare.  The Fermi Paradox, originating back in 1950, questions the contradiction between the supposed high incidence of intelligent life, and the continued lack of evidence of it.  The Great Filter theory suggests that many intelligent civilizations self-destruct at some point, explaining their apparent scarcity.  This leads to the conclusion that the easier it is for civilization to advance to our present stage, the bleaker our prospects for long-term survival, since the 'filter' that other civilizations collide with has yet to face us.  A contrarian case can thus be made that the longer we go without detecting another civilization, the better. 

Exochart But one dimension that is conspicuously absent from all of these theories is an accounting for the accelerating rate of change.  I have previously provided evidence that telescopic power is also an accelerating technology.  After the invention of the telescope by Galileo in 1609, major discoveries used to be several decades apart, but now are only separated by years.  An extrapolation of various discoveries enabled me to crudely estimate that our observational power is currently rising at 26% per year, even though the first 300 years after the invention of the telescope only saw an improvement of 1% a year.  At the time of the 1980 Cosmos television series, it was not remotely possible to confirm the existence of any extrasolar planet or to resolve any star aside from the sun into a disk.  Yet, both were accomplished by the mid-1990s.  As of May 2009, we have now confirmed a total of 347 extrasolar planets, with the rate of discovery rising quickly.  While the first confirmation was not until 1995, we now are discovering new planets at a rate of 1 per week.  With a number of new telescope programs being launched, this rate will rise further still.  Furthermore, most of the planets we have found so far are large.  Soon, we will be able to detect planets much smaller in size, including Earth-sized planets.  This leads us to our second postulate :

Telescopic power is rising quickly, possibly at 26% a year.  

Extrasolar_Planets_2004-08-31This Jet Propulsion Laboratory chart of exoplanet discoveries through 2004 is very overdue for an update, but is still instructive.  The x-axis is the distance of the planet from the star, and the y-axis is the mass of the planet.  All blue, red, and yellow dots are exoplanets, while the larger circles with letters in them are our own local planets, with the 'E' being Earth.  Most exoplanet discoveries up to that time were of Jupiter-sized planets that were closer to their stars than Jupiter is to the sun.  The green zone, or 'life zone' is the area within which a planet is a candidate to support life within our current understanding of what life is.  Even then, this chart does not capture the full possibilities for life, as a gas giant like Jupiter or Saturn, at the correct distance from a Sun-type star, might have rocky satellites that would thus also be in the life zone.  In other words, if Saturn were as close to the Sun as Earth is, Titan would also be in the life zone, and thus the green area should extend vertically higher to capture the possibility of such large satellites of gas giants.  The chart shows that telescopes commissioned in the near future will enable the detection of planets in the life zone.  If this chart were updated, a few would already be recorded here.  Some of the missions and telescopes that will soon be sending over a torrent of new discoveries are :

Kepler Mission : Launched in March 2009, the Kepler Mission will continuously monitor a field of 100,000 stars for the transit of planets in front of them.  This method has a far higher chance of detecting Earth-sized planets than prior methods, and we will see many discovered by 2010-11.

COROT : This European mission was launched in December 2006, and uses a similar method as the Kepler Mission, but is not as powerful.  COROT has discovered a handful of planets thus far. 

New Worlds Mission : This 2013 mission will build a large sunflower-shaped occulter in space to block the light of nearby stars to aid the observation of extrasolar planets.  A large number of planets close to their stars will become visible through this method. 

Allen Telescope Array : Funded by Microsoft co-founder Paul Allen, the ATA will survey 1,000,000 stars for radio astronomy evidence of intelligent life.  The ATA is sensitive enough to discover a large radio telescope such as the Arecibo Observatory up to a distance of 1000 light years.  Many of the ATA components are electronics that decline in price in accordance with Moore's Law, which will subsequently lead to the development of the..... 

Square Kilometer Array : Far larger and more powerful than the Allen Telescope Array, the SKA will be in full operation by 2020, and will be the most sensitive radio telescope ever.  The continual decline in the price of processing technology will enable the SKA to scour the sky thousands of times faster than existing radio telescopes. 

These are merely the missions that are already under development or even under operation.  Several others are in the conceptual phase, and could be launched within the next 15 years.  So many methods of observation used at once, combined with the cost improvements of Moore's Law, leads us to our third postulate, which few would have agreed with at the time of 'Cosmos' in 1980 :

Thousands of planets in the 'life zone' will be confirmed by 2025. 

Now, we will revisit the under-discussed factor of accelerating change.  Out of 4.5 billion years of Earth's existence, it has only hosted a civilization capable of radio astronomy for 71 years. But as our own technology is advancing on a multitude of fronts, through the accelerating rate of change and the Impact of Computing, each year, the power of our telescopes increases and the signals of intelligence (radio and TV) emitted from Earth move out one more light year.  Thus, the probability for us to detect someone, and for us to be detected by them, however small, is now rising quickly.  Our civilization gained far more in both detectability, and detection-capability, in the 30 years between 1980 and 2010, relative to the 30 years between 1610 and 1640, when Galileo was persecuted for his discoveries and support of heliocentrism, and certainly relative to the 30 years between 70,000,030 and 70,000,000 BC, when no advanced civilization existed on Earth, and the dominant life form was Tyrannosaurus. 

Nikolai Kardashev has devised a scale to measure the level of advancement that a technological civilization has achieved, based on their energy technology.  This simple scale can be summarized as follows :

Type I : A civilization capable of harnessing all the energy available on their planet.

Type II : A civilization capable of harnessing all the energy available from their star.

Type III : A civilization capable of harnessing all the energy available in their galaxy.

The scale is logarithmic, and our civilization currently would receive a Kardashev score of 0.72.  We could potentially achieve full Type I status by the mid-21st century due to a technological singularity.  Some have estimated that our exponential growth could elevate us to Type II status by the late 22nd century.  

This has given rise to another faction in the speculative debate on extra-terrestrial intelligence, a view held by Ray Kurzweil, among others.  The theory is that it takes such a short time (a few hundred years) for a civilization to go from the earliest mechanical technology to reach a technological singularity where artificial intelligence saturates surrounding matter, relative to the lifetime of the home planet (a few billion years), that we are the first civilization to come this far.  Given the rate of advancement, a civilization would have to be just 100 years ahead of us to be so advanced that they would be easy to detect within 100 light years, despite 100 years being such a short fraction of a planet's life.  In other words, where a 19th century Earth would be undetectable to us today, an Earth of the 22nd century would be extremely conspicuous to us from 100 light years away, emitting countless signals across a variety of mediums. 

A Type I civilization within 100 light years would be readily detected by our instruments today.  A Type II civilization within 1000 light years will be visible to the Allen or the Square Kilometer Array.  A Type III would be the only type of civilization that we probably could not detect, as we might have already been within one all along.  We do not have a way of knowing if the current structure of the Milky Way galaxy is artificially designed by a Type III civilization.  Thus, the fourth and final postulate becomes :

A civilization slightly more advanced than us will soon be easy for us to detect.

The Carl Sagan view of plentiful advanced civilizations is the generally accepted wisdom, and a view that I held for a long time.  On the other hand, the Kurzweil view is understood by very few, for even in the SETI community, not that many participants are truly acceleration aware.  The accelerating nature of progress, which existed long before humans even evolved, as shown in Carl Sagan's cosmic calendar concept, also from the 1980 'Cosmos' series, simply has to be considered as one of the most critical forces in any estimation of extra-terrestrial life.  I have not yet migrated fully to the Kurzweil view, but let us list our four postulates out all at once :

The expected lifespan of an intelligent civilization is rising.  

Telescopic power is rising quickly, possibly at 26% a year. 

 

Thousands of planets in the 'life zone' will be confirmed by 2025. 

A civilization slightly more advanced than us will soon be easy for us to detect.

As the Impact of Computing will ensure that computational power rises 16,000X between 2009 and 2030, and that our radio astronomy experience will be 92 years old by 2030, there are just too many forces that are increasing our probabilities of finding a civilization if one does indeed exist nearby.  It is one thing to know of no extrasolar planets, or of any civilizations.  It is quite another to know about thousands of planets, yet still not detect any civilizations after years of searching.  This would greatly strengthen the case against the existence of such civilizations, and the case would grow stronger by year.  Thus, these four postulates in combination lead me to conclude that :

 

2030

 

 

 

 

Most of the 'realistic' science fiction regarding first contact with another extra-terrestrial civilization portrays that civilization being domiciled relatively nearby.  In Carl Sagan's 'Contact', the civilization was from the Vega star system, just 26 light years away.  In the film 'Star Trek : First Contact', humans come in contact with Vulcans in 2063, but the Vulcan homeworld is also just 16 light years from Earth.  The possibility of any civilization this near to us would be effectively ruled out by 2030 if we do not find any favorable evidence.  SETI should still be given the highest priority, of course, as the lack of a discovery is just as important as making a discovery of extra-terrestrial intelligence. 

If we do detect evidence of an extra-terrestrial civilization, everything about life on Earth will change.  Both 'Contact' and 'Star Trek : First Contact' depicted how an unprecedented wave of human unity swept across the globe upon evidence that humans were, after all, one intelligent species among many.  In Star Trek, this led to what essentially became a techno-economic singularity for the human race.  As shown in 'Contact', many of the world's religions were turned upside down upon this discovery, and had to revise their doctrines accordingly.  Various new cults devoted to the worship of the new civilization formed almost immediately. 

If, however, we are alone, then according to many Singularitarians, we will be the ones to determine the destiny of the cosmos.  After a technological singularity in the mid-21st century that merges our biology with our technology, we would proceed to convert all matter into artificial intelligence, make use of all the elementary particles in our vicinity, and expand outward at speeds that eventually exceed the speed of light, ultimately saturating the entire universe with out intelligence in just a few centuries.  That, however, is a topic for another day.   

May 23, 2009 in Accelerating Change, Core Articles, Space Exploration, The Singularity | Permalink | Comments (28)

Tweet This! |

Liquid Mirrors May Boost Future Telescopes

On September 28, 2006, I made the case that telescopic power is indeed an accelerating technology, set to improve at an estimated rate of 26% a year for the next 30 years.  I believe that increasingly more powerful telescopes will ensure that we discover the first genuinely Earth-like planet in another star system by 2011, and that by 2025, we will have discovered thousands of such planets. 

In support of this thesis of accelerating telescope improvement, I had to bring attention to one particular prospective technology that greatly increases the chances of this predicted rate of improvement holding true : Liquid mirrors that could at some point replace glass in the largest telescopes (from MIT Technology Review). 

The mirror is a pool of salt-based liquids that only freeze at very lower temperatures, coated with a silver film.  While practical usage is at least 20 years away, the details reveal a technology that is brilliantly simple, yet tantalizingly capable of addressing almost all of the problems facing the construction of giant telescopes.  Glass mirrors are exceedingly difficult to scale to larger sizes, and even the most minor defect can render a mirror useless.  Reflective liquid, by contrast, can be scaled up almost indefinitely, limited only by the perimeter of the enclosure it is placed in.  External blows that would crack or scratch a glass mirror would have no effect on a liquid that could quickly return to the original shape.

I don't expect updates on this technology in the near future, but the next logical step would be for a smaller telescope to be demonstrated to use this technology.  If that succeeds, the ultimate goal would be, by 2030, a massive telescope more than 200 meters in diameter placed on the Moon, where the sky is free of atmospheric distortions, and the ground is free of tiny seismic shaking.  This would enable us to observe Earth-like planets at a distance of up to 100 light years, as well as observe individual stars near the center of the Milky Way galaxy (30,000 light years away). 

Related :

Are You Acceleration Aware?

Finding Earth-like Planets Will Soon be Possible

June 23, 2007 in Accelerating Change, Science, Space Exploration, Technology | Permalink | Comments (2) | TrackBack (0)

Tweet This! |

New Astronomy Images, and Evidence of Telescopic Advances

I happened to come across this post, which displays the author's selections of the top astronomical photographs of 2006.  The one I am particularly stunned by is #5, the transit of the Space Shuttle and International Space Station in front of the Sun.  The precise timing needed to execute this image mind boggling, and probably less than one in a million.  The photographer, Thierry Legault, had to 1) know when the shuttle was approaching the ISS, 2) know when both of them would be in front of the sun relative to his location in France, which was a zone of observation only 7.4km wide, and 3) get this image in the 0.6 seconds of the transit duration.

Not all of these ten photographs are exclusively the result of instruments and technologies that did not exist a few years ago, but 3 to 4 of them are.  As we have discussed before, telescopic power is also an accelerating technology, and increasingly impressive images will continue to emerge as new telescopes and supporting resources become operational. 

Little can match an astronomical discovery's ability to generate wonder, optimism, and just a general good mood.  We shall see, within just the next couple decades, images that even the late Carl Sagan would have been in awe of. 

January 07, 2007 in Accelerating Change, Science, Space Exploration, Technology | Permalink | Comments (0) | TrackBack (0)

Tweet This! |

Telescope Power - Yet Another Accelerating Technology

285pxhubble_01Earlier, we had an article about how our advancing capability to observe the universe would soon enable the detection of Earth-like planets in distant star systems.  Today, I present a complementary article, in which we will examine the progression in telescopic power, why the rate of improvement is so much faster than it was just a few decades ago, and why amazing astronomical discoveries will be made much sooner than the public is prepared for. 

The first telescope used for astronomical purposes was built by Galileo Galilei in 1609, after which he discovered the 4 large moons of Jupiter.  The rings of Saturn were discovered by Christaan Huygens in 1655, with a telescope more powerful than Galileo's.  Consider that the planet Uranus was not detected until 1781, and similar-sized Neptune was not until 1846.  Pluto was not observed until 1930.  That these discoveries were decades apart indicates what the rate of progress was in the 17th, 18th, 19th, and early 20th centuries. 

383pxextrasolar_planets_20040831_1The first extrasolar planet was not detected until 1995, but since then, hundreds more with varying characteristics have been found.  In fact, some of the extrasolar planets detected are even the same size as Neptune.  So while an object of Neptune's size in our own solar system (4 light-hours away) could remain undetected from Earth until 1846, we are now finding comparable bodies in star systems 100 light years away.  This wonderful, if slightly outdated chart provides details of extrasolar planet discoveries. 

The same goes for observing stars themselves.  Many would be surprised to know that humanity had never observed a star (other than the sun) as a disc rather than a mere point of light, until the Hubble Space Telescope imaged Betelgeuse in the mid 1990s.  Since then, several other stars have been resolved into discs, with details of their surfaces now apparent.

So is there a way to string these historical examples into a trend that projects the future of what telescopes will be able to observe?  The extrasolar planet chart above seems to suggest that in some cases, the next 5 years will have a 10x improvement in this particular capacity - a rate comparable to Moore's Law.  But is this just a coincidence or is there some genuine influence exerted on modern telescopes by the Impact of Computing? 

Many advanced telescopes, both orbital and ground-based, are in the works as we speak.  Among them are the Kepler Space Observatory, the James Webb Space Telescope, and the Giant Magellan Telescope, which all will greatly exceed the power of current instruments.  Slightly further in the future is the Overwhelmingly Large Telescope (OWL).  The OWL will have the ability to see celestial objects that are 1000 times as dim as what the Hubble Space Telescope (HST) can observe, and 5 trillion times as faint as what the naked eye can see.  The HST launched in 1990, and the OWL is destined for completion around 2020 (for the moment, we shall ignore the fact that the OWL actually costs less than the HST).  This improvement factor of 1000 over 30 years can be crudely annualized into a 26% compound growth rate.  This is much slower than the rate suggested in the extrasolar planet chart, however, indicating that the rate of improvement in one aspect of astronomical observation does not automatically scale to others.  Still, approximately 26% a year is hugely faster than progress was when it took 65 years after the discovery of Uranus to find Neptune, a body with half the brightness.  65 years for a doubling is a little over 1% a year improvement between 1781 and 1846.  We have gone from having one major discovery per century to having multiple new discoveries per decade - that is quite an accelerating curve. 

We can thus predict with considerable confidence that the first Earth-like planet will make headlines in 2010 or 2011, and by 2023, we will have discovered thousands of such planets.  This means that by 2025, a very important question will receive considerable fuel on at least one side of the debate...

Image attribution : Courtesy NASA/JPL-Caltech

 

September 28, 2006 in Accelerating Change, Science, Space Exploration, Technology, The Singularity | Permalink | Comments (14)

Tweet This! |

Finding Earth-like Planets Will Soon be Possible

First, the Earth (whether flat or spherical) was considered to be the center of the universe.  Then, the Sun was considered to be center of the universe.  Eventually, mankind came to realize that the Sun is just one of 200 to 400 billion stars within the Milky Way galaxy, which itself is just one among hundreds of billions of galaxies in the known universe, and there may even be other universes.

Astronomers have long believed that many stars would have planets around them, including some Earth-like planets.  Carl Sagan wrote and spoke extensively about this in the 1970s and 80s, but we did not have the technology to detect such planets at the time, so the discussions remained theoretical.  There were no datapoints by which to estimate what percentage of stars had what number of planets, of which what fraction were Earth-like. 

The first confirmed extrasolar planet was discovered in 1995.  Since then, continually improving technology has yielded discovery of more than one per month, for a grand total of about 176 to date.  So far, most known extrasolar planets have been Jupiter-sized or larger, with the detection of Earth-sized planets beyond our current technology. 

But the Impact of Computing is finding its way here as well, and new instruments will continue to deliver an exponentially growing ability to detect smaller and more distant planets.  Mere projection of the rate of discovery since 1995 predicts that thousands of planets, some of them Earth-sized, will be discovered by 2015.  To comfortably expect this, we just need to examine whether advances in astronomical observation are keeping up with this trend.  Let's take a detailed look at the chart below from a Jet Propulsion Laboratory publication, which has a lot of information.

383pxextrasolar_planets_20040831

The bottom horizontal axis is the distance from the star, and the top horizontal axis is the orbital period (the top and bottom can contradict each other for stars of different mass, but let's put that aside for now).  The right vertical axis is the mass as a multiple of the Earth's mass.  The left vertical axis is the same thing, merely in Jupiter masses (318 times that of the Earth). 

Current detection capability represents the area above the purple and first two blue lines, and the blue, red, and yellow dots represent known extrasolar planets.  Planets less massive than Saturn have been detected only when they are very close to their stars.  The green band represents the zone on the chart where an Earth-like planet, with similar mass and distance from its star as our Earth, would reside.  Such a planet would be a candidate for life. 

The Kepler Space Observatory will launch in mid-2008, and by 2010-11 will be able to detect planets in the green zone around stars as far as 1000 light years away.  It is set to examine 100,000 different stars, so it would be very surprising if the KSO didn't find dozens of planets in the green-zone. 

After 2015, instruments up to 1000 times more advanced than those today, such as the Overwhelmingly Large Telescope and others, will enable us to conduct more detailed observations of the hundreds of green-zone planets that will be identified by then.  We will begin to get an idea of their color (and thus the presence of oceans) and atmospheric composition.  From there, we will have a distinct list of candidate planets that could support Earth-like life. 

This will be a fun one to watch over the next decade.  Wait for the first headline of 'Earth-like planet discovered' in 2010 or 2011.  

Image Attribution: Courtesy NASA/JPL-Caltech

March 26, 2006 in Accelerating Change, Space Exploration, Technology | Permalink | Comments (7)

Tweet This! |

Elevators into Space - Yes, Really.

Most popular science fiction is still not all that ambitious in what it expects the audience to accept.  The basics, such as assumptions that space will be explored by large spaceships, and that faster-than-light travel will be achieved before human near-immortality, are given.  Yet, neither is a probable outcome within current trends of technological progress, and thus represent an unwillingness to challenge many basic assumptions about our technology and existence. 

Reality can be far more exotic than the science fiction of earlier generations, because not only are the wrong trends extrapolated in science fiction, but linear, rather than exponential, thinking is applied. 

People are working to build a functioning space elevator by 2018 - just 12 years from now.  It would consist of a carbon nanotube ribbon that extends into space to carry 100 tons up at a time, to a height of at least 65 miles or higher.  NASA has a long-term goal of extending an elevator all the way up to 62,000 miles in height, or one-fourth of the distance to the Moon. 

Beyond absurd, you say?

Material strength, at least, is not going to be a problem.  Carbon Nanotubes can form superstrong materials that can be strong enough and light enough to handle this.  Nanotubes were priced at $230,000 per pound in 2000, but the price is dropping exponentially, and even a 65-mile ribbon would not be tremendously expensive by 2018.

My opinion on whether this goal is possible?  It is difficult, and 2018 might be a decade too soon, even if it does succeed.  But a voyage to the Moon would have appeared difficult to Thomas Jefferson, and the accelerating rate of progress continues to shorten the interval between major innovations.  They already advanced from 300m to 1600m in just a few months.  What if they got to, say, 10 miles by 2016?  Would people take notice?

This will be a fun one to watch over the next few years.  Stay tuned for updates.

February 16, 2006 in Space Exploration, Technology | Permalink | Comments (13) | TrackBack (0)

Tweet This! |

Exponential, Accelerating Growth in Transportation Speed

In the modern world, few people truly understand that the world is progressing at an exponential and accelerating rate.  This is the most critical and fundamental aspect of making any attempt to understand and predict the future.  Without a deep appreciation for this, no predictions of the intermediate and distant future are credible.

Read Ray Kurzweil's essay on this topic for an introduction.

Among the many examples of accelerating progress, one of the easiest to historically track and grasp is the rate of advancement in transportation technology.  Consider the chart below :

Speed_1

For thousands of years, humans could move at no more than the pace of a horse.  Then, the knee of the curve occurred, with the invention of the steam engine locomotive in the early 19th century, enabling sustained speeds of 60 mph or more.  After that came the automobile, airplane, and supersonic jet.  By 1957, humans had launched an unmanned vehicle into space, achieving escape velocity of 25,000 mph.  In 1977, the Voyager 1 and 2 spacecraft were launched on an interplanetary mission, reaching peak speeds of 55,000 mph.  However, in the 29 years since, we have not launched a vehicle that has exceeded this speed. 

Given these datapoints, what trajectory of progress can we extrapolate for the future?  Will we ever reach the speed of light, and if so, under what circumstances?

Depending on how you project the trendline, the speed of light may be reached by Earth-derived life-forms anywhere between 2075 and 2500.  How would this be possible?

Certainly, achieving the speed of light would be extremely difficult, just like a journey to the Moon might have appeared extremely difficult to the Wright brothers.  However, after the 1000-fold increase in maximum speed achieved during the 20th century, a mere repeat of the same magnitude of improvement would get us there.

But what of various limits on the human body, Einstein's Theory of Relativity, the amount of energy needed to propel a vehicle at this speed, or a host of other unforseen problems that could arise if we get closer to light-speed transportation?  Well, why assume that the trip will be made by humans in their current form at all?

Many top futurists believe that the accelerating rate of change will become human-surpassing by the mid-21st century, in an event known as the Singularity.  Among other things, this predicts a merger between biology and technology, to the extent that a human's 'software' can be downloaded and backed up outside of his 'hardware'. 

Such a human mind could be stored in a tiny computer that would not require air or water, and might be smaller than a grain of sand.  This would remove many of the perceived limitations on light-speed travel, and may in fact be precisely the path we are on. 

I will explain this in much more detail in the near future.  In the meantime, read more about why this is possible.

February 07, 2006 in Accelerating Change, Space Exploration, Technology, The Singularity | Permalink | Comments (24) | TrackBack (0)

Tweet This! |

Search

Ads

Categories

  • About
  • Accelerating Change
  • Artificial Intelligence
  • ATOM AotM
  • Biotechnology
  • China
  • Computing
  • Core Articles
  • Economics
  • Energy
  • India
  • Nanotechnology
  • Political Debate
  • Politics
  • Science
  • Space Exploration
  • Stock Market
  • Technology
  • The ATOM
  • The Misandry Bubble
  • The Singularity

Recent Posts

  • I Hope Everyone is Over at the YouTube Channel
  • YouTube Channel Underway
  • Endings and Reincarnations
  • ATOM Award of the Month, January 2021
  • Comment of the Year - 2020
  • ATOM Award of the Month, November 2020
  • More ATOM Proof Piles Up
  • ATOM Award of the Month, August 2020
  • ATOM Award of the Month, June 2020
  • ATOM Webcast on Covid-19
Subscribe to this blog's feed

Reference

  • The Economist
  • KurzweilAI
  • MIT Technology Review

Archives

  • August 2024
  • March 2021
  • January 2021
  • December 2020
  • November 2020
  • September 2020
  • August 2020
  • June 2020
  • April 2020
  • February 2020

More...

© The Futurist