Comments on: Poll: Top 10 Existential Risks https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks Safeguarding Humanity Mon, 17 Apr 2017 05:27:50 +0000 hourly 1 https://wordpress.org/?v=6.6.1 By: Willard Wells https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-69130 Fri, 10 Sep 2010 04:45:02 +0000 http://lifeboat.com/blog/?p=116#comment-69130 We can be sure that natural hazards are insignificant because humankind has survived 200,000 years of exposure. By contrast, we’ve had only about 60 years of adaptation to man-made existential threats, and new ones appear continually.

Nuclear holocaust does not threaten extinction because it would happen in the Northern Hemisphere. New Zealand’s South Island and Tierra del Fuego are insulated from its effects by at least one complete Hadley cell of atmospheric circulation.

The biggest risks are uncategorized surprises. For example, a mad billionaire (trillionaire?) ‘hears’ orders from God to exterminate the human race. Or pollutants in the ocean produce mutant phytoplankton that emit poison gas.

]]>
By: DPirate https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-43246 Mon, 18 Jan 2010 08:12:08 +0000 http://lifeboat.com/blog/?p=116#comment-43246 “The third hard ingredient is the ability to effect sane social systems which bring about the blooming of individual potential, and which is chronically blunted when the fundamentals of the first two elements is inadequate.”

Sounds like Lenin, lol.

]]>
By: DPirate https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-43245 Mon, 18 Jan 2010 08:09:10 +0000 http://lifeboat.com/blog/?p=116#comment-43245 Our only existential threat is humanity. Soon enough, what little there is to eat will be full of poisons. Maybe some elite cadre will be able to consolidate enough power to save us from ourselves, but who wants to live like that. Better the whole thing comes down and we start over in a few thousand years. False start.

]]>
By: A.H. Jessup https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-35918 Mon, 06 Jul 2009 18:09:33 +0000 http://lifeboat.com/blog/?p=116#comment-35918 There are two major vectors–energy and water–which define the complexities of most of these symptoms.
The social inversions that lead to genocide and government abuse tend to go away where a sufficiency of these two elements is available, combined with enough education to use them positively.

Global warming is an energy dysfunction; most of the political complexities are water-driven and energy-driven. Virus control is a lot easier, as wlel, when these are resolved.

The third hard ingredient is the ability to effect sane social systems which bring about the blooming of individual potential, and which is chronically blunted when the fundamentals of the first two elements is inadequate.

]]>
By: Bon Davis https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-29459 Wed, 31 Dec 2008 10:12:46 +0000 http://lifeboat.com/blog/?p=116#comment-29459 I will go out on a different limb here. I would allocate $50 million to unfriendly AI. Most of the space born threats, well there isn’t really much we could honestly do. Especially near-term. I agree with earlier comments: space invasion, government abuse and simulation shutdown don’t deserve to be mentioned (how do you fund against a government abuse of power?). AI is a more imminant threat and different from the others: it is the one technology that could actively turn against us, pose a threat by indifference (not actually hostile, see Yudkowski), and the others actually take a human to push the button or open the vile. Still — $20 million to nanotechnology, $10 million to nuclear, $10 million to biological (both of these life will persist without help, if even limited)amd $10 million to environmental threats.

]]>
By: Press to Digitate https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-25989 Tue, 19 Aug 2008 04:47:56 +0000 http://lifeboat.com/blog/?p=116#comment-25989 $20M Superintelligent AI un-friendly…
$20M Nanotechnology gray goo…
These threats deserve special consideration out of respect for their sheer inevitability. We *KNOW* that they will occur, that they will occur within the next 20 years, and that it is only a question of where and when. There are too many nucleation sites from which Strong AI or Carbon-fixing Nanoreplicators might emerge, and too many ‘good’ motivations prompting them to be developed for well-intentioned purposes. Every year that passes, the tools and enabling technologies for each become cheaper, more powerful and more readily available. These are not occasional rarities like a Big Space Rock or Coronal Mass Ejection; these will happen ONCE, and then, Its All Over — unless we can somehow prepare ahead of time.

$10M Biological viruses…
While the technology for genetic manipulation and synthetic life also becomes exponentially cheaper, more powerful and more readily available year by year, and it is likely also inevitable that someone will create a virulent new synthetic pathogenic organism that defies common biological defenses, in this case the tools to counter it also grow more potent with the passage of time. We live in an age when the DNA of any new organism can be sequenced in a matter of hours, a cure modelled by computer within days, and industrial quantities synthesized in biorefineries within a month. Nevertheless, its inevitability places it high on the list of priorities.

$10M Space Threats asteroids…
The Big Space Rock has to be on the list because its happend before, and will happen again. We have many ways of dealing with the problem should it occur, if we are prepared, adn a large enough detection effort finds it in time.

$10M Environmental global warming…
While the technologies to solve the Environmental Energy issue already exist, if they are not disseminated and implemented in time on a broad enough scale, soon enough, the Clathrate Gun (Methane released from the melting permafrost) will render the planet uninhabitable within the century. 4,000 Billion Tones of Methane (>20x as potent a GHG as CO2) will dwarf anything man can produce as a driver of Climate Change; we must mine the atmosphere for the Carbon we need. Atmospheric Carbon Capture coupled with Algae Based Biofuels can fix this problem.

$10M Extraterrestrial invasion…
The million-plus annual UFO sightings reported globally, hundreds of thousands of abductees and hundreds of physical trace cases are absolutely convincing that “They” are real, and they are Here. Even the scientific mainstream has been forced to recognize that planets, with water, and conditions suitable for life are not only ‘not rare’, but are probably quite common in the universe. However, the apparent evidence indicates that this threat has already happend, but with no — or very subtle — negative effects from it. Even so, it is worthy of serious research and analysis, sooner rather than later.

$05M Governments abusive power…
The 2008 FISA revision failed to insert the Constitution between our government and our personal electronics, which are about to become far more ‘personal’, with the advent of Brain/Computer Interface technologies. Neurochips are a reality today, with more than 200,000 already implanted for Cochlear hearing, artificial retinas, and brain pacemakers for Parkinsonism, Epilepsy, OCD, Obesity, and Depression. Microwave-based Voice-to-Skull technology (“Government Mind-Control Rays”) are now also an acknowledged reality. Orwell was right, just 30 years or so off in his timing.

$05M Nuclear holocaust…
This ranks low becuase of its extremely low probability at the Existential Risk level. Israel/Iran or India/Pakistan nuclear exchanges would not pose such a risk, though they would be damned inconvenient for the participants, and neighboring bystanders in the region. However, better dissemination of Anti-Ballistic Missile technology and methods for screening ships and containers at sea for onboard WMD could substantially eliminate this threat category.

$00M Simulation Shut Down if we live in one…
This should not be on the list, not because its impossible, but because, even if true, there is nothing that money could be spent on which would make any difference. What would you do? Build an electronic “prayer machine”, in hopes of contacting the Simulators directly, through the vacuum aether? If we live in a simulation, that aether itself is suspect, and they are reading these texts as we type them anyway.

$10M Other
(1) A Temporal Modem is presently under construction at the University of Connecticut by Dr. Ronald Mallett. Unless the prevailing models of Relativity and Quantum Mechanics are fundamentally WRONG, this device cannot fail to work. It will become cheap to replicate; the knowledge to do so is already in the public domain. It has the potential to destroy causality and trigger chronoplexy on a planetary scale. This cannot be stopped — even if Mallett were inhibited by whatever means, the design is already widely distributed, and some grad student, sooner or later, will inevitably build one that works. So, the best we can do, as with Strong AI and Grey Goo and Synthetic Plagues, is to prepare to somehow detect and deal with its consequences.

(2) Just because the Relativistic Heavy Ion Collider didnt create blackholes, strangelets, or vacuum instability does not mean the Large Hadron Collider will not. Or the next biggest machine after that, or the next one, etc. Sooner or later, one of these Big Science Machines is going to produce something unexpected, at energies where unexpected most likely equals dangerous. Thus far, no research has been devoted to mitigation strategies, techniques, and technologies, which is odd given the very high probability of something eventually going wrong.

(3) When (not ‘If’) room temperature superconductors are commercially introduced, one unexpected result will be the rapid development of a Thanatronic Interface; a device which perfects Electronic Voice Phenomena, enabling reliable, high-fidelity Instrumental Transcommunication with the “Dead”. In every other instance in scientific equipment where a Germanium Diode (used as a detector of subtle EM fields) has been replaced with a Superconducting Diode, a sensitivity increase and improvement of signal-to-noise ratio of Three Orders of Magnitude has been observed. There is no reason to think that EVP detection will be any different. While this may not pose an existential threat, the inevitable advent of reliable electronic communication with the dead will certainly change human society as profoundly as anything else one can imagine. It is worth serious study of the implications and consequences of its development.

Of course, if a wavefront from the Gamma Ray Burst of Wolf-Rayet 104 strikes Earth in December, 2012, as is indicated, we may have bigger and more immediate problems to contend with than much of the above.

]]>
By: 100 millones para salvar al planeta - textualmenteactivo.com https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-16513 Tue, 26 Feb 2008 05:26:38 +0000 http://lifeboat.com/blog/?p=116#comment-16513 […] hipotético de 100 millones para contraarrestar 10 riesgos fatales para el planeta? A través de una encuesta se realizó esta pregunta con el fin de captar la percepción de peligro que tiene la gente ante […]

]]>
By: Frank Sudia https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-16087 Wed, 13 Feb 2008 21:58:14 +0000 http://lifeboat.com/blog/?p=116#comment-16087 $20 Biological viruses, ease of creation
Not an existential risk, but not enough work being done and 30% of us could die is a few months.
$20 Nuclear holocaust
Nuclear winter IS an existential risk, and could actually happen tomorrow.
$20 Space Threats, asteroids, gamma burst, etc.
$20 Nanotechnology gray goo…
$10 Other — earth swallowing micro black holes from new particle accelerators. Maybe this is why SETI finds nothing.
$10 Extraterrestrial invasion…
Might be worth taking a sober look at our survival options.

$0 Superintelligent AI un-friendly…
This is a legal and political issue, not a technology problem. AIShield risks being total Luddite obstructionism! (Work on curbing Microsoft.)
$0 Environmental global warming…
Not an exsitential risk, we’ll survive
$0 Governments abusive power
Not an existential risk, we’ll survive
$0 Simulation Shut Down if we live in one…
You won’t care if the simulation ends, will you?

Post the runner-up ideas in a second ranked list near the top, so future commenters can review and be inspired by them, not just your chosen few categories.

]]>
By: Robert Hunt Robinson https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-16084 Wed, 13 Feb 2008 20:06:38 +0000 http://lifeboat.com/blog/?p=116#comment-16084 I could put these threats in any haphazard order and the list’s logic would be valid, but it still wouldn’t put humanity back squarely on the tracks. The number one existential threat is ignorance. Let’s put $100M into educating the population of planet and I’m sure we’ll then be able to face and solve any problem we are faced with.

]]>
By: Edward Greisch https://russian.lifeboat.com/blog/2008/01/poll-top-10-existential-risks#comment-16070 Wed, 13 Feb 2008 04:40:15 +0000 http://lifeboat.com/blog/?p=116#comment-16070 Hydrogen Sulfide gas will Kill all people. Homo Sap will go
EXTINCT unless drastic action is taken.

October 2006 Scientific American

“EARTH SCIENCE
Impact from the Deep
Strangling heat and gases emanating from the earth and sea, not
asteroids, most likely caused several ancient mass extinctions.
Could the same killer-greenhouse conditions build once again?
By Peter D. Ward
downloaded from:
http://www.sciam.com/
article.cfm?articleID=
00037A5D-A938-150E–
A93883414B7F0000&
sc=I100322
.….….….….…Most of the article omitted.….….….….…..
But with atmospheric carbon climbing at an annual rate of 2 ppm
and expected to accelerate to 3 ppm, levels could approach 900
ppm by the end of the next century, and conditions that bring
about the beginnings of ocean anoxia may be in place. How soon
after that could there be a new greenhouse extinction? That is
something our society should never find out.”

Press Release
Pennsylvania State University
FOR IMMEDIATE RELEASE
Monday, Nov. 3, 2003
downloaded from:
http://www.geosociety.org/meetings/2003/prPennStateKump.htm
“In the end-Permian, as the levels of atmospheric oxygen fell and
the levels of hydrogen sulfide and carbon dioxide rose, the upper
levels of the oceans could have become rich in hydrogen sulfide
catastrophically. This would kill most of the oceanic plants and
animals. The hydrogen sulfide dispersing in the atmosphere would
kill most terrestrial life.”

http://www.astrobio.net is a NASA web zine. See:

http://www.astrobio.net/
news/modules.php?op=
modload&name=News&
file=article&sid=672

http://www.astrobio.net/
news/modules.php?op=
modload&name=News&
file=article&sid=1535

http://www.astrobio.net/
news/article2509.html

http://astrobio.net/news/
modules.php?op=modload
&name=News&file=article
&sid=2429&mode=thread
&order=0&thold=0

These articles agree with the first 2. They all say 6 degrees C or
1000 parts per million CO2 is the extinction point.

The global warming is already 1 degree Farenheit. 11 degrees
Farenheit is about 6 degrees Celsius. The book “Six Degrees” by
Mark Lynas agrees. If the global warming is 6 degrees
centigrade, we humans go extinct. See:
http://www.marklynas.org/
2007/4/23/six-steps-to-hell–
summary-of-six-degrees-as–
published-in-the-guardian

“Under a Green Sky” by Peter D. Ward, Ph.D., 2007.
Paleontologist discusses mass extinctions of the past and the one
we are doing to ourselves.

ALL COAL FIRED POWER PLANTS MUST BE
CONVERTED TO NUCLEAR IMMEDIATELY TO AVOID
THE EXTINCTION OF US HUMANS. 32 countries have
nuclear power plants. Only 9 have the bomb. The top 3
producers of CO2 all have nuclear power plants, coal fired power
plants and nuclear bombs. They are the USA, China and India.
Reducing CO2 production by 90% by 2050 requires drastic action
in the USA, China and India. King Coal has to be demoted to a
commoner. Coal must be left in the earth. If you own any coal
stock, NOW is the time to dump it, regardless of loss, because it
will soon be worthless.
$100 million to teach people that nuclear power is safe and the only thing that works.

]]>