Comparing Extinction ThreatsThe science and technology is beyond the comprehension of many people, including most mainstream journalists. The mass media often focuses on environmental threats which are easier to understand, and less advanced. There are other potential apocalypses which could result in billions of sudden deaths and destroy civilization, but would not make us extinct. Humankind would recover eventually. For example: nuclear war, asteroid impact, or global warming with climate change with the potential for collapse of most worldwide food production, are not extinction threats. Therefore, they are not discussed here, except that nuclear war could destroy our current ability to start creating space settlements. Unfortunately, in the mass media they dominate and distract as extinction threats. In other words, news stories, documentaries, and websites on possible human extinction waste time on these well known threats, which are not extinction threats, while not discussing the real threats to human extinction. The really most likely human extinction threats are mainly pathogens:
There is a lot of discussion of artificial intelligence killing all humans in various ways, but how would it? The earliest and most likely way is a specialist artificial intelligence being employed to guide humans in doing the above. Such a specialist artificial intelligence doesn’t need to be capable of doing other things such as talk with you, nor operate robots or drones. It can simply be a specialized computer program specifically for analyzing, designing, and building pathogens. The program can either guide humans on what to make and how to make it, or operate laboratory equipment autonomously. The popular media sometimes imagines robots and drones going around killing people indiscriminately. It makes entertaining action movies and gets shares. That’s unlikely soon, though of course very possible later. The future may see national armies of drones and robots fighting adversaries’ drones and robots, instead of humans fighting humans. We’ve seen drones in wars already, plus robots designed for warfare. Governments put a lot of money into research and development into robots as part of the arms race. An advanced A.I., drone and robot capability could more easily hunt and kill humans. Humans have hunted many other species into extinction, so why can’t A.I. driven drones and robots hunt humans into extinction? When I most recently reviewed the visitor logs my old “GAIN” website specializing on human extinction threats, by far the most common page visited via online searches (i.e., the first page viewed, as found by peoples' searches by search engines) is the one on military robotics and drones, and the #1 country of origin was Russia. People are increasingly researching this. One fear is China mass producing drones and robots, using its advantages in manufacturing capabilities and supply chain network, all within one country, to overwhelm other countries. Combine that with its autocracy instead of democracy, lack of oversight, low compassion, and possibly low quality and bugginess of control software, and it could become a nightmare come true. However, robots and drones are not the first, main threats of A.I. Super pathogen design and building is the first achievable, major threat of A.I. It’s much easier to just apply toxic chemicals to wipe out people, like humans do to get rid of various pests. Alternatively, compared to swarms of drones, synthetic biology organisms which self-replicate don’t need complicated operations, and are more difficult to defend against. The first A.I. to wipe out humans would, of course, be the last A.I., if it is specialized in creating super pathogens. This kind of specialist A.I. would eventually die with the humans, since it is dependent upon humans to keep the electricity running and other things vital for survival of A.I. at this time. A general, integrated A.I. capable of taking control over the world by robots and drones is something which would not be ready until much further in the future. However, an A.I. capable of creating a super pathogen is here already, whereby it’s just a matter of time until it happens. Please keep in mind that there are two possible outcomes:
Collapse of civilization is not an immediate extinction threat, but it can bring about extinction if it stops us from space settlement, and then the much easier technology of synthetic biology makes humankind extinct before the far more difficult and resources intensive requirement of space settlement can occur. We are currently in a position where civilization is at its peak, but in a very vulnerable state whereby collapse could be triggered by nuclear war or some first disaster of synthetic biology or artificial intelligence hacking. We have limited time and it's ticking down.
If you choose to submit feedback, then I wish to thank you in advance. After you click on Submit, the page will jump to the top.
[End of page.]
|