Nick Bostrom: How to Destroy Civilization

image



Chris Anderson: Nick Bostrom, you've already shared your crazy ideas with us. A couple of decades ago, as I recall, you suggested that we will live in a simulation or we are already in it. And relatively recently, you presented the most illustrative example of how universal artificial intelligence can harm. And already this year you are going to publish a book with the world's most vulnerable hypotheses. And today's issue should be an illustrated guide to it. Let's start. What are these hypotheses?



Nick Bostrom:This is an attempt to comprehend the structural components of the current state of humanity. You like the urn metaphor, so I'll use it. Imagine a large urn filled with balls representing ideas, methods, and all kinds of technologies. All creative activity of mankind can be compared to plunging into this urn and pulling out balls from there. And for many years we have always only benefited from this, right? We removed too many white balls, some with shades of gray, with ambiguous consequences. But we haven't come across a black ball for so long - a technology that will certainly destroy its creators. The book contains versions of what will be inside the black ball.



CA: So you define this ball as the cause of the death of civilization.



NB:Until we get out of our standard state of semi-anarchy, as I call it.



CA: You cite several counterexamples as evidence. You are convinced that we have been lucky over the years. And perhaps we, without knowing it, have already drawn this ball of death. Also, you give a quote. What does it mean?



NB: It only shows how difficult it is to foresee all the consequences of basic discoveries. We do not have such an opportunity. We have learned well how to take balls out of the urn, but we do not know how to return them back. That is, we can invent, but not vice versa. As a strategy, we only have the hope that there are no black balls in the urn.



CA: We'll get it out one day and we won't be able to get it back, and you think we're lucky so far. Please comment on examples where you mention several types of vulnerability.



image



NB:The easiest type to understand is technology that can cause a lot of massive destruction without much difficulty. Synthetic biology can be a rich source of black balls of this kind. Geoengineering products are also worth considering. We can fight global warming, but we don't need to make it easy for ourselves. It is not necessary for a random person and his grandmother to have the opportunity to radically change the climate on earth. Or deadly mosquito-sized drones mass-produced to kill pests. Nanotechnology, universal artificial intelligence.



CA: There's a doubt in the book that we were lucky enough to discover nuclear energy to create an atomic bomb. Perhaps this is the case when it is possible to create a bomb using the lighter resources available to everyone.



NB:Let's go back to the 30s of the twentieth century, when we first made progress in nuclear physics. Some prominent scientists figured out how to create a nuclear chain reaction and then brought it to a bomb. We've done some research. It turned out that to create an atomic bomb containing a huge amount of uranium and plutonium, which are quite difficult to obtain, ultracentrifuges are needed, reactors are needed, as well as a large amount of energy. We assume that, instead, there is an easier way to release energy from the atom. And what if it turns out to be possible to cause a nuclear explosion from sand in a microwave oven or in a simpler way. Now this is impossible from the point of view of physics. But until such experiments have been carried out, how do you know what will come of it?



CA:Agree, for the normal development of life on Earth, a stable environment is needed. And if there was an easier way to get large nuclear reactions, then a stable environment on Earth would not exist, and we would not be here.



NB: Yes, if there was something created on purpose, and not by itself. This is how we do a lot. We can stack 10 blocks on top of each other, but this is not the case in nature.



CA: Now let's talk about what worries us the most. For now, from what we can predict, synthetic biology is the shortest path to self-destruction.



NB:And just imagine what it would look like if you told someone that he could smash the city by cooking something in the kitchen. It is difficult to imagine how modern civilization in its usual form can survive such a thing. Because among any population of many millions, there are those who, for some unknown reason, want to use this destructive force. And if this lover of the apocalypse chose to destroy the city, or worse, the cities would be destroyed.



CA: There is another type of vulnerability. Tell us about it, please.



image



NB:In addition to the easy-to-understand black balls that will blow everything in their path, others will create incentives for people to do bad things and cause harm. That is, if you think about the so-called Type-2a, then this is a technology that will encourage the great world powers to use all their military power for destruction. In fact, nuclear weapons are very close to that, right? What we did: We've spent over $ 10 trillion building 70,000 nuclear warheads and are keeping them on high alert. During the Cold War, we were on the brink of mutual destruction several times. Not because people think it is a great idea to waste $ 10 trillion in self-destruction. It was stimulated by the fact that we are only looking for ourselves, and this could be the worst. Imagine if the first blow went through without much consequences,it would be very difficult to refrain from launching all nuclear missiles in a crisis situation. If not otherwise, because you are afraid that it will be done on the other side.



CA: Assured mutual destruction maintained stability during the Cold War. Otherwise we wouldn't be here.



NB: Yes, that would make her more unstable. Here the other side of technology could appear. It would be more difficult to reach agreements on armaments if instead of nuclear weapons there were some small item or less visible.



CA: Just like the bad motives of the world's influential actors, you also worry about the bad motives of each of us in type 2b.



image



NB:Yes, it's better to take global warming as an example. Each of us uses some modern conveniences in everyday life, the use of which does not have significant consequences, right? But billions of people are doing the same thing, which together has a devastating effect now the problem of global warming could get worse. There is a parameter of influence on the climate. It shows how much warming will increase when a certain amount of greenhouse gases are emitted. Suppose that with the amount of greenhouse gases emitted at the moment, by 2100, instead of a temperature increase of only 3-4.5 degrees, we will face an increase of 15 or even 20 degrees. That is, we may find ourselves in a very bad position. Suppose it is very difficult to create a renewable energy source or there are more fossil fuels in the earth



CA: Are you saying that if the way of life that we are leading today eventually leads to a visible difference of 10 degrees, then people would already be fussing and do something about it. We may be stupid, but not that stupid. But who knows.



NB: I wouldn't be so sure.



Consider other properties as well. At the moment, switching to renewable energy and the like is difficult, but possible. Perhaps using slightly different physical properties, this will be much more expensive.



CA: What is your opinion? Is our planet, humanity with all its technical capabilities, in a vulnerable position? Is there a ball of death in our future?



NB:Hard to say. I think this urn is likely to contain a variety of black balls. But there will also be gold ones that will help us in the battle against the black balls. But I don't know in what order they will appear.



CA: Your idea slightly contradicts the philosophical idea that the future is predetermined and inevitable, regardless of the color of the next ball, and to some extent I do not want to believe in such a future. I want to believe that the future is not predetermined, that the decisions we make today will affect the color of the ball that we pull out.



NB:I think if we keep inventing we will eventually pull out all the balls. There is some weak form of technological determinism that has good reason, which means you are unlikely to find a community that uses silicon axes and jet planes. Technology can be thought of as a package of capabilities. Technology allows us to do different things and achieve different results in the world. How to use them depends only on our choice. But if you think about these three types of vulnerability, they can shatter the idea of ​​\ u200b \ u200busing them. Let me remind you that the first type of vulnerability is a huge destructive force and the assumption that among the multimillion population there are those who want to use it for destruction is rather weak.



CA:What worries me most is the thought that at this point in some way we have immersed ourselves in this urn enough to realize that we are doomed. That is, if you believe in an increase in power, in the development of technology, in the creation of technology that will make us stronger, then at some point you come to the conclusion that a single person can destroy all of us. And then we're all finished. Doesn't it bother you?



NB: Oh, yes,



I believe ... Yes, we are getting stronger and it is getting easier to use this power, but we can also invent technology that will allow us to control the use of this power by the KA people



:Let's discuss this. Let's talk about responses. Suppose, thinking about all the possibilities that we have today, it is not only synbiotechnology, but also things like cyber warfare, artificial intelligence, etc. - for which, perhaps, in the future we will face a serious reckoning. What will be the responses? You also mentioned four types of these responses.



image



Limitation of NB developments



:The limitation in the development of technical progress does not inspire hope if we all mean the halt of technical progress as a whole. I think we have neither the desire nor the ability to do this. Or perhaps only in some areas you want to slow down technological progress. I think you do not want the rapid development of biological weapons or isotope separation, which will facilitate the creation of nuclear weapons.



CA:I completely agree with you, but I would like to focus on this, firstly, because if you look at the history of the last two decades, we are moving forward at full speed, it is good that this is only our choice, but if you look at globalization and its rapid growth, to the strategy of "move faster and break down the barriers" and what it has become, and then to the full potential of synthetic biology. I'm not sure if we should be moving that fast, without any restrictions, into a world where there will be a DNA printer in every home and in the high school laboratory. There should be some restrictions.



NB: Perhaps the first part is not the most suitable one. If people want to stop it anyway, there will be implementation problems. In fact, nothing will come of it if the population of one country ...



KA:Not if only one people is acting, but we have already concluded international agreements earlier. Thanks to this, we managed to eliminate the nuclear threat through meetings and lengthy complex negotiations. I wonder if it would not be more logical for us not to come out here on an important matter and try to discuss strict rules on synthetic bio-research. Isn't that what you want to democratize?



NB:I totally agree. For example, I would like the DNA synthesizer to be not a device that will be installed in every laboratory, but, perhaps, a service. Let's say there are only four or five places in the world where you can send an electronic DNA sample and you get the results. And then you will have the opportunity, if necessary, to obtain a finite set of vulnerabilities. I think you will want to know about special situations that can be tightly controlled.



CA: Basically, your opinion is that it won't work if we just slow down progress. Someone, somewhere, let's say in North Korea, there is someone who wants to get exactly the black ball from the urn, if there is one.



NB:In the current environment, it is likely. And it's not about synthetic biology. Any new major changes in the world can turn out to be a black ball.



CA: There is another development option.



image



Elimination of dangerous people



NB: But the probability is rather small. Concerning the first type of vulnerability. If it was possible to reduce the number of people interested in destroying the world, if they had access to weapons and funds, it would be great.



CA: You asked for this slide, with face-recognition drones flying around the world. And when they find people with signs of antisocial behavior, they will shower them with love, "fix" so to speak.



NB:It's not so simple. Liquidation also means, for example, imprisonment or destruction, or changing their views on life to more correct ones. The point is that, for example, you have coped with the task, have reduced the number of dangerous people in half. If you do liquidation by persuasion, then you will encounter other powerful factors that affect people: parties, religion, education. If the number of dangerous people is halved. It seems to me that the risk will decrease not by 50%, but only by 5 or 10%.



CA: Do you think we are not risking the future of humanity in the second option?



NB: I think there is nothing wrong with trying to dissuade people, but this method cannot provide safety alone.



CA: And the third option?



NB:There are two main methods by which we can harden our world against a whole range of possible vulnerabilities. But you may need both. The first is highly effective police law enforcement,



image



Total surveillance



, interception capability. If someone wants to commit a crime, it would be possible to intercept and stop him. In this case, widespread supervision will be required, each of us will be under supervision.



CA: Like Minority Report.



NB: Artificial intelligence algorithms, huge centers of freedom, etc. can help us in observation.



CA: Do you know the term "total surveillance" is not very popular now?



NB:It will be a small device with multidirectional cameras built into a so-called choker that will need to be worn at all times. So that it doesn't seem like a burden, let's just call it something like a "symbol of freedom."



CA: Okay Friends, that's why our conversation is so exciting.



NB: You can actually talk about it for a long time, which is obvious. We discussed risks and challenges, right? We can return to this. And fourth,



image



Global Governance



the last opportunity for stabilization is to close the gaps in government. Control would become a flaw in micro-level governance, preventing someone from doing something illegal. Of course, the same problem is observed at the macro level, at the global level. Definitely, it will take competence to prevent disruptions in global coordination, to avoid war between the great powers, an arms race, general problems with disasters, to cope with type 2a vulnerabilities.



CA:The term “global governance” is clearly not in vogue now, but you can argue that in the course of human history, at each new stage of development, technological power increases. People have transformed and concentrated this power. For example, when a roving criminal gang may lead a society, the answer was that you have a nation-state and you are concentrating forces - the police or the army and "You cannot do that." The logic, perhaps, is that if a single person or group of people can destroy humanity, someday we will embark on this path, at least in some aspects?



NB:Clearly, in the course of the development of human history, the scale of the political apparatus has increased. First there was a group of hunter-gatherers, then the head of a tribe, city-state, nation, now there are international organizations, etc. I just want to make sure that I have been able to highlight the large number of negative aspects and risks both in the case of total surveillance and in global governance. I just want to note that if we are lucky, the world may turn out to be such that all of the above will only be salvation from the black ball.



CA:As I understand it, the logic of this theory is that it must be understood that everything will not happen immediately. It's a kind of naive dream that many of us have that technology is always used for good. Go on, don't stop, hurry up if possible, and don't think about the consequences that may not even happen. It can happen. And if it does, we will have to accept other attendant circumstances and become part of the arms race. If you want to have power, it's best to limit it, or figure out how to limit it.



NB:I think this is a very good option, in a sense even the easiest one and it can work. But this means we are very vulnerable to the appearance of the black ball. I think that by adding a little coordination, if we solved all the problems at the level of macro and micromanagement, we would take all the balls out of the trash can and reap a lot of benefits.



CA: What difference does it make if we live in a simulation? We're just rebooting.



NB: I don't think it has come.



CA: What do you think? Putting it all together, what is the likelihood that we are doomed?

(I love the laughter of people when this question is asked.)



NB: On an individual level, we will end anyway, over time we age, we decay, and stuff like that, right?



It's actually a little more complicated. If you want to calculate the likelihood that we will end, first of all, who are "we"? If you are many years old, you may die a natural death. If you are young, then you can have as many as 100 years. The probability is individual. Then the threshold, that is, to limit the concept of the destruction of civilization. In my book, I do not consider existential catastrophe to be the destruction of civilization. It's all a matter of definition. Let's say there will be a billion deaths or a 50% reduction in global GDP, but how you define the threshold, you get different degrees of probability. I think you will call me a fearful optimist.



CA: You are a fearful optimist, and I think you have created quite a few of your own intimidated ... people by now.



NB: Living in simulation.



CA:Exactly. Nick Bostrom, your ideas are amazing. Thank you for scaring us to death.



Translated by Olia Francia

Reviewed by Natalia Ost








image



Learn the details of how to get a sought-after profession from scratch or Level Up in skills and salary by completing SkillFactory paid online courses:






Read more






All Articles