Richard Dawkins and Jon Stewart discussed existential risk on the Sept. 24, 2013 edition of The Daily Show. Here’s how it went down:
STEWART: Here’s my proposal… for the discussion tonight. Do you believe that the end of our civilization will be through religious strife or scientific advancement? What do you think in the long run will be more damaging to our prospects as a human race?
In reply, Dawkins said that Martin Rees (of CSER) thinks humanity has a 50% chance of surviving the 21st century, and one cause for such worry is that powerful technologies could get into the hands of religious fanatics. Stewart replied:
STEWART: …[But] isn’t there a strong probability that we are not necessarily in control of the unintended consequences of our scientific advancement?… Don’t you think it’s even more likely that we will create something [for which] the unintended consequence… is worldwide catastrophe?
DAWKINS: That is possible. It’s something we have to worry about… Science is the most powerful way to do whatever you want to do. If you want to do good, it’s the most powerful way to do good. If you want to do evil, it’s the most powerful way to do evil.
STEWART: …You have nuclear energy and you go this way and you can light the world, but you go this [other] way, and you can blow up the world. It seems like we always try [the blow up the world path] first.
DAWKINS: There is a suggestion that one of the reasons that we don’t detect extraterrestrial civilizations is that when a civilization reaches the point where it could broadcast radio waves that we could pick up, there’s only a brief window before it blows itself up… It takes many billions of years for evolution to reach the point where technology takes off, but once technology takes off, it’s then an eye-blink — by the standards of geological time — before…
STEWART: …It’s very easy to look at the dark side of fundamentalism… [but] sometimes I think we have to look at the dark side of achievement… because I believe the final words that man utters on this Earth will be: “It worked!” It’ll be an experiment that isn’t misused, but will be a rolling catastrophe.
DAWKINS: It’s a possibility, and I can’t deny it. I’m more optimistic than that.
STEWART: … [I think] curiosity killed the cat, and the cat never saw it coming… So how do we put the brakes on our ability to achieve, or our curiosity?
DAWKINS: I don’t think you can ever really stop the march of science in the sense of saying “You’re forbidden to exercise your natural curiosity in science.” You can certainly put the brakes on certain applications. You could stop manufacturing certain weapons. You could have… international agreements not to manufacture certain types of weapons…
- Müller & Bostrom, Future progress in artificial intelligence: a poll among experts.
- Ord, The timing of labour aimed at reducing existential risk.
- In the USA, avionics software must be certified by designated specialists beholden to (e.g.) the FAA. But when it comes to software for self-driving cars, Google is pushing hard for a system of self-certification.
- So apparently there was a hit TV show in the late 90s, with more viewers than Game of Thrones, about a naked guy in a small room who had to survive entirely on sweepstakes winnings (e.g. dog food) for more than a year and who didn’t know he was on TV the whole time. Obviously, this happened in Japan.