Stewart and Dawkins on the unintended consequences of powerful technologies

Richard Dawkins and Jon Stewart discussed existential risk on the Sept. 24, 2013 edition of The Daily Show. Here’s how it went down:

STEWART: Here’s my proposal… for the discussion tonight. Do you believe that the end of our civilization will be through religious strife or scientific advancement? What do you think in the long run will be more damaging to our prospects as a human race?

In reply, Dawkins said that Martin Rees (of CSER) thinks humanity has a 50% chance of surviving the 21st century, and one cause for such worry is that powerful technologies could get into the hands of religious fanatics. Stewart replied:

STEWART: … [But] isn’t there a strong probability that we are not necessarily in control of the unintended consequences of our scientific advancement?… Don’t you think it’s even more likely that we will create something [for which] the unintended consequence… is worldwide catastrophe?

DAWKINS: That is possible. It’s something we have to worry about… Science is the most powerful way to do whatever you want to do. If you want to do good, it’s the most powerful way to do good. If you want to do evil, it’s the most powerful way to do evil.

STEWART: … You have nuclear energy and you go this way and you can light the world, but you go this [other] way, and you can blow up the world. It seems like we always try [the blow up the world path] first.

DAWKINS: There is a suggestion that one of the reasons that we don’t detect extraterrestrial civilizations is that when a civilization reaches the point where it could broadcast radio waves that we could pick up, there’s only a brief window before it blows itself up… It takes many billions of years for evolution to reach the point where technology takes off, but once technology takes off, it’s then an eye-blink — by the standards of geological time — before…

STEWART: … It’s very easy to look at the dark side of fundamentalism… [but] sometimes I think we have to look at the dark side of achievement… because I believe the final words that man utters on this Earth will be: “It worked!” It’ll be an experiment that isn’t misused, but will be a rolling catastrophe.

DAWKINS: It’s a possibility, and I can’t deny it. I’m more optimistic than that.

STEWART: … [I think] curiosity killed the cat, and the cat never saw it coming… So how do we put the brakes on our ability to achieve, or our curiosity?

DAWKINS: I don’t think you can ever really stop the march of science in the sense of saying “You’re forbidden to exercise your natural curiosity in science.” You can certainly put the brakes on certain applications. You could stop manufacturing certain weapons. You could have… international agreements not to manufacture certain types of weapons…


  1. Sean O hEigeartaigh says

    Thanks, I missed this! These are actually pretty good soundbytes for two people who don’t normally talk about this topic.

Leave a Reply

Your email address will not be published. Required fields are marked *