The only argument he gives in any detail has to do with AGI timing:
Given current power consumption by electronic computers, a computer with the storage and processing capability of the human mind would require in excess of 10 Terawatts of power, within a factor of two of the current power consumption of all of humanity. However, the human brain uses about 10 watts of power. This means a mismatch of a factor of 1012, or a million million. Over the past decade the doubling time for Megaflops/watt has been about 3 years. Even assuming Moore’s Law continues unabated, this means it will take about 40 doubling times, or about 120 years, to reach a comparable power dissipation. Moreover, each doubling in efficiency requires a relatively radical change in technology, and it is extremely unlikely that 40 such doublings could be achieved without essentially changing the way computers compute.
Krauss doesn’t say where he got his numbers for the power requirements of “a computer with the storage and processing capability of the human mind,” but there are a few things I can say even leaving that aside.
First, few AI scientists think AGI will be built so similarly to the human brain that having “the storage and processing capability of the human mind” is all that relevant. We didn’t build planes like birds.
Second, Krauss warns that “each doubling in efficiency requires a relatively radical change in technology…” But Koomey’s law — the Moore’s law of computing power efficiency — has been stable since about 1946, which runs through several radical changes in computing technology. Somehow we manage, when there is tremendous economic incentive to do so.
Third, just because the human brain achieves general intelligence with ~10 watts of energy doesn’t mean a computer has to. A machine superintelligence the size of a warehouse is still a challenge to be reckoned with!
Added 08-28-15: Also see Anders Sandberg’s comments on Krauss’ calculations.
Added 02-18-16: Sandberg wrote a version of his comments for arxiv, here.