I would give the book two stars. Just reading through p. 60, I notice the following dubious things:
1. Miller fails to consider that Moore's Law might grind to a halt well short of its physical limits because of political constraints. As Peter Thiel has argued, most forms of engineering progress since 1970 have become effectively illegal because of environmental ideology or irrational risk aversions. The Singularitarians with their doomsday predictions could incite political restrictions on computing as well.
2. He hasn't addressed the fact that living standards in many ways have stagnated since 1970, an inconvenient truth which blows up all the acceleration-speak promoted by singularity cultists.
3. He believes in nanotech genies.
4. He likes the idea filling the world with more people as smart as John von Neumann, without considering that we already have that many comparably smart people around now, yet progress still seems to have bogged down any way despite their freedom to think about hard problems. He also fails to consider how many high-IQ people these days go into rent-seeking professions like law, administration or finance instead of doing things which push the frontiers of knowledge or increase wealth.
5. He fails to consider the advantages of raising the intelligence of the billions of the world's dumbasses to just ordinary levels. That could revolutionize our world for the better by pushing a lot of people above the threshold where they can start to support themselves without the need for zoo keeping; become more educable and employable; become more law abiding; not pump out bastard kids they can't support; save money and plan for the future and so forth.
6. He over-relies on Ray Kurzweil.
7. He over-relies on Robin Hanson and Eliezer Yudkowsky.
8. He makes his imaginary "jailed" ultra-AI sound like Hannibal Lecter as it plays mind games with its captors to find their weaknesses and trick them into releasing it to take over the universe.
9. He says we could become "immortal" by 2049, a nonsensical claim because you would need longitudinal studies lasting far longer than current life expectancies to see if anti-aging and life extension therapies work.
10. He basically wrote this book as an advertisement for the Singularity Institute and Eliezer Yudkowsky's greatness, so please donate to fund Yudkowsky's humanity-saving mission. He mentions that the Singularity Institute's director, Michael Vassar, would like $50 million towards that end, but that they could get started with $10 million.
The next section about human intelligence and its enhancement I found defensible because it has a basis in, you know, _reality_. We know that human intelligence exists and we know what it can do, unlike the case for Yudkowsky's science-fiction fantasies about sociopathic AI's. Only about a third of the book didn't feel like a ripoff, so I wasted about $8 in buying it.