Winter Driving Best Books of the Month Men's Leather Watches Learn more nav_sap_SWP_6M_fly_beacon $5 Albums Explore Home Audio All-New Amazon Fire TV Grocery Valentine's Day Cards Bring a little greenery into your home Amazon Gift Card Offer girls2 girls2 girls2  Amazon Echo All-New Fire Kindle Paperwhite Winter Sports on Amazon.com Sale

Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

on August 4, 2013
First, the Good: The author introduces a few ideas that are tasty, like the idea of evolution as computation. This notion suggests evolution as a phenomenon in which Nature (to anthropomorphize) explores possibility space. Or, the introduction of a few challenging (to be charitable) notions--evolution as actually goal-directed (in a way), evolution NOT acting on populations, etc. There is also a nice description of P/NP problems (indeed, the first part of the book is strongest). Finally, there is confrontation with that great bugaboo of science philosophy, the Problem of Induction--even more important in the age of Big Data (a phenom now in its "Screw causation, all is correlation! Yippee!" adolescent part of the Hype Cycle. Sigh.)

The Bad: None of these ideas are really developed, much less justified. From an evolutionary science POV, what he is saying is rather provocative (one thinks..see below) but never defended. Contrast with Dawkins' fantastically lucid descriptions of evolutionary mechanisms--this author's do not compare.

The Ugly: after a while, the prose is simply unreadable. The effect is a little hard to describe, but it seems that the author can't find his theme (or cannot show it to us), and cannot BUILD his ideas. In other words, he doesn't take a central idea, build it up, repeat the essentials (to keep us oriented) and push those elements out into concretes for illustration. Even worse, in trying to straddle some path between using math to show and not using math so as to avoid spooking, there is both too much and too little math.

Worst of all: There is no clear, explicit definition of what a PAC algorithm is--there is a very light introduction to venerable machine learning algorithms (e.g. Perceptron) then suddenly references to the PAC algorithm. Is PAC just machine learning? What the hell is he talking about?! I began to wonder if this was just another kind of crackpottery, old (but very cool, useful and provocative) ideas tarted up in new language, like Wolfram's book...

In the end, this book got the Dorthy Parker review in my house--not a book to be tossed aside lightly, but to be hurled with great force! I actually threw it across the room...
33 comments67 of 80 people found this helpful. Was this review helpful to you?YesNoReport abuse
on July 20, 2013
In short: whether you're a computer scientist familiar with machine learning algorithms, or whether you don't know much about artificial intelligence, this book has profound and novel insights to offer. I've been a practitioner of machine learning for a long time, and yet the book's framework relating machine learning to evolution gave me a whole bunch of "aha" moments. So pick it up and give it a read.

The book's thesis in a few words: cognitive concepts are computational, and they are acquired by a learning process, before and after birth. Nature, the grand designer, uses ecorithms to guide this process - systems whose functioning and whose parameters are learned and evolved, as opposed to written down once (like algorithms). The processes of learning, evolution and reasoning are the building blocks of ecorithms.

This, in and of itself, is not a new framework. Open any artificial intelligence textbook, and the table of contents will be organized into algorithms for "learning" and "reasoning". So nothing new there. But then, the book launches into an excellent, simple and mind-blowing thought experiment: what if nature were simply relying on the same simple learning algorithms that we as humans have been researching, with the same constraints - and evolution is just that formal learning process in action? And then: given all we know about the parameters of these learning algorithms, would evolution have been mathematically possible?

To answer that, the author goes into some detail on computational complexity theory. Computer science has shown that there are many seemingly simple processes that aren't solvable in polynomial time - meaning, if you make them big enough, solving them will take longer than the universe existed. The question of the shortest overall path in visiting all cities in a particular geography is such a problem. So if it is so easy to mathematically prove that so many really simple problems aren't solvable in the time the universe existed, how would it even be remotely possible that evolution build something as complex as the human brain in an even shorter time frame?

The book then essentially explores areas of machine learning just deep enough to show that it probably would be possible. There are enough real-world functions of the "probably approximately correct"-learnable class that are learnable in polynomial time, and algorithms that do the learning that we already know (and use) today, that it's imaginable that nature relies on variants of those. The book has some strong tidbits it throws out in the course of discussing this. For example, it turns out that parity functions (deciding, without counting, whether something is odd or even) aren't PAC-learnable. So far, so satisfying a read.

One of the book's drawbacks is that a lot of the details are left open. In the author's thesis, the genome and our protein networks somehow encode the parameters of the learning algorithms nature uses. But of course we have no idea how that actually happens (and the book doesn't pretend that it knows). Another drawback is that the book seemingly can't quite decide on its audience: is it pop science or more serious work? It oscillates strangely between being very concrete and being hand-wavy: for example, when discussing the limits of machine learning (semantics, brittleness, complexity, grounding), there isn't anything offered in terms of why machine learning is so brittle (just try Apple's Siri). It also somewhat casually throws around ideas that are mind-blowing but totally unproven: for example, it is known that our working memory can only hold 7 +/- 2 objects at any point in time. The author argues that this is by design, so that the subsequent learning algorithms have an easier time picking up features. That's a pretty cool line of thinking, because it would suggest that nature uses the same heuristics that we as computer scientists use when tackling a learning problem (reduction in features and dimensionality). But it's also totally unproven that THIS is why we have limited working memory, or that THIS is what it does. The book also doesn't go into any depths on learning algorithms we already know, even though a lot of the known algorithms actually have pretty simple intuitions underlying them that could nicely be treated for a non-computer science audience.

But overall, there are some awesome thought starters in this book. It is not always an easy read. But certainly worth it.
0Comment21 of 27 people found this helpful. Was this review helpful to you?YesNoReport abuse
on January 25, 2015
This book is by a machine learning expert. He in interested in models of learning and particularly their assessment in terms of computational complexity theory. It considers seriously the role of evolution itself in the context of knowledge acquisition processes. The book argues that evolution is a subset of learning processes.

Overall, the book is a reasonable one. However, the presentation is a bit dry and boring. The author apparently likes coining terms, and dislikes reviewing the work of others. As Leslie says, there is indeed a close link between the theories of evolution and learning. He correctly argues against the modern dogma of directionless evolution (since evolution and learning are linked and learning is clearly directional). Leslie argues that "fitness" provides such a direction. In fact a much stronger case than the one Leslie gives can be made - based on thermodynamics.

Overall, I am inclined to think that the book has its core thesis backwards. Instead of evolution being a subset of learning processes, learning processes are part of evolution. Most of the rest of this review focuses on this one point, because I think it is an important one.

The idea that learning is a part of evolution is an old one. James Mark Baldwin proposed that organisms could learn a behavioural trait and then see genetic predispositions to learning that behaviour amplified by evolution. This idea was later generalised by Waddington - who proposed that genes could take over the trait completely - via a process known as "genetic assimilation". We see this effect in modern times, with learned milk drinking preceding genetically encoded lactose tolerance. Overall, the course of evolution is altered significantly by individual and social learning processes.

Leslie says that "The idea that evolution is a form of learning sounds implausible to many people when they first hear it." I think this is because he has things backwards - and learning is better seen as one of the products of evolution. How does Leslie argue that evolution is part of learning - and not the other way around? Leslie confines his attention to the case of "Darwinian evolution". According to Leslie, this term refers to evolution without learning. Leslie asserts that, in Darwinian evolution, genetic variations are generated independently of current experiences - a constraint that does not apply to learning systems. Unfortunately for Leslie's thesis, this isn't the kind of evolution that Darwin believed in. Darwin was well aware of the role of learning in evolution. Indeed he formulated a theory to explain how current experiences went on to affect the next generation. Darwin's proposed "gemmules" were subsequently discredited, but they clearly show that Darwin thought that current experiences influenced heritable variation.

Leslie goes on to describe modern cultural evolution, saying that "culture also undergoes change or evolution, but this change is no longer limited by Darwinian principles". However, Darwin was, in fact, a pioneer in the discovery of cultural evolution, writing about how words and languages were subject to natural selection. Leslie argues that human culture introduced learning to evolution. He minimizes the significance of cultural inheritance in other animals and the influence of individual learning on DNA evolution via the Baldwin effect and genetic assimilation. He says that before human culture: "the learning and reasoning carried out by an organism during its life had limited impact that outlived the individual". I think this is a big understatement that is not really consistent with the scientific evidence on the role of learning in evolution. Learning is important, and it's impact on evolution long pre-dates human cultural evolution.

The "Darwinian evolution" described by the author would have been foreign to Darwin. Also, we know that the idea that genetic variations are generated independently of current experiences is wrong - not least because of the role of stress in stimulating the production of mutations. This it isn't the kind of evolutionary theory that is much use for explaining what happens in nature. Why Leslie focuses on this impoverished version of evolutionary theory is not completely clear. Perhaps he really believes that this is what Darwinian evolutionary theory says. Or perhaps making Darwinism look weak makes his own field of learning seem more important.

So far, this has been mostly an argument over terminology - specifically over what the term "Darwinian evolution" refers to. This debate has limited interest - and can mostly be avoided with clear definitions. However the problem with learning theorists placing learning centrally and denigrating the power of Darwinian evolution, is that they then fail to make proper use of the insights evolutionary theory provides. In fact, Darwinism has much to say about how the brain processes responsible for animal learning work. Natural selection acts on synapses. Axon pulses are copied with variation and selection. There's competition between ideas within the brain for attention. The result is a good adaptive fit between an organism's model of its world and its environment. Interesting though these idea are, you won't find anything like them in this book. Indeed, few machine learning experts appear to have looked into the implications of modern versions of Darwinism. Instead, Leslie sees Darwinian evolution as a primitive ladder that led to modern learning systems. He doesn't deal with the more powerful, generalized versions of evolutionary theory that also cover organisms that learn or make use of cultural transmission.
0Comment2 of 2 people found this helpful. Was this review helpful to you?YesNoReport abuse
on February 26, 2014
In the preface the author states that, “The focus here will be the unified study of the mechanisms of evolution, learning, and intelligence using the methods of computer science.” This is not an original topic - a vast literature extending back decades already exists. Unfortunately, the author fails to (1) review the existing literature, (2) identify a limitation or defect in the existing literature, and (3) propose a solution to the identified limitation or defect.
0Comment5 of 6 people found this helpful. Was this review helpful to you?YesNoReport abuse
on July 12, 2013
It is not what some of the other reviews say it is, nor is it what the pundits claim. It is actually a sloppy, vague and imbalanced account of some very complex and deep subjects. It might serve as an introduction to some of these, but gives no clues on further study and substantiates very little along the way.

It is evidently a whimsically and hurriedly written book, perhaps written in fell swoops, as the author would actually speak, without thought as to what the reader would be reading. It purports to present new and interesting ideas. Instead it presents old ideas that have been lying around in the author's closet for decades and does so in a manner that insults the technical. It is perhaps a fairy tale for the novice - like a sci-fi novel. It is a surprising offering from a fairly distinguished professor.

It contains many passages phrased in a manner that makes one LOL -- after re-reading the passage to make sense out of it. There are many instances of this so-called elegant phraseology and missing substantiation, (see the comments below for examples) and thus the read is a bumpy ride for all. Even if one persists, no coherent ideas are expressed; the book just rambles on about this and that.

Those familiar with the subjects will want to refer to more formal content - available for free on the net (see [...] for an excellent plethora) - for some clarity and missing substantiation; the book is definitely not written for them.

I also found a few statements that are made in the book that are just not true.
33 comments34 of 50 people found this helpful. Was this review helpful to you?YesNoReport abuse
on February 14, 2014
What I was expecting: a mostly-non-technical description of the ideas currently present in the various sub-fields of evolutionary computation, perhaps with elegant parallels drawn to the worlds of biology, cognitive science, genetics, etc.; in a way that such parallels might facilitate interesting thought on how this stuff works in general.

What this actually seems to be: a mostly-non-technical description of some of the author's own ostensibly new-ish theories based on some, but not all, of the things above; although I still can't quite understand where the precisely the novelty and the theories come in (or why he chose this format and audience to talk about them). The descriptions are not very precise (again, probably due to the non-technical tone/format/intended audience) The chapters that give the reader an overview of background topics *are* almost what I was looking for, although without the interesting synthesis of distinct but related fields.

What I'm confused about: the apparent lack of distinction being drawn between Machine Learning principles and Genetic Algorithm principles, either in the book or in the reviews, given that the ideas from genetics are closer to, well, genetic algorithms; whereas the parallels drawn are mostly to machine learning.

Next book I'll try:
http://www.amazon.com/The-Engine-Complexity-Evolution-Computation/dp/0231163045/ref=pd_sim_b_21
0Comment2 of 3 people found this helpful. Was this review helpful to you?YesNoReport abuse
on March 22, 2015
PAC learning is an interesting paradigm but the writer obscures his argument for it as a path to describing evolution mathematically by spending so much time repeating his view that evolution lacks explanatory/predictive value. The whole work comes off as yet another attack on evolution based on outdated understanding rather than the intended quest to find a fitting computational model for it.
0CommentWas this review helpful to you?YesNoReport abuse
on June 22, 2015
The book has some excellent ideas, but none of them are expanded enough. That is probably because this is a book of thought rather evidences, therefore many things remain in a to-be-proved state. I am looking forward to see how research in this understanding of evolution could provide more concrete information.
0CommentWas this review helpful to you?YesNoReport abuse
on May 23, 2014
This is a very interesting book. The author presents the case for ecorithms (algorithms, heuristics perhaps) that could explain and ultimately allow quantitative assessment and testable predictions of the mechanisms (and timescale) of evolution and one of its most “mysterious” byproducts consciousness/cognition (I should perhaps not conflate these two).

The author looks at the central problem of evaluating and decision making based on incomplete information, small empirical samples and within biological and physical constraints and be successful. The linear/polynomial time algorithms (using the generalized concept of computation: universal Turing machines) for learning from inputs from external environments in a “theory-less” context could lead to “probably approximately correct” classifications, decisions and actions and be explanatory for evolution and perhaps human learning and human cultural evolution (with the latter having Lamarckian as well Darwinian aspects).

The book explores these matters through the lens of computer science (the author’s expertise). This is a very interesting and instructive perspective. The limits, and similarities and contrasts between computer systems and algorithms was well presented.

I think this book fits nicely with Penrose “Emperor’s New Mind” (which argues for non-algorithmic apsects to consciousness and learning), Kahneman’s “Thinking Fast and Slow” which explores the limitations of human reason (our hard wiring for making fast decisions with limited information but our limitations in statistical and probabilistic reasoning) and Silver’s “The Signal and the Noise”.

I had (and continue) to think hard about the concepts in the book. It is, however, I believe a very refreshing viewpoint to seek to explain the gaps in evolution, cognition and learning. The author ends by looking at issues of artificial intelligence and why this has been more challenging than anticipated and the authors appeal to reason in relation to fears about a ‘Sky-net’ future was very interesting. The integration of external inputs, the central role of learning, the power of inductive reasoning in and the need for composite induction and deductive reasoning (the latter indispensible for what the author calls theory-ful contexts) are all part of the authors rich explanation…it seemed to me “probably approximately correct”.
0Comment1 of 2 people found this helpful. Was this review helpful to you?YesNoReport abuse
on June 29, 2015
It is a very nice and thought-provoking book. I was not aware that modern biology had no clue about explaining the speed of evolution and how learning algorithms and complexity theory can bring very useful insights into that area.
0CommentWas this review helpful to you?YesNoReport abuse

Send us feedback

How can we make Amazon Customer Reviews better for you?
Let us know here.