Automotive Holiday Deals Books Gift Guide Books Gift Guide Shop Women's Cyber Monday Deals Week Learn more nav_sap_SWP_6M_fly_beacon Indie for the Holidays egg_2015 Fire TV Stick Luxury Beauty Gifts for Her Amazon Gift Card Offer mithc mithc mithc  Amazon Echo Starting at $49.99 Kindle Voyage R6 Siege Outdoor Deals on HTL

Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.

  • Apple
  • Android
  • Windows Phone
  • Android

To get the free app, enter your email address or mobile phone number.

Neural Networks: A Comprehensive Foundation (2nd Edition) 2nd Edition

25 customer reviews
ISBN-13: 978-0132733502
ISBN-10: 0132733501
Why is ISBN important?
This bar-code number lets you verify that you're getting exactly the right version or edition of a book. The 13-digit and 10-digit formats both work.
Scan an ISBN with your phone
Use the Amazon App to scan ISBNs and compare prices.
Sell yours for a Gift Card
We'll buy it for $26.67
Learn More
Trade in now
Have one to sell? Sell on Amazon
Buy used
Condition: Used - Good
In Stock. Sold by HPB-Ohio
Condition: Used: Good
Comment: Item may show signs of shelf wear. Pages may include limited notes and highlighting. Includes supplemental or companion materials if applicable. Access codes may or may not work. Connecting readers since 1972. Customer service is our top priority.
Access codes and supplements are not guaranteed with used items.
20 Used from $40.78
+ $3.99 shipping
More Buying Choices
10 New from $87.71 20 Used from $40.78

There is a newer edition of this item:

Free Two-Day Shipping for College Students with Amazon Student Free%20Two-Day%20Shipping%20for%20College%20Students%20with%20Amazon%20Student

Get Up to 80% Back Rent Textbooks

Editorial Reviews

From the Publisher

This text represents the first comprehensive treatment of neural networks from an engineering perspective. Thorough, well-organized, and completely up-to-date, it examines all the important aspects of this emerging technology. Neural Networks provides broad coverage of the subject, including the learning process, back propogation, radial basis functions, recurrent networks, self-organizing systems, modular networks, temporal processing, neurodynamics, and VLSI implementations. Chapter objectives, computer experiments, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary reinforce key concepts. The author's concise and fluid writing style makes the material more accessible. --This text refers to an out of print or unavailable edition of this title.

From the Back Cover

Renowned for its thoroughness and readability, this well-organized and completely up-to-date text remains the most comprehensive treatment of neural networks from an engineering perspective. Thoroughly revised.


  • NEW—New chapters now cover such areas as:
    • Support vector machines.
    • Reinforcement learning/neurodynamic programming.
    • Dynamically driven recurrent networks.
    • NEW-End—of-chapter problems revised, improved and expanded in number.


    • Extensive, state-of-the-art coverage exposes the reader to the many facets of neural networks and helps them appreciate the technology's capabilities and potential applications.
    • Detailed analysis of back-propagation learning and multi-layer perceptrons.
    • Explores the intricacies of the learning process—an essential component for understanding neural networks.
    • Considers recurrent networks, such as Hopfield networks, Boltzmann machines, and meanfield theory machines, as well as modular networks, temporal processing, and neurodynamics.
    • Integrates computer experiments throughout, giving the opportunity to see how neural networks are designed and perform in practice.
    • Reinforces key concepts with chapter objectives, problems, worked examples, a bibliography, photographs, illustrations, and a thorough glossary.
    • Includes a detailed and extensive bibliography for easy reference.
    • Computer-oriented experiments distributed throughout the book
    • Uses Matlab SE version 5.

Engineering & Transportation Books
Discover books for all types of engineers, auto enthusiasts, and much more. Learn more

Product Details

  • Hardcover: 842 pages
  • Publisher: Prentice Hall; 2 edition (July 16, 1998)
  • Language: English
  • ISBN-10: 0132733501
  • ISBN-13: 978-0132733502
  • Product Dimensions: 6.9 x 1.6 x 9.4 inches
  • Shipping Weight: 3.1 pounds
  • Average Customer Review: 4.1 out of 5 stars  See all reviews (25 customer reviews)
  • Amazon Best Sellers Rank: #788,149 in Books (See Top 100 in Books)

Important Information

Example Ingredients

Example Directions

More About the Author

Discover books, learn about writers, read author blogs, and more.

Customer Reviews

Most Helpful Customer Reviews

45 of 45 people found the following review helpful By David Elder on April 19, 2003
Format: Hardcover
Haykin's book is probably the most comprehensive compendium of traditional neural network theory currently available. I say "traditional" because historically neural networks developed within the field of computer science, only loosely inspired by actual neuroscience. Feedforward networks, backpropagation, self-organizing maps, PCA, and hierarchical machines fit into this traditional lineage. A second branch of neural networks, inspired more heavily by biology, have sought to model brain function and structure. Within this camp are network models such as adaptive resonance theory (ART), BCS/FCS, integrate-and-fire models, and a variety of others. Though this second branch of neural network theory has applications in pattern recognition, image processing, clustering, etc., Haykin barely mentions it. In other words, Haykin presents the material that computer scientists and engineers want to see, but skimps on the more biological side of the field. That being said, the material covered in Haykin is very well-presented, with clear mathematical notation and typesetting throughout. The book is accessible to graduates and advanced undergraduates. It should be on the shelf of every serious researcher, though workers in the biological sciences may want supplementary material. Computer scientists, mathematicians, and other engineers will not be disappointed at all.
1 Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
64 of 68 people found the following review helpful By Dr. Lee D. Carlson HALL OF FAMEVINE VOICE on July 12, 2001
Format: Hardcover
This book, excellent for self-study and for use as a textbook, covers a subject that has had enormous impact in science and technology. One can say with confidence that neural networks will increase in importance in the decades ahead, especially in the field of artificial intelligence. The book is a comprehensive overview, and does take some time to read and digest, but it is worth the effort, as there are many applications of neural networks and the author is detailed in his discussion.
In the first part of the book, the author introduces neural networks and modeling brain functions. A good overview of the modeling of neural networks and knowledge representation is given, along with a discussion of how they are used in artificial intelligence. Ideas from computational learning are introduced, as well as the important concept of the Vapnik-Chervonenkis (VC) dimension. The VC dimension is defined in this book in terms of the maximum number of training examples that a machine can learn without errors. The author shows it to be a useful parameter, and allows one to avoid the difficult problem of finding an exact formula for the growth function of a hypothesis space.
In the next part of the book, the author discusses learning machines that have a teacher. The single-layer perceptron is introduced and shown to have an error-correction learning algorithm that is convergent. There is a fine discussion of optimization techniques and Bayes classifiers in this part. The least-mean-square algorithm is generalized to the back-propagation algorithm in order to train multi-layer perceptrons along with a discussion on how to optimize its performance using heuristics. The author gives a detailed discussion of the limitations of back-propagation learning.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
26 of 28 people found the following review helpful By Luke Palmer on March 6, 2002
Format: Hardcover
Extremely concise, extremely complete. Every new page has a new concept or method. In the first chapter, I knew more than I did after reading two other books I bought on the subject.
I would suggest, however, not to use this as an introduction. It's a bit more rigorous mathematically than some others, so use it if you understand the concepts first. It will shine new insight onto the concepts you already know, but it will probably fail at teaching them to you from the ground up.
I suggest this for the experienced Artificial Intelligence experimenter.
And for the love of god, use Perl for your test programs! Writing C++ classes for artificial intelligence is wholly impractical!
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
16 of 17 people found the following review helpful By Itzcoatl Muoz Martinez on July 24, 2005
Format: Hardcover
About theory, the book is good, but; it needs more practical or numerical examples, in order to get the information understandable.

There are too many concepts and ideas that without a good example, it is very hard to assimilate.

Also the computer oriented experiments in matlab, do not use the

neural network tool box, so it is not possible to get the gap to convert knowledge into computer code.

If these two recommendations are improved in a next edition, the book will become and excellent one.

Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
17 of 19 people found the following review helpful By Sidhant on May 7, 2006
Format: Hardcover
I used this as a textbook for a Neural Networks course I did in the second year of my undergraduate program in Mathematics and Computing.

My mathematical background till that point of time comprised Linear Algebra and upper level Calculus. This being rather 'limited' mathematical exposure, I found the book quite difficult to follow. It becomes harder when you are expected to convert the mathematical equations into working programs (without using tool-boxes or libraries, i.e.). The end-of-chapter exercises are pretty hard, and try to go beyond what the text talks about, most undergraduates may not be able be able to appreciate that. I think this is an excellent reference book for those who are pretty comfortable with Math. For undergraduates doing a first course in Neural Networks, I strongly recommend Timothy Masters' "Practical Neural Network recipes in C++". The math there is manageable, and yes, it comes with working code to make your life easier.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse

Most Recent Customer Reviews

Want to discover more products? Check out these pages to see more: neural networks electricity, computing network 1998