Qty:1
  • List Price: $35.00
  • Save: $3.50 (10%)
FREE Shipping on orders over $35.
In Stock.
Ships from and sold by Amazon.com.
Gift-wrap available.
Perceptrons: An Introduct... has been added to your Cart
FREE Shipping on orders over $35.
Used: Good | Details
Sold by RentU
Condition: Used: Good
Comment: Fast shipping from Amazon! Qualifies for Prime Shipping and FREE standard shipping for orders over $35. Overnight, 2 day and International shipping available! Excellent Customer Service.. May not include supplements such as CD, access code or DVD.
Trade in your item
Get a $2.00
Gift Card.
Have one to sell? Sell on Amazon
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See this image

Perceptrons: An Introduction to Computational Geometry, Expanded Edition Paperback – December 28, 1987


See all 6 formats and editions Hide other formats and editions
Amazon Price New from Used from
Paperback
"Please retry"
$31.50
$27.49 $6.49
Unknown Binding
"Please retry"
Amazon%20Web%20Services

$31.50 FREE Shipping on orders over $35. In Stock. Ships from and sold by Amazon.com. Gift-wrap available.

Frequently Bought Together

Perceptrons: An Introduction to Computational Geometry, Expanded Edition + The Society of Mind
Price for both: $43.54

Buy the selected items together
  • The Society of Mind $12.04

NO_CONTENT_IN_FEATURE

Shop the New Digital Design Bookstore
Check out the Digital Design Bookstore, a new hub for photographers, art directors, illustrators, web developers, and other creative individuals to find highly rated and highly relevant career resources. Shop books on web development and graphic design, or check out blog posts by authors and thought-leaders in the design industry. Shop now

Product Details

  • Paperback: 308 pages
  • Publisher: The MIT Press; expanded edition edition (December 28, 1987)
  • Language: English
  • ISBN-10: 0262631113
  • ISBN-13: 978-0262631112
  • Product Dimensions: 9.1 x 5.9 x 0.8 inches
  • Shipping Weight: 1 pounds (View shipping rates and policies)
  • Average Customer Review: 5.0 out of 5 stars  See all reviews (3 customer reviews)
  • Amazon Best Sellers Rank: #831,619 in Books (See Top 100 in Books)

Editorial Reviews

About the Author

Marvin L. Minsky is Toshiba Professor of Media Arts and Sciences and Professor of Electrical Engineering and Computer Science at M.I.T. . Seymour A. Papert is Professor of Education and Media Technology also at M.I.T. .

More About the Authors

Discover books, learn about writers, read author blogs, and more.

Customer Reviews

5.0 out of 5 stars
5 star
3
4 star
0
3 star
0
2 star
0
1 star
0
See all 3 customer reviews
Share your thoughts with other customers

Most Helpful Customer Reviews

11 of 11 people found the following review helpful By A Customer on April 2, 2000
Format: Paperback
This is a seminal work in the field of Artificial Intelligence. Following an initial period of enthusiasm, the field encountered a period of frustration and disrepute. Minksy and Papert's 1969 book summed up this general feeling of frustration among researchers by demonstrating the representational limitations of Perceptrons (used in neural networks). Their arguments were very influential in the field and accepted by most without further analysis.
I found this book to be generally easy to read. Despite being written in 1969, it is still very timely.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
16 of 20 people found the following review helpful By Howard Schneider on November 26, 2000
Format: Paperback
In 1958, Cornell psychologist Frank Rosenblatt proposed the 'perceptron', one of the first neural networks to become widely known. A retina sensory layer projected to an association layer made up of threshold logic units which in turn connected to the third layer, the response layer. If two groups of patterns are linearly separable then the perceptron network works well in learning to classify them in separate classes. In this reference, Minsky and Papert show that assuming a diameter-limited sensory retina, a perceptron network could not always compute connectedness, ie, determining if a line figure is one connected line or two separate lines. Extrapolating the conclusions of this reference to other sorts of neural networks was a big setback to the field at the time of this reference. However, it was subsequently shown that having an additional 'hidden' layer in the neural network overcame many of the limitations. This reference figures so prominently in the field of neural networks, and is often referred to in modern works. But of even greater significance, the history of the perceptron demonstrates the complexity of analyzing neural networks. Before this reference, artificial neural networks were considered terrific, after this reference limited, and then in the 1980s terrific again. But at the time of this writing, it is realized that despite physiological plausibility, artificial neural networks do not scale well to large or complex problems that brains can easily handle, and artificial neural networks as we know them may actually be not so terrific.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
0 of 3 people found the following review helpful By Ali Alderete on March 9, 2013
Format: Paperback Verified Purchase
Big book about those special computer science' topics, great if your looking for research material. I used it as a referral for my thesis,
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

What Other Items Do Customers Buy After Viewing This Item?