- Hardcover: 992 pages
- Publisher: Addison-Wesley Professional; 4th edition (March 19, 2011)
- Language: English
- ISBN-10: 032157351X
- ISBN-13: 978-0321573513
- Product Dimensions: 7.6 x 1.4 x 9.2 inches
- Shipping Weight: 3.2 pounds (View shipping rates and policies)
- Average Customer Review: 222 customer reviews
- Amazon Best Sellers Rank: #26,941 in Books (See Top 100 in Books)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Algorithms (4th Edition) 4th Edition
Use the Amazon App to scan ISBNs and compare prices.
Fulfillment by Amazon (FBA) is a service we offer sellers that lets them store their products in Amazon's fulfillment centers, and we directly pack, ship, and provide customer service for these products. Something we hope you'll especially enjoy: FBA items qualify for FREE Shipping and Amazon Prime.
If you're a seller, Fulfillment by Amazon can help you increase your sales. We invite you to learn more about Fulfillment by Amazon .
All Books, All the Time
Read author interviews, book reviews, editors picks, and more at the Amazon Book Review. Read it now
Frequently bought together
Customers who bought this item also bought
Customers who viewed this item also viewed
About the Author
Robert Sedgewick has been a Professor of Computer Science at Princeton University since 1985, where he was the founding Chairman of the Department of Computer Science. He has held visiting research positions at Xerox PARC, Institute for Defense Analyses, and INRIA, and is member of the board of directors of Adobe Systems. Professor Sedgewick’s research interests include analytic combinatorics, design and analysis of data structures and algorithms, and program visualization. His landmark book, Algorithms, now in its fourth edition, has appeared in numerous versions and languages over the past thirty years. In addition, with Kevin Wayne, he is the coauthor of the highly acclaimed textbook, Introduction to Programming in Java: An Interdisciplinary Approach (Addison-Wesley, 2008).
Kevin Wayne is the Phillip Y. Goldman Senior Lecturer in Computer Science at Princeton University, where he has been teaching since 1998. He received a Ph.D. in operations research and industrial engineering from Cornell University. His research interests include the design, analysis, and implementation of algorithms, especially for graphs and discrete optimization. With Robert Sedgewick, he is the coauthor of the highly acclaimed textbook, Introduction to Programming in Java: An Interdisciplinary Approach (Addison-Wesley, 2008).
Author interviews, book reviews, editors picks, and more. Read it now
Top customer reviews
There was a problem filtering reviews right now. Please try again later.
by Addison-Wesley in March 2011) is one of the best computer science
books I have ever read. It should be required reading for all CS
students and all programmers - it aims to cover the "50 algorithms
every programmer should know". Below I discuss some of the main
reasons why I think the book is so good.
Unlike its main rival, "An introduction to algorithms" by Cormen,
Leiserson, Rivest and Stein (CLRS), "Algorithms" contains actual
source code (written in a subset of Java). The importance of this
cannot be overstated: it means students can actually use the
algorithms to solve real problems. This enables a wealth of
interesting and motivating applications --- from web search to
genomics --- which are sprinkled throughout the book. (Source code and
data are available on the book's website.)
A natural worry with real code is that it will obscure the basic
ideas. However, by careful useful of abstract data types (classes
such as queues, bags, hash tables, trees, DAGs, etc), the authors have
done a masterful job at creating extremely concise and readable
Using real code also forces one to address important implementation
details that are easy to overlook. For example, it is well known that
mergesort requires auxiliary memory. In the CLRS pseudocode, they
allocate temporary storage space inside their merge routine. In
practice it is much more efficient to allocate temporary storage space
once, and then pass this in as a pointer to the merge function (or let
it be a private member of the mergesort class). Where else can you
learn such important tricks?
In addition to presenting the code, there are of course accompanying
English explanations of the methods, which are very clear. One unique
thing about "Algorithms" is that there are also many detailed worked
examples, illustrating the behavior of the algorithms while running on
actual data (something that is hard to do unless you actually
implement all the algorithms!)
Another strength is that the book is that exemplifies good software
engineering practice: write the API first, devise unit tests and/or
implement applications ("clients") that use the data structure or
algorithm, and only then worry about how to implement the
API. Furthermore, multiple implementations of the API are usually
discussed, with different tradeoffs between simplicity, speed and
For data structures, it is obviously natural to use classes, but they
also adopt this approach for many algorithms, esp. graph processing
ones. This allows the algo to do pre-processing and to store internal
state, and then to provide a service to the caller. This is more
general than the standard stateless functional view of algorithms.
Each section of the book has a large number of exercises, classified
into "simple", "creative" and "experimental". Solutions to some
exercises are available online.
An unusual feature of the book is that it contains a lot of empirical
algorithmics, in addition to theory. That is, it shows actual running
times of different implementations on problems of different sizes, and
uses this to supplement traditional theoretical analysis.
A small bonus relative to CLRS is that the book is slightly shorter
(~ 1000 pages instead of 1300). In addition it is available in Kindle
format, which means one just has to carry around an ipad instead of a
back-breaking tome. (The formatting of the Kindle edition is not
Not surprisingly, the content of "Algorithms" overlaps a lot with
CLRS. This is not obvious from the table of contents, which only
gives a high level view of the book. I have therefore created a more
detailed list of topics (see below).
The overall ordering and flow of the book is great: ideas (and code)
that are introduced early on get re-used in several places later in
the book (e.g., heaps -> priority queues -> Prim's algo for min
spanning trees). The topics also become more advanced. Consequently,
the book is best read sequentially. It is worth reading the whole thing.
Prof. of Computer Science
University of British Columbia
Below I give a detailed summary of the topics in the book,
since this is not apparent from the table of contents.
1.1 Basic programming model
- Intro to Java
- APIs and libraries
- Binary search (recursion)
1.2 Data abstraction
- Basics of OOP
- Avoiding 'wide' interfaces
1.3 Bags, queues and stacks
- Generics (known as templates in C++)
- Dijkstra's 2 stack algo for evaluating arithmetic expressions
- Resizing arrays
- Linked lists, pointers
1.4 Analysis of algorithms
- empirical algorithmics
- big O notation ("linearithmic" as a term for O(N log N))
- Randomized algorithms
- Memory usage
1.5 Case study: Union-find
- Application: Dynamic connectivity (are p,q in same set?)
- 3 implementations, culminating in the classic algorithm
2.1 Elementary sorts
- Selection sort
- insertion sort
- shell sort
- Top-down (recursive)
- Proof that running time is N log N
- proof that lower bound for sorting requires N log N compares
- 3 way partitioning to speedup case of equal keys
- lower bound for sorting is N*entropy of key distrib.
2.4 Priority queues
- priority queue,
- top N items from a list using PQ
- multiway merge of N sorted lists using indexed PQ
- comparison of sorting algos (speed, stability, in place, extra space)
- order statistics/ median finding in O(N) time
3.1 Symbol tables (aka associative arrays)
- ST vs ordered ST (where keys can be compared, so can get min and max)
- count word frequencies in a large document
- sequential search through unordered linked list
- binary search through ordered array
3.2 Binary search trees
- BST property (parent is bigger than left child, smaller than right)
- get and put implementation and analysis O(log N) time
- find min, delete min, delete any node
- inorder traversal
3.3 Balanced search trees
- 2-3 trees and red-black trees
3.4 Hash tables
- hash functions (eg modular hashing with Horner's rule)
- separate chaining
- linear probing
- Dictionary lookup
- inverted index
- file index
- sparse matrix vector multipication
4.1 Undirected graphs
- Adjacency list representation
- Depth first search
- Breadth first search
- single source shortest paths using bfs
- connected components usign dfs
- is G acyclic using dfs
- is G bipartite using dfs
- Kevin Bacon game (degrees of separation)
4.2 Directed graphs
- Multi-source reachability
- Application to mark-sweep garbage collection
- Cycle detection using dfs
- topological sort (reverse of post order)
- Kosaraju's algo for strongly connected components
- Transitive closure (all pairs reachability)
4.3 Min spanning trees of undirected weighted graphs
- Prim's algo
- Kruskal's algo
4.4 Shortest paths in weighted digraphs
- Dijkstra's algo
- Shortest paths in weighted (possibly -ve) DAGs
- Critical path method for scheduling
- Shortest paths in weighted cyclic digraphs (Bellman-Ford and -ve cycle detection )
- Application to arbitrage
5.1 String sorts
- key indexed counting (radix sort)
- least significant digit (LSD) sorting
- most significant digit (MSD) sorting for variable length strings
- 3-way string quicksort for repeated prefixes.
- R-way trie
- Ternary search tries (BST representation of R-way array)
5.3 Substring search
- brute force method
- KMP method
- Boyer-Moore method
- Rabin-Karp fingerprint
5.4 Regular expressions
- Syntax of regexp
- Check if string in language using non-deterministic finite automaton
5.5 Data compression
- Run-length encoding
- Huffman compression
- LZW compression (using tries)
6.1 Event driven simulation using PQs
6.3 Suffix arrays.
- Find longest repeated substring.
- Indexing a string (keyword in context)
6.4 Ford-Fulkerson maxflow.
- Find shortest augmenting path.
- Maximum bipartite matching reduces to maxflow
- maxflow and shortest paths reduce to linear programming
6.5 NP completeness
Writing like this permeates the entire tome, but in planning this review, I happened to flip to page 86. The page focuses on Java methods. Here are a few examples, just from this page, of why I am writing this review:
They say, "To implement data-type operations (the behavior of each object) we implement instance methods with code that is precisely like the code that you learned in section 1.1 to implement static methods (functions)."
Translation: Writing instance methods in Java is just like writing static methods.
The complexity is unnecessary, and the sentence turns out not to be correct. Later in the paragraph, they explain the key difference between the instance methods and static methods, and they reveal that "precisely" was precisely the wrong word choice; they meant to say "much". This may seem a picky complaint, but it is important. A good technical book must be either crystal clear or extremely accurate. To be dense and careless is unfair to the reader, who must take the time to decode each and every word carefully in order to determine the meaning.
The very next sentence goes on to accidentally explain that *every* Java method must have a return statement. Here it is:
"Each instance method has a return type, a signature (which specifies its name and the types and names of its parameter variables), and a body (which consists of a sequence of statements, including a return statement that provides a value of the return type back to the client)."
I say "accidentally" because it's pretty clearly unintentional. It's not at all that the authors don't know Java - they obviously do! - it's rather that they appear to get lost in their own unwieldy sentences. Clearly, many correct Java methods do not contain return statements, and this is illustrated beautifully by the two short and elegant examples of Java code provided on the page!
Again, from the same page: "How do we specify which objects instance variables we want to use? If you think about this question for a moment, you will see the logical answer: a reference to a variable in an instance method refers to the value for the object that was used to invoke the method."
Translation: When there is an instance of some object, its methods can only access its own variables, not those of other instances. Put another way, if some code were to call myThing.getInfo(), getInfo() would only be able to use variables stored inside of myThing.
I was hoping to use this book with my students, but I am disappointed that the authors seem to have lost sight of students as readers. The students who learn out of these books don't yet know the material. It takes careful crafting to unpack difficult ideas for a reasonably motivated novice. The authors didn't take this challenge seriously enough, leaving the reading experience very frustrating for the audience that matters the most.
(As a related side note: The authors also create dense code with 1980's-style variable names and significant use of side-effects. You must keep k++ straight from ++k if you are to understand the sorting examples in this book! This is a single line of code from quick-sort: "while (less(a[++i], v)) if (i = hi) break;")
I still hold high hopes for future editions. In spite of this review, there are some really good aspects to this work. I have no doubt that, if the authors took the time to switch from an academic style of writing to a mode pedagogical one, this book could become the truly extraordinary resource it is meant to be. The authors have high aspirations, but they haven't achieved them yet.