Buy new:
-12% $15.00$15.00
Delivery Wednesday, October 30
Ships from: Amazon.com Sold by: Amazon.com
Save with Used - Good
$8.41$8.41
Delivery Monday, November 4
Ships from: Amazon Sold by: 123 Books Central
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Follow the authors
OK
You Are Not a Gadget: A Manifesto Paperback – February 8, 2011
Purchase options and add-ons
A NATIONAL BESTSELLER
A programmer, musician, and father of virtual reality technology, Jaron Lanier was a pioneer in digital media, and among the first to predict the revolutionary changes it would bring to our commerce and culture. Now, with the Web influencing virtually every aspect of our lives, he offers this provocative critique of how digital design is shaping society, for better and for worse.
Informed by Lanier’s experience and expertise as a computer scientist, You Are Not a Gadget discusses the technical and cultural problems that have unwittingly risen from programming choices—such as the nature of user identity—that were “locked-in” at the birth of digital media and considers what a future based on current design philosophies will bring. With the proliferation of social networks, cloud-based data storage systems, and Web 2.0 designs that elevate the “wisdom” of mobs and computer algorithms over the intelligence and wisdom of individuals, his message has never been more urgent.
- Print length240 pages
- LanguageEnglish
- PublisherVintage
- Publication dateFebruary 8, 2011
- Dimensions5.2 x 0.7 x 8 inches
- ISBN-100307389979
- ISBN-13978-0307389978
Frequently bought together

Similar items that ship from close to you
Editorial Reviews
Review
A New York Times, Los Angeles Times, and Boston Globe Bestseller
“Lucid, powerful and persuasive. . . . Necessary reading for anyone interested in how the Web and the software we use every day are reshaping culture and the marketplace.”
—Michiko Kakutani, The New York Times
“Persuasive. . . . Lanier is the first great apostate of the Internet era.”
—Newsweek
“Thrilling and thought-provoking. . . . A necessary corrective in the echo chamber of technology debates.”
—San Francisco Chronicle
“Mind-bending, exuberant, brilliant. . . . Lanier dares to say the forbidden.”
—The Washington Post
“With an expertise earned through decades of work in the field, Lanier challenges us to express our essential humanity via 21st century technology instead of disappearing in it. . . . [You Are Not a Gadget]compels readers to take a fresh look at the power—and limitations—of human interaction in a socially networked world.”
—Time (“The 2010 Time 100”)
“Lanier is not of my generation, but he knows and understands us well, and has written a short and frightening book, You Are Not a Gadget, which chimes with my own discomfort, while coming from a position of real knowledge and insight, both practical and philosophical.”
—Zadie Smith, The New York Review of Books
“Sparky, thought-provoking. . . . Lanier clearly enjoys rethinking received tech wisdom: his book is a refreshing change from Silicon Valley’s usual hype.”
—New Scientist
“Important. . . . At the bottom of Lanier’s cyber-tinkering is a fundamentally humanist faith in technology. . . . His mind is a fascinating place to hang out.”
—Los Angeles Times
“A call for a more humanistic—to say nothing of humane—alternative future in which the individual is celebrated more than the crowd and the unique more than the homogenized. . . . You Are Not a Gadget may be its own best argument for exalting the creativity of the individual over the collective efforts of the ‘hive mind.’ It’s the work of a singular visionary.”
—Bloomberg News
“A bracing dose of economic realism and Randian philosophy for all those techno utopianists with their heads in the cloud. . . . [Lanier is] a true iconoclast. . . . He offers the sort of originality of thought he finds missing on the Web.”
—The Miami Herald
“For those who wish to read to think, and read to transform, You Are Not a Gadget is a book to begin the 2010s. . . . It is raw, raucous and unexpected. It is also a hell of a lot of fun.”
—Times Higher Education
“[Lanier] confronts the big issues with bracing directness. . . . The reader sits up. One of the insider’s insiders of the computing world seems to have gone rogue.”
—The Boston Globe
“Gadget is an essential first step at harnessing a post-Google world.”
—The Stranger (Seattle)
“Lanier turns a philosopher’s eye to our everyday online tools. . . . The reader is compelled to engage with his work, to assent, contradict, and contemplate. . . . Lovers of the Internet and all its possibilities owe it to themselves to plunge into Lanier’s manifesto and look hard in the mirror. He’s not telling us what to think; he’s challenging us to take a hard look at our cyberculture, and emerge with new creative inspiration.”
—Flavorwire
“Poetic and prophetic, this could be the most important book of the year. . . . Read this book and rise up against net regimentation!”
—The Times (London)
“[Lanier’s] argument will make intuitive sense to anyone concerned with questions of propriety, responsibility, and authenticity.”
—The New Yorker
“Inspired, infuriating and utterly necessary. . . . Lanier tells of the loss of a hi-tech Eden, of the fall from play into labour, obedience and faith. Welcome to the century’s first great plea for a ‘new digital humanism’ against the networked conformity of cyber-space. This eloquent, eccentric riposte comes from a sage of the virtual world who assures us that, in spite of its crimes and follies, ‘I love the internet.’ That provenance will only deepen its impact, and broaden its appeal.”
—The Independent (London)
“Fascinating and provocative. . . . Destined to become a must-read for both critics and advocates of online-based technology and culture.”
—Publishers Weekly
About the Author
Jaron Lanier is known as the father of virtual reality technology and has worked on the interface between computer science and medicine, physics, and neuroscience. He lives in Berkeley, California.
Visit the author's website at www.jaronlanier.com.
Excerpt. © Reprinted by permission. All rights reserved.
THE IDEAS THAT I hope will not be locked in rest on a philosophical foundation that I sometimes call cybernetic totalism. It applies metaphors from certain strains of computer science to people and the rest of reality. Pragmatic objections to this philosophy are presented.
What Do You Do When the Techies Are Crazier Than the Luddites?
The Singularity is an apocalyptic idea originally proposed by John von Neumann, one of the inventors of digital computation, and elucidated by figures such as Vernor Vinge and Ray Kurzweil.
There are many versions of the fantasy of the Singularity. Here’s the one Marvin Minsky used to tell over the dinner table in the early 1980s: One day soon, maybe twenty or thirty years into the twenty- first century, computers and robots will be able to construct copies of themselves, and these copies will be a little better than the originals because of intelligent software. The second generation of robots will then make a third, but it will take less time, because of the improvements over the first
generation.
The process will repeat. Successive generations will be ever smarter and will appear ever faster. People might think they’re in control, until one fine day the rate of robot improvement ramps up so quickly that superintelligent robots will suddenly rule the Earth.
In some versions of the story, the robots are imagined to be microscopic, forming a “gray goo” that eats the Earth; or else the internet itself comes alive and rallies all the net- connected machines into an army to control the affairs of the planet. Humans might then enjoy immortality within virtual reality, because the global brain would be so huge that it would be absolutely easy—a no-brainer, if you will—for it to host all our consciousnesses for eternity.
The coming Singularity is a popular belief in the society of technologists. Singularity books are as common in a computer science department as Rapture images are in an evangelical bookstore.
(Just in case you are not familiar with the Rapture, it is a colorful belief in American evangelical culture about the Christian apocalypse. When I was growing up in rural New Mexico, Rapture paintings would often be found in places like gas stations or hardware stores. They would usually include cars crashing into each other because the virtuous drivers had suddenly disappeared, having been called to heaven just before the onset of hell on Earth. The immensely popular Left Behind novels also describe this scenario.)
There might be some truth to the ideas associated with the Singularity at the very largest scale of reality. It might be true that on some vast cosmic basis, higher and higher forms of consciousness inevitably arise, until the whole universe becomes a brain, or something along those lines. Even at much smaller scales of millions or even thousands of years, it is more exciting to imagine humanity evolving into a more wonderful state than we can presently articulate. The only alternatives would be extinction or stodgy stasis, which would be a little disappointing and sad, so let us hope for transcendence of the human condition, as we now
understand it.
The difference between sanity and fanaticism is found in how well the believer can avoid confusing consequential differences in timing. If you believe the Rapture is imminent, fixing the problems of this life might not be your greatest priority. You might even be eager to embrace wars and tolerate poverty and disease in others to bring about the conditions that could prod the Rapture into being. In the same way, if you believe the Singularity is coming soon, you might cease to design technology to serve humans, and prepare instead for the grand events it will bring.
But in either case, the rest of us would never know if you had been right. Technology working well to improve the human condition is detectable, and you can see that possibility portrayed in optimistic science fiction like Star Trek.
The Singularity, however, would involve people dying in the flesh and being uploaded into a computer and remaining conscious, or people simply being annihilated in an imperceptible instant before a new superconsciousness takes over the Earth. The Rapture and the Singularity share one thing in common: they can never be verified by the living.
You Need Culture to Even Perceive Information Technology
Ever more extreme claims are routinely promoted in the new digital climate. Bits are presented as if they were alive, while humans are transient fragments. Real people must have left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality.
Kevin Kelly says that we don’t need authors anymore, that all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. Wired editor Chris Anderson proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway.*
Antihuman rhetoric is fascinating in the same way that selfdestruction is fascinating: it offends us, but we cannot look away.
The antihuman approach to computation is one of the most baseless ideas in human history. A computer isn’t even there unless a person experiences it. There will be a warm mass of patterned silicon with electricity coursing through it, but the bits don’t mean anything without a cultured person to interpret them.
This is not solipsism. You can believe that your mind makes up the world, but a bullet will still kill you. A virtual bullet, however, doesn’t even exist unless there is a person to recognize it as a representation of a bullet. Guns are real in a way that computers are not.
Making People Obsolete So That Computers Seem More Advanced
Many of today’s Silicon Valley intellectuals seem to have embraced what used to be speculations as certainties, without the spirit of unbounded curiosity that originally gave rise to them. Ideas that were once tucked away in the obscure world of artificial intelligence labs have gone mainstream in tech culture. The first tenet of this new culture is that all of reality, including humans, is one big information system. That doesn’t mean we are condemned to a meaningless existence. Instead there is a new kind of manifest destiny that provides us with a mission to accomplish. The meaning of life, in this view, is making the digital system we
call reality function at ever- higher “levels of description.”
People pretend to know what “levels of description” means, but I doubt anyone really does. A web page is thought to represent a higher level of description than a single letter, while a brain is a higher level than a web page. An increasingly common extension of this notion is that the net as a whole is or soon will be a higher level than a brain. There’s nothing special about the place of humans in this scheme. Computers will soon get so big and fast and the net so rich with information that people will be obsolete, either left behind like the characters in Rapture novels or subsumed into some cyber-superhuman something.
Silicon Valley culture has taken to enshrining this vague idea and spreading it in the way that only technologists can. Since implementation speaks louder than words, ideas can be spread in the designs of software. If you believe the distinction between the roles of people and computers is starting to dissolve, you might express that—as some friends of mine at Microsoft once did—by designing features for a word processor that are supposed to know what you want, such as when you want to start an outline within your document. You might have had the experience of having Microsoft Word suddenly determine, at the wrong moment, that you are creating an indented outline. While I am all for the automation of petty tasks, this is different.
From my point of view, this type of design feature is nonsense, since you end up having to work more than you would otherwise in order to manipulate the software’s expectations of you. The real function of the feature isn’t to make life easier for people. Instead, it promotes a new philosophy: that the computer is evolving into a life-form that can understand people better than people can understand themselves.
Another example is what I call the “race to be most meta.” If a design like Facebook or Twitter depersonalizes people a little bit, then another service like Friendfeed— which may not even exist by the time this book is published— might soon come along to aggregate the previous layers of aggregation, making individual people even more abstract, and the illusion of high- level metaness more celebrated.
Information Doesn’t Deserve to Be Free
“Information wants to be free.” So goes the saying. Stewart Brand, the founder of the Whole Earth Catalog, seems to have said it first.
I say that information doesn’t deserve to be free.
Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it’s even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not?
Of course, there is a technical use of the term “information” that refers to something entirely real. This is the kind of information that’s related to entropy. But that fundamental kind of information, which exists independently of the culture of an observer, is not the same as the kind we can put in computers, the kind that supposedly wants to be free.
Information is alienated experience.
You can think of culturally decodable information as a potential form of experience, very much as you can think of a brick resting on a ledge as storing potential energy. When the brick is prodded to fall, the energy is revealed. That is only possible because it was lifted into place at some point in the past.
In the same way, stored information might cause experience to be revealed if it is prodded in the right way. A file on a hard disk does indeed contain information of the kind that objectively exists. The fact that the bits are discernible instead of being scrambled into mush—the way heat scrambles things—is what makes them bits.
But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de- alienate information.
Information of the kind that purportedly wants to be free is nothing but a shadow of our own minds, and wants nothing on its own. It will not suffer if it doesn’t get what it wants.
But if you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith.
*Chris Anderson, “The End of Theory,” Wired, June 23, 2008 (www.wired.com/science/discoveries/magazine/ 16- 07/pb_theory).
Product details
- Publisher : Vintage; Reprint edition (February 8, 2011)
- Language : English
- Paperback : 240 pages
- ISBN-10 : 0307389979
- ISBN-13 : 978-0307389978
- Item Weight : 2.31 pounds
- Dimensions : 5.2 x 0.7 x 8 inches
- Best Sellers Rank: #66,500 in Books (See Top 100 in Books)
- #44 in Computer Graphics & Design
- #52 in Internet & Telecommunications
- #59 in Social Aspects of Technology
- Customer Reviews:
About the authors

Discover more of the author’s books, see similar authors, read book recommendations and more.

Jaron Lanier is known as the father of virtual reality technology and has worked on the interface between computer science and medicine, physics, and neuroscience. He lives in Berkeley, California.
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonCustomers say
Customers find the book insightful, philosophical, and an important introduction to discourse about the future of the web. They describe it as well worth reading, interesting, and enjoyable. Readers also appreciate the clear writing.
AI-generated from the text of customer reviews
Customers find the book insightful, philosophical, and important. They say it establishes a better overall context to understand the future of the web. Readers also mention the book has good analogies and makes them think about how humans are lowering their standards.
"...I found this book to be a fantastic thought exercise and it made me take a hard look at my technological worldview...." Read more
"...that he and many of his colleagues pioneered, this book is very relevant today...." Read more
"...I could go on but will simply say that this book is important and worth a read...." Read more
"...says, but I think he brings up a number of very important points. The point that resonated..." Read more
Customers find the book well worth reading, interesting, and enjoyable. They also say it's a rough ride.
"...It was well worth the time and the price spent...." Read more
"...I could go on but will simply say that this book is important and worth a read...." Read more
"...So suffer through the sentences. The rough ride is worth it--even if it takes you, as it did in my case, a few months to finish...." Read more
"I thought this was a great, thought-provoking book. Definitely buy it...." Read more
Customers find the writing quality of the book easy, clear, and refreshing. They also say it's a decent read.
"...direction.Lanier's writing is clear, and should be required reading for regular followers of..." Read more
"...It is an easy book in read in short segments.Mark Winter" Read more
"...Poor writing, poor editing, from start to finish...." Read more
"...Go and read it for yourself. It's an easy read and, although he tends to ramble and repeat points, the book is an important introduction to..." Read more
Customers find the content hardly worth the effort. They say the book doesn't deliver.
"...study of hand-waving and proof by intimidation, but the content is hardly worth the effort...." Read more
"...based on quantitative research via statistical sampling, this book does not deliver, especially when he speaks of the tendency of crowds to revert..." Read more
"This is honestly one of the worst books I've ever read (and I read a lot!)...." Read more
"crappy book. Hard to understand." Read more
-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
This unsettling book explores some of the strange conundra created by our fascination with all things `web 2.0'. From the way one programmer's convenience becomes the next generation's strait-jacket, to the loss of identity in wiki-based knowledge, and the lowering of self-esteem among Facebook addicted youth, to the `ideal' of perpetual existence as a stream of electrons in a computer's `consciousness, this book takes science fiction and roots it deep into the rich manure of common current `culture'.
The concept that structure and process can speed up adoption and dissemination of new ideas by lowering volatility and improving message targeting is anathema to the proponents of wiki-style freedom. But is the freedom of information necessarily worth the sacrifice of individual expression, attribution and control? Proponents of the hive mind or noosphere would argue that case but Lanier takes an independent stance that values contribution of individuals as individuals, with their personal intelligence, experience and emotion, above the anonymous, often re-edited and variable outputs of agglomerated information mash-ups. It is a brave, but valid, stance and coherently reasoned.
The doctrine of crowd-based wisdom is infiltrating strategy and policy development processes. Whilst involvement is inherently useful, it appears obvious, upon reading this treatise, that there should be clear limits to the way in which crowds are used and scope for individual attributable contributions to retain relevance.
The use of pseudonyms and anonymous postings is definitely supporting the rise of `Trolls'. Trolls, in cyberspace, are people who are abusive towards other people or ideas. They have been implicated in cyber-bullying which leaves boards exposed to claims of failure to prevent harassment and/or discrimination. The move towards transparency is greatly hampered when organisations interact online with anonymous respondents.
As Lanier points out, "If you win anonymously no one knows, and if you lose, you just change your pseudonym and start over, without having modified your point of view one bit. If the Troll is anonymous and the target is known then the dynamic is even worse." Any company is at risk of a cyber-storm if their operations, brand or philosophy should offend a tribe of trolls. The case of Nestle and the palm oil debate is a dramatic illustration of this principle in action.
Another of Lanier's bugbears is the principle of `lock-in', where decisions made in the early stages of development establish constraints on decision-making in the later stages until they become ingrained as `facts'. Reducing the richness of individual experience to suit the templates of networking sites is a harrowing process to any innovative thinkers. Cutting the glissando of music into computer recognisable notes is anathema to many musicians. Both of these processes have enabled sharing and progress on a scale unparalleled in human history. Both are reducing the expression of future potential by fitting it into a template based on past expedience.
Lanier is one of the leading thinkers of the internet age and this book has set him apart, and at odds, from his fellows. It has also provided a necessary space for consideration in our headlong rush to the brave new lands of the internet fuelled universe. Like the maps of olden-times, at the edges of our current knowledge it would be well to mark the internet with signs stating `Here be Dragons'. They may only be dragons of our own invention but it is as well proceed towards them with due caution.
Highly recommended for both fans and sceptics of web 2.0 plus anyone who is still undecided.
Available at amazon.com
* Julie Garland McLellan is a professional non-executive director, board and governance consultant and mentor. She is the author of "Presenting to Boards", "Dilemmas, Dilemmas: practical case studies for company directors', "The Director's Dilemma", "All Above Board: Great Governance for the Government Sector" and numerous articles on corporate strategy and governance.
Dilemmas, Dilemmas: Practical Case Studies for Company Directors
Presenting to Boards: Practical Skills for Corporate Presentations (Volume 1)
The book starts out speaking about the limitations inherent in our current technology due to lock-in as many of the programming languages currently in use today were written ten, twenty, thirty years ago. A good example is MIDI. MIDI was created to be a simple mime of a synthesizer on a computer, but MIDI only specifies certain notes in a limited range (like the keys on a piano). Pick up a saxophone or start to sing and there are many more possible sounds than can be produced using MIDI; the technology is so embedded in everything we do now that it's locked-in.
Lainer stands in contrast to proponents of the free/open culture movement; most free culture advocates perceive themselves as rebellious and liberal but Lainer posits that they are the conservative ones. He makes a great point in that many of our best pieces of software have come from closed systems - i.e. the iPhone or Adobe Flash. This quote sums up his view nicely: "If we choose to pry culture away from capitalism while the rest of life is still capitalistic, culture will become a slum." I think a happy medium does exist, but is currently not present in efforts such as Creative Commons licensing (though it's certainly a start).
"Am I accusing all of those hundreds of millions of users of social networking sites of reducing themselves in order to be able to use the services? Well yes, I am. [...] A real friendship outght to introduce each person to unexpected weirdness in the other. Each acquaintance is an alien, a well of unexplored difference in the experience of life that cannot be imagined or accessed in any way but through genuine interaction. The idea of friendship in database-filtered social networks is certainly reduced from that." Sure all of that is true, but that all depends on how we define and value various words. I don't consider all of my 1,192 "Facebook friends" to be my close friends in real life, many are people that I've met along the way and simply want to keep in touch with occasionally. I realized shortly after reading that chapter that I was being small. I grew up straddling the analog and digital ages and I know both. Lanier is looking beyond that at future generations that will grow up on Facebook, Twitter, and other web 2.0 networks.
I've found great joy in people I've met on the internet and proceeded to meet in person, many of who have become great friends. I've been meeting people from online communities for nearly a decade now and it's never been weird or creepy, aside from the middle school dance feeling that might occur for the first few minutes. To be fair, Lainer does spend about a page or two praising this result of the web, but I don't think he gives it enough credit.
One point I have to strongly disagree with is Lanier's assertion that musical progress has been greatly slowed and everything is just "retro, retro, retro." He says most people in their 20s can't differentiate between 90s and 00s music. Can you tell me that there's anything that sounds like The Postal Service or The Knife from the 20th century? Those are just two examples off the top of my head but there's a plethora of original music out there right now that is distinct to our time. I'm also not sure why musical genres and trends have to be spliced into ten year increments that coincide neatly with decades, but that's just an aside. Musical genres have splintered and there's isn't currently an overarching archetype, but I would say that's simply because we have access to so much music and record companies no longer have as much power to set the standard for what is appropriate for the masses. The masses decide for themselves by finding new music on the internet.
I found this book to be a fantastic thought exercise and it made me take a hard look at my technological worldview. I wish the conclusion was a more coherent and non-tangential; Lanier goes on to talk about cephlopods for several pages at the end of the book.
My life is seeped in web 2.0; this review itself is sending to four different web 2.0 platforms after I hit the publish button. We as a culture and society have become so engrossed in these platforms that I think it's important to step back and evaluate exactly what it is we're doing. I hope there's a compromise that exists between totally free and open culture and closed systems; I suppose we'll find out.
Word-starved as I was, I devoured it in a few days. It was well worth the time and the price spent. As an amateur, self-taught musician who once thought digital audio software was the greatest thing since sliced bread, Lanier's grasp of digital music is taking me through something of a paradigm shift, and I'm recognizing why I'm yearning for a good, old acoustic guitar. One misgiving I have, though, is that Lanier seems to be seeing sensory reality through a filter of software code, and may possibly be confusing the two at times. I agree that self-expression in the digital world is extremely repressed and compressed, but I don't think that changes our natures as much as Lanier fears. It may just be encouraging bad habits.
Top reviews from other countries
Wir brauchen mehr von diesen Aufklärern
Have carried this book around with me for a couple of years. Just finished it today. Great read and lots to think about. Would not claim to understand all of the points made but food for thought for anyone like myself who spends much time contributing to social networks.
Lanier deals with a long list of concerns he has with recent developments. In fact one of these relates to information being taken out of context e.g. fragments being reused in various social networks. While reviewing the book - and therefore selecting some of the ideas - I suggest that if you think the subject matter is of interest you should read the full book.
The author addresses the subject of ‘authorship’ - referencing a discussion between Kevin Kelly (who postulates that eventually there will be only one book) and John Updike on the subject. His opionion is that authorship is not a priority for the new ideology promoted by the singularity, the anti humanist computer scientists, promoters of ‘digital maoisim’ or the ‘noosphere’.
Lanier is highly critical of web 2.0 designs which actively demand that people define themselves downwards. Nor is he a fan of Wikipedia - which he sees as (1) a system which removes individual ‘points of view’ and (2) lendds itself to ‘lazy’ search engines serving up its context as its first answer each time.
Lanier also has less expectations of crowd wisdom than James Surowiecki. The author stresses the need for a combination of collective and individual intelligence. In fact he would avoid having crowds frame their own questions. He has concerns for a society that risks mob rule as a follow on from crowd wisdom, in its extreme form.
Interestingly the author claims to be optimistic and to see benefits in technology. But the technology should exist to server people and to improve the human condition. He seems to be unconvinced about the benefits of much of the web 2.0 culture and associated ideology. He sees it lending itself to a winner take all - the lords of the cloud and search - while the creators of cultural experiences will work for very little (if anything at all).
He spends a reasonable amount of time looking at modern music and suggesting that we have lost much of the creativity of previous generations - that in fact much so what we hear is rehash of previously created music. Later in the book he also references phenotropics (his own programming/ development environment).
Lanier is encouraging everyone to value their own individualism - in this context we are all encouraged to be expressive in our website content, to be reflective and to take more time in preparing blog postings. His concern is that we are devaluing the individual and are at risk of ‘spirituality committing suicide’ as consciousness wills itself out of existence.
He is a long way from accepting the Ray Kurzweil view (‘singularity’) - that the computing cloud will scoop up the contents of our brains so we can live in virtual reality’. While not necessarily signing up to all of his commentary and analysis (e.g. re music) I certainly find myself more aligned to the humanist than the ‘noosphere’ group.





