Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
You Are Not a Gadget: A Manifesto Paperback – February 8, 2011
Featured IT certification guides sponsored by Pearson. Learn more.
Frequently bought together
Customers who bought this item also bought
Amazon Best Books of the Month, January 2010: For the most part, Web 2.0--Internet technologies that encourage interactivity, customization, and participation--is hailed as an emerging Golden Age of information sharing and collaborative achievement, the strength of democratized wisdom. Jaron Lanier isn't buying it. In You Are Not a Gadget, the longtime tech guru/visionary/dreadlocked genius (and progenitor of virtual reality) argues the opposite: that unfettered--and anonymous--ability to comment results in cynical mob behavior, the shouting-down of reasoned argument, and the devaluation of individual accomplishment. Lanier traces the roots of today's Web 2.0 philosophies and architectures (e.g. he posits that Web anonymity is the result of '60s paranoia), persuasively documents their shortcomings, and provides alternate paths to "locked-in" paradigms. Though its strongly-stated opinions run against the bias of popular assumptions, You Are Not a Gadget is a manifesto, not a screed; Lanier seeks a useful, respectful dialogue about how we can shape technology to fit culture's needs, rather than the way technology currently shapes us.
A Q&A with Author Jaron Lanier
Question: As one of the first visionaries in Silicon Valley, you saw the initial promise the internet held. Two decades later, how has the internet transformed our lives for the better?
Jaron Lanier: The answer is different in different parts of the world. In the industrialized world, the rise of the Web has happily demonstrated that vast numbers of people are interested in being expressive to each other and the world at large. This is something that I and my colleagues used to boldly predict, but we were often shouted down, as the mainstream opinion during the age of television’s dominance was that people were mostly passive consumers who could not be expected to express themselves. In the developing world, the Internet, along with mobile phones, has had an even more dramatic effect, empowering vast classes of people in new ways by allowing them to coordinate with each other. That has been a very good thing for the most part, though it has also enabled militants and other bad actors.
Question: You argue the web isn’t living up to its initial promise. How has the internet transformed our lives for the worse?
Jaron Lanier: The problem is not inherent in the Internet or the Web. Deterioration only began around the turn of the century with the rise of so-called "Web 2.0" designs. These designs valued the information content of the web over individuals. It became fashionable to aggregate the expressions of people into dehumanized data. There are so many things wrong with this that it takes a whole book to summarize them. Here’s just one problem: It screws the middle class. Only the aggregator (like Google, for instance) gets rich, while the actual producers of content get poor. This is why newspapers are dying. It might sound like it is only a problem for creative people, like musicians or writers, but eventually it will be a problem for everyone. When robots can repair roads someday, will people have jobs programming those robots, or will the human programmers be so aggregated that they essentially work for free, like today’s recording musicians? Web 2.0 is a formula to kill the middle class and undo centuries of social progress.
Question: You say that we’ve devalued intellectual achievement. How?
Jaron Lanier: On one level, the Internet has become anti-intellectual because Web 2.0 collectivism has killed the individual voice. It is increasingly disheartening to write about any topic in depth these days, because people will only read what the first link from a search engine directs them to, and that will typically be the collective expression of the Wikipedia. Or, if the issue is contentious, people will congregate into partisan online bubbles in which their views are reinforced. I don’t think a collective voice can be effective for many topics, such as history--and neither can a partisan mob. Collectives have a power to distort history in a way that damages minority viewpoints and calcifies the art of interpretation. Only the quirkiness of considered individual expression can cut through the nonsense of mob--and that is the reason intellectual activity is important.
On another level, when someone does try to be expressive in a collective, Web 2.0 context, she must prioritize standing out from the crowd. To do anything else is to be invisible. Therefore, people become artificially caustic, flattering, or otherwise manipulative.
Web 2.0 adherents might respond to these objections by claiming that I have confused individual expression with intellectual achievement. This is where we find our greatest point of disagreement. I am amazed by the power of the collective to enthrall people to the point of blindness. Collectivists adore a computer operating system called LINUX, for instance, but it is really only one example of a descendant of a 1970s technology called UNIX. If it weren’t produced by a collective, there would be nothing remarkable about it at all.
Meanwhile, the truly remarkable designs that couldn’t have existed 30 years ago, like the iPhone, all come out of "closed" shops where individuals create something and polish it before it is released to the public. Collectivists confuse ideology with achievement.
Question: Why has the idea that "the content wants to be free" (and the unrelenting embrace of the concept) been such a setback? What dangers do you see this leading to?
Jaron Lanier: The original turn of phrase was "Information wants to be free." And the problem with that is that it anthropomorphizes information. Information doesn’t deserve to be free. It is an abstract tool; a useful fantasy, a nothing. It is nonexistent until and unless a person experiences it in a useful way. What we have done in the last decade is give information more rights than are given to people. If you express yourself on the internet, what you say will be copied, mashed up, anonymized, analyzed, and turned into bricks in someone else’s fortress to support an advertising scheme. However, the information, the abstraction, that represents you is protected within that fortress and is absolutely sacrosanct, the new holy of holies. You never see it and are not allowed to touch it. This is exactly the wrong set of values.
The idea that information is alive in its own right is a metaphysical claim made by people who hope to become immortal by being uploaded into a computer someday. It is part of what should be understood as a new religion. That might sound like an extreme claim, but go visit any computer science lab and you’ll find books about "the Singularity," which is the supposed future event when the blessed uploading is to take place. A weird cult in the world of technology has done damage to culture at large.
Question: In You Are Not a Gadget, you argue that idea that the collective is smarter than the individual is wrong. Why is this?
Jaron Lanier: There are some cases where a group of people can do a better job of solving certain kinds of problems than individuals. One example is setting a price in a marketplace. Another example is an election process to choose a politician. All such examples involve what can be called optimization, where the concerns of many individuals are reconciled. There are other cases that involve creativity and imagination. A crowd process generally fails in these cases. The phrase "Design by Committee" is treated as derogatory for good reason. That is why a collective of programmers can copy UNIX but cannot invent the iPhone.
In the book, I go into considerably more detail about the differences between the two types of problem solving. Creativity requires periodic, temporary "encapsulation" as opposed to the kind of constant global openness suggested by the slogan "information wants to be free." Biological cells have walls, academics employ temporary secrecy before they publish, and real authors with real voices might want to polish a text before releasing it. In all these cases, encapsulation is what allows for the possibility of testing and feedback that enables a quest for excellence. To be constantly diffused in a global mush is to embrace mundanity.
(Photo © Jonathan Sprague)
--This text refers to the Hardcover edition.
From Publishers Weekly
Computer scientist and Internet guru Lanier's fascinating and provocative full-length exploration of the Internet's problems and potential is destined to become a must-read for both critics and advocates of online-based technology and culture. Lanier is best known for creating and pioneering the use of the revolutionary computer technology that he named virtual reality. Yet in his first book, Lanier takes a step back and critiques the current digital technology, more deeply exploring the ideas from his famous 2000 Wired magazine article, One-Half of a Manifesto, which argued against more wildly optimistic views of what computers and the Internet could accomplish. His main target here is Web 2.0, the current dominant digital design concept commonly referred to as open culture. Lanier forcefully argues that Web 2.0 sites such as Wikipedia undervalue humans in favor of anonymity and crowd identity. He brilliantly shows how large Web 2.0–based information aggregators such as Amazon.com—as well as proponents of free music file sharing—have created a hive mind mentality emphasizing quantity over quality. But he concludes with a passionate and hopeful argument for a new digital humanism in which radical technologies do not deny the specialness of personhood. (Jan.)
Copyright © Reed Business Information, a division of Reed Elsevier Inc. All rights reserved. --This text refers to the Hardcover edition.
Browse award-winning titles. See more
If you are a seller for this product, would you like to suggest updates through seller support?
Top Customer Reviews
Lanier is a name in a creative area of computing, and he is a professional composer and musician. Unfortunately that did not confer the chops required to write a coherent book on economics, history, cultural analysis, software engineering or the host of other things he attempts here. So he wings it. The result is a number of real howlers accompanied by less egregious chapters where he obsesses about the color of an elephant's tail without understanding that there is an elephant attached
Lanier's Method of Operation:
* pick a personal pet peeve
* select some phenomenon of general interest
* fabricate a random cause-effect relation linking the popular phenomenon to his pet peeve.
* ignore any relevant historical, economic or cultural information.
Looking back on that paragraph, I realize that I just described Fox News's MO. Except Fox does it on purpose for millions of dollars. I believe Lanier was just misguided by bilious enthusiasm for his own biases.
The biggest howler? Lanier walks into a hedge fund office and sees a wall of computer monitors. "Aha!" he thinks, "the 'cloud' is responsible for the bank crash."
Now behind all the hype, the "cloud" is just a more rational way of billing for computer hardware use, supported by a means of running multiple servers safely on one machine. It does allow much smaller businesses to afford hosted "servers", but it explains absolutely nothing about the 2006-2008 financial crash -- or the 15 "pre-cloud" other financial crashes in U.S. history! Lanier was simply looking the wrong way in that hedge-fund office. If, like Michael Lewis, he had ignored the hardware and paid attention to the "wetware", he might have learned something.
If you want an adult discussion of the crash, see the excellent movie "An Inside Job", which will change forever the meaning of the phrase "leading economists" for you. Read Michael Lewis's "The Big Short" which gives you a manager's eye view of the thing or "Confidence Men" which gives you the political side back to Clinton and Larry Summers. Lanier could have at least read Lewis's "Liar's Poker, which was available at the time he was writing this book, or could have simply kept silent until he knew enough not to embarrass himself.
Howler number two: Lanier experiences a lag on his new iPhone. "It must be Linux! I can just feel it!" Funny that every other iPhone user identified the problem correctly as the AT&T network, which could not handle the sudden increase of activity from the new iPhone. IPhone apps phone home promiscuously, but that's not Linux, it's marketing. If Lanier had been equally wrong but less biased, he might have attacked Objective C, a more immediate software platform than Linux. But Lanier was a Microsoftie, and Linux was his pet peeve.
Google: Lanier seems unaware that capitalism breeds monopolies! But he thinks that Google is a really dangerous one. Lanier forgot to count the horse he rode in on. In the nineties and the oughts, Microsoft was the oppressive monopoly. Unlike Google, MS were always happy to throw users under the bus with software that simply did not run (Windows 3.0, DOS 4.0, Windows Millenium, Vista). They were willing to use illegal means to keep better software off the market (DR-DOS). (With iOS, Apple now throws developers as well as users under the bus.) I don't like monopolies, but Google has contributed a tremendous amount of good free software and free developer cycles to the community, so between Microsoft, Apple and Google, I'll take Google.
If Lanier had any insight, he would have identified U.S. internet service providers as the monopoly/cartel most dangerous to the average computer user. In most markets in this country, internet providers charge outrageous rates for broadband in order to subsidize their "no-choices" cable TV business. Broadband is a utility: if you are employed you need it and if you are unemployed you need it even more. You can easily live without cable TV (for PBS, local news and weather, I get better reception with an antenna ). So the cable TV company doubling as an internet provider is a monopoly with devastating consequences and no justification. And it costs us three times as much for half the speed that many European countries enjoy.
Which brings me to open-source software. I have read some reviews here where the review writer is apologizing for giving his review away. Sad, sad. Amazon reviews are probably the greatest innovation in shopping since cash. I am grateful for this community and contribute freely to it. (Contrast Amazon with a really useless shopping site like the iStore/appStore.)
Because of Lanier's bias (or a kool-aid overdose), he doesn't notice that open source software may be the biggest single job creator in our economy. Thousands of tiny businesses exist because of it. The problem with commercial software like Oracle, Windows and Apple software is not so much the up-front cost (well, with Oracle it used to be). It is the ongoing maintenance and management expense due to the poor usability and the opacity of the software itself. The proprietary vendors seem to glory in re-inventing computer science with out-of-date technology masked by really impenetrable nomenclature.
If I run into a bug in open source software that affects only a few users, I can go in and fix it myself right away because I have the source and because the open source community generally evolves solid, readable, pattern-based software. With a commercial program, bugs that affect you will not be fixed in the next release unless you throw tons of extra money at the software vendor -- and maybe not then, because the best and the brightest programmers have already left for more interesting open source projects. Major innovation will not be happening in proprietary software because you can't pre-calculate the bottom line for innovation and all the manpower is required just to keep the legacy stuff from crashing.
By the way, if YOU use open source, do what I do: contribute money -- it's well worth paying for.
Creative Commons: Lanier's cultural history chops are far too weak to understand the Creative Commons concept. He allows that he might be willing to let some of his work be used for free but he would like to control how it is used.
Now, Shakespeare was the greatest mash-up artist of all time (along with Marlowe, Moliere, Aeschylus, Sophocles, Euripides, Ovid, Vergil...). If forced to, the Bard might have grudgingly given a few shillings to the prior authors of everything he borrowed (and he borrowed pretty much everything). But if he had to write Hamlet with Kyd's executor looking over his shoulder, or R&J, Richard II, III or Henry IV,V,VI,VIII with Brooke's or Holinshed's attorneys telling him what he could or could not write, he would surely have packed it in and gone home to make beer.
I believe strongly in the modern concept of paying authors for their work -- even though Shakespeare, Mozart and most other geniuses had to have day jobs. But the intellectual rigidity of our copyright laws (not to mention our libel laws) absolutely guarantees that there will never be another Shakespeare.
Creative Commons is an attempt to restore literature's ability to cross-fertilize (the way classical and folk music used to and visual art still does) by trying to reproduce the open-source software phenomenon there.
My problem with this book is that Lanier did not respect his subject or his readers enough to question his own prejudices or ignorance. You could read Lanier's book as a study of hand-waving and proof by intimidation, but the content is hardly worth the effort. Read "How to Win Every Argument" instead for shorter, more humorous and more self-aware examples of chopped logic.
If you want an example of somebody doing a great job at what Lanier fails at here, read "Freakonomics". (Read it anyway, it's great.)
[Disclosure: the author of this review is an ex-actor/musician and a writer of fiction who has done graduate research in artificial intelligence and has been doing software engineering for about 30 years.]
Lanier's personal interest in exotic music fits with his own form of personal revolt against the dehumanizing forces at play. Laced with insightful snippets, Lanier appears "locked" into his own matrix. Lanier fears and believes in a digital future. It's economic and socially revolutionary potential intersects with a growing proportion of humanity. Yet just as Latin (then French) dominated intellectual discord in the West (and arguably English in a more global sense today), the very breadth of intellectual resources will eventually require the return of specialists and revitalize demand for "original" sources in much the same manner that ideas from classical times became the passion of Renaissance humanists.
Lanier identifies the forces and possible implications of modern information technology. As the law of unforeseen consequences would suggest, Lanier defines a window with extensive potential to remake the world as we know it -- or at least perceive it. However, anticipating future trends (the mantra of the human existence) carries a burden of having only the present to project into the future. Lanier's critique of contemporary culture is insightful but it is also limited in application to a passing generation.
Lanier aspires to pull us away from the dehumanizing abyss of current technology. His own evidence regarding the expanding capacity of computers and information technology generally, however, could as easily prove its own undoing as data evades quantification under the weight of its own quantity. Qualitative analyses (along with an appreciation of music) require a mature intuitive cherry-picking of reliable evidence. Or as previous leading thinkers have noticed, logical constructs preclude non-logical outcomes, which in turn preclude progress beyond what is commonly viewed as reasonable or rational. Or as Einstein once noted, paraphrasing, if you want your children to better understand math, teach them fairy tales. Perhaps Lanier can bridge this gap in his next book.