Amazon.com: Customer Reviews: You Are Not a Gadget: A Manifesto
Automotive Deals HPCC Amazon Fashion Learn more Discover it Lori Mckenna Fire TV Stick Sun Care Handmade school supplies Shop-by-Room Amazon Cash Back Offer TarantinoCollection TarantinoCollection TarantinoCollection  Amazon Echo  Echo Dot  Amazon Tap  Echo Dot  Amazon Tap  Amazon Echo Starting at $49.99 All-New Kindle Oasis AutoRip in CDs & Vinyl Segway miniPro

Format: Paperback|Change
Price:$11.74+ Free shipping with Amazon Prime
Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

Showing 1-10 of 11 reviews(2 star). Show all reviews
on June 3, 2014
A rule of thumb is that it takes ten years to become an expert. Among PhD's and software engineers, there is a common delusion that hard-won expertise in a miniscule area of human knowledge is automatically extended to everything under the sun.

Lanier is a name in a creative area of computing, and he is a professional composer and musician. Unfortunately that did not confer the chops required to write a coherent book on economics, history, cultural analysis, software engineering or the host of other things he attempts here. So he wings it. The result is a number of real howlers accompanied by less egregious chapters where he obsesses about the color of an elephant's tail without understanding that there is an elephant attached

Lanier's Method of Operation:
* pick a personal pet peeve
* select some phenomenon of general interest
* fabricate a random cause-effect relation linking the popular phenomenon to his pet peeve.
* ignore any relevant historical, economic or cultural information.

Looking back on that paragraph, I realize that I just described Fox News's MO. Except Fox does it on purpose for millions of dollars. I believe Lanier was just misguided by bilious enthusiasm for his own biases.

The biggest howler? Lanier walks into a hedge fund office and sees a wall of computer monitors. "Aha!" he thinks, "the 'cloud' is responsible for the bank crash."

Now behind all the hype, the "cloud" is just a more rational way of billing for computer hardware use, supported by a means of running multiple servers safely on one machine. It does allow much smaller businesses to afford hosted "servers", but it explains absolutely nothing about the 2006-2008 financial crash -- or the 15 "pre-cloud" other financial crashes in U.S. history! Lanier was simply looking the wrong way in that hedge-fund office. If, like Michael Lewis, he had ignored the hardware and paid attention to the "wetware", he might have learned something.

If you want an adult discussion of the crash, see the excellent movie "An Inside Job", which will change forever the meaning of the phrase "leading economists" for you. Read Michael Lewis's "The Big Short" which gives you a manager's eye view of the thing or "Confidence Men" which gives you the political side back to Clinton and Larry Summers. Lanier could have at least read Lewis's "Liar's Poker, which was available at the time he was writing this book, or could have simply kept silent until he knew enough not to embarrass himself.

Howler number two: Lanier experiences a lag on his new iPhone. "It must be Linux! I can just feel it!" Funny that every other iPhone user identified the problem correctly as the AT&T network, which could not handle the sudden increase of activity from the new iPhone. IPhone apps phone home promiscuously, but that's not Linux, it's marketing. If Lanier had been equally wrong but less biased, he might have attacked Objective C, a more immediate software platform than Linux. But Lanier was a Microsoftie, and Linux was his pet peeve.

Google: Lanier seems unaware that capitalism breeds monopolies! But he thinks that Google is a really dangerous one. Lanier forgot to count the horse he rode in on. In the nineties and the oughts, Microsoft was the oppressive monopoly. Unlike Google, MS were always happy to throw users under the bus with software that simply did not run (Windows 3.0, DOS 4.0, Windows Millenium, Vista). They were willing to use illegal means to keep better software off the market (DR-DOS). (With iOS, Apple now throws developers as well as users under the bus.) I don't like monopolies, but Google has contributed a tremendous amount of good free software and free developer cycles to the community, so between Microsoft, Apple and Google, I'll take Google.

If Lanier had any insight, he would have identified U.S. internet service providers as the monopoly/cartel most dangerous to the average computer user. In most markets in this country, internet providers charge outrageous rates for broadband in order to subsidize their "no-choices" cable TV business. Broadband is a utility: if you are employed you need it and if you are unemployed you need it even more. You can easily live without cable TV (for PBS, local news and weather, I get better reception with an antenna []). So the cable TV company doubling as an internet provider is a monopoly with devastating consequences and no justification. And it costs us three times as much for half the speed that many European countries enjoy.

Which brings me to open-source software. I have read some reviews here where the review writer is apologizing for giving his review away. Sad, sad. Amazon reviews are probably the greatest innovation in shopping since cash. I am grateful for this community and contribute freely to it. (Contrast Amazon with a really useless shopping site like the iStore/appStore.)

Because of Lanier's bias (or a kool-aid overdose), he doesn't notice that open source software may be the biggest single job creator in our economy. Thousands of tiny businesses exist because of it. The problem with commercial software like Oracle, Windows and Apple software is not so much the up-front cost (well, with Oracle it used to be). It is the ongoing maintenance and management expense due to the poor usability and the opacity of the software itself. The proprietary vendors seem to glory in re-inventing computer science with out-of-date technology masked by really impenetrable nomenclature.

If I run into a bug in open source software that affects only a few users, I can go in and fix it myself right away because I have the source and because the open source community generally evolves solid, readable, pattern-based software. With a commercial program, bugs that affect you will not be fixed in the next release unless you throw tons of extra money at the software vendor -- and maybe not then, because the best and the brightest programmers have already left for more interesting open source projects. Major innovation will not be happening in proprietary software because you can't pre-calculate the bottom line for innovation and all the manpower is required just to keep the legacy stuff from crashing.

By the way, if YOU use open source, do what I do: contribute money -- it's well worth paying for.

Creative Commons: Lanier's cultural history chops are far too weak to understand the Creative Commons concept. He allows that he might be willing to let some of his work be used for free but he would like to control how it is used.

Now, Shakespeare was the greatest mash-up artist of all time (along with Marlowe, Moliere, Aeschylus, Sophocles, Euripides, Ovid, Vergil...). If forced to, the Bard might have grudgingly given a few shillings to the prior authors of everything he borrowed (and he borrowed pretty much everything). But if he had to write Hamlet with Kyd's executor looking over his shoulder, or R&J, Richard II, III or Henry IV,V,VI,VIII with Brooke's or Holinshed's attorneys telling him what he could or could not write, he would surely have packed it in and gone home to make beer.

I believe strongly in the modern concept of paying authors for their work -- even though Shakespeare, Mozart and most other geniuses had to have day jobs. But the intellectual rigidity of our copyright laws (not to mention our libel laws) absolutely guarantees that there will never be another Shakespeare.

Creative Commons is an attempt to restore literature's ability to cross-fertilize (the way classical and folk music used to and visual art still does) by trying to reproduce the open-source software phenomenon there.

My problem with this book is that Lanier did not respect his subject or his readers enough to question his own prejudices or ignorance. You could read Lanier's book as a study of hand-waving and proof by intimidation, but the content is hardly worth the effort. Read "How to Win Every Argument" instead for shorter, more humorous and more self-aware examples of chopped logic.

If you want an example of somebody doing a great job at what Lanier fails at here, read "Freakonomics". (Read it anyway, it's great.)

[Disclosure: the author of this review is an ex-actor/musician and a writer of fiction who has done graduate research in artificial intelligence and has been doing software engineering for about 30 years.]
22 comments| 19 people found this helpful. Was this review helpful to you?YesNoReport abuse
on July 6, 2012
Lanier is very good describing the superficiality of the Web and the triviality of social networks like Twitter and Facebook. He shows how computer programmers embed their faulty philosophical beliefs in their technology and how that in turn affects affects society. While he is good at showing the symptoms, he is not so good at showing how that happens..

He justly complains about the lack of authorial expertise and primary-source material on Wikipedia. He often refers to the limitations of the MIDI system of replicating tones on a computer, the Linux operating system, and the Web 2.0 architecture, but he doesn't show how they transfer to social practices.

Lanier's book exhibits the same problems he is complaining about. He wanders through many interesting subjects, but he doesn't show how they are connected. In short, like the Web, his work lacks organization.

He brings up the lack of organization on the Web, but he dances around that subject and never takes it on. He fails to realize how central that problem is, not only for the Web, but for the whole field of knowledge and learning.

Because of the binary structure of our brains and nerves, we think and learn in categories. We learn by means of incorporating new information into what we already know. Learning resembles nothing so much as how our bodies metabolize food, transforming foreign matter into our own living tissue. We make something we learn part of us that we can use to learn yet other things.

The whole process of education relies on the sharing and learning of information as it is conceived and organized by experts in their particular fields. All of science and learning going back to Aristotle rests on that principle of organization, incorporating the new into what we already know. Mastering a subject means having a grasp of its organization.

Carl Linnaeus captured that vision in the binomial nomenclature (genus and species) that he used in his 1735 landmark "Systema Naturae." In the whole field of science and knowledge, everything is part of something else. That classical view of knowledge was also embodied in the Dewey Decimal and Library of Congress classification systems in the 19th century and the Universal Decimal Classification in the 20th.

When you are looking for a book in a library, the classification system does two things: it tells you where the book is physically located and 2. it puts that book and its subject subject matter in the context of a body of knowledge as it is organized by experts. It introduces you to other related books on the same shelf as the book you are looking for.

In using a classification system, you gain knowledge by gaining context. Everything in the library, like learning itself, is based on context, organization, and hierarchy.

Early programmers before the Web embraced that view of knowledge. The most popular programs in the 80s and 90s were databases and spreadsheets, which were tools for organizing information. The arrival of object-oriented and the mark-up programming languages such as SGML were all embedded in that same concept of hierarchical grouping of information. It was not only the best method for storing and finding information but also for learning it.

The problem of organization, however, is that 1. only the human brain can do that, and 2. it is very difficult to do. As the ancients said, "Sapienti est ordinari," only humans can organize. It will be a very long time before computers will be able to do that.

About the time the Web started, some programmers were trying to avoid that problem. They proceeded on a popular but unfounded belief that people don't need any help in organizing materials if they only have access to the information. In their fear of authority, they regarded the previous organization of thought by experts as Euro-centric, snobbish, and elitist. They mistakenly felt that technology would eliminate the need for experts and make the task of organization--pointing out where everything belongs--unnecessary.

They saw data as discrete, free-floating items in no need of context. They rejected the need of experts to design the organization of knowledge. They promoted digital searches as a better path to finding information but forgot about the need to put it in context.

Asking individuals to do that without the help of others who have already done it is like asking everyone to start from scratch and re-invent the wheel.

The success of the programmers has resulted in the hive mentality and the trivialization of truth that Lanier writes about. People are led to think that bits of information out of context can accumulate to become somehow significant. Creators of Wikipedia continue to hope that the body of information somehow will organize itself without the need of experts. Because of that real disability, real experts in any subject are hesitant to contribute anything, crippling the whole effort.

The opponents of this childish obsession with technology were the educators, scientists, classifiers, and librarians. They had a different vision of what the Web could be-- a hierarchical structure, like like any other body of information such as an encyclopedia or a library. A great window of opportunity was missed when Web programmers went off in a different direction and failed to provide it a classification system.

It was a time to use and correct a system of classifications and create something really great for the world. There are still those around who are working on modern systems of classification. Too bad that Tim Berners-Lee and the other developers of the Web did not follow their alternative line. For more information on that possibility, go see: [...]
33 comments| 13 people found this helpful. Was this review helpful to you?YesNoReport abuse
on December 23, 2014
Picked up the book based on a friend's recommendation. The author did not maintain my interest in the topic and although there are some interesting concepts, my take away from the book was to basically ignore this fellow's theories.
0Comment|Was this review helpful to you?YesNoReport abuse
on August 6, 2011
A better title for this book would have been "I am not a Gadget". It is centered on personal thoughts, experiences and apprehension that can hardly be generalized (not convincingly so at least).

This book has the great merit of being creative and making connections between concepts that initially sound remote to each other. It also stands firm on frivolous uses of User-Generated Content that end up producing more useless noise than quality content. To this extent, it may sound a bit anti-democratic. I adhere to Lanier's position that mediocrity is encouraged these days, however I wouldn't know (no more that he does) how to address this problem while not hindering creativity, innovation, free speech or emerging phenomena. We know that, throughout history, massive mediocrity prevailed over selective rationality: We are just going through another one of these "human" and barbaric phases.

I must admit that the excessive use of the overly harsh and biased notion of "cybernetic totalism" to designate the "opposing camp" (inclusive of all categories) turned me off and it was hard to keep adhering to the author's position from then on. Though I attempted to observe the landscape from his point of view, the facts and substance he produces are so rare and shallow that (sorry) I wasn't convinced. Big names such as Turing, Darwin and Popper are dropped but, more often than otherwise, simplistic quotes from these people are used to drive a point, not facts that stem from their research.

The last chapter contains a relevant (although partial) discussion on post-symbolic communication which, in fact, emphasizes the importance of primitive, non-verbal, direct body-language communications. In essence, post-symbolic communication is also pre-symbolic communication.

Overall, this book is more of a philosophical dissertation on value systems than a work upon which one can build knowledge.
0Comment| 13 people found this helpful. Was this review helpful to you?YesNoReport abuse
on February 25, 2011
Jaron Lanier invokes totalitarian collectivism in order to discredit contemporary internet culture. While the book contains a worthwhile critique of the corporations and software developers pulling the strings, Lanier conceives of the problem as a threat to middle-class privilege. The social shift toward free content erodes the position people such as Lanier who've had success selling their ideas in the marketplace. He employs the conventional conservative appeals to make this case, most notably in repeatedly describing the horror of the crowd or mob. Great dudes like Einstein drive history, not the ignorant masses. Lanier's rhetoric here comes out of a long elitist and specifically anti-radical tradition. His defense of intellectual property laws serves the interests of his class but harms the species as a whole. This ideology worries me every bit as much as Ray Kurzweil's - at least Singularitarianism allows for the possibility of a post-scarcity economy and thus effective end to capitalism. Lanier longs for perpetual competition, hierarchy, and want.
66 comments| 24 people found this helpful. Was this review helpful to you?YesNoReport abuse
on January 3, 2015
Edition VERY below standard.
0Comment|Was this review helpful to you?YesNoReport abuse
on August 25, 2011
I cannot read the tiny print without getting a headache and my corrected vision is 20/20. I should have really looked at some pages before ordering the book. I will try to resell it for a gift card.
22 comments| 7 people found this helpful. Was this review helpful to you?YesNoReport abuse
on May 18, 2010
I have to admit that half of this book went over my head. I picked it up because of Mr. Lanier's criticism of social media, and I completely agree with what he says about that. His ideas about the way "Web 2.0" is dumbing us all down sounds right on, and also his thoughts about the idea of "lock-in" were very interesting. But I found most of the book to be aimed at a much more tech-savvy audience than me. I had never even heard of the "hive mind" or the "noosphere," and Mr. Lanier seems to suffer from the familiarity that too many tech writers get, where it's assumed that the reader knows more than they do. I struggled to understand many of his concepts, especially his ideas on the financial world.

The book is simply written, but his concepts wander and looking at headings like "Goldingesque Neoteny, Bachelardian Neoteny, and Infantile Neoteny" started to be daunting. He seems to be obsessed with Wikipedia's influence, which I found weird because I hardly ever use Wikipedia and don't trust most of it, but Lanier acts as though 100% of internet users treat it like the Bible. Also, although I was not sure how I felt (agree or disagree) about all of his ideas, he totally lost my respect when he said that the video game Spore was really great. I found that game to be one of the worst video games ever and a gigantic personal disappointment, so after that I couldn't take anything Lanier said seriously. At the end of the book he seemed to go off on a personal tangent about how much he likes cephalopods, that didn't seem to fit in with the rest of the book and felt self-indulgent to me.
11 comment| 19 people found this helpful. Was this review helpful to you?YesNoReport abuse
on December 10, 2010
Premises of book enticed me to get it. I braved through first few pages and found it to be choppy at best. Some points were thought provoking while others drifted so far you have to check the cover of the book to check you are reading the same book.

I gave up with less than 100 pages.

I won't be critical as I have not completed the book, maybe it picks up later on maybe not.

All I have to say is, it's not for everyone and in my opinion; it would take a dedicated and patient reader to go through the entire book.
11 comment| 8 people found this helpful. Was this review helpful to you?YesNoReport abuse
on June 8, 2011
I just wish he used "everyday" English to say them.

I wanted to like this book more than I did. The author's choice of difficult to follow text (for me), along with a habit of alluding to problems with the web, social networks, etc as they've developed rather than giving more SPECIFIC examples was bothersome.
0Comment| 4 people found this helpful. Was this review helpful to you?YesNoReport abuse