"Technology criticism," the author writes, "should not be left to the Luddites." Jaron Lanier is certainly no Luddite, but in this "manifesto" he blasts the Web 2.0 mentality, highlights long-standing technology lock-ins, and ranges far and wide in his criticisms of the Internet, computing, and the cultures surrounding the two today.
The core of his argument is that the achievements of the Web 2.0 collaborations are neither exciting, nor new. "Let's suppose that, back in the 1980s, I had said, `In a quarter century, when the digital revolution has made great progress and computer chips are millions of times faster than they are now, humanity will finally win the prize of being able to write a new encyclopedia and a new version of UNIX!' It would," he writes, "have sounded utterly pathetic." He's referring to Wikipedia and Linux, two clear successes of collaborative construction. And furthermore, the intellectual work of those thousands of people have been undervalued, in fact, they're unpaid volunteers. The middle classes have spent their hours working without paid to build wonderful constructs for the profits of major companies. Hmmm...as I write this book review, unpaid, with Amazon looking to earn money from selling more copies of this book...
Ranging further across the Web 2.0 field, Jaron notes the Facebook and Myspace pages in their prescribed formats with individuals reduced to favorite books, movies, five options for politics, and six options for relationship status. Other parts look at technology lock-in, with the example of MIDI. It was developed in the early 1980s for keyboard synthesizer control and output, and reproduces the nuances of a keyboard but not, for example, a violin. It would be hard to get support for a new, broader tool. "A thousand years from now, when a descendant of ours is traveling at relativistic speeds to explore new star systems, she will probably be annoyed by some awful beepy MIDI-driven music to alert her that the antimatter filter needs to be recalibrated."
Well, I certainly don't agree with everything Jaron has to say, even if I do fondly recall the handmade (with blink tags) web pages from before the AOL deluge (the September without end) when the masses discovered the Internet. There's a lot of crap online, but then again, there's a lot of crap everywhere. I can happily share my family photos over Facebook with people who barely are computer literate, and still be critical of the silly lock-ins of the Facebook pages. Lanier is not a Luddite though, he doesn't want us to smash the digital world, but wants to criticize it to make it better. Nothing wrong with that, whether we agree with his criticism or not.
In his book You Are Not a Gadget: A Manifesto, Jaron Lanier becomes a solitary voice in the wilderness shouting as loudly as he can that all is not well with the virtual world nor with the tools that make the virtual world possible....software and computers. That this book was written by an insider from the world of the Internet should get everyone's attention.
Jaron Lanier is a household name for those who follow the world of computers and virtual reality and his book is nothing more than a manifesto warning us that there is a dark side to the Internet. Even innocuous websites such as Facebook and Google, "lords of the cloud" do not escape Lanier's expose. "Emphasizing the crowd means de-emphasizing individual humans" and that, in the end, leads to "mob" behavior. Utterly true.
As I flipped through the book, the point that resonated most loudly to me was the impact `anonymity' has had on our virtual world (and maybe the real world as well). I can remember visiting a chat room that was dedicated to "Books and Literature" in 2000 or 2001. As a librarian I was naturally drawn to a space that I thought would be filled with others like me who had a love of the written word and for good books. Did that assumption back fire? You bet! What I found was a chat area filled with virtual people who wanted to chat about anything but books and literature. If I were to post a question about what people were reading or what they thought of a given book I was torn (virtually) from limb to limb. Having served in the military I have a pretty good operational understanding of foul language, and I'm pretty good at throwing the words around when necessary. However, that this language would be used in that particular venue by people who could remain anonymous was a shock. I'm pretty certain that most of the visitors to that website hadn't read a book in years and had no problem violating the most basic rules of civility. Lanier is correct when he argues that this is not a step in the right direction. (Please forgive this personal observation)
Obviously I'm a fan of the virtual world. I post reviews online for free (which is another point Lanier makes) but the joy isn't the posting of reviews but in reading the books; real books. What Lanier has to say should be of interest to all of us.
You Are Not a Gadget is written for the ordinary reader with a minimal background in computers. Lanier floats from idea to idea not necessarily fully exploring a point, but instead simply raising an issue and then moving on. Very effective!
I predict that You Are Not a Gadget is destined to become a cultural icon in the future. We now point to books such as Silent Spring by Rachel Carson and I'm Ok, You're Ok by Dr. Thomas Harris as books that changed society and altered the future. I suspect that You Are Not a Gadget may become that type of sign post.
I highly recommend.
What Jaron Lanier does is take us up 50,000 feet and allow us to view things with perspective. He says we have been overwhelmed by the unnoticed "lock-in" and simply adjust and reduce ourselves to fit the requirements of online dating, social media, forums, and the software we employ. Web 2.0 is homogenizing humanity, taking us down to the lowest common denominator instead of allowing or encouraging us to bloom in different directions. Everything we now "enjoy" seems to be backward looking - music is sampled and retro, news is criticized mercilessly, but very few are creating it any more, relationships are Tweets...
It sounds like Lanier recommends friends don't let friends communicate via Facebook - they do it on the phone or in person. But the direction we are taking instead reduces interaction, kills creativity, journalism, music, science....it's not as pretty as predicted.
These are truly valuable criticisms, and this is an important, if flawed book. Flawed because after a hundred page pounding of logic and evidence, Lanier spends the second hundred pages telling us how wonderful it is to be a scientist and play with humans and cuttlefish. I was particularly annoyed with a gratuitous couple of paragraphs devoted to swearing, which which he says might be connected to parts of the brain controlling orifices and obscenity.
Well, to my knowledge, swearing is purely cultural, not physiological. In Quebec, the worst swearing is against the Catholic Church, Translated into English "Christ Tabernacle" sounds like something WC Fields said to skirt the censors. But it's the most vile thing you can say in polite conversation in Montreal. On the other hand Motherf----r doesn't translate into French at all. And what's any of this got to do with online reductionism? Zilch - is my point. The last 100 pages is full of such diversions.
Others have pointed to other sections they disagree with, and they all seem to occur in the last half of the book. But don't let that deter you, as it distracted him. The original message is important. People create. Software does not. Software restricts. Don't leave anonymous contributions. Build a creative website of your own design. Probe deeply and uniquely - beyond Wikipedia. Reflect before you blog.
Lanier says our humanity and creativity are being put at risk by the miasma foisted on us by the incredible leveling machine of the internet. Instead of becoming exciting, the internet has become boring. Instead of creating new music, it has assassinated the entire industry. Instead of bringing people together, it lets them off the hook. That's worth exploring, and for about 100 pages, Lanier does a grand job of it.
on January 15, 2010
This is a very interesting book. Its a critique of the "internet culture" which has up until now been mostly beyond challenge. The author hits exactly on the key problems of the culture: Collectivism, mob mentality, conformity and the marginalization of the individual. He also hits upon the problem that small decisions made by individuals can lock people into mindsets or patterns of behavior.
Its an excellent book in highlighting the problems of the era. But it doesn't really provide any easy answers about how to change things. And the unfortunate truth is that many of the problems are less to do with technology than human nature.
The joke of "free" software is that it isn't "free" at all. It always comes with a licence agreement which spells out that duties of the individual to the "collective". The innovation of Linux and its licence over the works that had preceeded it was that any additions to Linux belonged to the collective. An individual can't ever own anything.
Wikipedia is even worse. Want to create your own facts or history? Create a web-page where you say something about a particular subject, then quote the webpage as the source for what you want to say on Wikipedia. Suddenly your web page is the equal of any scholarship in the whole of human history.
In pointing to the growth of mob mentality across society and the accompanying anti-intellectual climate, the author has hit upon *the* key philosophical issue in the new century. This is important and necessary book that deserves to be read.
While my review remains positive, I want to point out one major problem in the book. The account of events on p. 125-126 is full of misinformation and errors. The LISP machine in retrospect was a horrible idea. It died because the RISC and MIPS CPU efforts on the west coast were a much better idea. Putting high-level software (LISP) into electronics was a bad idea.
Stallman's disfunctional relationship with Symbolics is badly misrepresented. Stallman's licence was not the first or only free software licence. Where stallman was unique was in that his licenses are more about enforcing the rights of the collective and claiming the work of others than anything to do with making things free. And often the growth of the so-called culture was being driven by personal feuds with the BSD community, with Symbolics and with anyone who dared touch the holy EMACS editor. Much of the time, the so-called movement seemed more about picking fights and asserting control than anything to do with makings things free.
And the irony of Linus Torvalds is that he didn't follow in their footsteps. Stallman and company were driven by flawed collectivism into a massive failed project known as "Hurd". Linus was successful in that he brought an individualist mindset, a simple set of ideas and the ability to get along with other people to his effort. Linux isn't that way anymore, but the reasons that Linux (with no reasources) was successful and the Hurd (with huge resources) was a massive failure presents a case study in how collectivism fails.
There have been any number of massive collectivist failures. To list a few: The OSI networking protocols, the ADA programming language, The first generation of microkernel operating systems, OSF/1 (and the OSF in general), any number of initiatives at the IETF.... Things that have tended to be successful over time are things that grew up in secret.
And calling Linux an "antique" was really strange as is the idea that it represents a 1970s mindset. The fact is that all kinds of people have tried new radical designs for operating systems since the 1980s and they have all generally been dismal failures (like Hurd from GNU). And the fact is, many people who worked on such things discovered over time that investing creativity at the lower levels of the system was generally a bad idea. Abstract entities were best created at the higher levels of systems where hardware and operating system would stay out of the way as much as possible.
The first thing that must be said about Jaron Lanier's "You Are Not a Gadget: A Manifesto" is that it a very intricate book, full of several different arguments and lines of thought. It might be best to say that it is a manifesto containing several submanifestos. His arguments against the current directions in "web 2.0" technology are many and multifaceted, taking us through questions of the effectiveness of capitalism, how culture evolves, whether there can really be "wisdom in crowds," and even the nature of what "human" is.
If we have to sum up the book into an overall point or argument, here's how I'd do it: web technology, which was hoped to lead to vigorous innovation and individualization, has done precisely the opposite. On the consumption side, the idea of the "wisdom of crowds" has made the group (Lanier says "hive mind") more important and more "real" than voices of individuals. On the production side, the internet has led less to innovative production than to the recycling of old ideas in new forms, while making it hard for inventors/pioneers to make a living being creative. (Yes, I know I am missing some things in this description but, as mentioned, Lanier's work is very hard to sum up with concision.)
Lanier believes that there are two big reasons for this. First, we are not using our conception of humanity to drive how we shape technology so much as we are allowing technology to shape how we define humanity. A shining example is our faith in the "wisdom of crowds" as exemplified by our increasing obsession with all things wiki. Lanier reminds us that, in reality, there is no such "wisdom in crowds" because crowds are simply collections of individuals making individual decisions. (I would also add that "wisdom of crowds" is a literal impossibility as wisdom can only happen embodied in a point-of-view, of which a crowd has none.)
Secondly, Lanier believes that innovation may be lagging behind expectations because of our belief in the "information wants to be free" model. Yes, this has benefits, like offering information in a way that is accessible to...well...most. But it has the disadvantage of removing the incentives provided by markets out of a market. Lanier often uses the example of music and art: it was thought that the internet would allow more artists to make livings off of their art by removing the middle-men and allowing artists direct access to consumers. But with so much free content and exponentially increased competition, it is becoming even harder for artists to (a) get noticed in the milieu and (b) make a living off of their creativity.
While Lanier does not directly champion capitalism (he does contemplate its goods and bads), I think it is fair to argue that Lanier is championing a market system as the surest spur to innovation. Here, I must quote him directly: ""Why are so many of the more sophisticated examples of code in the online world - like the page-rank algorithm in the top search engines or like Adobe's Flash - the results of proprietary development? Why did the adored iphone come out of what many regarded as the most closed, tyrannically managed software - development shop on earth? An honest empiricist must conclude that while the open approach has been able to create lovely, polished copies, it hasn't been so good at creating notable originals." Lanier is not against the open source movement (think Youtube) altogether, but does present good pragmatic arguments as to why it is severely limited.
In a book so rich and varied, I certainly can't say I agree with everything Lanier puts forth. One of the major criticisms I have of the book is that while Lanier sees the internet's failure to meet expectations as a problem with the internet, he never blames the expectations. By example, Lanier bemoans the fact that much music created in the past 15 years (with technology) hasn't been wholly innovative, as he thought it would be. But I would remind him that such whole-cloth innovation has always been rare. Jazz, he says, was innovative, as were the Bealtes experiments with multi-track recording. Why nothing like that now? Well, Jazz used the same musical forms and concepts of Dixieland before it and ragtime before that. And the Beatles multitrack experiments didn't sound THAT different from the rock and roll which preceded it. Similarly, Lanier bemoans the fact that Wikipedia is simply the combination of the existing ideas of the encyclopedia and usenet. Okay, but couldn't it just be that the encyclopedia and usenet were such good ideas, that combining them is better than scrapping them and inventing from whole-cloth? Long and short, Lanier expected the type of whole-cloth invention out of the internet that never really existed before the internet.
There are several other areas where I think Lanier's arguments are weak (and several places where I think he argues against "straw man" positions held by only a few). I will not get into them as this isn't the place. What I will say is that I cannot recommend this book strongly enough. Even though I am sure everyone will find areas of agreement AND disagreement with Lanier, every reader will think very deeply as a result of what he writes. He is neither a luddite nor a techno-utopian, neither a reductionist or a mysterian, and neither a techno-anarchist or techno-Maoist. But he is a challenging thinker who deserves to be thought about.
on January 27, 2010
I had never heard of Jaron Lanier before reading this book; I bought it for one reason only - a blurb on the back cover by Lee Smolin (The Life of the Cosmos) - and I am so glad I did.
Essentially, Lanier has written a well founded criticism of the uses and abuses of technology in the world today. One of the main culprits in Lanier's view is the metaphor that people are computers and that we can ultimately reduce descriptions of both humans and computers to simple processes of information exchange. Lanier rightly believes this metaphor is inherently damaging to peoples psyche's and to society in general - this is a view I share with Lanier. Some of his targets include the "computationalism" philosophers like Daniel Dennett and Douglas Hofstadter; "eliminative materialists" like Patricia and Paul Churchland; biologically heavy-handed academics like Richard Dawkins and Christof Koch; and the over-the-top Singularity preachers like Ray Kurzweil (The Singularity Is Near: When Humans Transcend Biology) and Vernor Vinge (Rainbows End). There is much missing from these people's reductionist approach - People Are Not Computers (hence the title: You Are Not a Gadget). I praise Lanier for his sensible, pragmatic and inspiring manifesto on this issue and his call for a more humanistic approach. He must really feel like the lone voice in the wilderness.
Throughout Lanier's work, I couldn't help but be reminded of the general ennui that seems to be sweeping through our culture these days. Lanier has captured the universal angst that some of my other favorite books speak about too (Empire of Illusion: The End of Literacy and the Triumph of Spectacle,Idiot America: How Stupidity Became a Virtue in the Land of the Free,Denialism: How Irrational Thinking Hinders Scientific Progress, Harms the Planet, and Threatens Our Lives,Bright-sided: How the Relentless Promotion of Positive Thinking Has Undermined America and The Trap: Selling Out to Stay Afloat in Winner-Take-All America). I am definitely adding it to this small collection of books as a reminder to myself to break free from the historical "lock-in" that seems to come as technological niches get filled (think Google, Facebook, Amazon (which I do love), iTunes, MS Windows, et al)) Try something new Lanier says. Don't simply rely on the Matthew effect, Cumulative Advantage, or what Michael Shermer calls the Bestseller Effect: "It takes only a tiny group of engineers to create technology that can shape the entire future of human experience with incredible speed. Therefore, crucial arguments about the human relationship with technology should take place between developers and users before such direct manipulations are designed. This book is about those arguments."
Some cool new terminology I learned: open-culture, cloud, hive-mind, noosphere, and lulz. I love this book and highly recommend it. Thanks Lee Smolin!
on June 3, 2014
A rule of thumb is that it takes ten years to become an expert. Among PhD's and software engineers, there is a common delusion that hard-won expertise in a miniscule area of human knowledge is automatically extended to everything under the sun.
Lanier is a name in a creative area of computing, and he is a professional composer and musician. Unfortunately that did not confer the chops required to write a coherent book on economics, history, cultural analysis, software engineering or the host of other things he attempts here. So he wings it. The result is a number of real howlers accompanied by less egregious chapters where he obsesses about the color of an elephant's tail without understanding that there is an elephant attached
Lanier's Method of Operation:
* pick a personal pet peeve
* select some phenomenon of general interest
* fabricate a random cause-effect relation linking the popular phenomenon to his pet peeve.
* ignore any relevant historical, economic or cultural information.
Looking back on that paragraph, I realize that I just described Fox News's MO. Except Fox does it on purpose for millions of dollars. I believe Lanier was just misguided by bilious enthusiasm for his own biases.
The biggest howler? Lanier walks into a hedge fund office and sees a wall of computer monitors. "Aha!" he thinks, "the 'cloud' is responsible for the bank crash."
Now behind all the hype, the "cloud" is just a more rational way of billing for computer hardware use, supported by a means of running multiple servers safely on one machine. It does allow much smaller businesses to afford hosted "servers", but it explains absolutely nothing about the 2006-2008 financial crash -- or the 15 "pre-cloud" other financial crashes in U.S. history! Lanier was simply looking the wrong way in that hedge-fund office. If, like Michael Lewis, he had ignored the hardware and paid attention to the "wetware", he might have learned something.
If you want an adult discussion of the crash, see the excellent movie "An Inside Job", which will change forever the meaning of the phrase "leading economists" for you. Read Michael Lewis's "The Big Short" which gives you a manager's eye view of the thing or "Confidence Men" which gives you the political side back to Clinton and Larry Summers. Lanier could have at least read Lewis's "Liar's Poker, which was available at the time he was writing this book, or could have simply kept silent until he knew enough not to embarrass himself.
Howler number two: Lanier experiences a lag on his new iPhone. "It must be Linux! I can just feel it!" Funny that every other iPhone user identified the problem correctly as the AT&T network, which could not handle the sudden increase of activity from the new iPhone. IPhone apps phone home promiscuously, but that's not Linux, it's marketing. If Lanier had been equally wrong but less biased, he might have attacked Objective C, a more immediate software platform than Linux. But Lanier was a Microsoftie, and Linux was his pet peeve.
Google: Lanier seems unaware that capitalism breeds monopolies! But he thinks that Google is a really dangerous one. Lanier forgot to count the horse he rode in on. In the nineties and the oughts, Microsoft was the oppressive monopoly. Unlike Google, MS were always happy to throw users under the bus with software that simply did not run (Windows 3.0, DOS 4.0, Windows Millenium, Vista). They were willing to use illegal means to keep better software off the market (DR-DOS). (With iOS, Apple now throws developers as well as users under the bus.) I don't like monopolies, but Google has contributed a tremendous amount of good free software and free developer cycles to the community, so between Microsoft, Apple and Google, I'll take Google.
If Lanier had any insight, he would have identified U.S. internet service providers as the monopoly/cartel most dangerous to the average computer user. In most markets in this country, internet providers charge outrageous rates for broadband in order to subsidize their "no-choices" cable TV business. Broadband is a utility: if you are employed you need it and if you are unemployed you need it even more. You can easily live without cable TV (for PBS, local news and weather, I get better reception with an antenna ). So the cable TV company doubling as an internet provider is a monopoly with devastating consequences and no justification. And it costs us three times as much for half the speed that many European countries enjoy.
Which brings me to open-source software. I have read some reviews here where the review writer is apologizing for giving his review away. Sad, sad. Amazon reviews are probably the greatest innovation in shopping since cash. I am grateful for this community and contribute freely to it. (Contrast Amazon with a really useless shopping site like the iStore/appStore.)
Because of Lanier's bias (or a kool-aid overdose), he doesn't notice that open source software may be the biggest single job creator in our economy. Thousands of tiny businesses exist because of it. The problem with commercial software like Oracle, Windows and Apple software is not so much the up-front cost (well, with Oracle it used to be). It is the ongoing maintenance and management expense due to the poor usability and the opacity of the software itself. The proprietary vendors seem to glory in re-inventing computer science with out-of-date technology masked by really impenetrable nomenclature.
If I run into a bug in open source software that affects only a few users, I can go in and fix it myself right away because I have the source and because the open source community generally evolves solid, readable, pattern-based software. With a commercial program, bugs that affect you will not be fixed in the next release unless you throw tons of extra money at the software vendor -- and maybe not then, because the best and the brightest programmers have already left for more interesting open source projects. Major innovation will not be happening in proprietary software because you can't pre-calculate the bottom line for innovation and all the manpower is required just to keep the legacy stuff from crashing.
By the way, if YOU use open source, do what I do: contribute money -- it's well worth paying for.
Creative Commons: Lanier's cultural history chops are far too weak to understand the Creative Commons concept. He allows that he might be willing to let some of his work be used for free but he would like to control how it is used.
Now, Shakespeare was the greatest mash-up artist of all time (along with Marlowe, Moliere, Aeschylus, Sophocles, Euripides, Ovid, Vergil...). If forced to, the Bard might have grudgingly given a few shillings to the prior authors of everything he borrowed (and he borrowed pretty much everything). But if he had to write Hamlet with Kyd's executor looking over his shoulder, or R&J, Richard II, III or Henry IV,V,VI,VIII with Brooke's or Holinshed's attorneys telling him what he could or could not write, he would surely have packed it in and gone home to make beer.
I believe strongly in the modern concept of paying authors for their work -- even though Shakespeare, Mozart and most other geniuses had to have day jobs. But the intellectual rigidity of our copyright laws (not to mention our libel laws) absolutely guarantees that there will never be another Shakespeare.
Creative Commons is an attempt to restore literature's ability to cross-fertilize (the way classical and folk music used to and visual art still does) by trying to reproduce the open-source software phenomenon there.
My problem with this book is that Lanier did not respect his subject or his readers enough to question his own prejudices or ignorance. You could read Lanier's book as a study of hand-waving and proof by intimidation, but the content is hardly worth the effort. Read "How to Win Every Argument" instead for shorter, more humorous and more self-aware examples of chopped logic.
If you want an example of somebody doing a great job at what Lanier fails at here, read "Freakonomics". (Read it anyway, it's great.)
[Disclosure: the author of this review is an ex-actor/musician and a writer of fiction who has done graduate research in artificial intelligence and has been doing software engineering for about 30 years.]
on April 7, 2010
In You are not a Gadget Jaron Lanier purposes that the emergence of web 2.0, cloud computing, and the "hive mind" of humanity are beginning to stifle creativity, individualism, and expression in the human race. He believes a paradigm shift has occurred (and is rapidly continuing to occur) in the last two decades that is reducing our fundamental human-ness. I find his ideas fascinating.
The book starts out speaking about the limitations inherent in our current technology due to lock-in as many of the programming languages currently in use today were written ten, twenty, thirty years ago. A good example is MIDI. MIDI was created to be a simple mime of a synthesizer on a computer, but MIDI only specifies certain notes in a limited range (like the keys on a piano). Pick up a saxophone or start to sing and there are many more possible sounds than can be produced using MIDI; the technology is so embedded in everything we do now that it's locked-in.
Lainer stands in contrast to proponents of the free/open culture movement; most free culture advocates perceive themselves as rebellious and liberal but Lainer posits that they are the conservative ones. He makes a great point in that many of our best pieces of software have come from closed systems - i.e. the iPhone or Adobe Flash. This quote sums up his view nicely: "If we choose to pry culture away from capitalism while the rest of life is still capitalistic, culture will become a slum." I think a happy medium does exist, but is currently not present in efforts such as Creative Commons licensing (though it's certainly a start).
"Am I accusing all of those hundreds of millions of users of social networking sites of reducing themselves in order to be able to use the services? Well yes, I am. [...] A real friendship outght to introduce each person to unexpected weirdness in the other. Each acquaintance is an alien, a well of unexplored difference in the experience of life that cannot be imagined or accessed in any way but through genuine interaction. The idea of friendship in database-filtered social networks is certainly reduced from that." Sure all of that is true, but that all depends on how we define and value various words. I don't consider all of my 1,192 "Facebook friends" to be my close friends in real life, many are people that I've met along the way and simply want to keep in touch with occasionally. I realized shortly after reading that chapter that I was being small. I grew up straddling the analog and digital ages and I know both. Lanier is looking beyond that at future generations that will grow up on Facebook, Twitter, and other web 2.0 networks.
I've found great joy in people I've met on the internet and proceeded to meet in person, many of who have become great friends. I've been meeting people from online communities for nearly a decade now and it's never been weird or creepy, aside from the middle school dance feeling that might occur for the first few minutes. To be fair, Lainer does spend about a page or two praising this result of the web, but I don't think he gives it enough credit.
One point I have to strongly disagree with is Lanier's assertion that musical progress has been greatly slowed and everything is just "retro, retro, retro." He says most people in their 20s can't differentiate between 90s and 00s music. Can you tell me that there's anything that sounds like The Postal Service or The Knife from the 20th century? Those are just two examples off the top of my head but there's a plethora of original music out there right now that is distinct to our time. I'm also not sure why musical genres and trends have to be spliced into ten year increments that coincide neatly with decades, but that's just an aside. Musical genres have splintered and there's isn't currently an overarching archetype, but I would say that's simply because we have access to so much music and record companies no longer have as much power to set the standard for what is appropriate for the masses. The masses decide for themselves by finding new music on the internet.
I found this book to be a fantastic thought exercise and it made me take a hard look at my technological worldview. I wish the conclusion was a more coherent and non-tangential; Lanier goes on to talk about cephlopods for several pages at the end of the book.
My life is seeped in web 2.0; this review itself is sending to four different web 2.0 platforms after I hit the publish button. We as a culture and society have become so engrossed in these platforms that I think it's important to step back and evaluate exactly what it is we're doing. I hope there's a compromise that exists between totally free and open culture and closed systems; I suppose we'll find out.
on July 6, 2012
Lanier is very good describing the superficiality of the Web and the triviality of social networks like Twitter and Facebook. He shows how computer programmers embed their faulty philosophical beliefs in their technology and how that in turn affects affects society. While he is good at showing the symptoms, he is not so good at showing how that happens..
He justly complains about the lack of authorial expertise and primary-source material on Wikipedia. He often refers to the limitations of the MIDI system of replicating tones on a computer, the Linux operating system, and the Web 2.0 architecture, but he doesn't show how they transfer to social practices.
Lanier's book exhibits the same problems he is complaining about. He wanders through many interesting subjects, but he doesn't show how they are connected. In short, like the Web, his work lacks organization.
He brings up the lack of organization on the Web, but he dances around that subject and never takes it on. He fails to realize how central that problem is, not only for the Web, but for the whole field of knowledge and learning.
Because of the binary structure of our brains and nerves, we think and learn in categories. We learn by means of incorporating new information into what we already know. Learning resembles nothing so much as how our bodies metabolize food, transforming foreign matter into our own living tissue. We make something we learn part of us that we can use to learn yet other things.
The whole process of education relies on the sharing and learning of information as it is conceived and organized by experts in their particular fields. All of science and learning going back to Aristotle rests on that principle of organization, incorporating the new into what we already know. Mastering a subject means having a grasp of its organization.
Carl Linnaeus captured that vision in the binomial nomenclature (genus and species) that he used in his 1735 landmark "Systema Naturae." In the whole field of science and knowledge, everything is part of something else. That classical view of knowledge was also embodied in the Dewey Decimal and Library of Congress classification systems in the 19th century and the Universal Decimal Classification in the 20th.
When you are looking for a book in a library, the classification system does two things: it tells you where the book is physically located and 2. it puts that book and its subject subject matter in the context of a body of knowledge as it is organized by experts. It introduces you to other related books on the same shelf as the book you are looking for.
In using a classification system, you gain knowledge by gaining context. Everything in the library, like learning itself, is based on context, organization, and hierarchy.
Early programmers before the Web embraced that view of knowledge. The most popular programs in the 80s and 90s were databases and spreadsheets, which were tools for organizing information. The arrival of object-oriented and the mark-up programming languages such as SGML were all embedded in that same concept of hierarchical grouping of information. It was not only the best method for storing and finding information but also for learning it.
The problem of organization, however, is that 1. only the human brain can do that, and 2. it is very difficult to do. As the ancients said, "Sapienti est ordinari," only humans can organize. It will be a very long time before computers will be able to do that.
About the time the Web started, some programmers were trying to avoid that problem. They proceeded on a popular but unfounded belief that people don't need any help in organizing materials if they only have access to the information. In their fear of authority, they regarded the previous organization of thought by experts as Euro-centric, snobbish, and elitist. They mistakenly felt that technology would eliminate the need for experts and make the task of organization--pointing out where everything belongs--unnecessary.
They saw data as discrete, free-floating items in no need of context. They rejected the need of experts to design the organization of knowledge. They promoted digital searches as a better path to finding information but forgot about the need to put it in context.
Asking individuals to do that without the help of others who have already done it is like asking everyone to start from scratch and re-invent the wheel.
The success of the programmers has resulted in the hive mentality and the trivialization of truth that Lanier writes about. People are led to think that bits of information out of context can accumulate to become somehow significant. Creators of Wikipedia continue to hope that the body of information somehow will organize itself without the need of experts. Because of that real disability, real experts in any subject are hesitant to contribute anything, crippling the whole effort.
The opponents of this childish obsession with technology were the educators, scientists, classifiers, and librarians. They had a different vision of what the Web could be-- a hierarchical structure, like like any other body of information such as an encyclopedia or a library. A great window of opportunity was missed when Web programmers went off in a different direction and failed to provide it a classification system.
It was a time to use and correct a system of classifications and create something really great for the world. There are still those around who are working on modern systems of classification. Too bad that Tim Berners-Lee and the other developers of the Web did not follow their alternative line. For more information on that possibility, go see: [...]
on February 6, 2010
This is the sort of book I normally avoid, a compilation of jargon filled columns and short bits written for other outlets mashed into a book. But here's the thing, if you haven't read Lanier's work before, you should give this book a look, if only for this thought provoking quote on Facebook: "The real customer is the advertiser of the future, but this creature has yet to appear at the time this is being written. The whole artifice, the whole idea of fake friendship, is just bait laid by the lords of the clouds to lure hypothetical advertisers--we might call them messianic advertisers--who might someday show up." So much of the Digital Age is built around making money on things we once never associated with the material world, things like friendship (Facebook, My Space) and love (eHarmony), sex (so many porn sites). In some ways, the internet is one big advertising medium, and it's come to control so much of our world, influence our decisions, and to dominate our lives. And as someone who works with his hands, I worry a lot about "free content" and the devaluing of craft. This book grapples with that, and a lot more. Recommended for anyone who thinks deeply on these matters, or who wonders where the digital world is leading us, and how we can set a new path.