20 Guiding Principles of Computing
1. Brooks’ Law
Adding More Programmers to a Late Project will make it LATER.
It takes time to bring new people up to speed, and there are group dynamics that slow this process. One of the most obvious costs to new people is someone has to explain to them the project and procedures, and while doing so the teacher is not doing any work, and neither are the students. Until they are up to speed, the new staff do nothing and also divert experienced staff from work, thus it is a net loss to the project.
2. Choose the Middle Way
One of the great role models you should look to in your development activities is a young girl who didn’t actually exist. Her name was Goldilocks, of Goldilocks and the Three Bears (Beginner's Activity Book Series) fame, and although not reported to be a practicing Buddhist, she understood the Eastern philosophical principle of the Middle Way well from An Introduction to Zen Buddhism. The principle advises avoiding extremes. Imagine her looking at some Entity-Relationship Diagrams (ERDs): “This ERD is too high-level”; “This ERD is too detailed”; “This ERD is ju-u-u-ust right.” Much of software engineering involves finding a middle way between two extremes. For example, you must thoroughly analyze every situation while avoiding analysis paralysis. For specifications, tests, and edits—almost everything—you must copy Goldilocks and get it just right. By following the middle way, you will avoid extremes, as they are rarely the best way, and by finding the middle ground between two competing ideas, you can get the best of both. Another way of stating this principle is:
Bad ideas are just good ideas carried to extreme.
3. Conservation of Complexity
Simplicity is Complicated
Robert J. Lucky said: “John Naisbitt has observed that the computer is a tool that manages complexity, and as such, just as highways encourage more cars, the computer invites more complexity into society.” [Silicon Dreams: Information, Man, and Machine] For example, the Internal Revenue Code could never have been as complicated as it is without computers to keep track of all the data needed to support it.
Technology was supposed to simplify our lives?!?
4. Embrace Contradiction
True Lies & Other Oxymorons
The First Law of Logic for the software developer: Logic doesn’t always work! Surprisingly enough, Computer Programming teaches empiricism, even intuition, but NOT rationalism. Lucky for me, I’ve had some exposure to Zen concepts, which really helps in this regard.
As a dedicated contrarian, to convince you of the contradictions in life, I will offer in evidence:
¤ The opposite of a Profound Truth is another Profound Truth.
¤ There is No Such Thing as Nothing.
¤ The greatest Truths are told in Fiction.
¤ You can hide in plain sight.
¤ You can kill with kindness.
¤ To remember something, stop trying to remember it.
¤ In AI, the hardest problems are the easiest; the easiest the hardest.
¤ If you want something done, assign it to someone who is busy.
¤ Go slow to go fast.
¤ Whether You think You Can, or you Think Can’t, You’re Right.
As a systems analyst you must embrace contradiction. Systems analysis is contradiction. In fact, it’s a riddle inside a conundrum that’s part of a mysterious puzzle. Remember, genius is the ability to hold two completely contradictory opinions at the same time. As with The Force in the Star Wars Trilogy (A New Hope / The Empire Strikes Back / Return of the Jedi) (Widescreen Edition with Bonus Disc), you must learn to trust your intuition, and let your gut guide you when your head can’t.
Beware the nagging question. If you feel uneasy, there’s probably a reason, and in analysis, minor contradictions can lead to major consequences later. If there is something you don’t understand there must be something wrong. So you must dig and dig until you resolve the issue. You’re in peril if your gut disagrees with your head.
Wait! Isn’t that the opposite of what I just said in the previous paragraph? So what should you do, accept a contradiction and live with it, or not proceed until you eliminate it??? The answer, of course, is “Yes”. Learn to deal with ambiguity. Zen and the Art of Systems Analysis: Meditations on Computer Systems Development
5. Gould’s Spandrel
Things are the Way They are because They got that Way.
(For a good reason, for a bad reason, or for no reason at all!)
In Florence and Rome, many cathedrals have murals of astonishing beauty in the spaces between the arches that support the ceiling. The architect’s word for the space between arches is spandrel. Stephen J. Gould, The Structure of Evolutionary Theory, first suggested in a 1979 paper with Richard Lewontin some biological effects might be like spandrels. Often there are incredible decorations in the spandrel, so it is possible to think “the purpose of a spandrel is to provide a canvas for the artist. But Gould argues that the spandrel wasn’t intentional, it was a consequence of the arch, and just happened. Likewise, sometimes a gene has several effects, and some features are consequences of other features and not actually selected for.
Hume’s Fallacy: OUGHT can not be Deduced from what IS.
There is a tendency to think things ought to be as they are. David Hume An Enquiry concerning Human Understanding (Oxford Philosophical Texts) pointed this out as a fallacy of assuming whatever is ought to be. In history, we often assume the actual victors ought to have won. Consider the Second Punic War: The War with Hannibal: The History of Rome from Its Foundation, Books XXI-XXX (Penguin Classics) (Bks. 21-30) “Luckily, Hannibal was defeated, and Rome was saved.” But if things had gone differently at Zama, we’d likely say, “Luckily, Scipio was defeated, and Carthage was saved.” If you assume what “is” ought to be, there will never be improvement. In Hume’s time, there were colonies, slaves, peonage, diseases—which people assumed were “right” because they’d been around for so long. The same is true with systems. Many times a feature remains long after the original reason is gone.
6. Hofstadter’s Law
Douglas Hofstadter is the author of GEB, [Gödel, Escher, Bach: An Eternal Golden Braid] and Le Ton beau de Marot: In Praise of the Music of Language. [Le Ton Beau De Marot: In Praise Of The Music Of Language]. Hofstadter’s Law is given as:
It always takes longer than you expect, even when taking into account Hofstadter’s Law.
His new book, I Am a Strange Loop, is proof positive of that. It was delayed again and again.
It is amazing the complexity of what the human mind can do. Many experts are using what seem to be incredibly crude methods, but are doing as well as some of the most sophisticated mathematical techniques using one of the most sophisticated products of modern technology can do. No doubt, someday we will crack the chicken feed problem, if we haven’t already. But the moral of the story is: Never underestimate the complexity the human mind can handle. Developing a computer system is always harder than you think it will be, because the uneducated humans are doing a lot more than we well-educated analysts and managers think they are.
Occam’s Razor—Keep it Simple, Stupid
Every engineer knows this one: KISS—Keep it simple, stupid. Another word for powerful is complicated. If you don’t have it already, get [Newton's Telecom Dictionary: 22nd Edition (Newton's Telecom Dictionary)]. Despite its title, it’s not just about Telecomm, but about computers and systems generally. As soon as your book comes, look up KISS.
8. Lubarsky’s Law of Cybernetic Entomology
Lubarsky’s Law of Cybernetic Entomology: There’s Always One More Bug.
When I was in the fifth grade, I lived at Sewart AFB near Smyrna, Tennessee, and delighted in observing the many varieties of insects that inhabited the streams and fields near my home. And so I decided I would be an Entomologist when I grew up. This was a momentous decision indeed, since it required abandoning my long-held (since early fourth grade) plan to be a Nobel-prize-winning Chemist. But when I read that chemical companies were one of the largest employers of entomologists, I decided my two passions were obviously complementary--I would be an entomologist for a chemical company. It was some months before the awful realization struck: the tie between chemistry and entomology is insecticides, and the goal of most entomologists is to kill as many insects as possible during their lifetimes. As a consequence, I abandoned my plans for both chemistry and entomology, but eventually found a career that involved ruthlessly exterminating bugs without harming any insects—computer programming.
The more you find, the more there are. So get as many people looking for them as you can, as Linus Torvalds expressed it what is called Linus’ Law: “Given Enough Eyeballs, All Bugs Are Shallow.” [The Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary]
9. There are Many Ways to the Mountaintop
All the One True Ways
After years of research, computer science has finally discovered the one true best methodology—at least a hundred of them, in fact! Each guru will tell you his is the only way, but no way is always right. When asked to compare various methods of attaining enlightenment, the Buddha is said to have remarked: “There are many ways to the Mountaintop.” Some are harder, and some are easier; some are surer, some slipperier. But many ways are valid. You must find your own best way, but one of the first things to learn in the analyst business: You’ve got to be flexible. There is no one correct, or even best, methodology—or technique, or tool, or approach. The rub of course is that although there is no one right way, there are in fact many wrong ways, and the essence of analysis is avoiding them. So this book contains tips and rules of thumb to help you determine which approach to embrace and which to avoid. An idea is a very dangerous thing if you only have one.
10. Moore’s Law
Computing Power Doubles every 18 Months
Moore’s Law postulates that computing power doubles approximately every 18 months, a trend that has held for three decades. Will the trend continue? Current integrated circuit (IC) technology will top out in a few decades, around 2020, and Reed & Tour tell us “Few if any researchers believe that our present technology—semi-conductor-based solid-state microelectronics—will lead to circuitry dense and complex enough to give rise to true cognitive abilities.” [Understanding Artificial Intelligence (Science Made Accessible)] But Kurzweil thinks Moore’s Law will keep going beyond that. “There are more than enough new computing technologies now being researched, including three-dimensional chips, optical computing, crystalline computing, DNA computing, and quantum computing, to keep the law of accelerating returns going for a long time.” [Are We Spiritual Machines?: Ray Kurzweil vs. the Critics of Strong A.I.] Julian Brown even suggests multiverses, other universes, might exist to do computing for us. [MINDS, MACHINES, AND THE MULTIVERSE: THE QUEST FOR THE QUANTUM COMPUTER]
But an important point: whatever might or might not develop in the future, we have plenty of processing power now; it doesn’t seem to be hardware that is holding us back. Whether Moore continues or not, we haven’t kept up in software development—software is the limiting factor, not hardware.
11. Murphy’s Law
Anything that can Go Wrong, will Go Wrong
Anyone who has programmed much recognizes the truth of Murphy’s Law: “Anything that can go wrong, will go wrong”, and knows to expect the worst. Murphy’s Law is a Law, a useful Law, and yes, even a true Law. It is irrefutably empirically demonstrable that the vast majority of things that can go wrong, do NOT go wrong, but Murphy’s Law is a truth that needs no proof to anyone who has ever worked on a complex system.
The bad news is, many systems people know O’Niel’s Law: “Murphy was an optimist.”
12. No Silver Bullet
Software is Hard
Frederick P. Brooks, Jr.’s essay “No Silver Bullet: Essence and Accident in Software Engineering” was first published in 1986, “refired” in 1995 in [The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition (2nd Edition)] and appears to be just as applicable today as it was then. In it he discussed whether software development was likely to discover a bullet, which like the silver bullet that could slay the otherwise immortal werewolf, could slay the otherwise immortal “monster of missed schedules, blown budgets, and flawed products”. Brooks’ essay is subtitled “Essence and Accident in Software Engineering”, which requires a little explanation. Brooks is using “accident” in the Aristotelian sense, meaning “nonessential”, not “happening by chance or unintentionally”. For example, my automobile is shiny black, which is not an essential feature, as is proven by the fact the car was blue when manufactured, and is still a classic 280-Z despite the fact I had it re-painted. So its color is accidental in the Aristotelian sense, even though the change of color was no accident—it was fully planned, and cost me several thousand dollars. Having wheels, on the other hand, is essential, since without them it wouldn’t be mobile and thus not an auto-mobile.
Basically, Brooks describes analysis as the essential part of systems development, and programming as the accidental part. Since programming has advanced to the stage that it is less than half the problem, if you eliminated it entirely you wouldn’t even double productivity, not to mention get an order-of-magnitude improvement. And virtually all products mentioned as silver bullets have no affect on the essence, just the accident. “Not only are there no silver bullets in view, the very nature of software makes it unlikely there will be any—no inventions that will do for software productivity, reliability, and simplicity what electronics, transistors, and large-scale integration did for computer hardware.” So Brooks feels order-of-magnitude improvements that are normal in hardware are not possible in software.
The development of software to continue the incredible saga of human achievement in computer systems is one of the most stimulating adventures of all time. But I’m not promising a rose garden. It will be tough, as all significant quests are. I’ll give Brooks the last word: “The tar pit of software engineering will continue to be sticky for a long time to come. One can expect the human race to continue attempting systems just within or just beyond our reach; and software systems are perhaps the most intricate of man’s handiworks. This complex craft will demand our continual development of the discipline, our learning to compose in larger units, our best use of new tools, our best adaptation of proven engineering management methods, liberal application of common sense, and a God-given humility to recognize our fallibility and limitations.” Good luck on your journey!
13. Pareto Principle
The 80%-20% Rule
[Or even 90-10]
The Pareto Principle, also known as the 80/20 Rule, is named after Vilfredo Pareto, The Economics of Vilfredo Pareto an Italian economist, who noted that in distributions 80 percent of the dividend usually goes to 20 percent of the observations. For example, 80 percent of the wealth is held by the richest 20 percent of households, 80 percent of the workers are employed by the largest 20 percent of companies, etc. By this reckoning, 80 percent of your problems will be caused by 20 percent of your bugs, or cases, or whatever. And 80% of people use a software application use only 20% of the features.
In many circumstances the 80/20 rule becomes the 90/10 rule: 90 percent of the problems are caused by 10 percent of the items. So the opportunity is to be sure to find all of the 10 to 20 percent of items that will cause most of the problems, and then reduce the other problems as much as you can. Include the critical items you can’t operate without, look inside complicated processes in which the chance of error is high, and consider the past history of problems you've encountered from each system. There is even a book, The 80/20 Principle: The Secret to Achieving More with Less, and several sequels, on the subject, so I guess he didn’t get the critical 80% into the first book.
But remember Joel Spolsky’s Caveat: Although 80% of people use only 20% of the features. “Unfortunately, it’s never the same 20 percent. Everybody uses a DIFFERENT set of features.” Joel on Software: And on Diverse and Occasionally Related Matters That Will Prove of Interest to Software Developers, Designers, and Managers, and to Those Who, Whether by Good Fortune or Ill Luck, Work with Them in Some Capacity
14. Parkinson’s Law
Work Expands to Fill the Time Allotted
Parkinson’s law—work expands to fill the time allotted to it, C Northcote Parkinson, 1954. Sometimes need an arbitrary deadline.
So sometimes you need an Arbitrary deadline, or even an unreasonable one.
15. Populations as Individuals
Patrick's Population Principle: Individuals are Individual
You can view communities:
As a super-organism
Or a bunch of Individuals
Or as a Swarm Prey
Sometimes any way will do, but sometimes the distinction becomes important. This explains why organizations things seemingly against own interest: The individual’s and organization’s interests do not always align, as an example a Manager’s project vs. Good of Company.
16. Shapiro’s Observation
Technology Changes. Economic Laws do not
Shapiro’s Observation: [Information Rules: A Strategic Guide to the Network Economy] Technology changes. Economic Laws do not. This has also been expressed as the Law of Disruption: “Social systems change incrementally, technology exponentially.” [The Social Life of Information] David Shenk’s Second Law of Data Smog: “Silicon circuits evolve much more quickly than human genes.” [Data Smog: Surviving the Information Glut Revised and Updated Edition]
17. Slingerland’s Law of Fools
No System’s Foolproof, because there’s Always a Bigger Fool
When it is wrong, it should be unmistakably wrong. If there are 2 ways to do it, think of a way to make the wrong way impossible if possible. If not, make it obviously wrong.
When you see an error, fix it. Cf. If it ain’t broke, don’t fix it.
There is No Such Thing as a Free Lunch
This is an observation of Software Engineering Economics, and in fact Economics itself.
Economics is called the Dismal Science, for good reason. It’s about limits: Satisfying unlimited human wants with limited resources. Economics is the study of allocation of scarce resources. And in any competitive business environment, resources will be limited. I was an Economist before I got into the computer business. We were fond of saying: “There is no such thing as a free lunch”, often while we were eating one.
19. Unintended Consequences
1st Law of Ecology: Everything is Connected to Everything Else
Barry Commoner, The Closing Circle. The Closing Circle: Nature, Man, and Technology
You can never do just one thing.
Culture + New Technology YIELDS New Culture
All systems analysts have experienced this one. Almost inevitably, technological change has effects, good and bad, that were never intended, nor even foreseen, by their creators. Through “Social Adaptation”: When you add a technology to a culture, you don’t get a culture plus the technology; you get a different culture that includes the new technology. And note the arrow is one way; you can’t go home again.
My theory is that emergence has an evil twin, that’s called unintended consequences. For a little humility—remember the best-laid plans of mice and men—you might want to read Edward Tenner’s Why Things Bite Back. The cover, designed by Abby Weintraub with photography by Edward Matalon, shows a cord and plug rearing up to strike, cobra-like. [Why Things Bite Back: Technology and the Revenge of Unintended Consequences (Vintage)] So perhaps Unintended Consequences is Emergence’s evil twin: instead of lots of dumb things doing something smart, we have lots of smart things doing something dumb.
I would doubt that any of the original inventors of the technologies (computers, the Internet, telephones, VCRs, etc.) that might revolutionize education did so planning to revamp the educational system of the nation, but the result might well be an unforeseen educational system very different from today’s.
Consider the airplane. The spread of disease has been linked to air travel. Several Europeans who live near airports have contracted malaria although they had never traveled outside of Europe, and some epidemiologists cite the aircraft as a contributor to the spread of the AIDS virus. The jet airplane is a source of air and noise pollution, and uses tremendous resources, especially non-renewable fuel, which could be put to other uses. The concrete of airports and the necessary clearance of trees that would obstruct flight paths removes plant life, and like the automobile it produces greenhouse gases, but even worse, it delivers them directly into the upper atmosphere. Aircraft accidents, although rare, have killed thousands of people. Some other examples:
The consequences can even the opposite of what was intended. The paperless office increased paper use. Power de-regulation in California led to a more regulated industry.
Or consider even a seemingly beneficial move such as a company that fixed a compiler bug. Now all the programs don’t work because they took advantage of loose edit the bug allowed.
20. Veblen’s Principle
A Change that Helps Somebody usually Hurts Someone Else
Veblen The Theory of the Leisure Class (Dover Thrift Editions)
ALL changes help some people and hurt others.
So technological innovation will probably bring at least some harm to some people, and some of the historically accepted examples of injustice may have had some justice in them. I’m sure most of us have our qualms about where technology is taking us, and whether it’s for good or ill. I’ve thought long and hard and have concluded it is indeed good, and offer three steps in the argument. First, we humans in a technological society live significantly longer than our non-technological counterparts, both past and present. The second observation is that given a choice, the vast majority of people choose the technological lifestyle over the primitive one, and those that reject it find, often to their horror or disgust, that their children embrace it. So although Jacques Ellul argues in The Technological Bluff The Technological Bluff that although we live longer lives, they are not better lives, we must accept that the technological life is better since those with a free choice freely choose it. And third, our lives have less drudgery, more entertainment, and are more interesting than without technology. So the bottom line is: I’m glad I was born into a technologically advanced time and place, and those that disagree might turn off the power to their homes for a week and see if it changes their perspective. You’ll face similar quandaries on a smaller scale as a computer systems developer, but I am convinced that on balance, technology improves our lives. ZSA: Zen and the Art of Systems Analysis: Meditations on Computer Systems Development
With A Deep Bow to Funakoshi Gichin & Nakasone Genwa for their inspiration in Karate-do Nijukkajo to sono kaishaku, 1938.
The Twenty Guiding Principles of Karate
The Twenty Guiding Principles of Karate: The Spiritual Legacy of the Master