A principle is a statement that guides or constrains action. We used three criteria to select computing principles:
1. Universal: The principle arises from taking care of a pervasive concern. Everyone is affected. It is unavoidable. The concern is durable if not permanent.
2. Recurrent: The principle has been encountered repeatedly in many contexts. Different groups have independently discovered it. It is reproducible. It is useful for prediction and design.
3. Broadly Influential: The principle informs and constrains all the technologies and applications of computing. It shapes standard practice; its impact is wide and deep in science, industry, and society.
Although related, these criteria are not the same. The universality criterion says that people everywhere find it relevant to their success. The recurrence criterion says that people in different fields, places, and times are likely to independently recognize the principle. The breadth criterion says that everybody is practicing it, whether they are aware of it or not.
It is tempting but misleading to say that principles are invariant. Over time the interpretation of a principle can change -- for example, the US Constitution is constantly reinterpreted by the courts. Moveover, the set of active principles can change as new principles are discovered and older principles go out of use. A principle may become obsolete, outmoded, or irrelevant. An example of a relatively new principle in computing is the page-ranking scheme of the Google search engine. An example of a reinterpreted principle is that computation is a sequence of states of a representation; the older interpretation of sequence of instructions executed by a computer is too narrow. An example of an outmoded principle is the guideline for building a vacuum-tube flipflop.
Big ideas do not automatically qualify as principles. An idea is a mental construct. It can exist in a single mind. Even if many people accept an idea, it isn't a principle unless it is universal, recurrent, and broadly influential. Moore's Law is an example of a widely held idea that is not a principle. The law says that the number of transistors on a silicon computer chip (and by implication the chip's speed) doubles approximately every 24 months. The idea has been extended into communications bandwidth and memory with doubling times as short as 12 months. The "law" is an empirical relationship with a limited lifetime depending on how soon shrinking feature size passes a limit threshold.