/ Home
Engineering Laws
Brooks’s Law: “Adding manpower to a late software project makes it later.” This captures how onboarding new people to an ongoing project initially decreases productivity.
Murphy’s Law: “Anything that can go wrong, will go wrong.” A reminder to plan for contingencies.
Peter Principle: People in organizations rise to their level of incompetence - meaning employees are promoted based on success in their current role until they reach a position they’re not competent at.
Hofstadter’s Law: “It always takes longer than you expect, even when you take into account Hofstadter’s Law.” This is particularly relevant to project planning and time estimation.
Goodhart’s Law: “When a measure becomes a target, it ceases to be a good measure.” When people optimize for a metric, they often find ways to game it rather than improve the underlying quality.
Conway’s Law: “Organizations design systems that mirror their own communication structure.” The way teams are organized influences the design of the products they create.
Sturgeon’s Law: “90% of everything is crud.” A reminder that quantity doesn’t equal quality.
Dunning-Kruger Effect: People with limited knowledge or expertise in a given domain often overestimate their own competence, while experts tend to underestimate theirs.
The Law of Diminishing Returns: After a certain point, additional effort or resources yield progressively smaller benefits.
Price’s Law: Square root of the number of people in a domain do 50% of the work. In a company of 100 people, 10 people produce 50% of the results.
Occam’s Razor: The simplest explanation is usually the correct one. Don’t multiply complexity unnecessarily.
The Cobra Effect: When an attempted solution makes the problem worse. Named after a failed policy in colonial India where a reward for dead cobras led to people breeding cobras.
Hanlon’s Razor: “Never attribute to malice that which can be adequately explained by stupidity.” Most mistakes come from ignorance rather than ill intent.
The Lindy Effect: The longer something has been around, the longer it’s likely to continue existing. Particularly applicable to technologies and ideas.
The 90-9-1 Rule: In online communities, 90% of users just observe, 9% contribute occasionally, and 1% create most of the content.
Berkson’s Paradox: When a selection process creates misleading correlations. For example, in a dating pool, you might notice smart people tend to be unattractive, but this could just be because the smart attractive people are already taken.
The Matthew Effect: “The rich get richer and the poor get poorer.” Initial advantages compound over time, leading to growing inequalities.
Brandolini’s Law (Bullshit Asymmetry Principle): The amount of energy needed to refute bullshit is an order of magnitude larger than to produce it. This explains why misinformation spreads so easily.
The Ben Franklin Effect: People who do favors for you tend to like you more, not less. Counterintuitively, asking for small favors can build stronger relationships than doing favors.
Zipf’s Law: In many large datasets, the second most common item appears half as often as the most common, the third appears one-third as often, and so on. This pattern appears in everything from word usage to city populations.
The Law of Triviality (Parkinson’s Law of Triviality): Groups spend disproportionate time on trivial issues while neglecting more important ones. Also known as the “bike-shed effect” - a committee might spend more time discussing the color of a bike shed than complex nuclear plant designs.
Gall’s Law: Complex systems that work evolved from simple systems that worked. Complex systems designed from scratch never work and can’t be patched up to make them work.
The Streisand Effect: Attempting to suppress information often leads to it spreading more widely. Named after Barbara Streisand’s attempt to suppress photos of her house, which led to more people seeing them.
Cunningham’s Law: The best way to get the right answer on the internet is not to ask a question; it’s to post the wrong answer. People are more motivated to correct errors than to answer questions.
The Broken Windows Theory: Visible signs of disorder and misbehavior in an environment encourage further disorder. One broken window leads to more broken windows.
The Law of Large Numbers: Large numbers tend to iron out fluctuations and reveal underlying patterns. The more data points you have, the closer your sample will match the true population.
Sayre’s Law: “In any dispute, the intensity of feeling is inversely proportional to the value of the stakes at issue.” People fight most bitterly when the least is at stake.
Fitts’s Law: The time to reach a target is determined by the distance and size of the target. Explains why corner buttons (like ‘X’ to close) are easy to click - infinite effective width.
The Pygmalion Effect: People tend to perform according to the expectations placed on them. High expectations lead to better performance, low expectations to worse.
Campbell’s Law: When a quantitative metric is used for decision-making, it becomes corrupted and distorts the process it’s intended to monitor. Similar to Goodhart’s Law but focuses on social processes.
The Dunbar Number: Humans can maintain only about 150 stable social relationships. Beyond this, group cohesion requires more formal structures.
The Peter Principle’s Corollary (Putt’s Law): Technology is dominated by two types of people - those who understand what they do not manage, and those who manage what they do not understand.
The Ringelmann Effect: Individual performance decreases as group size increases. The larger the group, the less effort each member puts in.
The Law of Conservation of Complexity (Tesler’s Law): Every application has an inherent amount of complexity that cannot be removed or hidden. You can only shift it between user and developer.
The Iron Law of Bureaucracy: In any organization, people who prefer the rules to the organization’s goals will gain control. The means become more important than the ends.
The Law of Demeter: Software components should only talk to their immediate friends, not strangers. Also known as the principle of least knowledge - don’t talk to strangers.
Zawinski’s Law: Every program attempts to expand until it can read email. Reflects how software tends to grow beyond its original scope until it becomes a communication platform.
The Pareto Principle’s Evil Twin: In management, the first 90% of the task takes 90% of the time, and the last 10% takes the other 90%. A cynical take on project timelines.
Muphry’s Law (intentionally misspelled): If you write anything criticizing editing or proofreading, there will be a fault of some kind in what you have written. An extension of Murphy’s Law specific to writing.
The Law of Unintended Consequences: Actions (especially in complex systems) always have effects that were not anticipated. Often these unexpected effects are more impactful than the intended ones.
The Anna Karenina Principle: Success requires many things to go right, while failure can occur from a single flaw. Named after Tolstoy’s line “Happy families are all alike; every unhappy family is unhappy in its own way.”
Postel’s Law (Robustness Principle): “Be conservative in what you do, be liberal in what you accept from others.” Originally about computer systems, but applies to human interactions too.
The Second Law of Consulting: No matter how it looks at first, it’s always a people problem. Technical issues usually mask underlying human/organizational issues.
Eagleson’s Law: Any code you haven’t looked at for six months might as well have been written by someone else. Reflects how quickly we forget the context and reasoning behind our work.
Wadsworth’s Law: The longer a discussion continues, the probability of a Hitler comparison approaches 1. Similar to Godwin’s Law but specifically about online discussions.
The Law of Raspberry Jam: The wider you spread it, the thinner it gets. Resources or qualities often degrade when distributed too widely.
Benford’s Law: In naturally occurring collections of numbers, the leading digit is likely to be small. The first digit is 1 about 30% of the time, and it is 9 only about 5% of the time.
Amara’s Law: We tend to overestimate the effect of a technology in the short run and underestimate it in the long run. Explains why revolutionary technologies often disappoint initially but exceed expectations later.
The Law of Requisite Variety: The complexity of a control system must match the complexity of the system it’s controlling. You can’t manage a complex system with simple rules.
Chesterton’s Fence: Don’t remove a fence until you understand why it was put up. Reforms should understand the reasoning behind existing structures before changing them.
The Baader-Meinhof Phenomenon (Frequency Illusion): After noticing something for the first time, you start seeing it everywhere. Your brain is just paying attention to something it previously ignored.
Hyrum’s Law: With enough users, all observable behaviors of your system will be depended on by somebody. Even bugs become features if people rely on them.
The Law of Leaky Abstractions: All non-trivial abstractions are leaky. Higher-level systems can’t completely hide the complexity of lower-level ones.
Shirky’s Law: Institutions will try to preserve the problem to which they are the solution. Organizations resist changes that would make them unnecessary.
The Narcissist’s Prayer Principle: In hierarchical systems, blame flows downward while credit flows upward. Success has many fathers, failure is an orphan.
The Curse of Knowledge: Once you understand something, it becomes hard to imagine not understanding it. Makes it difficult for experts to teach beginners effectively.
Wirth’s Law: Software gets slower faster than hardware gets faster. Despite better technology, systems often feel slower because software complexity grows faster than hardware capability.