[[TITLE]]
[[SUBTITLE]]
A friend once told us about the day she almost gave her daughter a measles vaccine and then didn’t. It wasn’t anti-vax stuff. She trusted the science. But the nurse mentioned a small chance of a fever after the shot. My friend pictured herself buckling her child into the car seat later, feverish and miserable, because she had “chosen” the shot. She felt the heaviness of that responsibility in her hands and walked out holding the consent form unsigned. Nothing happened that day—no fever, no crying. It felt like the safer choice. A year later a measles outbreak hit the neighboring town. She admitted, “I was more scared of a mistake I’d make than a problem I couldn’t see yet.”
Omission bias is our tendency to judge harmful actions as worse than harmful inactions, even when the outcomes are equally bad or worse. It sneaks in when doing nothing feels morally cleaner or less risky than doing something, and it silently tilts our decisions.
We’re the MetalHatsCats Team, and we’re building a Cognitive Biases app because we want to catch these brain quirks before they cost us health, money, reputation, and time. This article is about omission bias—the gentle brake pedal you don’t notice your foot on until you’re late and the road’s empty.
What is Omission Bias — when doing nothing feels safer than making a mistake — and why it matters
Omission bias makes inaction feel safer, wiser, and more moral than taking action that might backfire. It’s why we’ll accept the risk of a future problem rather than accept responsibility for a decision that could go wrong today. Researchers have shown we judge harm from action more harshly than identical harm from inaction (Ritov & Baron, 1990; Spranca, Minsk, & Baron, 1991). That’s not a thought experiment; you can see it in vaccine uptake, investment inertia, product roadmaps, hiring, and even relationships.
It matters because:
- Inaction has costs. We just don’t get receipts for them. A product we don’t launch never earns, a feature we don’t deprecate keeps draining, a diagnosis we don’t test for keeps growing.
- Inaction feels less blameworthy. If the status quo goes bad, it feels like fate. If our action goes bad, it feels like our fault. So we pick the path that lets us sleep tonight and pay later.
- The world rewards adaption, not neutrality. Markets, teams, and bodies move. What looks safe on Monday is costly by Friday.
Most of us don’t wake up thinking, “I will avoid action today.” We simply feel a tug to wait for more data. We plan to “decide after the next milestone.” The next milestone arrives, and we are “90% sure” but want to be “sure-sure.” Omission bias isn’t procrastination exactly; it’s a moral tilt. We’ll accept a 30% chance of a big problem in six months to avoid a 5% chance of feeling like the cause of a small problem tomorrow.
Examples
Stories stick better than definitions. Here are cases we’ve seen, with the quiet logic that guided the choice to not choose.
1) The vaccine form
A parent sees two risks: side effects now, disease later. Action (vaccination) creates tangible responsibility: “If my kid has a fever, it’s because I chose the shot.” Inaction splits responsibility with the world: “If they get sick, that’s because disease exists.” Even if the absolute risk is lower with the vaccine, omission bias makes the action feel heavier (Ritov & Baron, 1990).
- The nurse reframes: “Your choice is between vaccine side-effect risk and the disease risk. Doing nothing is also a decision.” She pairs numbers with stories of outbreaks nearby.
- The parent sets a rule in advance: “We vaccinate on schedule unless medical advice says otherwise.” Pre-commitment pulls weight away from in-the-moment fear.
What breaks it:
2) The product manager and the deprecated API
A PM knows they should migrate off a legacy API, but there’s a nonzero chance of breaking flows during the cutover. So they push it a sprint. And another. Six months later, the vendor sunsets the API and the team scrambles with weekend fire drills.
- Breaking something through your own change is a visible sin of commission.
- Keeping a shaky dependency feels like merely “not doing harm.” No alarms today, no blame today.
Why omission bias bites:
- Treat “action debt” like tech debt. Log “delays” with projected inaction costs. Review them monthly. If inaction cost crosses a threshold (downtime risk, fines), action becomes the default.
- Run a game day to simulate failure. Real-feeling risk makes inaction look riskier.
What breaks it:
3) The investor who keeps cash on the sidelines
A new investor waits for the perfect entry. Markets dip, spike, dip. They never buy. Two years pass. They feel safe, because “I didn’t lose money.” They also didn’t gain. Inflation quietly ate 6–10% of purchasing power.
- Losses from action hurt more than missed gains. People fear regret from a purchase that drops tomorrow. They rarely audit the cost of staying in cash (Tversky & Kahneman, 1981, loss aversion informs this, though omission bias adds the action/inaction layer).
Why omission bias bites:
- Automatic monthly investments with a split across funds. The decision moves upstream; each month is no longer a full “yes/no.”
- A personal rule: “I measure missed-opportunity cost quarterly.” Write down “Cash that could have been invested in X = Y% foregone.”
What breaks it:
4) The team that won’t sunset a feature
A B2B company keeps a little-used legacy feature. Support tickets are low, but every new release must accommodate the old code. Devs hate it; customers barely notice. PMs fear uproar if they remove it and a vocal few complain.
- Action creates noise. Inaction creates drag. Humans fear noise and visible conflict.
- The unknown of removal feels riskier than the known annoyance of maintenance.
Why omission bias bites:
- Flag a deprecation path with dates, migration guides, and proactive calls to top accounts. Offer a replacement. Document savings in cycle time. Share the win.
What breaks it:
5) HR and the “almost right” hire
A candidate checks many boxes but raises team concerns. The hiring manager worries turning them down will extend time-to-hire. Instead of acting, the team keeps them as a “temp contractor for now,” postponing the hard call. Friction smolders; good people leave.
- Rejecting feels like you might be wrong and to blame. Drifting feels safe. No headline.
- Managers fear “firing the wrong person.” They forget that not-hiring/not-firing is also an active choice with effects.
Why omission bias bites:
- Structured hiring rubrics with bright lines and a “no by default unless criteria N are met.” This moves responsibility from a gut-trigger to a standard.
- Timebox: “We decide by Friday at 4 p.m. with these inputs.” Decision deadlines reduce endless deferral.
What breaks it:
6) The couple putting off a hard conversation
One partner resents the other’s spending. Each time, they swallow it, promising to “bring it up when the moment’s right.” Months pass; resentment festers. A blowup finally forces the talk, now sitting on a powder keg.
- Choosing to have the conversation could cause an argument now (visible pain).
- Not choosing just leaks pressure—quietly. The leak doesn’t scream.
Why omission bias bites:
- Ritualized check-ins: “We do 20-minute weekly money talks, same time, same coffee mugs.”
- Script the opener: “I’m not blaming; I want to align. Here’s what I’m seeing.”
What breaks it:
7) The hospital that doesn’t implement a checklist
Everyone knows surgical checklists reduce errors. But adopting them means training sessions, pushback from senior staff, and a risk of immediate friction. A director delays. The complication rates don’t spike overnight; they just stay higher than they could (Gawande popularized the effect; the omission/commission tilt explains adoption drag).
- A pilot in one unit with metrics. Voice-of-nurse and voice-of-patient stories. A pre-set expansion plan if metrics improve.
- Leadership states: “Maintaining the status quo is a decision; show me its outcomes next month.”
What breaks it:
8) Data security patching
A critical patch requires planned downtime. Ops fears user complaints, so they push it to the next window. Two weeks later a breach uses the unpatched exploit. The apology tour is longer than the downtime would have been.
- “We didn’t cause an outage” feels safer than “we caused a patch outage.”
- The breach costs are probabilistic and later; patch downtime is certain and now.
Why omission bias bites:
- SLOs that include security posture. If CVSS >= 8, patch within 72 hours. No debate.
- Communicate downtime early, provide progress updates, and celebrate zero-day resilience.
What breaks it:
9) The city that delays a flood barrier
A town models increasing flood risk but stalls deciding on a barrier that might be an eyesore and cost millions. Each sunny day feels like they chose well. One storm dumps a decade of rain.
- Voters see an ugly wall they paid for. They don’t see avoided floods.
- Councils fear blame for spending, not blame for weather.
Why omission bias bites:
- Map costs over 30 years: insurance, property values, cleanup. Include lived stories from nearby towns that flooded.
- Frame: “Doing nothing costs X per year on average; the barrier costs Y. We choose one.”
What breaks it:
10) The founder who won’t pivot
Metrics flatten. Customers use the product for one small feature, not the core vision. A pivot means admitting the old plan isn’t working. So they “optimize” the landing page for three more months.
- Action = identity hit: “I was wrong.”
- Inaction lets you stay who you thought you were.
Why omission bias bites:
- Monthly kill/pivot criteria set in advance: “If retention < 15% by date, we pivot to the use-case we see in 60% of sessions.”
- External advisory board that reviews the criteria and enforces them.
What breaks it:
How to recognize and avoid it
Omission bias hides inside your words: “Let’s wait until we’re sure.” “The risk is low; why poke the bear?” “We can revisit later.” Look for that special relief that comes from not having to sign your name yet. That relief is information. It says, “I care more about not being the cause than about the total outcome.”
Below is a practical playbook. The goal isn’t reckless action. It’s honest accounting—making inaction compete fairly with action.
1) Make inaction explicit and measurable
- Write both options as actions. “Do A on date X” vs. “Don’t do A; accept cost Y until date Z.” If “do nothing” has a date and a cost, it becomes a real choice.
- Put numbers on the invisible. If you can’t quantify, narrate: “Inaction likely means two more outages per quarter,” “Inaction means I’ll keep resenting you weekly.”
2) Swap the default in your head
Ask: “If the current situation were not the default, would I choose it today?” If the answer is no, the status quo is coasting on omission bias. This question punctures the comfort of “just continuing.”
3) Unbundle fear of regret from risk
- Action: small chance of small pain now; lowers big risk later.
- Inaction: no pain now; higher chance of big pain later.
Our brains spike on regret from action. Name it. “I’m scared of causing harm, not of harm itself.” Then frame it in expected value: Write those side-by-side. Read them out loud to someone who doesn’t care about your pride.
4) Precommit while you’re calm
- “We patch critical vulnerabilities within 72 hours.”
- “We sunset features with <1% usage after 90 days, with two announcements.”
- “We invest 10% of income monthly, regardless of headlines.”
Make policies while not under pressure: Precommitment narrows room for the bias to sneak in.
5) Use reversibility to pick speed
- Roll change to 5% of users. If metrics degrade by X, rollback.
- Have the hard conversation in a 30-minute block with a planned escape hatch: “Let’s pause and return Sunday if we feel heated.”
If a decision is reversible and cheap to roll back, bias for action. If it’s one-way and heavy, bias for careful analysis. Most decisions are more reversible than they feel. Create tiny experiments:
6) Run a premortem and an “inaction-mortem”
Premortem: “It’s six months later, we acted, and it went badly. Why?” Then flip it: “It’s six months later, we didn’t act, and it went badly. Why?” Put both lists on the wall. If you only run one side, omission bias wins by default (Klein popularized premortems; the inaction mirror balances the view).
7) Assign an explicit owner and a date
- “Elena will decide on TLS migration by Sept 15 after consulting DevOps and Legal.”
Vague ownership is fertilizer for inaction. Name a person, not a committee, and a date: If the date moves, log why. A drifting date should feel like a fresh decision, not a ghost.
8) Separate moral stain from responsibility
- “We choose deliberate action and accept learning pain.”
- “We make the cost of inaction visible and share responsibility for it.”
We conflate “I caused it” with “I’m a bad person.” Create language that allows credit for responsible risk-taking:
9) Ask the empty-chair question
Imagine the future user/customer/partner who benefits if you act. Put an empty chair in the room and give it their name. Ask, “What does Jordan (future customer) lose if we don’t act?” It sounds theatrical. It works.
10) Track “action debt”
Alongside tech debt, add an “action debt” column—decisions delayed, with accrued cost. Review monthly. If a card appears three times, escalate it. Seeing the pile turns “wait” from fog into weight.
A checklist you can actually use next week
- Write the “don’t act” choice as a dated, costed plan.
- Ask: “If this weren’t the default, would I choose it today?”
- Estimate both harms: caused-by-action vs. allowed-by-inaction.
- Decide reversibility; run a small reversible test if possible.
- Premortem both paths: “Action fails because…” and “Inaction fails because…”
- Precommit to rules where you can (patch times, investment schedules, deprecations).
- Assign a single owner and a deadline; log slippages.
- Put a name in the empty chair and argue their case.
- Review your action debt every month.
- When you feel relief from waiting, write why. Read that reason tomorrow. Decide then.
Stick this list on the wall next to your roadmap. Or your fridge. Or your mirror.
Related or confusable ideas
Omission bias rubs shoulders with a few other gremlins. They often travel together, but they’re not the same.
- Status quo bias: Prefers the current state because it’s the current state. Omission bias adds a moral tint: doing something feels like sin; doing nothing feels neutral. You can love the status quo and not fear action, or fear action even when you hate the status quo.
- Loss aversion: Losses hurt about twice as much as equivalent gains feel good (Tversky & Kahneman, 1991). Omission bias piggybacks on this by making action feel like the source of losses. Inaction frames losses as external, dulling the sting.
- Regret aversion: We avoid choices that could lead to regret. Omission bias shows one way regret aversion plays out: we prefer to regret not acting than to regret acting, even if the outcomes match (Zeelenberg & Pieters, 2007 discuss regret aversion broadly).
- Procrastination: Delaying due to present bias or task aversion. Omission bias is about moral and blame dynamics, not just effort. Some delays are about “ugh, effort”; omission bias is “ugh, responsibility.”
- Analysis paralysis: Overanalyzing and never deciding. Omission bias is one reason analysis feels safe—numbers protect you from being the person who acted. The cure is not always more data; it’s a clean frame that shows inaction as a choice.
- Bystander effect: People don’t help in groups because responsibility diffuses. Omission bias adds an internal story: “I’ll feel worse if I intervene and make it worse.” Both can stack in emergencies.
- Endowment and default effects: You value what you have more than new options, and defaults stick. Omission bias fuels both by making movement feel stained compared to “keeping.”
We call these “the stillness syndicate.” The cure is not reckless motion; it’s honest motion—giving action and inaction equal lighting on the stage.
Wrap-up: Choose your future, not your alibi
We’ve all felt the clean relief of choosing not to choose. No messy side effects today. No angry emails. No awkward talk. The day stays smooth, and nobody can say it was “your fault.” But the world keeps moving whether we move or not. And the cost of holding still usually arrives with interest.
Omission bias isn’t evil; it’s our desire to not cause harm. That desire needs a chaperone: the truth that harm happens from holes we leave unfilled, bugs we don’t patch, and words we don’t say. Responsible lives and teams practice cutting small, smart slices into the unknown and learning fast. They also practice mourning gentle losses like “the story we liked about who we were” and then pivoting anyway.
We’re building a Cognitive Biases app to make this easier in real time—to surface the “do nothing” tug, put numbers and narratives around both paths, and nudge you toward the choice you’ll respect in a month. Until then, tape the checklist up. Next time your gut whispers “wait,” ask it to show its math. Then choose your future, not your alibi.
FAQ
Q: How do I tell if I’m being cautious or biased? A: Ask two questions: “If this weren’t the default, would I choose it today?” and “What is the cost of waiting one unit of time?” If your answers are “no” and “nontrivial,” caution may be shading into omission bias. Caution gathers data and sets a date; bias drifts without a clock.
Q: What if my team punishes failed actions but ignores failed inactions? A: You’ve built an omission-bias factory. Change incentives: measure and celebrate high-quality decisions regardless of outcome. Run blameless postmortems for actions and “no-actions.” Add “avoided costs” to wins. Make inaction costs visible on dashboards.
Q: How do I reduce the fear of making the wrong call? A: Shrink the decision. Can you test on 5% of users? Pilot with one customer? Talk for 15 minutes instead of solving everything in a two-hour showdown? Reversible, smaller choices lower regret risk and train your brain to move.
Q: Isn’t waiting for more information rational? A: Sometimes. Waiting is rational when the value of information exceeds the cost of delay. Write both down. If you can’t quantify, narrate and timebox: “We wait one week for data X. If it doesn’t arrive, we decide anyway.” Open-ended waiting invites bias.
Q: How do I handle stakeholders who always say “let’s not rock the boat”? A: Reframe: “Our options are to act now and risk X, or not act and pay Y over Z months. Which loss do we choose?” Then propose a tiny, safe test. Most boat-worriers accept splashes if the hull stays intact.
Q: Can omission bias be useful? A: Yes. In high-stakes, irreversible situations, a bias toward inaction can prevent rash harm. The trick is to consciously evaluate reversibility and expected value. Use the bias as a brake, not a parking brake.
Q: How can I apply this at home without sounding like a management memo? A: Use warm scripts and small rituals. “I love us enough to have the awkward talk for 15 minutes; timer on?” Or, “Let’s try this for two weeks, then we can roll it back.” Keep it human and time-bound.
Q: What metric can I track to spot omission bias? A: Track “decision cycle time” and “action debt.” How long from identifying an issue to choosing a path? How many decisions delayed more than once? If the numbers creep up, your culture is drifting toward inaction.
Q: How do I keep myself from rationalizing inaction as “prudence”? A: Write a one-paragraph “Inaction Justification” and read it tomorrow. If it still persuades you, fine. If it sounds like fear cosplaying as wisdom, you’ve caught the bias. Adding a trusted peer review helps.
Q: What about legal and compliance contexts where inaction is safer? A: In some regimes, inaction reduces liability. Even then, list the non-legal costs—reputation, safety, employee trust—and consider controlled action with documentation. Safety and dignity beat legal minimalism long-term.
Checklist
- Name the “do nothing” path with a date and a cost.
- Ask: “If this weren’t the default, would I choose it today?”
- Estimate the expected harm of acting and not acting.
- Prefer small, reversible tests; bias for action when rollback is easy.
- Run premortem and inaction-mortem side-by-side.
- Set precommitment rules (patch windows, investment schedules, deprecations).
- Assign single-owner decisions with deadlines; log slippages.
- Put a future beneficiary in the room (empty chair method).
- Review action debt monthly and escalate repeat deferrals.
- Notice the relief of waiting; write it down; re-evaluate tomorrow.
We’re the MetalHatsCats Team. We’re building tools to make the invisible costs of inaction visible. Because doing nothing isn’t neutral—it’s a choice. And you deserve better than a choice you never realized you made.
- Ritov, I., & Baron, J. (1990). Reluctance to vaccinate: Omission bias and ambiguity.
- Spranca, M., Minsk, E., & Baron, J. (1991). Omission and commission in judgment and choice.
- Tversky, A., & Kahneman, D. (1991). Loss aversion in riskless choice.
- Zeelenberg, M., & Pieters, R. (2007). A theory of regret regulation.
References (for the curious, not the dogmatic):

Cognitive Biases — #1 place to explore & learn
Discover 160+ biases with clear definitions, examples, and minimization tips. We are evolving this app to help people make better decisions every day.
People also ask
What is this bias in simple terms?
Related Biases
Women Are Wonderful Effect – when women are seen more positively than men
Do you think women are naturally kinder, more honest, and caring? That’s Women Are Wonderful Effect …
Plant Blindness – when you fail to notice the green world around you
Do you notice animals, buildings, and cars but overlook trees and flowers? That’s Plant Blindness – …
Hindsight Bias – when the past looks obvious in retrospect
Do past events seem obvious in hindsight? That’s Hindsight Bias – the tendency to see past events as…