NomosLogic
The Gap Between Technical Debt and Technical Dishonesty
Back to Blog
tech debtarchitecturesystem designsoftware principlesscience

The Gap Between Technical Debt and Technical Dishonesty

Matt HardyMarch 21, 20265 min read

Matt Hardy

Matt Hardy


Technical debt has become one of the most useful — and most abused — phrases in engineering.

Useful because it's real. Every team accumulates it. Every codebase has corners where someone made a pragmatic decision under pressure and left a note that said "we'll come back to this." That's not failure. That's engineering in the real world.

Abused because somewhere along the way it became a blanket term. A way to describe not just the intentional shortcuts we made with full awareness, but the things we built badly and didn't want to admit. The architecture we knew was wrong when we shipped it. The assumption we never tested but told stakeholders we had. The system we called "scalable" in the roadmap meeting and quietly knew wasn't.

That second category isn't technical debt. It's something else. And I think we do the profession a disservice by calling it the same thing.


Debt implies a decision. Dishonesty implies an evasion.

When I think about genuine technical debt, there's always a moment of clarity in the origin story. Someone — usually under time pressure, usually with imperfect information — made a conscious tradeoff. They knew what they were deferring. They could articulate what it would cost to fix it later. The decision was made with eyes open.

Technical dishonesty is different. It's what happens when the uncomfortable truth about a system gets smoothed over in the retelling. When "this will need work" becomes "this is solid for now." When the demo environment gets presented as production-ready. When the estimate gets shaved because the real number would have killed the project — and the team knew it.

I've been in both situations. I've made real tradeoffs and documented them honestly. I've also been in rooms where the truth about a system got quietly edited before it reached the people making decisions. The second thing is harder to talk about, which is probably why we rarely do.


Why it matters more than we admit

Technical debt is manageable. You can inventory it, prioritize it, schedule it. Teams do this successfully all the time.

Technical dishonesty compounds in ways that are much harder to unwind — because it doesn't just live in the codebase. It lives in the mental model that leadership has of the system. Decisions get made on top of it. Roadmaps get built on top of it. Commitments get made to customers on top of it.

And then the system fails in production, or the scaling event exposes the assumption that was never tested, or the acquisition due diligence asks a question nobody prepared an honest answer to — and suddenly you're not just fixing code. You're rebuilding trust. With your team, with your stakeholders, sometimes with your customers.

That's an order of magnitude more expensive than the original problem would have been.


How it happens — and it's not always malicious

This is the part I want to be careful about, because I've seen this framed as a character issue when it's usually a systems issue.

Most technical dishonesty doesn't start with someone deciding to deceive. It starts with pressure. A deadline that doesn't move. A stakeholder who responds badly to uncertainty. A culture where "I don't know yet" reads as incompetence rather than intellectual honesty. A promotion cycle that rewards confident delivery over accurate forecasting.

In those environments, the incentives are all pointed in the wrong direction. The engineer who says "this isn't ready" gets overruled. The architect who flags the assumption gets told to find a way to make it work. Over time, people learn to edit themselves before they ever get to the meeting. The dishonesty gets internalized and starts to feel like just how things are done.

I've watched good engineers develop this habit. I've had to work to unlearn it myself.


The leadership accountability

If you're leading an engineering team and this is happening on your watch, the first question isn't about the engineers. It's about what you've made safe.

Do people on your team feel like they can tell you something is broken without it becoming a performance conversation? Do your standups surface real blockers or curated progress updates? When someone says "I'm concerned about this," what happens next?

The honest answer to those questions tells you more about your technical culture than any architecture review or code audit.

The best engineering cultures I've been part of had one thing in common: telling the truth about the system was easier than hiding it. Not because everyone was naturally virtuous, but because the incentives were set up correctly. Honesty was rewarded. Surprises were treated as information, not failures.

That's a leadership design choice. It doesn't happen by accident.


A simple distinction I've started using

When I'm trying to assess the health of a system — or a team's relationship with their system — I ask one question:

Can you tell me what you know, what you think, and what you don't know yet — and keep those three things clearly separated?

Debt lives in "what we know." It's documented, understood, and has a path to resolution.

Dishonesty lives in the collapse of that distinction. In "what we think" being presented as "what we know." In "what we don't know yet" being quietly omitted from the conversation.

Teams that can hold that separation clearly are teams I trust. Teams that can't — regardless of their technical skill — are teams that will eventually surprise you in ways you don't want to be surprised.


Thirty years in, the systems I respect most are the ones that were built by people who were willing to be honest about their limits. Not because they lacked confidence, but because they understood that honesty about constraints is what makes real engineering possible.

The debt is manageable. It's the evasion that costs you.

MH

Matt Hardy

Published on March 21, 2026

When I think about genuine technical debt, there's always a moment of clarity in the origin story. Someone — usually under time pressure, usually with imperfect information — made a conscious tradeoff. They knew what they were deferring. They could articulate what it would cost to fix it later. The decision was made with eyes open.