You Are a Quantitative Analyst (but Likely Not Very Skilled)

by 

| June 3, 2025 | in

The Hidden Economist in Every Engineer

Every time you decide whether to refactor code or push a new feature, you’re doing economics.
Software developers and tech leaders make cost-benefit decisions daily, often without realizing it. Should we spend an extra week on automated tests or release now and fix bugs later? That’s a cost-benefit analysis. Do we build a custom solution or use an off-the-shelf service? Another cost-benefit analysis.

In fact, you are a quantitative analyst every time you weigh effort against reward. However, chances are, you’re not doing it as skillfully or intentionally as you could.

In our previous discussion about establishing a “standard of care” in software, we touched on themes of intentionality and professional responsibility. A big part of that is making better decisions guided by evidence and economics.

The truth is that the software industry’s lack of formal decision-making discipline has real consequences. Consider that only 31% of technology projects are considered successful (on time, on budget, and meeting goals). The rest under-deliver, overspend, or outright fail – often due to decisions made (or avoided) along the way (Standish Group. CHAOS Report 2015. The Standish Group International, 2015).

We can do better. To start, we need to embrace our role as analysts and bring cost-benefit thinking to the forefront.

Every Software Decision is an Economic Decision

Whether you call it engineering judgment, gut feeling, or just support triage, what you’re really doing is evaluating costs and benefits. Economics isn’t just for CFOs and MBAs; it’s a daily reality in engineering and product development. Some common scenarios include:

  • Build vs. Buy: Do we invest developer time to build this feature in-house or pay for a third-party product? Here, you’re weighing development costs (time, salaries, opportunity cost) against the benefit of control or saved licensing fees.
  • Speed vs. Quality: Should we rush this release to satisfy a deadline or take extra time to reduce bugs? This weighs the benefit of early delivery against the potential cost of defects in the field. A bug caught after release can cost up to 100x more to fix than if caught in design, a dramatic economic penalty for poor early decision-making (Boehm, Barry W. Software Engineering Economics. Prentice Hall, 1981; and McConnell, Steve. Code Complete, 2nd ed., Microsoft Press, 2004).
  • New Features vs. Tech Debt: Do we implement a cool new feature or spend this sprint paying down technical debt? Here the benefit of shiny new capabilities is weighed against the long-term cost of crappy code. As Ward Cunningham famously put it, “Shipping first-time code is like going into debt… Every minute spent on not-quite-right code counts as interest on that debt” (Ward Cunningham, OOPSLA ’92 Experience Report). In other words, postponing refactoring is taking out a loan that your team will pay back with interest later.

The problem is that many of us make these decisions on autopilot or emotion. We favor new features because they’re visible and fun, undervaluing less visible work like testing or documentation. We underestimate costs (especially future costs) and overestimate benefits (especially short-term ones).

Lessons from History

Throughout history, great engineers, economists, and leaders have wrestled with cost-benefit tradeoffs. Their stories are not only fascinating – they’re instructive for us in software.

World War II Bomber Armor

During WWII, Allied military analysts faced a critical question: where to add armor on bombers to best protect them. Initially, it seemed logical to reinforce the areas where returning planes had the most bullet holes (fuselage, wings, tails). But Hungarian American statistician Abraham Wald pointed out a fatal flaw in this reasoning. The planes with bullet holes in certain areas were the ones that made it home. Damage in those spots was survivable. The planes that didn’t come back were likely hit in other areas – engines, cockpits – where even a single bullet was catastrophic.

Wald famously advised the military to armor the places where there were no bullet holes on the surviving planes. This counterintuitive analysis saved lives. It’s a powerful reminder that gut instinct or “obvious” data can mislead; careful analysis of both what you see and what you don’t see leads to better decisions.

The Netscape Rewrite Debacle

Not all historical lessons are about good decisions. In the late 1990s, Netscape (then a leading web browser) decided to rewrite its browser code from scratch. This was an attempt to leap ahead by cleaning up years of “messy” code. The result? A disaster.

The complete rewrite took three years, during which time Internet Explorer zoomed ahead and captured the market. When Netscape’s new version finally shipped (as the open-source Mozilla browser), it was buggy, slow, and too late; Netscape’s market share had dwindled to almost nothing.

This is a classic case of failing to weigh costs and benefits realistically. The cost (years of no product improvement and delay) far outweighed the uncertain benefit of a cleaner codebase. For developers, it’s a cautionary tale. Sometimes, incremental improvement beats starting over. The quantitative analyst in you should ask, “Can we afford the time this will take? What is the opportunity cost?” Netscape’s team either didn’t ask or ignored the answers – and paid the price.

These examples, from domains as diverse as personal life, warfare, and software business, carry a common theme:

Better decision-making comes from consciously applying analysis and learning from data.

In Roman times, as we noted previously, engineers understood tradeoffs intuitively. They focused on infrastructure fundamentals over luxuries because they knew durability yielded the greatest benefit for society. In more recent history, the formal practice of cost-benefit analysis emerged in the 1800s (credited to thinkers like Jules Dupuit in France and later popularized in the U.S. for public works). Over and over, the message is clear: being intentional and quantitative about decisions leads to better outcomes.

Bringing Cost-Benefit Thinking into Software Engineering

How can we apply these lessons to our day-to-day technical work? We don’t need a PhD in economics, but we do need a mindset shift and a few practical habits.

Make Tradeoffs Explicit

Whenever a significant technical decision is on the table (whether it is to adopt a new framework, delay a launch for extra testing, refactor a module, etc.), articulate the costs and benefits explicitly. This can be as simple as a two-column list or a team discussion of pros and cons. The key is to surface the assumptions. How much effort will choice A require? What do we gain from choice B? Writing it down forces clarity. It also injects a bit of objectivity into conversations that can otherwise be driven by who talks loudest. Encourage your team to ask, “What are we giving up by doing this?” and “What do we get, and is it worth it?” This is the essence of opportunity cost and return on investment (ROI) thinking.

Use Data and Estimates

We often have more data than we realize. Do you have historical metrics on how long similar tasks took or the impact a past outage had on revenue? Even rough estimates or industry benchmarks can inform decisions.

For example, knowing that fixing a bug in production might cost 10x more in developer time (not to mention user trust) than fixing it earlier can justify investing in better QA or automated tests now. If you’re debating whether to rewrite a component, look at usage analytics: if only 5% of users use it, perhaps it’s not worth a massive rebuild.

Conversely, if that component handles 50% of your transaction volume, any improvement could yield big benefits. Developers love to optimize code. Channel some of that analytical zeal into optimizing your choices. Make back-of-the-envelope calculations a normal part of your engineering proposals. No need for perfection – even ballpark numbers beat pure guesswork.

Learn from “Economic” Frameworks

There are existing frameworks and principles that marry well with software decision-making. One is the Pareto Principle (80/20 rule) – often 20% of the features deliver 80% of the value. A savvy product engineer asks: which work is in that valuable 20%? Prioritize that and be cautious about gold-plating the rest.

Another concept is sunk cost. Just because we’ve spent five months on a module doesn’t mean we should spend five more if it’s not working out. Economists teach us to ignore sunk costs and make decisions looking forward (only considering future cost vs. future benefit). That’s hard for humans, but it is critical in software projects that might need pivoting.

Even the idea of technical debt is explicitly an economic metaphor. Taking on a “debt” (quick and dirty coding) incurs interest in the form of slower future development. Using this language with stakeholders can help quantify why paying off that debt (through refactoring or redesign) is a smart investment. It shifts the conversation from “engineering wants to do a cleanup” to “we’ll save X effort long-term and reduce risk by doing this now.”

Consider Second-Order Effects

True cost-benefit thinking goes beyond the immediate impact. It asks, “And then what?”. Sure, deploying quickly gets customers now, but then what? Will we spend the next year fixing scalability issues (at the cost of missing other opportunities)? Or if we postpone a feature, what is the cost in user goodwill or lost market share? Professionals take a holistic view.

One historical example: in the 1960s, NASA managers decided to use an all-up testing approach for the Saturn V rocket – testing all stages together rather than one at a time – which was riskier upfront but dramatically sped up the timeline (benefit: making Kennedy’s moonshot deadline). They looked past the immediate risk to see the broader benefit of saving years of development.

In software, we similarly need to weigh short-term vs. long-term effects. This is where intentionality and professional responsibility come in: sometimes the “best” decision on paper involves saying no to a shortcut that would harm the product’s reputation or yes to an investment that doesn’t pay off until next year but will delight users or avert a security fiasco.

Toward Intentional and Accountable Engineering

The beauty of embracing your role as a (budding) quantitative analyst is that it leads to more intentional engineering. Instead of stumbling forward and hoping for the best, you take ownership of outcomes. Being deliberate about costs and benefits doesn’t make you robotic or stifle creativity. Instead, it grounds your innovation in reality. It means you can go to your stakeholders, clients, or team and provide a rationale for why a certain approach is best, not just “I have a feeling this is right.” That’s empowering! It elevates our craft closer to the level of mature engineering disciplines that have long used calculations and standards to inform decisions.

Importantly, better decision-making is learnable and contagious. You can start small. At your next sprint planning or architecture review, infuse a bit of cost-benefit thinking. Ask the questions others aren’t asking. Over time, you’ll find that teams begin to catch on. It creates a culture where choices are made with eyes open. And when mistakes happen (they still will; this isn’t a magic wand), the post-mortems become richer learning opportunities because you can trace back which assumptions or estimates were off.

In software product development, our raw materials (time, talent, computing resources) are precious and limited. Guiding our decisions with economic thinking ensures those resources are used wisely to maximize value. It’s a professional responsibility we should embrace. Lives and livelihoods increasingly depend on the systems we build. The hopeful news is that by sharpening our cost-benefit skills, we can dramatically improve project outcomes. We can deliver software that not only works but delivers real value without runaway costs or nasty surprises.

Let’s hold ourselves to this higher standard of intentional decision-making. By doing so, we’ll not only avoid the proverbial collapsed bridges of our industry but also build stronger, more reliable products that stand the test of time to the benefit of everyone.

Better decision-making is achievable, and it’s transformative. Each thoughtful choice we make is a step toward a more disciplined, trustworthy, and ultimately empowering software practice. Let’s get to it.

Want to Build Your Quantitative Muscles?

If you’re serious about making better decisions and building better products because of it, here are a few books that have shaped how I think about cost-benefit tradeoffs, measurement, and intentional engineering:

author avatar
Bill Udell Chief Operating Officer
Bill is happily creating a groundswell of change and innovation in the Silicon Prairie ecosystem with his buddies at Don’t Panic Labs. He can be found regularly at community events or busily working in coffee shops and boardrooms to create opportunities for collisions of entrepreneurs, community leaders, and the ecosystem.

Related posts