FREE NEWSLETTER

Data’s Double Edge: Why Spreadsheets are Both a Lifeline and a Possible Lie

Go to main Forum page »

AUTHOR: Mark Crothers on 8/11/2025

Can you even begin to imagine the confusion and chaos that follows a major disaster, and the difficulty of communicating and coordinating a response? This is a situation where data management, especially with a simple but powerful tool like a spreadsheet, shows its magic.

One very well-known situation that comes to mind is the Space Shuttle Challenger disaster. Facing a complex amount of data, engineer Richard Feynman and his team had a hunch about the cause. They used a simple spreadsheet to plot launch temperatures against O-ring damage, creating an easily understandable graph that clearly showed more brittleness with temperature drops. This simple act cut through a massive amount of data that would have taken months to examine, quickly determining the reason for the failure.

A bit of online research reveals another great example: the 2014 Ebola outbreak in Africa. After a shaky start with manual data entry, a spreadsheet was developed and used to vastly improve the tracing and tracking of infected people. Using data visualization, public health officials were able to identify disease hotspots and enact quarantine areas, which slowed the outbreak to an eventual standstill and saved many lives.

While spreadsheets are powerful for analyzing and presenting data, they share the same vulnerability with handwritten math: they are only as good as the information you put into them. A single mistake—whether a miscopied number in a manual calculation or a flawed assumption in a spreadsheet—can invalidate the entire result.

The danger, I feel, with spreadsheets is that their automated power and clear outcomes can hide these errors, creating a false sense of certainty. This is the case for everything from public health data to retirement planning, where a small, incorrect input can lead to a dangerously misleading outcome. The usefulness of the output will always depend on the accuracy of the input. In simple terms, it’s just the same as using the wrong numbers in an equation.

Spreadsheet skeptics, it seems to me, confuse this fundamental characteristic of all “variable driven” mathematics to debunk the very nature and real world utility of a well-designed spreadsheet.

 

 

Subscribe
Notify of
12 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
dlc06492
26 days ago

In the late 1960s, we engineers for large, high-voltage circuit breakers, all knew of the glass-transition points for the various rubber compounds used in our O-rings. (The rubber hardens, like glass, and doesn’t seal quite as well at lower temperatures, resulting in leaks of air or SF6).

It still puzzles me that NASA, Morton-Thiokol and Martin Marietta (managers and engineers, except for Roger Boisjoly) did not know the possible outcomes of ignoring this well-known phenomenon when trying to seal highly explosive substances.

Rick Connor
26 days ago
Reply to  dlc06492

My understanding is that Thiokol management had been warned by their engineers, including Boisjoly, about the o-ring risk well before the tragedy. It had become an “acceptable risk” because, despite o-ring issues in the past, nothing catastrophic had happened before. Sadly, this seems to be a part of human nature – we ignore health issues, maintenance issues, financial issues – until a disaster occurs. My father was a very smart man, yet he smoked for over 4 decades, despite multiple heart issues. He finally quit in his 60s when my mother had some health issues, and he had gand children.

My experience working with NASA in the decades after the Challenger tragedy is that they became very cautious, even on unmanned missions. You can never eliminate all the risks of space flight, and at some point you have to decide whether to proceed or not. I worked on robotic missions most of my career, and I never had to make any calls that involved human safety. I hope I would have been strong enough to make the right call.

Adam Starry
26 days ago
Reply to  Rick Connor

True – the Thiokol engineers tried to persuade NASA and Thiokol leadership to delay the launch in a conference call the night before the launch, because they did not have data to support that the seals would work at the unusually low launch temperatures. Leadership decided to go ahead with the launch.

jan Ohara
26 days ago
Reply to  dlc06492

That’s alarming and especially sad given the loss of lives. Do you have a theory on how that critical oversight happened? I recently watched the documentary “Titan:The Oceangate” in which ego driven disregard of data in the face of myopic goal attainment led to disaster and wonder if you think there are any similarities.

David Powell
27 days ago

“While spreadsheets are powerful for analyzing and presenting data, they share the same vulnerability with handwritten math: they are only as good as the information you put into them.”

This is why you:

  1. Always proofread any data manually entered, if it’s a small quantity and use data import for bigger sets
  2. Use only high-quality historical data sources which deal with common biases
  3. Create data quality assertions to cross-check calculated results when you can, these often catch data entry bugs
  4. Build safety margin into every plan because every day things happen which have never happened before.

The alternative to building a plan based on good historical data, forecasts, and safety margin is throwing darts in the dark, praying, and hoping. That rarely ends well.

Last edited 27 days ago by David Powell
Keith Pleas
27 days ago

That’s a fair point – and one of the most common mistakes is manually entering a value rather than a calculated value (formula). You change one of the underlying values and the number in that cell is “off”. One way to address that is with an extra row at the bottom for totals, and a column at the right for totals, and making sure they match.

Another challenge I see is the apparent high precision of floating number numbers that are based – in whole or part – on rough integer inputs.

As someone who worked with the Excel team to develop the first Excel SDK, I’ve always appreciated that product’s enduring relevance.

bbbobbins
27 days ago

You can of course “audit” your own spreadsheets for data errors and formulae. But in the world of personal finance there are plenty of 3rd party ones also out there to run the data through (at a high level) to validate overall results. When things broadly align (even without trying to tie up all minor assumptions like monthly vs annual timings, date on which to apply inflation etc etc) you can develop reasonable confidence.

jerry pinkard
27 days ago

I have used spreadsheets extensively in my career and also for retirement planning. There have been some famous gaffes of people using spreadsheets. Therefore, you should always verify your information a second way to avoid that.

Richard Hayman
27 days ago

I try to add checks and balances in mine. After 3 years, I feel confident all the calculations are correct. The assumptions, however, may not be.

mytimetotravel
28 days ago

Ah yes. Back in the day, we called that GIGO. Garbage in, garbage out. Right up there with KISS as a term of art.

Richard Hayman
27 days ago
Reply to  mytimetotravel

In my first job out of college, anything printed on green bar paper must be correct. So we printed our hand calculated results on that type of paper.

mytimetotravel
27 days ago
Reply to  Richard Hayman

Lol. Back then I wrote code on green and white striped coding pads after flowcharting on special sheets…. I kept the 360 green reference card for years, think I finally threw it out during the last move.

Free Newsletter

SHARE