How a Simple Report and Data Integration Prevented a $300,000 Loss

The Challenge: A $300,000 Question

Our team was considering a major expense: $300,000 spread over a few years to hire consultants. They would have been brought in to formalize a due diligence process for evaluating external estimates done by third-party engineering firms. There was concern these estimates weren’t reliable enough, potentially causing inefficiencies and misinformed decisions that were leading to discrepancies in managing resources.

Before committing to such a significant investment, leadership wanted to confirm the scope of the issue. Were external estimates really inaccurate enough to justify bringing in consultants, or was this a solution looking for a problem?

I was tasked with creating a report to assess the accuracy of these estimates.

The Data Puzzle: Three Pieces, Different Departments

The analysis itself wasn’t the hard part. The real challenge was piecing together information that lived across three separate areas of the business in different formats:

  1. PDF Estimates: The initial estimates from external parties were stored in PDFs with unique identifiers on the network drive

  2. Engineering Data: This data connected those identifiers to assets, which were organized by location in our GIS system.

  3. Financial Records: Actual usage data was stored in our data warehouse managed by the finance department.

Individually, these data sources were useful, but not enough to tell the full story. To answer the question of estimate accuracy, I needed to bring them together into one cohesive dataset.

The Real Work: Unifying the Data

Connecting these data sources wasn’t a straightforward task. It required working closely with data owners in each department to figure out:

  • Where the data lived: What systems were used, and who owned them?

  • How the data connected: What relationships existed between the identifiers in the PDFs, engineering data, and financial records?

  • How to make it repeatable: Could this process be formalized for future cross-department analyses?

Some data sources weren’t well-documented. Others had quirks—like mismatched formats or naming conventions—that needed to be resolved. These are the kinds of hurdles that don’t make it into textbooks but often determine the success of a project in the real world.

The Revelation

Once the data was unified, I could finally dig into the analysis. The result? The estimates were much more accurate than anyone had assumed.

Leadership had feared a systemic issue with inaccurate estimates. Instead, the data showed a handful of outliers, not a widespread problem. The decision was clear: there was no need to spend $300,000 on consultants. The due diligence process was already functioning well enough.

The Lesson: Infrastructure > Flashy Reports

This project reinforced a fundamental truth about data:

An average report built on solid data infrastructure is always better than a beautiful report built on bad data.

In this case, the report wasn’t flashy, but it was built on reliable, unified data. That made all the difference.

Unifying data across different parts of the business is one of the fastest ways to provide actionable insights. It saves money, time, and frustration, and it helps people make better decisions.

Why This Matters for Small Businesses

You don’t need a massive team or expensive tools to get started. Many businesses already have the data they need—it’s just scattered across different systems, files, or departments. The challenge is bringing it all together in a way that tells a clear story.

And when you do? You can uncover insights that might just save you $300,000—or more.

So, next time you’re faced with a big question, don’t worry about making the prettiest report. Focus on building a solid data foundation first. The results might surprise you.

Previous
Previous

How to establish a data strategy around your ERP system in manufacturing

Next
Next

Free Guide: Construct a Knowledge-Base From Your Meetings