Your AI Investment Is Worthless Without These 6 Foundations
Every week another company announces they're implementing AI. A new forecasting tool. An intelligent demand planner. A predictive maintenance system. The vendor promises transformational results and the leadership team signs off because everyone else is doing it and nobody wants to be left behind. Six months later, the tool is barely used. The outputs don't match reality. The team doesn't trust it. And the expensive consultants who sold the implementation are long gone. This is not a technology problem. It is a foundation problem. 42% of companies abandoned most of their AI initiatives in 2025, up from just 17% the year before (WorkOS). Nobody is blaming the algorithm. The failures trace back to data and the people around it, every single time.
Here are the six data foundations that have to exist before AI can do anything useful at all.
1. Data ingestion/collection
Ask one question before anything else: is your data actually being captured? Not assumed to be captured. Actually captured, consistently, at the right frequency, in the right format, in the right system. In most SMEs, the answer is no. Production data lives in a spreadsheet someone maintains manually. Inventory adjustments get entered into the ERP later, sometimes. Supplier performance data exists only in the head of the procurement manager who has been there twelve years. You cannot build AI on top of that. Data collection is a process discipline problem, not a technology problem. Someone has to define what gets captured, where, how often, and who is responsible. Until that exists you have nothing to give AI to work with.
2. Data quality/Scrubbed data
Collecting data is not the same as having good data. A German logistics company invested €2.5 million in an AI demand forecasting system. It failed because historical sales data had been recorded inconsistently across locations, with different sites using different product categories (Goldright) The AI wasn't broken. The data was broken; same outcome. Dirty data doesn't just produce bad outputs. It produces confident bad outputs, which is worse. AI doesn't say it's not sure. It gives you a number. That number goes into a decision. That decision costs you money.
Data quality requires an ongoing process, not a one-time cleanse. Someone has to own it, checks have to be built into how data enters the system, and regular audits have to catch drift before it compounds.
3. Infrastructure and integration
Your ERP doesn't talk to your warehouse system. Your CRM doesn't connect to your planning tool. Your production floor machines generate data in a proprietary format nobody outside the machine vendor can read. Spreadsheets fill every gap. AI needs a data pipeline. Clean data sitting in five disconnected systems is still inaccessible without an integration layer pulling it together. You can have perfect data quality in each individual system and still be completely unable to do anything intelligent across them. This is where Industry 4.0 infrastructure becomes unavoidable. In Europe, AIVHY and their OpenIIoT platform are doing exactly this work, connecting machines, ERPs, people, products, and systems regardless of brand or age into a single unified architecture/digital ecosystem.
If your systems don't talk to each other, no analytics tool in the world fixes that with a dashboard. You need the infrastructure layer first.
4. Processes
Clean data inside a broken process produces clean garbage. If your replenishment process is built around a buyer's gut feel, AI demand planning on top of it doesn't fix anything. It generates statistically optimized recommendations that get ignored in exactly the same way the system recommendations always were. Before AI can add value, the underlying process has to be defined, documented, and actually followed. Not as a theoretical procedure in a folder nobody reads. As the way work actually gets done. The AI tool is the easy part. Getting a team to actually change how they work is the hard part.
That is where most implementations quietly die.
5. Data literate employees
A 2024 NewVantage survey found that 92.7% of executives identify data as the most significant barrier to successful AI implementation. Not compute power. Not talent. Not budget. Data (Amit Kothari). But the real problem isn't just having good data. It's having people who can look at what the data is saying and know whether it makes sense. AI will tell you demand is going up 40% next quarter. Is that right? Is it a genuine trend or is it reacting to a one-off spike because a customer over-ordered before a price increase? A person who understands the industry, the customer base, and the statistical methodology will know. A person handed a dashboard and told to trust it will not. Companies spend €200,000 on an AI platform and put a junior analyst in charge of interpreting the outputs with no statistical background, no industry context, and no authority to push back when something looks wrong.
The AI runs and then bad decisions get made. The AI gets blamed even though the AI didn’t fail. Ultimately the people layer failed.
6. Governance and ownership
Who owns the data? Not who cleans it. Who is accountable when it deteriorates, when two systems show different numbers for the same metric, when a critical dataset hasn't been updated in three months? In most organizations data falls into a grey zone where IT thinks the business owns it and the business thinks IT owns it. Nobody is accountable. And without accountability, all five pillars above erode the moment normal business pressure resumes. Governance means a named data owner for each critical dataset, defined standards for how data enters the system, and executive sponsorship that treats data integrity as a business priority, not an IT problem. Without it, you are not building a foundation.
You are building on sand.
Then, and only then, AI
Get those six right and AI becomes a genuine force multiplier. Forecasting that took two days runs in minutes. Supply disruption risks surface before they become crises. Inventory optimization across thousands of SKUs becomes tractable instead of overwhelming. That is the version of AI that delivers ROI. Not because the technology is different but because the foundation underneath it is solid. Without the foundation, you are not implementing AI. You are buying a very expensive way to automate bad decisions at scale.
Gartner predicts that through 2026, organizations will abandon 60% of AI projects unsupported by AI-ready data (Gartner). That number will not surprise anyone who has watched an AI implementation get sold, deployed, and quietly shelved inside eighteen months. Before your business spends another euro on an AI tool, ask honestly whether all seven of these are in place. If any are missing, fix those first.
The AI will still be there when you're ready. And when you are ready, it will actually work.
— — —
Kevin Cerullo is the founder of Fourth Echelon, a supply chain and operations consultancy based in Malta. We help businesses across Europe build the data infrastructure, processes, and operational capability that make AI investments actually deliver. For industrial IoT and system integration, we work with AIVHY and their OpenIIoT platform, Malta-based pioneers in Industry 4.0 connectivity.
If you want to know where your operation stands against these seven pillars, book a free 30-minute call and we will tell you honestly what is in place, what is missing, and what it would take to fix it.

