From Data to Decisions: How AI Is Unlocking Hidden Value in Supply Chain Data

carol millerBY CAROL MILLER, CHIEF MARKETING OFFICER, MHI

FOR DECADES, SUPPLY chains have relied on spreadsheets, siloed systems and gut instinct to balance cost, service and risk. But as global networks have become more complex and data volumes have exploded, traditional tools are no longer enough.

Today, structured data pours in from enterprise resource planning (ERP) systems, warehouse management systems (WMS) and transportation management systems (TMS). Unstructured data floods inboxes, PDFs and call center transcripts. Internet of Things (IoT) sensors and telematics generate streams of real‑time signals. The question is no longer whether companies have enough data; it’s how they can turn that data into decisions.

unlocking hidden value in supply chain data

Artificial intelligence (AI) is increasingly providing the answer. To explore how, four supply chain experts shared their perspectives:

  • Randy V. Bradley, PhD, dean of the Jack C. Massey College of Business at Belmont University.
  • Jim Frazer, VP of corporate strategy at ARC Advisory Group.
  • Rick McDonald, CEO of Rick McDonald Supply Chain Advisory and former chief supply chain officer at The Clorox Company.
  • Arman Moussavi, engagement manager at Deloitte Consulting LLP.

Together, their insights reveal both the opportunities and the challenges of harnessing AI to unlock hidden value in supply chain data.

FROM SPREADSHEETS TO STREAMING DECISIONS

A key advantage of applying AI to the supply chain is that it enables operations to move from static planning to continuous, dynamic decision‑making.

“Supply chains today are no longer about spreadsheets and static forecasts. They are about streaming decisions made against a backdrop of volatility,” Frazer said. “AI is the operational layer that converts millions of signals—inventory feeds, weather alerts, IoT pings, invoices, even customer service emails—into decisions at scale.”

He pointed to applications such as dynamic routing, where AI models recompute routes continuously as traffic or port congestion changes; demand forecasting that blends historicals with macroeconomic data and social media sentiment; and exception management, where autonomous agents triage late shipments or customs delays.

“The value is not that AI replaces ERP, TMS or WMS,” Frazer said. “It augments them, giving these systems a new capacity to sense, decide and act in real time.”

McDonald emphasized similar value drivers.

“Data ingestion capabilities are enabling supply chains to induct lots of data from unstructured, structured and semi‑structured sources,” he said. “Keying errors are reduced, while significant time savings and timely insights are delivered to those that need them.”

He cited three practical examples:

  • Auto document processing across purchase orders, bills of lading and parts listings.
  • Advanced demand sensing and inventory optimization platforms like ketteQ (see sidebar on page 16), which integrates external signals such as weather and flu case counts to balance stock levels.
  • Logistics route planning with real‑time shipment visibility.

When it comes to applying AI, Bradley emphasized that not all data is equal and that AI’s role depends on the character of the data.

“Machine learning (ML) models excel at finding patterns in structured data—such as inventory records or sensor readings from IoT devices monitoring temperature, humidity, vibration or global positioning systems (GPS)—where formats and values are consistent,” he said. “Unstructured data, including supplier emails, shipping documents, maintenance logs or photos of damaged goods, on the other hand, benefits from generative AI. That’s because such large‑scale models are more adept at uncovering complex relationships and providing context to data of mixed character and structure.”

WHERE AI THRIVES—AND WHERE IT STRUGGLES

AI has proven especially adept at handling certain types of data.

“AI thrives on high‑volume telemetry,” Frazer said. “IoT sensor streams, telematics data and electronic product code information services (EPCIS) events are natural fodder for AI‑driven anomaly detection and estimated time of arrival (ETA) recalculation.”

He also pointed to the power of natural language processing (NLP) in unlocking the value of documents that had previously sat idle. “Supplier emails, customs PDFs or standard operating procedure (SOP) manuals were once dead weight in shared drives. NLP now turns those into structured, searchable signals.”

McDonald agreed, noting that AI‑driven data ingestion tools already deliver major wins in processing unstructured documents. But he also underscored the challenges.

“AI struggles with its ability to interpret, properly compare and contrast terms across languages. For example, ‘soda,’ ‘pop,’ and ‘Coke’ mean the same thing in different parts of the U.S.,” he said. “Dirty, poor or missing data, causal reasoning and understanding nuance and context that humans understand easily are all ongoing obstacles for AI.”

Moussavi further highlighted the split between structured and unstructured use cases, noting, “In the past, AI or automation required structured data. Generative AI unlocks use cases where there is a lot of unstructured data.”

Specifically, structured data is ideal for forecasting, troubleshooting and exception management. Unstructured data is where generative AI shines—parsing supplier contracts, manuals and documents at scale.

However, he cautioned that not all use cases are equally mature. “High‑context, knowledge‑based decisions, such as supplier negotiations or consumer sentiment, still require human expertise,” Moussavi added. “This is also a function of there often not being the data or experience for the AI to make a particular decision, which is where human experience and knowledge come into play.”

Bradley reinforced that point, noting that AI’s real promise often lies in surfacing what’s missing.

“The true value is not just in spotting patterns, but in revealing gaps and relationships that have gone unnoticed,” he said, noting that generative AI, in particular, is effective at inferring context from incomplete information. “Generative AI draws on both the data it’s ingesting and processing and the patterns it has learned, which is why it can reveal connections that seem new to us. That’s the source of its insights.”

BARRIERS TO INTEGRATION

Despite progress, all four experts identified barriers that have slowed AI adoption. Frazer laid out four.

“The first barrier is inconsistency,” he said. “A shipment ID can mean one thing in the TMS and something else in the ERP. The second is governance. Many organizations run data lakes without contracts or stewardship, and the result is often confusion over the ‘source of truth.’ The third is latency. It’s one thing to run batch analytics; it’s another to combine real‑time IoT telemetry with nightly ERP loads and expect sub‑minute decisions. And finally, there’s trust. Without clear ownership of exceptions and closed‑loop resolution, planners revert to legacy ways of working.”

McDonald put the trust issue in human terms. During Clorox’s transition to a sophisticated planning platform, he recalled, one of the biggest barriers he faced was the believability of the data and the recommendations generated by the AI engine.

“Experienced planners did not believe the new recommendations, because they were so different than the way they would have been executed in the prior system,” he recalled.

Another root cause? Data fragmentation, said Moussavi.

“Data is siloed, messy and often offline on spreadsheets and laptops,” he explained. “That limits what companies can realistically achieve with AI. Too many are jumping into the idea of a ‘self‑healing supply chain’ before fixing the basics.”

EMERGING BEST PRACTICES

To tackle these challenges, a set of best practices is beginning to take shape.

Organizations considering applying AI to their data should start small, urged Moussavi.

“The best practice is to begin with low‑hanging fruit, such as SOP management and training,” he said. “Minimize customization, simplify the tech stack and avoid building bespoke AI where ERP or WMS vendors are already embedding agentic AI into their platforms.”

Frazer described the technical scaffolding of modern AI supply chains.

“We are seeing agent‑to‑agent (A2A) protocols so autonomous systems can negotiate with each other instead of waiting for human handoffs,” he said. “We are seeing the model context protocol (MCP), which gives AI memory and continuity across sessions. We are seeing retrieval‑augmented generation (RAG) pipelines that pull fresh, domain‑specific documents into model reasoning and GraphRAG—an AI that does not just look at flat lists, but traverses entire supply networks to reason across dependencies.”

All of this, added Frazer, sits on top of lakehouse‑plus‑feature‑store architecture stacks, integrated with real‑time event streams such as Kafka and governed by built‑in role‑based access controls (RBAC).

Governance frameworks are equally important, emphasized Bradley.

“Just as data governance has matured into a recognized discipline, AI governance frameworks are now taking shape,” he said. “Boards, guided by organizations such as the National Association of Corporate Directors, for example, are increasingly focused on putting safeguards in place.”

 

Click here to read the full article.

MR.HATCH/SHUTTERSTOCK.COM