InoGen

Analytics Platform

Unified Cloud Data Platform and Procurement Intelligence for a Premium Drinks Manufacturer
Consumer Goods
Data Engineering
Analytics
Procurement
£6M value delivered

£6M

procurement savings identified and taken into negotiations

Full spend visibility unified across all source systems

Weekly procurement intelligence, previously unavailable at any cadence

Lakehouse platform ready for ML workloads

A unified cloud analytics platform on Azure Databricks delivered £6M in procurement savings for a premium drinks manufacturer by consolidating fragmented data from ERP, supply chain, and legacy systems into a governed spend cube. The platform gave procurement and finance teams weekly intelligence on supplier spend, contract compliance, and pricing variances for the first time, while establishing the foundation for future ML workloads.

The Problem

A premium drinks manufacturer had recently invested in new planning and execution systems across its supply chain and logistics operations. Alongside these sat legacy tools still running core finance and commercial processes. The result was a fragmented data landscape where critical business information was locked inside systems that did not talk to each other.

The Solution

We designed and built a secure cloud analytics platform on Azure, using Databricks and a Data Lakehouse architecture, to unify data from the ERP, supply chain tools, and legacy systems into a single governed environment. The platform followed medallion architecture patterns (bronze, silver, gold) to progressively clean, conform, and enrich data as it moved through the pipeline.

The bronze layer preserved raw ingestion with full lineage, so any figure in a gold-layer report could be traced back to its source record. The silver layer applied data quality checks, deduplication, and business rule validation: purchase orders matched to goods receipts, supplier names standardised against a master list, category codes mapped to a canonical taxonomy. Failed records were quarantined with reason codes, not silently dropped, and a data quality dashboard surfaced pass rates and outstanding issues. The gold layer contained purpose-built analytical models, starting with a procurement spend cube: a dimensional model sliceable by supplier, category, plant, region, time period, and contract status, integrating purchase orders, contract terms, rebate agreements, and goods receipt confirmations.

Loading diagram...

The Power BI layer delivered weekly procurement intelligence: spend by supplier and category, contract compliance versus maverick purchasing, price variance analysis, supply risk indicators, and negotiation support profiles. Role-based access control ensured plant managers saw their plant's data, finance saw the full picture, and sensitive commercial terms were restricted appropriately. The platform was designed to be operated and extended by the client's own teams, with pipeline documentation, CI/CD deployment, monitoring and alerting, and a data ownership model assigning business owners to each gold-layer entity. The lakehouse architecture was deliberately chosen to support both BI and machine learning workloads on the same platform, avoiding the common trap of separate BI and ML environments with divergent definitions.

Results and Impact

MetricValue
Procurement savings identified£6 million taken directly into supplier negotiations
Spend coverageFull procurement spend unified across all source systems
Reporting cadenceWeekly procurement intelligence, previously unavailable at any cadence
Time to negotiation-ready dataReduced from days of manual assembly to on-demand dashboard access
GovernanceRBAC, data quality checks, pipeline CI/CD, and defined data ownership
Platform extensibilityLakehouse architecture ready for ML workloads on the same data

The £6 million was not a projection. Procurement used the spend analytics to identify specific savings levers (supplier consolidation, contract compliance gaps, pricing variances, rebate thresholds within reach) and took those numbers into live supplier negotiations. Beyond the headline savings, category managers could see their spend position weekly instead of waiting for quarterly finance reports, and supply risk became visible before it became a problem.

Key Takeaways

  • Ship something useful before asking for more investment. The platform only gained real sponsorship once it delivered something procurement used every week. An architecture diagram does not earn trust; a dashboard that changes how someone prepares for a supplier negotiation does.

  • Treat master data as a product, not a project. Supplier and category mappings, contract data, and rebate schedules require ongoing curation, clear ownership, and a maintenance process. The analytics are only as good as the entities underneath them.

  • Data quality is a feature, not a phase. Building quality checks into the pipeline (not as a separate initiative to be done later) meant the business could trust the numbers from day one. Quarantining bad records with reason codes gave business owners visibility into data health and the confidence to act on it.