£1M+
annual margin improvement
Promotion assessment reduced from days to minutes
Cannibalisation visibility introduced for the first time across all categories
Self-serve tool integrated into weekly planning workflow
Delivered over £1M in annual margin improvement for an international stationery retailer by building a self-serve Promotion Effectiveness Tool. The tool replaced guesswork with a searchable library of every past promotion, enriched with incremental lift estimation, margin flags, and cannibalisation detection, enabling category planners to make evidence-based decisions in minutes rather than days.
The Problem
An international stationery retailer ran hundreds of promotions each year, but the process of selecting what to promote, at what discount depth, and when was driven largely by instinct and habit. Promotions that had previously destroyed value were re-run because nobody could easily find what had happened last time. Institutional memory lived in spreadsheets, inboxes, and the heads of people who had since moved on.
The Solution
We built a Promotion Effectiveness Tool: a structured library of every past promotion, enriched with contextual controls and commercial guardrails, delivered through a self-serve interface designed for category planners.
The tool had three core components. First, a Promotion Library providing a searchable, taxonomised record of every historical promotion, its mechanics, and its outcomes. Every promotion was classified by type (percentage off, multi-buy, gift with purchase), category, discount depth, duration, and timing. A planner considering a "3 for 2 on notebooks in September" could instantly see every similar promotion from the past three years, with outcome data attached.
Second, an Effectiveness Engine that estimated each promotion's true incremental lift after controlling for seasonality, baseline price movements, and trend effects. This meant planners could compare promotions on a like-for-like basis. Two critical flags were layered onto every assessment: a margin flag (did the promotion generate enough incremental margin to cover the discount cost?) and a cannibalisation flag (did the promoted product pull sales from full-price lines in the same category?). These guardrails did not prevent planners from running a margin-negative promotion where justified, but they ensured the cost was visible upfront.
Third, a Self-Serve Planner UI that let planners search, filter, compare, and assess promotions without needing analyst support. Previous analytical work had been delivered as static reports produced by a central data team: useful but slow. The self-serve interface changed this entirely, using commercial language rather than statistical jargon, designed to fit naturally into the weekly planning workflow.
Results and Impact
| Metric | Value |
|---|---|
| Annual margin improvement | £1,000,000+ |
| Promotions catalogued | Full historical library spanning multiple years and all product categories |
| Margin-negative promotions identified | Flagged and reduced in subsequent planning cycles |
| Planner adoption | High: tool integrated into weekly planning workflow |
| Time to assess a promotion | Reduced from days (analyst-dependent) to minutes (self-serve) |
| Cannibalisation visibility | Introduced for the first time across all categories |
The £1M margin improvement came not from a single change but from a cumulative shift in behaviour: fewer value-destroying promotions repeated, better-informed discount depth choices, and earlier identification of cannibalisation patterns. The tool did not make the decisions. It made the cost of bad decisions visible, and planners responded.
Key Takeaways
-
A searchable record of past promotions prevented repeated mistakes. When planners could see that a specific promotion type had failed in similar conditions three times before, they stopped proposing it. This alone eliminated a significant portion of value-destroying activity.
-
Guardrails shifted behaviour faster than guidelines ever did. The business had previously issued guidance on minimum margin thresholds. Compliance was inconsistent. When the same rules were embedded as visible flags in the planning tool, adherence improved sharply: it is harder to ignore a red margin flag than a paragraph in a policy document.
-
Making the tool easy to use was as important as the analytics behind it. A sophisticated model behind a request queue gets used quarterly. A simple interface that planners can access during their normal workflow gets used weekly. Adoption drove the impact, and adoption was a design problem, not a data science problem.