Category management in Australian grocery has always been about the relationship between retailers and suppliers. Coles and Woolworths set the shelf. Suppliers compete to be on it — and to stay there. Metcash's independent network adds another layer of complexity. The tools that category managers use to navigate this environment have improved significantly over the last decade, but the fundamental challenge remains: how do you demonstrate that your product deserves more shelf space before a range review decides otherwise?
The answer increasingly depends on software. Not just planogram tools — though those matter — but a broader category management stack that connects data, shelf visualisation, and shopper evidence. This article covers where that stack currently stands in Australian grocery, and where the gaps are.
What category management means in Australian grocery
Category management is the process of treating a product category as a strategic business unit rather than a collection of individual SKUs. In practice, it means making shelf decisions based on category-level data: which products drive basket size, which are destination purchases versus impulse buys, how shelf organisation affects shopper navigation, and how range breadth versus depth affects total category sales.
In the Australian context, category management is largely retailer-led. Coles and Woolworths each have dedicated category management teams that control range, pricing, and planogram decisions. Suppliers influence those decisions — but they don't make them. The supplier's job is to come to a range review with compelling evidence that their product improves the category, not just their brand's performance.
Metcash operates differently. The independent network it supports gives individual retailers more flexibility in range decisions, but Metcash's wholesale distribution model still means category-level thinking applies. For suppliers working across all three networks, the category management challenge is understanding how shelf decisions differ by format and tailoring the evidence accordingly.
The category review is the moment that matters. Everything in a CPG brand's category management toolkit is ultimately pointed at one goal: walking into a Coles or Woolworths range review with data that makes the retailer's decision obvious.
The current category management software landscape
The tools most commonly used in Australian grocery category management fall into a few distinct categories:
Planogram creation and compliance software
Tools like JDA (now Blue Yonder), Relex, and Nielsen Spaceman have long been the standard for building and managing planograms. They allow category managers to lay out shelf configurations, track facing counts, and produce the schematic documents that retailers and suppliers work from during range reviews.
The limitation of these tools is that they're fundamentally static. A planogram is a floor plan — it shows you what the shelf looks like, but it doesn't tell you how shoppers behave on it. Whether the layout you've designed actually gets noticed, generates purchase intent, or drives the shopper to your product versus the competitor's requires additional data that planogram software doesn't produce.
Sales analytics and EPOS data
Sales data from electronic point-of-sale (EPOS) systems — including data accessible through retailer data programmes — tells you what sold. It's retrospective by nature. You can identify that a category reset changed velocity, but only after the reset has already happened and the next review cycle is approaching.
For major CPG brands with access to detailed scan data, EPOS analysis is foundational. But it doesn't answer the pre-launch question: will this new placement, packaging change, or facing count adjustment actually improve performance? Historical data can inform a hypothesis; it can't validate one.
Shopper research platforms
Traditional shopper research — qualitative focus groups, eye-tracking studies, in-store intercept surveys — produces richer behavioural data but at significant cost and lead time. A formal eye-tracking study in a simulated store environment can take six to eight weeks to design, recruit, and execute. That timeline rarely aligns with the pace of a retailer's category review calendar.
The cost also puts these tools out of reach for mid-market suppliers who lack the research budgets of multinational FMCG brands. The gap between what large and small suppliers can bring to a range review has historically been significant.
Where the category management stack falls short
The fundamental limitation running across most category management tools is the separation between shelf design and shopper evidence. Planogram software tells you what the shelf looks like. Sales data tells you what sold. Shopper research tells you why — but slowly and expensively.
This creates a predictable problem. Category managers and trade marketing teams spend significant time building planogram scenarios and constructing sales-based arguments for range reviews, but arrive without direct evidence of how the proposed shelf configuration will affect shopper behaviour. The retailer's category buyer, who has access to the same scan data, is unlikely to be moved by an argument they could have made themselves.
What's missing is a fast, affordable way to validate shelf configurations against real shopper behaviour before the review — not after.
Virtual shelf testing as a category management tool
Virtual shelf testing has emerged as the practical answer to this gap. By replicating a physical shelf in a digital environment using real planogram data, category managers can run structured shopper tests against specific shelf configurations in days rather than weeks — and at a fraction of traditional research costs.
The output is exactly what a range review conversation needs: behavioural data showing how shoppers navigate the category, which positions generate attention and purchase intent, and how your product performs relative to competitors under realistic conditions. That's different in kind from a sales argument or a theoretical planogram.
For Australian grocery specifically, this matters because the shelf environments at Coles, Woolworths, and Metcash differ meaningfully. A planogram built for a Woolworths metropolitan superstore has different format characteristics than a Coles regional store or an IGA independent. Virtual shelf testing against the right planogram produces results that are relevant to the specific format you're competing in — not a generic template.
Fitting into the category review process
The most effective way to use virtual shelf testing is as a pre-review validation step, not a replacement for the rest of the category management toolkit. The process typically looks like this:
- Build the shelf hypothesis: Use existing planogram data and sales analysis to develop a proposed configuration — a different position, adjusted facings, a packaging variant, or a new product introduction.
- Test against the virtual shelf: Run a structured shopper study on the proposed configuration versus the current state. Capture dwell time, navigation patterns, and purchase intent.
- Quantify the uplift: Translate behavioural differences into projected performance — how much does moving from position 4 to position 2 improve purchase intent? What's the visibility uplift from three facings versus two?
- Walk into the review with evidence: Present the retailer's category buyer with data showing that your proposed configuration improves category performance, not just your brand's performance. That reframing is what changes the conversation.
The category review reframe: A supplier arguing for more facings because "our product deserves it" is a hard sell. A supplier showing that more facings improves category conversion and basket value is a different conversation entirely. Shopper data makes the second argument possible.
Where ShelfLab fits in
ShelfLab is a virtual store testing platform built specifically for Australian grocery retail. It combines real Australian planogram data with structured shopper tests, producing the behavioural evidence that category managers need for range review conversations — without the timelines and costs of traditional shopper research.
The platform works alongside the planogram and sales tools already in use. It doesn't replace category management software — it adds the shopper evidence layer that those tools can't produce. For category managers and trade marketing teams at CPG brands competing in Coles, Woolworths, and the Metcash network, it fills the specific gap between shelf planning and shopper validation.
Explore a live planogram visualiser to see how real Australian shelf data looks in the platform — or request a demo to walk through a structured shelf test for your category. The range review timeline doesn't wait — and neither should your evidence.