Every year, Australian CPG brands spend millions launching new SKUs, negotiating for better shelf placement, and resetting categories — then find out after the fact whether it worked. The products that didn't sell get delisted. The category reset that looked good on paper confused shoppers. The new facing count that seemed like a win turned out to hurt visibility.
Virtual shelf testing is the answer to a straightforward problem: you need to know how shoppers will respond to a shelf change before you commit to it. Not after a three-month sales cycle. Before.
What is virtual shelf testing?
Virtual shelf testing is a research methodology that replicates a physical retail shelf in a digital environment, then measures how real shoppers interact with it. Instead of rearranging products on a live store shelf and hoping the data comes in, you build a digital replica of the shelf — using real planogram data — and run structured shopper tasks against it.
Shoppers browse the virtual shelf as they would in-store: scanning products, picking items up, adding them to their basket. The difference is that every action is captured. Every hover, every dwell moment, every purchase decision — all of it is data.
The result is behavioural insight that physical retail simply can't produce: why did the shopper pick that product? What were they looking at before it? How long did they spend in the category before deciding?
The core value proposition: Virtual shelf testing replaces post-launch learning with pre-launch learning. You find out what doesn't work before it costs you a listing.
The Australian grocery context
The stakes for shelf positioning are unusually high in Australia. The grocery market is one of the most concentrated in the world — Coles and Woolworths together hold approximately 67% of the market, with Metcash (IGA and independent network) adding another significant block. That duopoly structure means there are very few second chances.
Retailers control range review cycles. They set the planogram. They decide how many facings your product gets and whether it sits at eye level or at the bottom of the gondola. For most CPG brands, influencing those decisions means walking into a category review with evidence — not instinct.
Virtual shelf testing produces that evidence. You can test your product at different shelf positions and quantify the visibility difference. You can show that three facings outperforms two not just in theory but in actual shopper behaviour data. You can demonstrate that your new packaging improves purchase intent compared to the current format.
That's a fundamentally different category review conversation.
How planogram data powers shelf testing
The quality of a virtual shelf test depends entirely on the quality of the shelf replica. A generic shelf mock-up with placeholder products doesn't tell you anything useful about how shoppers behave in a real Coles aisle.
Planogram data is what makes the test real. A planogram defines the exact placement of every product on a shelf: which SKU sits where, how many facings it has, which products are adjacent, what the shelf height and depth are. When you build a virtual shelf from actual planogram data, shoppers are navigating a faithful replica of what they'd see in-store — not an approximation.
This matters for two reasons:
- Ecological validity: Shopper behaviour in a realistic environment reflects real-world behaviour. If the shelf looks like a real category, the browsing patterns, dwell times, and purchase decisions will generalise to real stores.
- Actionable output: When the test is grounded in real planogram data, the findings translate directly to real decisions. "Move from position 4 to position 2 in bay 3" is actionable. "Be near the left side" is not.
Australian category planograms — across Coles, Woolworths, and independent formats — have enough variation that category-specific data matters. A confectionery planogram for a suburban Woolworths metro store differs meaningfully from a regional Coles format. Testing against the right planogram gives you results that are relevant to the specific retail context you're competing in.
Key metrics from a virtual shelf test
A well-structured virtual shelf test produces several distinct measurement types, each answering a different question:
Dwell time
How long did shoppers spend looking at each shelf zone? Dwell time reveals attention — which products and positions captured shopper engagement, and which were effectively invisible. High dwell with low purchase suggests a consideration barrier. Low dwell in a high-traffic zone suggests a visibility problem.
Purchase intent
Which products did shoppers select? Purchase intent data shows conversion at the shelf level, before any price or promotion variables are introduced. It tells you whether your product's packaging and positioning are doing the work of triggering a pick-up.
Facings efficiency
How does purchase intent change as the number of facings increases? Facings efficiency analysis answers one of the most common questions in shelf strategy: at what point does adding another facing stop generating proportional return? For most categories, the answer isn't linear — there's a threshold beyond which additional facings provide minimal uplift. Knowing that threshold lets you negotiate smarter.
Navigation patterns
Where did shoppers look before they selected a product? Navigation data reveals the decision journey — how shoppers move through a category, what anchors their attention, and what visual cues drive brand switching. This is particularly useful for new product launches where the goal is to intercept shoppers who currently have an established preference.
Share of attention vs. share of shelf
Your brand may have 20% of the shelf by facing count, but is it getting 20% of shopper attention? Share-of-attention analysis maps actual visual engagement against physical space, identifying whether your current position is punching above or below its weight.
Where ShelfLab fits in
ShelfLab is a virtual store testing platform built specifically for Australian grocery retail. It combines real Australian planogram data with structured shopper tests — so the shelf you're testing reflects what's actually in Coles, Woolworths, and independent stores, not a generic US or UK category template.
The platform is designed for CPG brands and retailers who need to move quickly. You can explore a live planogram visualiser immediately, or request a demo to walk through a structured shelf test for your category. No physical store visits. No waiting for a category review cycle to deliver results. Just the data you need to make a better case at the next range review.
Virtual shelf testing isn't a niche research methodology reserved for multinationals with dedicated shopper insights teams. It's become table stakes for any brand serious about competing on shelf in Australian grocery. The tools to do it properly now exist — and the cost of not doing it is a listing decision made without evidence.