Look at any performance marketing report and ROAS is the first line. Clients ask about it, campaign managers report on it, budgets get increased or cut based on it. It's simple, direct, easy to compare. And that's exactly the trap.

That simplicity hides serious distortions. A 5:1 ROAS on a channel can look excellent on paper while simultaneously funding conversions that would have happened anyway. A recent article on MarTech.org raises this exact issue: dependence on ROAS creates a structural bias toward short-term efficiency at the expense of real business growth.

We see this in practice in every account audit we run. And most of the time, the solution isn't to abandon ROAS but to complement it with metrics that tell the full story.

Why ROAS alone distorts reality

ROAS measures revenue generated relative to ad spend. That's it. It doesn't account for profit margin, doesn't differentiate between a new customer and one who would have returned anyway, says nothing about the long-term value of the conversion obtained. It's a snapshot of a single transaction.

A concrete example: you have a 4:1 ROAS on brand keywords in Google Ads. Looks great in the report. But how many of those customers would have purchased without clicking the ad? Incrementality studies (lift tests) consistently show that between 10% and 30% of attributed conversions would have happened regardless of ad exposure. That means the real ROAS is lower than reported, sometimes significantly so.

The second problem: ROAS systematically favors bottom-funnel channels. Remarketing and brand search will always show a higher apparent ROAS than prospecting or awareness campaigns. If you allocate budget strictly based on ROAS, you inevitably underfund the channels that bring new customers into the ecosystem. Long-term, this leads to audience saturation and growth that stalls abruptly.

The third problem, more subtle: ROAS doesn't distinguish between customer quality. A 100 EUR conversion from a customer who'll never return and a 100 EUR conversion from a customer who'll purchase repeatedly for two years have the same ROAS but radically different business value. Without this distinction, you optimize for volume rather than quality.

Which metrics complete the picture

There's no single metric that replaces ROAS. But there's a set that, together, provides a realistic view of performance:

Customer Lifetime Value (LTV) changes the perspective entirely. According to Bain & Company research, acquiring a new customer costs 5-7x more than retaining an existing one. A customer who costs more to acquire but purchases repeatedly over 18 months is worth 3-5x more than one who converts cheaply and never returns. In Google Analytics 4, you can segment customer cohorts by acquisition source and see which channels deliver real long-term value, not just immediate conversions.

Incrementality answers the most important question in media buying: how many of these conversions would have happened without the ad? Incrementality tests (geo-split tests, holdout groups) provide concrete answers. We recently ran a test on an e-commerce account spending around 15,000 EUR per month on Meta Ads. The result: 22% of attributed conversions were not incremental. That completely changed the budget allocation between prospecting and remarketing campaigns.

CAC (Customer Acquisition Cost) by cohort shows what a new customer actually costs on each channel, not just what a generic conversion costs. The difference matters: when AI controls the bidding, cost per conversion may decrease while real CAC increases if the algorithm preferentially targets existing customers who are easier to convert.

90-day retention rate tells you whether the channel brings quality customers. A channel with high ROAS but 5% retention at 90 days costs more than one with modest ROAS but 40% retention. This metric only becomes visible when you connect ads data with CRM or e-commerce platform data.

How to implement this without overcomplicating reports

The most common objection we hear: "Sounds good, but that means a 15-page report nobody reads." It doesn't have to be that way. The shift can be gradual and pragmatic.

First step: add estimated LTV to monthly reporting. Google Ads and Meta Ads already support offline data imports and customer lifetime value segmentation. It won't be perfect immediately, but it's infinitely better than zero. Even a rough estimate (returning customers vs. one-time buyers) adds critical context to ROAS.

Second step: run an incrementality test on your largest channel once per quarter. It doesn't need to be sophisticated or expensive. A simple geo-split test (pause the campaign in one region for 3-4 weeks and compare sales with a similar region where the campaign runs normally) gives you an incrementality coefficient you can apply to all reported conversions. According to Google/BCG data, companies using advanced attribution see 20-30% improvement in media efficiency.

Third step: report new customers vs. returning separately. Most platforms already support this distinction natively. If 70% of your conversions come from existing customers, your 5:1 ROAS actually tells a very different story than the one you read at first glance. Perhaps performance is excellent at retention but weak at acquisition. Without this separation, there's no way to know.

Fourth step: connect marketing data with business data. The classic funnel model assumes a straight line from click to sale. Reality is far more circular, and metrics need to reflect that. Media Mix Modeling (MMM) provides a holistic view of each channel's contribution, including those hard to attribute directly like video campaigns or awareness display.

What changes when you report differently

We've seen accounts where ROAS was 6:1 and the client was frustrated that growth had stalled. And we've seen accounts with 3:1 ROAS generating aggressive quarter-over-quarter growth. The difference always lay in what sat beneath ROAS: customer quality, real incrementality, the split between acquisition and retention spending.

ROAS remains a useful indicator. But it's a starting point, not the destination. The difference between a performance team that delivers pretty numbers and one that delivers real growth lies precisely in the metrics they add on top of ROAS. And in the willingness to present them to the client, even when the story becomes more nuanced than a single number on a slide.