Every week, Jessica, the senior marketing manager at a prominent retail brand, eagerly awaits her performance report. It arrives punctually, and at first glance, it appears promising: **paid search is flourishing**, **social media is performing efficiently**, and **remarketing continues to dominate in ROAS**. The agency’s dashboards boast clear wins, comprehensive data, and seemingly straightforward recommendations.
Yet, beneath this polished exterior, Jessica feels an unsettling hesitation. She understands a crucial truth: her audience **does not follow a linear journey**.
- They’re mobile-first, constantly **switching screens** and **absorbing content** with fractured attention.
- They encounter her brand on platforms like **podcasts**, **CTV**, and even **mobile games**—vital channels where ad impact is significant, but measurement remains elusive.
What should she prioritize? The tidy data from her dashboards or the **complex, messy reality** of consumer behavior?
When Top Performers Are Just the Best-Measured Channels
It’s no surprise that channels like **remarketing**, **paid search**, and **lower-funnel social ads** often take center stage in performance reports. These platforms showcase strong last-click conversions, favorable ROAS, and well-defined audience targeting. But what about:
- The **CTV campaign** that prompted a subsequent brand search?
- The **mobile game ad** that ignited initial awareness?
- The **podcast sponsorship** that fostered trust long before the user reached the website?
These influential channels often go unrecognized in standard reporting. They miss the mark in last-click attribution, reflecting a **common frustration**: the easiest channels to measure often garner the most credit, even if they aren’t contributing significantly to overall brand success.
Explore more: How smarter measurement can fix marketing’s performance trap
The Role of the Ad Server: An Often-Overlooked Source of Truth
Complicating matters further is the **ad server**. Jessica’s media strategy operates across multiple platforms such as search, social, and DSPs, relying on a third-party ad server to unify impressions, clicks, and conversions. This is expected to serve as a neutral arbiter of truth. However, discrepancies between platform-reported data and ad server statistics are common.
Why the mismatch? Platforms tend to self-attribute, often presenting an inflated picture of their performance. On the other hand, the ad server takes a more conservative approach, tracking unassisted conversions using a stricter attribution window. While this helps prevent double-counting, it can downplay the influence of upper-funnel channels, especially those involving view-through behavior.
When Jessica observes the data from the ad platform and the ad server, she often finds contradictions. Unfortunately, neither aligns with insights from her internal CRM and analytics platform. This complexity produces what can be termed a **data conflict triangle**: ad platform vs. ad server vs. internal metrics.
View-Through Conversions: A Messy Middle Ground
Jessica is aware that she cannot overlook **view-through conversions**, particularly in channels like display, video, and CTV, where clicks may be sparse, but impressions fuel brand recall. However, it’s crucial to recognize that **not all view-through conversions** are created equal.
- Assisted view-through conversions suggest that an ad played a role but wasn’t the final touchpoint.
- Unassisted (or last-view) conversions credit the last impression before the conversion.
The manner in which view-through conversions are tracked can dramatically alter the apparent performance of various channels. For instance, an ad platform might credit CTV for driving conversions via view-through, yet the ad server may dispute this, showing users converting later through search or direct traffic.
In this murky landscape, the pressing question isn’t about being right or wrong, but about understanding what type of influence you’re measuring. This insight is essential for making optimization decisions that genuinely reflect consumer behavior.
The Cross-Device Conundrum
Then comes the wildcard: **cross-device tracking**. Jessica’s audience transitions not only between channels but across various screens. A potential customer might:
- Watch a CTV ad in the evening.
- Search for the brand on their mobile device the following morning.
- Click on a remarketing ad on their desktop during lunch.
These touchpoints often appear disjointed unless robust identity resolution is employed, which most brands struggle to achieve. Consequently, the last touchpoint tends to receive the bulk of credit—commonly a desktop click. The dilemma is not that last-click interactions lack importance; rather, every interaction preceding the final click contributes to the journey, even if it remains invisible in standard metrics.
Compounding this issue are mobile tracking challenges, app-to-web navigation hurdles, and privacy limitations (consider iOS and SKAdNetwork), which can further obfuscate attribution. As a result, upper-funnel mobile impact frequently goes unacknowledged.
Delve deeper: The real reason marketing measurement keeps failing
What Should You Optimize Then?
This is the juncture where remarkable marketers distinguish themselves from those who merely settle for good reporting. Exceptional marketers shift their focus from “**What performed?**” to “**What influenced?**” Here’s how you can adopt this mindset.
1. Look at Attribution in Layers
Utilize multiple attribution models—**last-click**, **first-touch**, and **multi-touch**—to analyze how performance shifts across different frameworks. Channels like podcasts, CTV, and gaming may appear underwhelming in last-click models but resonate powerfully in assist roles. If CTV and podcast efforts indicate lift in first-touch or assist models, that’s a **positive sign**, not a failure.
2. Use the Ad Server as a Check, Not the Bible
While ad servers can help clarify inflated platform data, they are not infallible. Utilize them to balance reporting and understand cross-platform behavior, especially when deciphering overlapping conversions.
3. Isolate Impact with Holdouts or Geo Tests
If you’re uncertain about a channel’s value—say, CTV or podcasts—consider running **holdout tests**. Pause campaigns in specific regions or segments. If performance declines across channels, that “underperforming” channel was likely contributing more than initially perceived.
4. Look for Proxy Signals Where Direct Tracking Fails
In the complex landscape of cross-device journeys, use **proxy indicators** to gauge impact:
- **Branded search volume**.
- **Direct traffic spikes**.
- **Social mentions** or **TikTok engagement**.
- **Post-exposure surveys**.
Even imperfect signals can provide greater guidance than mere click data.
Discover more: The smarter approach to marketing measurement
Bottom Line: Trust the Data, But Trust Your Audience More
While you need data and structure, it’s crucial to remember that your dashboard alone does not define your strategy. When channels seem underperforming in reports but your audience is engaging, the issue lies not with the media but with the measurement.
- **Optimize what truly works**, not just what shows up visibly.
- **Ask tough questions** before reallocating budget.
- And remember: your customer’s journey is indifferent to your attribution model.