Background shadow
MarTech SaaS · Audit

How a MarTech SaaS Discovered Their True CAC Was 2.1× Higher Than Reported

How a MarTech SaaS Discovered Their True CAC Was 2.1× Higher Than Reported

A Seed-stage MarTech SaaS running $18K/mo across Google, Meta, and LinkedIn looked healthy on paper: CPLs were "in range" and lead volume was growing. Revenue wasn't. The founder asked: "Ad platforms say 120+ leads per month. HubSpot says 50. Who's right?" Neither. A year of optimizations had been made off inflated conversion data, and true CAC was 2.1× higher than what had been shown to the board.

A Seed-stage MarTech SaaS running $18K/mo across Google, Meta, and LinkedIn looked healthy on paper: CPLs were "in range" and lead volume was growing. Revenue wasn't. The founder asked: "Ad platforms say 120+ leads per month. HubSpot says 50. Who's right?" Neither. A year of optimizations had been made off inflated conversion data, and true CAC was 2.1× higher than what had been shown to the board.

The Problem

The Problem

The Problem

The platforms said 120+ leads per month. HubSpot said 50. The real number was ~70. Both sides were wrong, for different reasons.

Platforms were over-counting. Google had two conversion tags firing on the same form submission — one from a previous freelancer, one the founder added later. Every form fill counted twice. Meta's pixel was firing on page load of the thank-you page, not the form submission — bot visits, page refreshes, and direct URL navigations all counted as "conversions." One contact who bookmarked the thank-you page generated 14 "conversions" in a single month. LinkedIn was treating demo requests, whitepaper downloads, and newsletter signups as the same "Lead" event with equal weight.

CRM was under-counting. HubSpot only captured leads from the main website form. Trial signups from the PLG flow were tracked in Mixpanel and never synced — about 30% of actual leads were invisible to the CRM entirely.

Algorithms were flying blind. No offline conversions on any platform. No clear primary conversion action anywhere. Budget allocation hadn't changed in 4 months — not because it was optimal, but because no one trusted the data enough to move it.

"Our entire last board deck was based on wrong numbers. That's… not great."

What I Found

What I Found

What I Found

  1. Meta was the biggest misallocation. Inflated pixel data had it looking like a strong performer — $5K/mo, nearly 28% of total spend. CRM-verified reality: fewer than 10 pipeline-qualified leads in the past 6 months. The pixel had been optimizing toward phantom conversions — page loads, bot visits, refreshes — so the algorithm never learned what a real lead looked like. Every dollar of that $5K/mo was training Meta to find more noise.

  2. Google had a reasonable structure but no signal clarity. All conversion types lumped together. A branded search trial signup looked identical to a non-brand whitepaper download. The algorithm couldn't distinguish high-intent from low-intent.

  3. The trial and demo funnels were cannibalizing each other. Both audiences saw the same ads pointing to the same landing page with both "Start Free Trial" and "Book a Demo" buttons. Trial visitors converted at 4× the rate, but demo-seekers had 6× higher ACV. The page was optimized for neither.

  4. Nobody was incompetent. The system had no owner. Tracking had been built incrementally over 18 months by different people, and nobody ever reconciled the full path from click → conversion → CRM.

The Roadmap

The Roadmap

The Roadmap

🔴 Critical fixes (Week 1–2): Remove duplicate GTM tags. Reconfigure Meta pixel to fire on form submission, not page load. Separate LinkedIn conversion events by type. Implement server-side tracking for Google and Meta CAPI. Set up Mixpanel → HubSpot sync for PLG trial signups. Result: platform-reported conversions reconciled with CRM.

🟡 Quick wins (Week 3–4): Set up offline conversion imports from HubSpot to Google and LinkedIn. Scale Meta back to $3K/mo — awareness and remarketing only until it can prove pipeline value with clean data. Define one primary conversion action per platform. Result: $2K/mo reallocated to pipeline-generating channels, algorithms start optimizing toward pipeline.

🟢 Scale opportunities (Month 2–3): Split landing page into dedicated trial (PLG) and demo (SLG) paths. Build retargeting sequence for trial users who don't activate within 7 days. Re-evaluate channel mix after 30 days of clean data.

The client executed the critical fixes and quick wins internally within 3 weeks, using the roadmap as a step-by-step guide. I stayed available for two months after delivery — clarifying specifics as questions came up and sharing references where needed. Implementation was fully on their side.

Results

Results

Results

Metric
Delta
Timeframe

Platform vs. CRM
conversion gap

Platform vs.
CRM conv. gap

2.4× discrepancy →
<10% variance

2.4× discr.
→ <10% variance

2 weeks

2 weeks

True CAC

Surfaced (was 2.1× underreported

Surfaced (was 2.1× under- reported

2 weeks

2 weeks

Meta ad spend

$5K → $3K (awareness
+ remarketing only)

2 weeks

Google Cost per
Opportunity

−26% after signal correction
+ offline conversions

60 days

PLG leads visible in CRM

~70% → ~98% captured

3 weeks

Based on 6 months of platform data cross-referenced with HubSpot and Mixpanel.
"Qualified" = demo completed + confirmed fit (SLG) or trial activated with repeat usage (PLG)

Based on 6 months of platform data cross-referenced with HubSpot and Mixpanel.
"Qualified" = demo completed + confirmed fit (SLG) or trial activated with repeat usage (PLG)

Key Takeaways

Key Takeaways

Key Takeaways

Reducing Meta spend felt counterintuitive — but the data was clear. Inflated pixel data had Meta looking like the second-best channel. CRM showed almost zero pipeline in 6 months. Sometimes the right finding is "stop doing this."

The scariest deliverable was a corrected CAC number. Telling a founder "your real CAC is 2× what you told the board" is not a fun conversation. But better to find out from a consultant than from a Series A investor doing due diligence.

"I was making decisions based on numbers that were fundamentally wrong. The scariest part wasn't the wasted spend. It was realizing I'd been telling the board a story that wasn't true." — Founder & CEO

This is exactly what the Paid Media Audit
delivers

This is exactly what the Paid Media Audit delivers

A full audit of tracking, account structure, and messaging-to-funnel resonance, with a 90-day execution roadmap. If your ad platforms and your CRM are telling different stories — start here: