Case Study — E-Commerce SaaS

Activation Up 75%, $2.5M+ Annual Revenue Identified, Feature Discovery Tripled.

Marketing analytics platform for e-commerce brands — ~30–50 person company, Series A. The VP of Product knew activation was broken. What they didn't know: the root cause was invisible.

Stack Amplitude Python AWS SageMaker pandas scikit-learn
20%35%
Activation rate improvement (+75%)
$2.5M+
Annual revenue opportunity identified
40+
Critical missing events discovered
13%40%+
Highest-value feature discovery rate
1.8×
Retention multiplier for 3+ feature users

Context.

Company Profile
  • Marketing analytics SaaS for e-commerce brands
  • ~30–50 employees, Series A stage
  • Tiered pricing: Basic · Pro · Enterprise
  • Stack: Amplitude, Python, AWS, pandas, scikit-learn, SageMaker
  • Existing analytics implementation with significant blind spots
Team Composition
  • VP of Product leading growth efforts
  • Small data team with limited analytics engineering bandwidth
  • Engineering team shipping features without event instrumentation
  • No dedicated product analytics role

Before ProductQuant.

The VP of Product knew activation was low — roughly 20% of new users completed the setup process. But they couldn't explain why. The team had a hypothesis that the onboarding flow was too long, but A/B testing shorter flows produced no meaningful improvement.

What they didn't know: the activation funnel had analytics coverage on the first three steps and then nothing. The team couldn't see where users went after step three because no events were being tracked below that point. They were trying to optimise a funnel they couldn't see.

Worse: the platform's highest-value feature — automated bid optimization across multiple marketplaces — had a 13% organic discovery rate and zero tracking. Users who discovered 3+ features retained at 1.8× the rate, but nobody was measuring feature adoption at all. The retention multiplier was a hidden superpower with no instrumentation.

The Problem
  • Activation stuck at 20% with no visibility into where the funnel broke
  • Highest-value feature had 13% discovery rate and ZERO analytics tracking
  • 40+ critical events entirely missing from the analytics setup
  • 1.8x retention multiplier for multi-feature users — unknown, unmeasured
  • Tiered pricing with no data on which features drove upgrades

What they tried before us.

Attempt 1 — Shorter onboarding flows

The team ran A/B tests reducing the number of setup steps, assuming users were dropping off because the flow was too long.

Outcome: No statistically significant improvement. The bottleneck wasn't step count — it was invisible post-step-3 behavior.
Attempt 2 — In-app messaging campaigns

They tried tooltip guides and in-app messaging to push users toward key features.

Outcome: Marginal lift in clicks, no impact on activation. The guides were pointing at features that weren't instrumented, so they couldn't measure whether users actually completed the guided actions.

Why it didn't work: Both attempts addressed symptoms, not the root cause. The problem wasn't the onboarding flow — it was the complete absence of analytics coverage below the first three funnel steps and across the entire feature adoption surface. You can't optimise what you can't measure.

The diagnosis.

Working through their data, the real problem was not the obvious guess. The VP of Product assumed the onboarding flow itself was broken. The data told a different story.

Finding 1 — The analytics blind spot

The team had instrumented the first three activation steps (signup, account creation, data source connection) but nothing after. 40+ critical events were entirely missing from the analytics setup — including the core action that defined activation for their product. The funnel appeared to drop off at step four only because step four was never tracked. Users were completing setup; the analytics just weren't recording it.

Finding 2 — The invisible feature problem

The automated bid optimization feature — the company's highest-value, highest-retention capability — had an organic discovery rate of just 13%. Users who found it retained at 1.8× the rate. But without tracking, the team had no visibility into whether users were discovering features at all. A retention multiplier this strong was being left entirely to chance.

Finding 3 — The retention lever was hidden

Analysis of existing payment data revealed that users who engaged with 3+ distinct product features had a 1.8× retention rate compared to single-feature users. This was the strongest retention signal in the data — and it had never been measured because feature usage was not instrumented. The team's north star was setup completion when it should have been feature depth.

The fix.

A complete rebuild of the analytics infrastructure, event taxonomy, and activation strategy — grounded in what the data actually revealed.

Fix 1 — Complete Analytics Rebuild
Every event audited. 40+ missing events identified and instrumented. Event taxonomy rebuilt from scratch with consistent naming conventions. Every event mapped to a specific business question. Activation milestones, feature adoption paths, and churn signals all instrumented in Amplitude.
Fix 2 — Event Taxonomy Cleanup
Inconsistent event names standardised. Properties attached to every event. The activation funnel now had complete coverage from signup through to value realization. Each event tied to a specific decision: which step to optimise, which feature to surface, which user to prioritize.
Fix 3 — Feature Discovery Optimization
Highest-value feature (automated bid optimization) given a guided discovery path: empty-state dashboard prompt, contextual tooltip triggered at first campaign creation, and Day 5 email sequence. Feature discovery rate projected from 13% to 40%+.
Fix 4 — Tiered Activation Paths
Basic, Pro, and Enterprise users given distinct activation paths. Each tier had a different first-action goal, different empty states, and different onboarding triggers. The aha moment was accelerated for each user segment based on their pricing tier and use case.

Pricing tiers audited

Basic
$29 / mo
  • Connected accounts 1
  • Campaign automations 3
  • Analytics reports Basic
  • Bid optimization Manual only
Pro
$79 / mo yearly
  • Connected accounts 5
  • Campaign automations Unlimited
  • Analytics reports Advanced
  • Bid optimization Automated
Enterprise
Custom
  • Connected accounts Unlimited
  • Campaign automations Unlimited
  • Analytics reports Custom
  • Bid optimization Automated + AI

The result.

Before vs After metrics with quantified revenue impact.

20%35%
Activation rate improvement — +75% relative increase within 90 days of implementation
$2.5M+
Annual revenue opportunity identified — from activation improvement, feature discovery uplift, and retention multiplier compound effects
40+
Critical missing or misconfigured analytics events discovered and instrumented
13%40%+
Feature discovery rate for automated bid optimization after guided discovery path deployed
1.8×
Retention multiplier confirmed for users who discovered 3+ features — now measurable and tracked in live dashboards
3
Distinct tiered activation paths — Basic, Pro, and Enterprise, each with optimized first-action goals

We knew activation was broken, but we didn't know where. The audit showed us the exact events missing, the exact features users weren't finding, and exactly what each fix was worth. That clarity changed how we prioritize everything.

— VP of Product, e-commerce SaaS platform
Key Lesson

The most damaging analytics problem isn't bad data. It's missing data. This team was trying to optimise an activation funnel they couldn't see past step three. The highest-value feature had zero tracking. A 1.8× retention lever was invisible because feature adoption was never instrumented. Finding what's absent is usually more valuable than analysing what's present — because you can't improve what you don't know exists.

What you can do now.

See exact conversion rates at every step of the activation funnel

By seller type, by cohort, and by acquisition source. Every drop-off is visible, measurable, and attributable. No more guessing where the funnel breaks.

Track feature adoption for every major capability

The retention multiplier for full-feature users is in a live dashboard. Feature discovery is managed as a product experience, not left to chance.

Address a $2.5M+ revenue opportunity

Activation improvement, feature discovery uplift, and retention compound are modeled as a prioritized experiment backlog. Every fix has a dollar value attached.

Jake McMahon
Jake McMahon
ProductQuant

10 years building analytics and growth systems for B2B SaaS at $1M–$50M ARR. BSc Behavioural Psychology, MSc Data Science. The most common analytics gap isn't bad data — it's missing data. Events never instrumented, properties never attached, funnels never connected. Finding what's absent is usually more valuable than analysing what's present.

What this looks like for your company

Analytics Audit.

A structured review of your event taxonomy, activation funnel, and analytics gaps — finding what's missing, sizing the revenue impact, and delivering a roadmap for what to instrument next.

  • Event audit: every event reviewed with status, issues, and specific recommendations
  • Activation funnel reconstruction: completion rates at each step with cohort splits
  • Gap analysis: biggest analytical blind spots revenue-sized — like the 40+ missing events that were hiding an activation problem
  • Implementation roadmap: exact event names, properties, and priority order
  • Six decision-ready dashboards delivered at completion
$3,497 · 10 days
Right for you if
  • Activation rate lower than expected with no clear view of where the funnel breaks
  • Events firing but not certain they're capturing the right properties or the right moments
  • High-value features with low discovery rates that should be driving retention

See how it works for your company.

A 15-minute call is enough to know whether what we do is relevant to where you are. No pitch. Just a conversation about your specific situation.