Case Study — Amazon PPC Platform · Activation Strategy

45% of signups stalled before activation. The product worked, but too many users never reached the part that retained them.

Users who activated automation within 31 days retained 1.8–1.9x better. The problem: most new users were not getting there. We mapped the stalled segments, found 28 missing analytics events, and built a phased roadmap to improve activation and unlock $2.5M+ in annual revenue.

45%
Signups never activated automation
28
Critical missing analytics events identified
1.9x
Retention lift confirmed for early activators
$2.5M+
Annual revenue impact from roadmap
Stack Amplitude Python JTBD

Before.

The platform had scaled past $10M ARR with a powerful automation engine. The value was real for users who reached it: rule creators assigned automations 6.6x on average, and Strategic Objective creators assigned them 10.1x.

The problem was the first mile. 45% of signups registered, connected their marketplace, and stalled before activating automation. The onboarding assumed users already knew what rule to build. Beginners got configuration work when they needed a goal-based starting point.

Analytics could not explain the gap. The team had 15 events, but not the events required to measure time-to-activation, rule setup abandonment, feature discovery, or the difference between users progressing and users at risk.

The Situation
  • 45% of signups never activated automation — “Stalled Starters”
  • Only 13% ever discovered Automate Assignment — the platform’s highest-converting feature
  • 38% signup → rule creation rate; no data on what caused the other 62% to stall
  • 28 critical analytics events missing — time-to-activation was invisible
  • Guided tour had documented failure modes: users couldn’t find it or launch it reliably

Four segments hiding in the signup cohort.

The activation data separated the signup cohort into four groups. Each group needed a different intervention.

45%
Stalled Starters
They sign up, connect a marketplace, and never create a rule. This was the biggest activation gap and the primary test target: goal-oriented onboarding instead of configuration-first setup.
25%
Slow Activators
They eventually create a rule, but too late to capture the full retention lift. They need a day 7-21 intervention that suggests the first rule for their segment instead of making them hunt for it.
15%
Day-One Winners
They create a rule within 24 hours and retain best. These users already know what to build, so the design challenge was to help beginners without slowing experts down.
15%
Early Churners
They leave within two weeks and never activate. This was not the same as stalling: many expected a simpler tool or managed service. Some early departures were structural to the self-serve motion.

What we did.

A full activation audit: user segmentation, UX research, competitive analysis, JTBD synthesis, analytics gap analysis, and a phased implementation roadmap.

Step 1 — Baseline Automation Usage Analysis
Analysed 15 existing Amplitude events to establish the baseline. The core finding: power-user behavior was strong, but discovery was weak. Only 1.95% of weekly active users created Strategic Objectives, even though users who found the feature assigned them at a 10.1x multiplier.
Step 2 — UX Research & Transcript Analysis
Analysed a UX test recording and documented the failure points with timestamps. The guided tour was hard to find, sometimes failed to launch, and tooltips covered the UI they were supposed to explain. One tour also referenced a feature that no longer existed in the current product.
Step 3 — Competitive Architecture Analysis
Compared the onboarding model against two competitor patterns: white-glove managed service and self-serve goal selection. The strategic direction was clear: keep self-serve accessibility, but make activation more goal-oriented so beginners do not have to manually design their first automation rule.
Step 4 — JTBD Framework
Synthesised the core job from 650 exit comments, 33 year-end surveys, and 2,914 Intercom support tickets. The user did not want to "build a rule." They wanted confidence that PPC optimization was covered without daily manual work. That made goal-based onboarding a better fit than rule-configuration prompts.
Step 5 — Analytics Gap Analysis (28 Missing Events)
Mapped 28 missing events across the automation funnel, advanced features, and engagement layer. The most important missing event was first_automation_activated. Without it, the team could not measure who reached the critical 31-day activation window.
Step 6 — Phased Implementation Roadmap
Built a three-phase roadmap. Phase 1: critical event implementation and a goal-oriented onboarding A/B test. Phase 2: Automate Assignment discovery and results attribution. Phase 3: AI-suggested first rule plus analytics dashboards. Total modeled investment: $600K. Expected annual revenue impact: $2.5M+.

Where users were — and weren’t.

The feature data showed a simple pattern: the strongest features worked for users who found them. Discovery, not product value, was the bottleneck.

Feature Discovery (% WAU) Total Events Avg per User Signal
Rule Assignment
Core automation usage
9.36% 2,112 1.46x Strong Reuse
Strategic Obj. Assign
Most powerful feature
6.08% 1,454 1.51x 10.1x Multiplier
Scale Optimizer Fix Now
Recommendation acceptance
3.13% 314 1.65x Underutilised
Rule Creation
Entry-level action
4.47% 320 1.27x Low Repeat
Strategic Obj. Create
Feature initiation
1.95% 144 1.27x Hidden
Scale Optimizer Automate
Automation from recommendation
0.36% 18 1.50x Rarely Found

Data from Oct 3–10, 2025. WAU = 3,289 users. Strategic Objectives: 144 created, 1,454 assigned. The 10.1x multiplier confirms power-user value; the 1.95% create rate confirms the discovery problem.

After.

45%
Stalled Starter rate quantified, giving the team a real baseline for activation experiments
1.9x
Retention lift confirmed for users who activate within 31 days
28
Missing analytics events specified across three priority tiers
10.1x
Strategic Objectives assignment multiplier, proving the feature was valuable when discovered
$2.5M+
Annual revenue impact from the three-phase activation roadmap
$150K
Phase 1 investment for critical events and the goal-oriented onboarding A/B test

What changed for the team.

The retention advantage became measurable. The team knew activation within 31 days mattered, and had a clear event plan for tracking who reached that milestone.

The next experiment became obvious. Instead of asking beginners to configure rules manually, test goal-oriented onboarding: Growth, Profitability, or Cost Control as the starting point.

Strategic Objectives moved from hidden power feature to targeted intervention. The feature had a 10.1x assignment multiplier, but only a 1.95% create rate, making discovery the clear product opportunity.

Jake McMahon
Jake McMahon
ProductQuant

10 years building growth systems for B2B SaaS companies at $1M–$50M ARR. BSc Behavioural Psychology, MSc Data Science. This engagement combined UX transcript analysis, competitive architecture review, JTBD synthesis from 3,500+ data points, analytics gap analysis, and a three-phase activation roadmap.

What this looks like for your company

Activation Deep Dive.

Two weeks to map your activation funnel end-to-end, confirm where it breaks with data, identify your top three fixes ranked by impact, and agree on an activation definition tied to retention.

  • Full activation funnel mapped from signup to aha moment with completion rates at each step
  • Drop-off points confirmed with data — cohort breakdowns by plan, channel, and user type
  • Top 3 fixes ranked by revenue impact: quick wins separated from structural changes
  • Activation event defined and validated against 30-day retention; baseline established
$4,997 · 2 weeks
Right for you if
  • Activation rate below 40% or declining — users signing up but not reaching value
  • Multiple user types with radically different starting points, goals, and prior experience
  • Know people are dropping off but can’t pinpoint where in the funnel or why

Power users retain. New users stall. What blocks the path?

If your best users get value but most new users never reach that behavior, the problem is usually activation, not product-market fit. A short call is enough to see whether the same pattern is showing up in your funnel.