6-WEEK COHORT PROGRAM · $1,297/SEAT

6 weeks. Leave with clean instrumentation, a metric hierarchy your whole team agrees on, and 3 experiments running — with an analytics operating rhythm that keeps working after the program ends.

Live cohort, 6–12 participants, applied to your real product data. You audit your current stack, rebuild instrumentation from first principles, define the metrics that actually matter for your product type, and run your first structured experiment — all with coaching from Jake and a group working on the same problems.

Jake McMahon Jake McMahon, ProductQuant
Apply for the next cohort → Read the full curriculum ↓

PROGRAM DETAILS

Duration 6 weeks
Sessions 2 live sessions per week
Cohort size 6–12 participants
Async work Homework applied to your real product
Recording All sessions recorded

$1,297/seat · limited seats per cohort

Delivered by Jake McMahon · Founder, ProductQuant · 8+ years B2B SaaS product analytics · Australian product leader
Duration
6 weeks, 2 sessions/week
👥
Format
Live cohort via Zoom, 6–12 seats
📋
Deliverable
Clean PostHog setup + metric hierarchy + 3 live experiments
💰
Price
$1,297/seat

You have the tools. The dashboards. None of it connects.

PostHog is set up. Mixpanel has been running for 8 months. There are 12 dashboards. And when the CEO asks "what's our activation rate?" the answer changes depending on who's in the room. Product has one number. Marketing has another. The analyst pulls a third. Nobody's wrong — they're just using different event definitions that nobody agreed on before they started tracking.

Your events are firing, but you started tracking before you decided what to track. So you have 3 different definitions of "active user," a funnel that stops halfway through because someone forgot to instrument the final step, and a dashboard that's technically correct but answers a question nobody's actually asking. Every time someone builds a new query, they have to make a decision that should have been made once, documented, and never revisited.

Marketing is reporting a number that doesn't match product's number. Not because anyone made a mistake — because nobody agreed on the event schema before the first engineer started firing events. The taxonomy was never designed. It accumulated.

The result: dashboards nobody trusts, analysis that takes twice as long as it should, and a growing sense that your analytics investment isn't actually helping you make faster decisions. You're not behind because you chose the wrong tool. You're behind because clean instrumentation is infrastructure, and infrastructure has to be built deliberately.

What changes after 6 weeks.

Before the program
  • 3 different definitions of "active user" depending on who pulls the query
  • Dashboards that are technically correct but answer questions nobody's actually asking
  • No agreed metric hierarchy — every team measures what they care about independently
After the program
  • Clean event taxonomy with governance documentation — one definition, one source of truth
  • Metric hierarchy aligned to your growth motion, built specifically for your product type
  • 3 live experiments running with proper statistical controls, in your product not a doc

4 concrete deliverables, built on your own product.

A fully instrumented PostHog (or equivalent) setup with clean event taxonomy
Not a recommended configuration — your actual setup, audited and fixed. Clean event naming conventions, governance documentation, and an instrumentation plan so the taxonomy stays clean as the product changes.
A metric hierarchy aligned to your growth motion
The 3–5 metrics that actually matter for your stage and growth model, structured into a hierarchy that connects team-level work to business outcomes. Built specifically for your product, not copied from a generic framework.
3 live experiments running with proper statistical controls
Experiments designed from your audit findings, with correct hypothesis formation, sample size calculation, primary metric selection, and guardrail metrics. Running in your product, not in a planning doc.
A repeatable analytics operating rhythm your team owns after you leave
The weekly and monthly review process, escalation paths, and governance model your team needs to keep analytics trustworthy without Jake in the room. The goal is independence, not dependency.

Live cohort. Applied to your real product every week.

Duration
6 weeks
2 live sessions per week. Each session builds on the last. Applied work between sessions on your own product.
Cohort size
6–12 seats
Small enough that Jake reviews every async submission personally and every participant gets airtime in live sessions.
Async work
~2 hrs/week
Structured homework applied to your own product. Reviewed by Jake before the next session with specific written feedback.
Recording
All sessions
Every session recorded and available for the duration of the cohort and 12 months after.
Platform
Zoom + Slack
Live sessions on Zoom. Async work and cohort discussion in a private Slack channel.
Tools
PostHog focus
Primary focus on PostHog. The audit framework applies to Mixpanel, Amplitude, or any event-based tool.

What gets built each week.

W1
Analytics audit
Audit your current stack — event schema, taxonomy, funnel gaps. Every participant audits their own product with Jake reviewing live.
W2
Clean instrumentation
Rebuild the event taxonomy from first principles. Naming conventions, governance doc, and instrumentation plan applied to your actual product.
W3
Metric hierarchy
Define the 3–5 metrics that actually matter for your stage and growth model. Build the hierarchy that connects team-level work to business outcomes.
W4
Dashboard build
Build the dashboards your team will actually use — structured around your metric hierarchy, not around what PostHog defaults to showing.
W5
Experiment design & launch
Design 3 experiments from your audit findings. Hypothesis, sample size, primary metric, guardrails. Each participant launches their experiments during this week.
W6
Operating rhythm & handoff
Build the weekly and monthly review process, escalation paths, and governance model. The system that keeps analytics trustworthy after the program ends.
Read the full curriculum →

Who this program is built for — and who it isn’t.

Good fit
  • Product teams with PostHog, Amplitude, or Mixpanel set up — but dashboards nobody trusts and no agreed metric hierarchy
  • Seed–Series B companies where every analyst runs different queries to answer the same question
  • Teams that know they have an instrumentation problem but don’t have bandwidth to fix it without external structure
Not the right fit
  • Teams who already have a clean taxonomy and an agreed metric hierarchy — you’re past the foundation phase
  • Solo founders without an engineering resource to implement tracking — the instrumentation work requires someone who can ship code
  • Teams on a BI tool without a product analytics layer — the audit framework is built for event-based tools

What teams walk away with.

“Placeholder — replace with a real quote from a past cohort participant.”

Name, Role — Company

“Placeholder — replace with a real quote from a past cohort participant.”

Name, Role — Company

“Placeholder — replace with a real quote from a past cohort participant.”

Name, Role — Company

Per-seat pricing. Apply to join the next cohort.

A product analytics contractor running an instrumentation project for 4 weeks costs $8,000–$20,000. This cohort delivers the same structured output for $1,297/seat — with the added benefit of peer review across 8–12 companies working through identical problems.

Per Seat
$1,297 /seat

One-time payment per seat. No recurring fee.

  • 12 live sessions over 6 weeks (2 per week)
  • Cohort of 6–12 participants
  • Async homework reviewed by Jake each week
  • Full analytics audit of your own stack
  • Clean event taxonomy and instrumentation plan
  • Metric hierarchy built for your growth motion
  • 3 experiments designed and launched
  • Analytics operating rhythm documentation
  • All sessions recorded, 12-month access
  • Private cohort Slack channel
Apply for the next cohort →
2-week guarantee: if your team doesn’t have a working event taxonomy and a single agreed activation metric by end of Week 2, you get a full refund.

Clean instrumentation is the prerequisite. What you build on top of it — the metric stack, the experiment cadence, the weekly decision review — is the system. But you can’t build the system on broken data.

Questions.

Or apply directly →
Is this worth $1,297 vs. hiring a contractor? +
A contractor running a standalone instrumentation project costs $8,000–$20,000 for 4 weeks of work — and when they leave, the knowledge leaves with them. This cohort is $1,297, delivers the same structured output, and the person who builds it is your team member who now owns it. The other thing a contractor can’t offer: peer review across 8–12 companies working through exactly the same problems at the same time. The cohort model means you hear how a company two stages ahead of you approached their taxonomy decision, and you apply that directly to your own product.
Which analytics tools does this work with? +
PostHog is the primary tool. The audit framework and dashboard process apply to any event-based analytics tool — Mixpanel, Amplitude, or others. If you’re on a different platform, mention it when you apply and Jake will confirm fit before the cohort starts.
Do I need to share my analytics data with other cohort members? +
You share what you’re comfortable sharing. Most participants share their screen during live sessions showing dashboards and event lists. No raw user data or PII is shared with the group. The cohort format exists for peer learning, not data exposure.
What if I can’t make a live session? +
All sessions are recorded. You can submit async work and questions even if you miss a session. That said, the live review format is where most of the value is — plan to attend at least 10 of the 12 sessions.
When is the next cohort? +
Dates are confirmed when enough applications are in. Apply via the booking link and Jake will reach out with cohort dates, with no obligation to enroll.
Can multiple team members join from the same company? +
Yes, and it often works well — one person from product and one from engineering means the instrumentation work can be split sensibly. Each seat is priced separately at $1,297. Get in touch if you want to bring a team of 3 or more and we can discuss.

Your analytics problem won’t fix itself.

Every sprint you run without a clean taxonomy is another sprint of data you can’t trust. 6 weeks from now, you can have the infrastructure — or you can still be in the same meeting asking why the numbers don’t match.

6 weeks · $1,297/seat · applied to your real product