How to calculate churn in SQL

Sharpen SQL for your next interview
500+ SQL problems with worked solutions — joins, window functions, CTEs.
Join the waitlist

What churn actually measures

Imagine the scenario: it's Sunday night, your PM pings you on Slack asking for "the churn number" for the Monday business review, and the only thing they specify is "we lost some Pro customers last month, right?". That single ambiguous request hides at least four different churn calculations, and the one you pick determines whether the meeting ends with a celebration or a layoff plan. Customer churn, revenue churn, voluntary cancellations, and behavioral churn for the freemium tier are all different numbers, sometimes by a factor of two, and they all answer different business questions.

Churn rate, at its most basic, is the share of users who stopped being active in a given period — for subscription products, the ones who cancelled. The formula every analyst memorizes the first week of the job is churned during period / active at start of period × 100%. It pairs directly with retention through Churn + Retention = 100%, which means the two metrics are literally the same shape of data viewed from opposite ends. Pick the one your team finds more motivating: leadership at Stripe and Snowflake usually talks retention because it sounds like growth, while finance teams at Netflix or DoorDash talk churn because it ties cleanly to lost MRR.

The hard part is not the formula. The hard part is the denominator, the window, the cancellation taxonomy, and what you do with trials and downgrades. This post walks through seven ready-to-paste SQL queries you can adapt for any subscription or freemium product, plus the pitfalls that come up in real production dashboards and analytics interviews.

1. Monthly customer churn for a subscription

The canonical version. Pick a period, count who was active at the start of it, count who cancelled inside it, divide.

WITH active_start AS (
    SELECT COUNT(*) AS active
    FROM subscriptions
    WHERE started_at <= '2026-03-01'
      AND (ended_at IS NULL OR ended_at > '2026-03-01')
),
churned AS (
    SELECT COUNT(*) AS cnt
    FROM subscriptions
    WHERE ended_at >= '2026-03-01'
      AND ended_at <  '2026-04-01'
)
SELECT
    c.cnt::NUMERIC / NULLIF(a.active, 0) AS churn_rate
FROM active_start a, churned c;

The denominator is "active at start of period", not "everyone who ever subscribed". Mix those two and you will report a churn rate three to five times lower than reality, which is a great way to look smart for a quarter and lose all credibility in the next QBR.

2. Behavioral churn for products without subscriptions

For freemium consumer apps, B2B usage-based products, and anything where there is no explicit cancel button, churn is defined by inactivity. A user is "churned" if they haven't done a meaningful action inside a chosen window.

SELECT COUNT(DISTINCT u.user_id) AS churned
FROM users u
WHERE EXISTS (
    SELECT 1 FROM events e WHERE e.user_id = u.user_id
)
AND NOT EXISTS (
    SELECT 1
    FROM events e
    WHERE e.user_id  = u.user_id
      AND e.event_at >= CURRENT_DATE - INTERVAL '30 day'
);

The 30-day window is illustrative. For a daily-use product like a messaging app or a productivity tool, 30 days is already a long absence and you might define churn at 14 days. For a tax-filing app used twice a year, 30 days means nothing and 9 months is more honest. Pick a window with the product team, write it down in a one-pager, and never silently change it — analytics teams at Airbnb and Notion have re-litigated this definition more than once after a leadership change.

3. Voluntary vs involuntary churn

A cancellation flagged as user_cancel is a product or pricing problem. A cancellation flagged as payment_failed or card_expired is a billing infrastructure problem. Reporting them as one number hides the lever you should actually pull.

-- voluntary: the user pressed the cancel button
SELECT COUNT(*) AS voluntary
FROM subscriptions
WHERE ended_at >= '2026-04-01'
  AND ended_at <  '2026-05-01'
  AND cancel_reason = 'user_cancel';

-- involuntary: payment failed, card expired, etc.
SELECT COUNT(*) AS involuntary
FROM subscriptions
WHERE ended_at >= '2026-04-01'
  AND ended_at <  '2026-05-01'
  AND cancel_reason IN ('payment_failed', 'card_expired');

Involuntary churn is often a hidden 20-40% of total churn at consumer SaaS companies. It is also the easiest to recover: smart retry logic, account updater services with the card networks, and a pre-expiration email flow can claw back a meaningful chunk of MRR in a single quarter. Voluntary churn, on the other hand, takes onboarding rework, value-prop iterations, and longer feedback loops with product.

4. Cohort churn

Aggregate churn numbers lie if your acquisition mix is shifting. A surge of low-intent traffic from a paid campaign will inflate this month's "active base" and depress next month's churn rate without anything actually improving. Cohort churn — grouped by acquisition month — fixes this.

WITH cohort AS (
    SELECT
        user_id,
        DATE_TRUNC('month', started_at)::DATE AS cohort_month
    FROM subscriptions
),
ended AS (
    SELECT
        user_id,
        DATE_TRUNC('month', ended_at)::DATE AS end_month
    FROM subscriptions
    WHERE ended_at IS NOT NULL
)
SELECT
    c.cohort_month,
    COUNT(*) AS cohort_size,
    COUNT(e.user_id) FILTER (
        WHERE e.end_month <= c.cohort_month + INTERVAL '3 month'
    ) AS churned_in_3m,
    COUNT(e.user_id) FILTER (
        WHERE e.end_month <= c.cohort_month + INTERVAL '3 month'
    )::NUMERIC
    / NULLIF(COUNT(*), 0) AS churn_3m
FROM cohort c
LEFT JOIN ended e USING (user_id)
GROUP BY c.cohort_month
ORDER BY c.cohort_month;

The cohort cut is what tells you whether the product is actually getting better. If recent cohorts have lower 3-month churn than older ones, your onboarding, pricing, or feature work is paying off. If churn is flat across cohorts, the aggregate number is moving because of acquisition mix, not product improvement.

5. Churn by plan

Pricing tiers churn at very different rates. Almost universally, the cheapest paid tier has the highest churn — the customers who pick it are the most price-sensitive, the most marginal, and the first to leave when budgets tighten.

SELECT
    plan,
    COUNT(*)                                          AS cohort_size,
    COUNT(*) FILTER (WHERE ended_at IS NOT NULL)      AS churned,
    COUNT(*) FILTER (WHERE ended_at IS NOT NULL)::NUMERIC
        / NULLIF(COUNT(*), 0)                         AS overall_churn
FROM subscriptions
WHERE started_at >= CURRENT_DATE - INTERVAL '6 month'
GROUP BY plan
ORDER BY overall_churn DESC;

Run this before any discounting decision. If your $9/month tier already churns at 12% per month and your $49/month tier churns at 3%, a "summer promo" that pushes more people into the cheap tier is going to look great for two weeks and ugly for two quarters.

Sharpen SQL for your next interview
500+ SQL problems with worked solutions — joins, window functions, CTEs.
Join the waitlist

6. Dollar churn (revenue churn)

Lost revenue, not lost users. A $500/month enterprise customer leaving has the same customer-churn weight as a $9/month consumer cancelling, but the financial damage is 55x worse. Dollar churn — sometimes called gross revenue churn — is the version your CFO actually cares about.

WITH mrr_start AS (
    SELECT SUM(monthly_price) AS mrr_start
    FROM subscriptions
    WHERE started_at <= '2026-04-01'
      AND (ended_at IS NULL OR ended_at > '2026-04-01')
),
mrr_churned AS (
    SELECT SUM(monthly_price) AS mrr_lost
    FROM subscriptions
    WHERE ended_at >= '2026-04-01'
      AND ended_at <  '2026-05-01'
)
SELECT
    mrr_churned.mrr_lost,
    mrr_start.mrr_start,
    mrr_churned.mrr_lost::NUMERIC / NULLIF(mrr_start.mrr_start, 0) AS dollar_churn_rate
FROM mrr_start, mrr_churned;

When higher-tier customers leave at any noticeable rate, dollar churn runs visibly above customer churn — that gap is exactly the financial impact the executive team should be tracking. For B2B SaaS, the gold-standard variant is net revenue retention (NRR), which also adds expansion MRR from upsells; we'll keep this post focused on the gross version.

7. Cancellation reasons

If your cancel flow includes a short exit survey, the reasons table tells you where to spend engineering and product cycles next.

SELECT
    cancel_reason,
    COUNT(*)                                               AS cnt,
    ROUND(100.0 * COUNT(*) / SUM(COUNT(*)) OVER (), 2)     AS pct
FROM subscriptions
WHERE ended_at >= CURRENT_DATE - INTERVAL '30 day'
GROUP BY cancel_reason
ORDER BY cnt DESC;

Cancellation reason buckets that show up most often in real exit surveys: "too expensive", "not using it enough", "missing feature X", "switching to competitor", "company shutting down / left the role". The first two point at value perception and onboarding; the third points at product roadmap; the last two are mostly outside your control and worth flagging separately so they don't inflate the "addressable" churn number.

Common pitfalls

The single most expensive mistake is the wrong denominator. Dividing churned users by every account that has ever existed in the database systematically understates churn, sometimes by an order of magnitude. The correct denominator is users who were active at the start of the period — the population at risk of churning during the window. Analysts at companies as varied as Uber and Figma have shipped dashboards with the wrong denominator and only caught it after a finance review.

Lumping voluntary and involuntary cancellations into one bucket is the second trap. They are caused by different problems, owned by different teams, and fixed by different work. Voluntary churn is a product and pricing problem owned by PM and design; involuntary churn is a payments and billing problem owned by engineering or a third-party tool like Stripe Smart Retries. If your dashboard shows one combined number, leadership will spend time on the wrong intervention.

Mixing trial cancellations with paid churn corrupts the metric in the other direction. Users who never converted from a free trial are not really paid churners — they are conversion-funnel dropouts. They belong in a trial-to-paid conversion report, not in the paid-churn number, and rolling them in will make a healthy product look like it's bleeding customers. Filter trial-only subscriptions out of the paid churn query, or report them as a separate "trial churn" column.

The arithmetic of "annual churn equals twelve times monthly churn" is wrong, and it shows up in interviews more often than candidates expect. If monthly churn is constant, the correct compounding is annual_churn = 1 − (1 − monthly_churn)^12. A 5% monthly churn compounds to about 46% annual, not 60%. Get this wrong on a whiteboard in front of a head of analytics and you will not get the offer.

Finally, silently changing the churn definition mid-quarter is a credibility-killer. If you switched from a 30-day to a 60-day inactivity window because the 30-day number looked bad, document it, version it, and label the chart. Otherwise the trend line moves for reasons nobody can explain and the metric becomes useless.

Optimization tips

The queries above are intentionally readable, not fast. On a subscriptions table with tens of millions of rows the most useful indexes are on (started_at) and (ended_at) — both are filtered against constants in nearly every churn query, and both are great candidates for B-tree indexes. For the behavioral churn query, the events table should have at least (user_id, event_at) as a composite index; without it the NOT EXISTS subquery degenerates into a full scan per user.

If your churn dashboard is recomputed on every page load, it's wasting compute. Pre-aggregate the monthly churn and dollar churn into a daily_churn_metrics rollup table, refreshed once a day with a small DAG step. Cohort churn benefits from a materialized view keyed on cohort_month plus the months_since_signup offset; the same view also powers the retention curve chart that PMs will inevitably ask for next. On Snowflake or BigQuery, partition the underlying subscriptions table on started_at month — pruning at the partition level usually beats indexing for these date-range filters.

Churn in Python

For ad-hoc analysis in a notebook, the same monthly churn calc looks like this:

subs = pd.read_sql('SELECT * FROM subscriptions', conn)

active_start = (
    (subs['started_at'] <= '2026-04-01') &
    ((subs['ended_at'].isna()) | (subs['ended_at'] > '2026-04-01'))
).sum()

churned = (
    (subs['ended_at'] >= '2026-04-01') &
    (subs['ended_at'] <  '2026-05-01')
).sum()

churn_rate = churned / max(active_start, 1)

Pandas is fine for a one-off, but resist the urge to migrate your production churn dashboard from SQL to a Python script — it will be slower, harder to govern, and impossible to share with the BI tool the rest of the company uses.

If you want to drill SQL questions like this every day, NAILDD is launching with 500+ SQL problems across exactly this pattern.

FAQ

Customer churn or revenue churn — which one should I report?

Both, in the same dashboard, side by side. Customer churn tells you about people leaving and is the right metric for product and onboarding teams. Revenue churn tells you about money leaving and is the right metric for finance, fundraising, and any board-deck slide. Reporting only one of them invites the other team to build a shadow dashboard with the version they prefer, and now you have two sources of truth.

When is a user "churned" in a product without a subscription?

It depends on the product's expected usage cadence. For a product people open daily — messaging, music, productivity — 14 to 30 days of inactivity is a reasonable cutoff. For weekly products, 60 days. For seasonal or low-frequency products like tax software or travel booking, 90 days or more is honest. The single most important rule is to pick a definition, write it down with the product owner, and resist the temptation to retroactively change it when the number stops looking good.

How do I lower churn?

There is no single lever. Involuntary churn responds best to payment infrastructure work — smart retries, account updater integrations, pre-expiration emails. Voluntary churn responds to onboarding improvements that get users to first value faster, pricing tier rationalization, and proactive outreach when a churn-risk model flags an account. High-value accounts deserve a human touch via a CS team. The order of operations almost always is: fix involuntary first because it's the highest ROI, then attack voluntary with product work.

Why is monthly churn × 12 not equal to annual churn?

Because churn compounds. If 5% of your customers leave every month, the survivors next month are 95% of the original, and 5% of those 95% leave the following month — not 5% of the original. The correct annual conversion under constant monthly churn is 1 − (1 − m)^12. A 5% monthly rate compounds to roughly 46% annual, a 2% monthly rate compounds to roughly 22% annual, and getting this distinction right in an interview is one of the cheapest ways to look senior.

What's a "good" churn rate?

Heavily context-dependent. For consumer subscription apps, a 5-7% monthly customer churn is common and considered acceptable. For SMB SaaS, 3-5% monthly is the typical range. For enterprise B2B SaaS with annual contracts, anything above 1% monthly is alarming and most healthy companies sit closer to 0.5%. Comparing your rate against levels.fyi salary benchmarks doesn't work here; the better benchmarks are public S-1 filings and SaaS index reports from firms like OpenView or Bessemer.

Should I include downgrades in churn?

Customer churn, no — the user is still a customer. Dollar churn, partially — the difference in MRR between the old plan and the new plan is lost revenue and belongs in gross revenue churn. The cleanest version is to report three numbers: customer churn (fully cancelled accounts), gross dollar churn (cancellations plus downgrade losses), and net revenue retention (gross churn minus upgrades and expansion). That triple is what an experienced finance partner will ask for in the first 10 minutes of any QBR.