How to calculate Add to Cart Rate in SQL

Sharpen SQL for your next interview
500+ SQL problems with worked solutions — joins, window functions, CTEs.
Join the waitlist

Why Add to Cart Rate matters

E-commerce funnels collapse into four steps: view product, add to cart, start checkout, purchase. Add to Cart Rate (ACR) is the conversion between the first two steps, and it isolates a single page. If checkout breaks, ACR is unaffected. If pricing is wrong on the listing, ACR moves first. That makes it a useful canary for the product detail page (PDP) team at Amazon, DoorDash, and most consumer marketplaces.

Picture this: your PM pings you on Slack at 5pm — "checkout revenue is down 8% this week, but we shipped nothing." First check whether ACR dropped on the categories that drive checkout volume. If ACR is flat, the leak is downstream. If ACR cratered on a specific SKU, you have a PDP regression. The same SQL gets asked in interviews at Stripe, Etsy, and Airbnb.

ACR sounds trivial — adds divided by views — but the real query has three judgment calls: what counts as a view, how you deduplicate within a session, and how you join the add back to the originating view.

The SQL formula

The definition every product analyst should know by heart:

Add to Cart Rate = adds_to_cart / product_views × 100%

There are three flavors. Event-based ACR counts every view and every add — easy, but inflates ACR for users who view the same SKU repeatedly. User-based ACR uses distinct users on both sides — better for shopper behavior, worse for PDP diagnostics. Session-based ACR scopes adds to views inside the same session — the closest thing to a "true" funnel rate and what most companies report externally. Name all three in an interview, pick session-based as the default. Examples below use Postgres-flavored SQL; Snowflake, BigQuery, and Databricks support the same patterns with minor syntax differences.

Base query: site-wide ACR

Start with the simplest version — total adds divided by total views over the last 30 days, no segmentation. This is the headline number you would publish on a dashboard.

SELECT
    COUNT(*) FILTER (WHERE event_type = 'product_view') AS views,
    COUNT(*) FILTER (WHERE event_type = 'add_to_cart')  AS adds,
    COUNT(*) FILTER (WHERE event_type = 'add_to_cart')::NUMERIC * 100
        / NULLIF(COUNT(*) FILTER (WHERE event_type = 'product_view'), 0) AS acr_pct
FROM events
WHERE event_date >= CURRENT_DATE - INTERVAL '30 days';

NULLIF on the denominator returns NULL instead of crashing on zero views. The ::NUMERIC cast prevents integer truncation in Postgres; without it the result rounds to zero. This is event-based ACR — move to user-based by wrapping each count in COUNT(DISTINCT user_id) FILTER (...).

Breakdown by product

The next question an interviewer asks: "Now show me the worst-performing PDPs." This is where filtering by product_id comes in, plus a sanity floor on view volume so you do not surface a SKU with three views and one add as a "20% ACR top performer."

SELECT
    product_id,
    COUNT(*) FILTER (WHERE event_type = 'product_view') AS views,
    COUNT(*) FILTER (WHERE event_type = 'add_to_cart')  AS adds,
    COUNT(*) FILTER (WHERE event_type = 'add_to_cart')::NUMERIC * 100
        / NULLIF(COUNT(*) FILTER (WHERE event_type = 'product_view'), 0) AS acr_pct
FROM events
WHERE event_date >= CURRENT_DATE - INTERVAL '30 days'
  AND product_id IS NOT NULL
GROUP BY product_id
HAVING COUNT(*) FILTER (WHERE event_type = 'product_view') >= 100
ORDER BY acr_pct DESC
LIMIT 50;

The HAVING views >= 100 line is the most important in the query. Without it, your top-ACR list is dominated by long-tail SKUs with meaningless samples. Explain the threshold out loud — 100 is reasonable for a mid-size catalog, but Etsy or Amazon would use 1,000-10,000. The number is a knob, not a constant. For a PDP audit, flip ORDER BY acr_pct DESC to ASC. The lowest-ACR products at sufficient volume are usually out of stock and not gated, mispriced, or missing photos for a new variant.

Breakdown by acquisition channel

The traffic-mix question — "is ACR down because we changed channel mix or because the page is worse?" — needs the channel join.

SELECT
    u.acquisition_channel,
    COUNT(*) FILTER (WHERE e.event_type = 'product_view') AS views,
    COUNT(*) FILTER (WHERE e.event_type = 'add_to_cart')  AS adds,
    COUNT(*) FILTER (WHERE e.event_type = 'add_to_cart')::NUMERIC * 100
        / NULLIF(COUNT(*) FILTER (WHERE e.event_type = 'product_view'), 0) AS acr_pct
FROM events e
JOIN users u ON u.user_id = e.user_id
WHERE e.event_date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY u.acquisition_channel
ORDER BY acr_pct DESC;

Organic search users almost always have the highest ACR — they searched for the product, landed with intent, and added. Paid social sits in the middle; display retargeting at the bottom. If channel-level ACR is flat but blended ACR fell, you have a mix shift, not a page regression. Different problem, different owner.

Sharpen SQL for your next interview
500+ SQL problems with worked solutions — joins, window functions, CTEs.
Join the waitlist

Session-scoped ACR

The most defensible version of ACR scopes adds to views inside the same session. This requires a join between view and add events on user_id and session_id, and it answers the question "of the sessions where a user viewed a product, how many ended with at least one add?" Most dashboards skip this version because it is heavier to compute, but interviewers love it.

WITH session_flags AS (
  SELECT
    session_id,
    MAX(CASE WHEN event_type = 'product_view' THEN 1 ELSE 0 END) AS viewed,
    MAX(CASE WHEN event_type = 'add_to_cart'  THEN 1 ELSE 0 END) AS added
  FROM events
  WHERE event_date >= CURRENT_DATE - INTERVAL '30 days'
  GROUP BY session_id
)
SELECT
  SUM(viewed)                                       AS sessions_with_view,
  SUM(CASE WHEN viewed = 1 AND added = 1 THEN 1 END) AS sessions_with_add,
  SUM(CASE WHEN viewed = 1 AND added = 1 THEN 1 END)::NUMERIC * 100
    / NULLIF(SUM(viewed), 0)                         AS session_acr_pct
FROM session_flags;

This treats ACR as a session-level conversion: among sessions that viewed at least one product, what fraction added at least one item? The number is usually 2-4x higher than event-based ACR because users browse multiple products per session and only need to add once to count. Pick the definition that matches the question — per-product event-based for PDP regressions, session-based for executive dashboards.

Common pitfalls

The first pitfall is mixing distinct-user and event counts in the same ratio. If the numerator is COUNT(DISTINCT user_id) FILTER (...) and the denominator is raw COUNT(*) FILTER (...), the result is users-per-view, not a rate. Pick one granularity for both halves and stick with it. Interviewers catch this faster than almost any other bug.

The second pitfall is forgetting to deduplicate adds. Users add, remove, then add again — that is two add_to_cart events but one shopper signal. Event-based ACR overcounts intent by 15-30% on mature carts. Fix it with a DISTINCT (user_id, session_id, product_id) wrapper on the add count, or with an upstream cart_state_change event that flags only first-time adds.

The third pitfall is "quick adds" with no preceding view. Recommendation widgets, "buy again" carousels, and one-tap reorder flows fire add_to_cart without product_view first. If you naively join adds to views, those adds drop out of the numerator and ACR looks lower than reality. Count all adds in the site-wide numerator, but only adds-with-views when computing PDP-specific ACR.

The fourth pitfall is treating ACR as a goal metric instead of a diagnostic. Optimizing for ACR alone is how you ship a "buy now" button that quadruples cart additions and tanks checkout completion. Always pair ACR with checkout start rate and purchase rate before declaring a PDP change a win.

Optimization tips

For event tables in the hundreds of millions of rows, partition by event_date. On Snowflake that means a clustering key; on BigQuery, native date partitioning; on Postgres, declarative partitioning by month with a B-tree index on event_type. Without partitioning, you scan the whole table on every dashboard load.

Materialize the daily aggregate. A nightly job that computes (event_date, product_id, channel, views, adds) reduces interactive query cost by 50-100x and lets the dashboard query millions of rows instead of billions — a one-line dbt incremental model on a modern lakehouse. For session-scoped ACR, compute session flags once at session-close time and store them in a session_summary table. Recomputing MAX(CASE WHEN ...) over 30 days of raw events on every dashboard load is how you end up on your team's top-10 expensive queries leaderboard.

On BigQuery, rewrite COUNT(* FILTER ...) as COUNTIF(condition). Snowflake supports both. Databricks accepts FILTER in recent versions, but the older CASE WHEN form is universally portable and what you should write in interviews unless told otherwise.

If you want to drill SQL questions like this every day, NAILDD is launching with 500+ SQL problems across exactly this pattern.

FAQ

What's a good Add to Cart Rate?

It depends on category. Mainstream e-commerce sits at 5-15% site-wide. Premium and luxury verticals run 2-5% because shoppers research longer. Fast fashion and impulse categories hit 15-25% because the median price is low and the psychological cost of adding is near zero. Baymard Institute and Contentsquare publish quarterly benchmarks — use them as sanity checks, but trust your own historical trend more than any industry median.

How is ACR different from Conversion Rate (CVR)?

ACR measures the view-to-add step. CVR usually means view-to-purchase end-to-end, or sometimes add-to-purchase depending on the company. The cleanest mental model is to track three rates: view-to-add (ACR), add-to-checkout, and checkout-to-purchase. Multiply them and you get end-to-end CVR. If end-to-end CVR drops, decomposing into the three sub-rates tells you which step regressed. A single CVR number tells you nothing about where to look.

Why is mobile ACR usually lower than desktop?

Mobile users browse more casually — more views per session, lower add-per-view conversion. And mobile UX makes adding harder: a small "add to cart" button on a thumb-driven screen has higher friction than a large desktop button. Apple Pay narrows the gap but never closes it. Always decompose ACR by platform before comparing to a competitor benchmark — comparing 60%-mobile traffic to 30%-mobile traffic without normalizing is a classic mistake.

How do I handle "buy now" buttons that skip the cart?

Treat them as add-to-cart events for ACR, then immediately as checkout-start events for the downstream rate. Most analytics teams fire both on a "buy now" tap so the funnel does not visibly skip a step. If your stack only fires the purchase event, you have a funnel gap that distorts every downstream rate, and the fix is instrumentation, not a SQL trick. Instrumentation gaps explain most "weird ACR numbers" you'll be asked about in your first month on a new team.

Does ACR move with promotions?

Yes, sharply. Site-wide promos pump up ACR by 20-50% during the campaign because users add aspirationally with a discount-anchored mental model. The followup question is whether add-to-purchase also moves. If it does, the promo is converting incremental demand. If only ACR moves and downstream rates flatten, you have lured users into the cart with no intent to complete — fine for retargeting list growth, bad for cart abandonment. Always report ACR alongside cart abandonment when evaluating promo lift.