Blog / User Engagement Metrics: 18 KPIs, Formulas & Best Practices

User Engagement Metrics: 18 KPIs, Formulas & Best Practices

Allan de Wit
Allan de Wit
·
September 4, 2025

Your product may boast a flood of new sign-ups, yet revenue stagnates. The missing link often hides between acquisition and retention: engagement. When you know exactly how users move, pause, and return, you can stop guessing and shape experiences that keep them around.

That clarity comes from user engagement metrics—quantitative signals that reveal how valuable, habit-forming, and share-worthy your product feels to real people. But not every shiny number belongs on your dashboard. Pageview fireworks and other vanity stats distract; actionable KPIs tie behavior to outcomes like lower churn, higher expansion, and sharper roadmap choices.

In the guide below you’ll find 18 engagement KPIs, each paired with a plain-English formula, tracking advice, and practical levers for improvement. We’ll start with DAU, stickiness, and session duration, weave in sentiment signals such as NPS and CES, and close with a customizable composite score.

Use the list as a menu—begin with metrics that match today’s objectives, validate findings with user feedback, then expand. Ready to swap dashboard noise for insight? Let’s jump into the numbers that matter.

1. Daily, Weekly & Monthly Active Users (DAU, WAU, MAU)

Before you dig into sophisticated ratios, make sure you know how many people actually show up. DAU, WAU, and MAU count unique users who perform a pre-defined “active” event—opening the app, sending a message, completing a task—inside a given period. Together they sketch a top-level pulse of traction, momentum, and reach.

Definition & Strategic Value

  • DAU measures everyday stickiness and quickly spots dips after UX changes.
  • WAU smooths out weekday/weekend noise and highlights short-term momentum.
  • MAU shows overall market penetration and fuels long-range planning.

Comparing the three reveals usage cadence: steady DAU + growing MAU often means new sign-ups aren’t forming habits yet.

Formulas & Benchmarks

DAU  = count of unique active users today
WAU  = count of unique active users in last 7 days
MAU  = count of unique active users in last 30 days
Stickiness = DAU ÷ MAU × 100

Typical DAU/MAU stickiness benchmarks: 20-25 % for B2C social apps, 15-20 % for B2B SaaS, single-digits for utilities.

Tracking Guidance

GA4’s “Active Users” card, Mixpanel’s retention board, or Amplitude’s “Personas” all work—just lock the same qualifying event across tools. Tag guest and logged-in traffic differently to avoid double-counting.

Optimization Playbook

  • Shorten onboarding paths so new users hit the key event faster.
  • Send behavior-based nudges (email, push) when someone drifts past 24 hours of inactivity.
  • Surface personalized dashboards or saved filters that reward daily check-ins.
  • Celebrate streaks with badges to convert weekly visitors into daily regulars.
    With a solid handle on DAU, WAU, and MAU you’re ready to explore how sticky those users really are.

2. Stickiness Ratio (DAU ÷ MAU)

If DAU tells you who showed up today, the stickiness ratio tells you how many of those visitors are coming back often enough to matter. High stickiness means users aren’t just testing the waters—they’ve woven your product into their routine. Low stickiness is an early-warning siren for churn and ballooning acquisition costs, because you’re paying for customers who never form a habit.

Why It Matters

A rising stickiness curve correlates strongly with retention, feature adoption, and word-of-mouth. It’s also a handy normalizer: whether your MAU is 1,000 or 1 million, the percentage highlights true engagement health.

Calculation & Targets

Stickiness (%) = (DAU ÷ MAU) × 100
  • Social/consumer apps: 20–30 % is healthy
  • B2B SaaS: 10–15 % good, 15 %+ excellent
  • Utilities or seasonal tools: single digits may be acceptable

How to Raise Stickiness

  • Trigger context-aware reminders (e.g., “You have 3 unresolved tickets”).
  • Introduce lightweight community spaces—comments, reactions, leaderboards.
  • Reward streaks with progressive badges or perks.
  • Ship small, frequent enhancements so there’s always something new to explore.
  • Capture qualitative feedback via in-app polls to uncover friction and close the loop fast.
    Consistently iterate on these levers and watch one-time users convert into daily regulars.

3. Session Frequency

A user can be “active” without being truly engaged. Session frequency fills that gap by revealing how often the same person opens your product within a set window—daily, weekly, or monthly. More sessions per user generally translate to more opportunities for value realization, upsell, and advocacy.

KPI Overview

Session frequency is the average number of sessions generated by each active user during the analysis period. A spike after a feature launch signals excitement; a gradual slide flags fading interest.

Measurement Methods

Session Frequency = Total sessions in period ÷ Unique active users in period
  • GA4: Explore → “Sessions per user” (break down by cohort or device).
  • Mixpanel/Amplitude: Create a custom formula metric and trend it week over week.

Ways to Increase

  • Release bite-sized content drops on a predictable cadence.
  • Schedule behavior-based push or email nudges when a user’s session gap exceeds their norm.
  • Add streak rewards or “since your last visit” highlights to make return trips feel worthwhile.

4. Average Session Duration

More minutes spent inside the product usually reflect deeper curiosity, but raw “time on site” can deceive. Someone who leaves a tab open while grabbing coffee shouldn’t inflate your user engagement metrics, and a rapid yet purposeful visit can still deliver value.

Nuances of “Time Spent”

GA4 solves the idle-tab problem by treating a session as “engaged” only when the page stays in focus for at least 10 seconds or the user triggers a conversion event. Track both the classic duration and the stricter “engaged time” to understand whether attention is real or passive.

Formula & Industry Benchmarks

Average Session Duration = Total session time (seconds) ÷ Total sessions

Typical ranges:

  • Content or media sites: 180–300 s
  • B2B SaaS dashboards: 120–240 s
  • E-commerce catalogs: 90–150 s

Improvement Tactics

  • Optimize first meaningful paint to shave loading dead time.
  • Embed interactive elements—tables that filter, live previews, calculators—to keep the cursor moving.
  • Use contextual tooltips and “next best action” prompts so users naturally chain tasks instead of bouncing after one.

5. Pages (or Screens) per Session

Depth, not just presence, signals real engagement. Pages per Session (or Screens per Session on mobile) tracks how many distinct views the average visitor chalks up before ending a session. A climb shows users are exploring features and content; a dip can reveal friction, thin navigation, or that they found what they needed too fast to matter for your goals.

Purpose & Context

  • Diagnose discovery quality—are users finding adjacent value?
  • Pair with exit or conversion data to catch “wandering” versus purposeful browsing.
  • Helpful early-warning metric when launching new sections or app flows.

Calculation

Pages per Session = Total pageviews (or screen views) ÷ Total sessions

Best Practices to Lift the Metric

  • Strengthen internal linking and breadcrumb trails.
  • Add “related items” or “next step” carousels after primary content.
  • Keep navigation labels crystal-clear and task-oriented.
  • Implement predictive search with auto-suggest to surface hidden gems.
  • Use subtle progress indicators so users know there’s more worth checking out.

6. Scroll Depth

Scroll depth shows exactly how much of a page—or in-app article—a visitor truly consumes. Unlike session duration, it ignores idle tabs and focuses on purposeful reading or scrolling, making it an essential complement to time-based metrics.

Why It’s Critical for Long-Form Content

For blogs, knowledge-base entries, and changelog posts, completion rate beats clicks. A 75 % scroll but low CTA hits flags copy fatigue; a 25 % scroll reveals headlines that over-promise or slow load speed. Pairing depth with conversions exposes where attention leaks.

Tracking Setup

Use GA4 or Tag Manager to fire events at 25 %, 50 %, 75 %, and 100 % thresholds. Heat-mapping tools like Hotjar or Microsoft Clarity layer on visual context, pinpointing rage-scrolls and sudden abandon points.

Optimization Ideas

  • Hook readers above the fold with a punchy payoff.
  • Break walls of text with images, quotes, or GIFs.
  • Insert TL;DR boxes so skimmers still grasp value.
  • Lazy-load assets to stop jarring layout shifts.

7. Click-Through Rate (CTR) on Key UI Elements

Every engagement journey hinges on micro-decisions: a user spots a button, hesitates, and either clicks or drifts away. Tracking CTR on your core interface elements quantifies those split-second choices and shows whether copy, color, or placement persuades users to advance.

Scope

Measure CTR for:

  • Primary and secondary CTAs
  • Navigation links and feature tabs
  • In-app banners or upsell ribbons
  • Onboarding tooltips and modal prompts

Formula

CTR (%) = (Clicks ÷ Impressions) × 100

Segment by device, page type, and user cohort to spot hidden friction.

Boosting CTR

  • A/B-test button text that stresses outcome (“Save 10 mins”) over action (“Submit”).
  • Increase visual contrast and whitespace so CTAs pop without feeling spammy.
  • Use micro-animations—subtle pulse or hover state—to signal interactivity.
  • Place contextual prompts near the moment of need, not just at the page top.

8. Feature Adoption Rate

New users might sign up for one killer workflow, but long-term value appears when they begin exploring the rest of your product. Feature Adoption Rate tracks how many active users embrace a specific capability, helping you decide whether to double-down, redesign, or sunset it.

Definition

The metric represents the percentage of unique active users who execute at least one qualifying event tied to the feature—opening a dashboard, exporting a report, scheduling an automation—during the analysis period.

Formula

Feature Adoption Rate (%) = (Feature users ÷ Total active users) × 100

Run it weekly after launch, then monthly once usage stabilizes. Anything above 30 % for a marquee feature or 10-15 % for an advanced one is a healthy starting benchmark.

Driving Adoption

  • Insert guided tours or interactive hotspots the first time the feature becomes relevant.
  • Announce value-rich use cases in “What’s New” modals and lifecycle emails.
  • Surface contextual prompts (e.g., “Try bulk edit to save clicks”) triggered by user behavior.
  • Collect friction points with in-app feedback widgets and loop fixes back into the roadmap.
    Consistent visibility plus friction removal usually lifts adoption within a sprint or two.

9. Activation Rate

Getting people in the door is easy; proving value fast is the hard part. Activation Rate tracks how many new sign-ups complete the key event that signals “this is useful,” making it a leading indicator for retention and monetization. Nail activation, and later metrics—stickiness, frequency, LTV—tend to move in the right direction almost automatically.

The “Aha” Moment Explained

The activation event must reflect a genuine outcome, not a vanity click. For Slack it’s sending the first message; for Koala Feedback it might be collecting the first piece of user feedback. Define it once and socialize it across product, marketing, and support teams.

Calculation

Activation Rate (%) = (Activated users ÷ Total sign-ups) × 100

Review by cohort (signup week or campaign) to catch onboarding blind spots early.

Ways to Improve

  • Progressive, non-skippable onboarding that walks users straight to the payoff
  • Checklist gamification with visible progress bars and celebratory micro-copy
  • Pre-populated templates or sample data to eliminate blank-state paralysis
  • Time-boxed nurture emails and in-app nudges that highlight the single next step
  • Immediate feedback—confetti, badges, or success modals—when the activation event fires
    Continuous A/B testing of these levers typically lifts activation within a few sprints.

10. Retention Rate & Churn Rate

Keeping hard-won users is cheaper—and more sustainable—than recruiting new ones. Retention Rate shows how many people you kept over a given period, while Churn Rate reveals the slice you lost. Track the pair together and you’ll know whether your engagement experiments are truly paying off or just masking an outflow beneath the surface.

Dual Definitions

  • Retention Rate: Percentage of users from a starting cohort who are still active at the end of the period.
  • Churn Rate: Percentage of that same cohort who left (no activity, subscription canceled, or account deleted). They’re mathematical mirror images—when one rises, the other falls.

Cohort-Based Formula

Retention (%) = ((Users at end − New users) ÷ Users at start) × 100
Churn (%)     = 100 − Retention

Run this weekly or monthly by signup cohort to expose silent attrition before it dents revenue.

Retention Levers

  • Lifecycle email or push sequences that surface underused features
  • Win-back offers for lapsed accounts (discounts, free seat, extension)
  • In-app success tips and nudges tied to stalled workflows
  • Proactive support check-ins when product usage drops sharply
  • Community events or webinars that reconnect drifting users

11. Customer (or User) Lifetime Value (LTV)

Revenue is the scorecard that proves engagement is working. When users keep logging in, adopting new features, and advocating for you, they also renew, expand, and upgrade—behavior that shows up as a higher Lifetime Value.

Engagement-Revenue Link

A healthy LTV means you can spend more on acquisition without bleeding cash. In B2B SaaS, every extra percentage point of activation, stickiness, or retention compounds into months (or years) of added revenue per account.

LTV Formulas

Classic LTV           = ARPU × Gross Margin × Average Customer Lifespan
Subscription Shortcut = ARPU ÷ Monthly Churn Rate
Cohort View           = Σ Net Revenue from cohort ÷ Cohort size

Choose the model that matches your data fidelity and update quarterly to catch trend shifts early.

Increasing LTV

  • Turn power users into power payers with seat or usage-based expansion.
  • Package premium support, integrations, or training as add-ons.
  • Offer annual billing discounts to extend lifespan and lower churn risk.
  • Build a customer community that fosters peer learning and reduces support costs.
  • Use product analytics plus feedback loops (hello, Koala!) to refine features users are willing to pay more for.

12. Net Promoter Score (NPS)

Engagement isn’t only clicks and session time—loyal users also talk. Net Promoter Score turns that chatter into a quantifiable gauge of advocacy. A high NPS signals that people find repeated value, trust the brand, and are willing to stake their own reputation on it—all prime indicators of future retention and expansion.

Sentiment as Engagement

Promoters (scores 9–10) behave like free marketing channels: they submit fewer tickets, adopt new features faster, and refer peers. Passives (7–8) are satisfied but silent, while detractors (0–6) often churn or spread negative word-of-mouth. Tracking shifts among these groups adds a qualitative layer to your user engagement metrics stack.

NPS Equation

NPS = (% Promoters − % Detractors)

Scores range from –100 to +100. In SaaS, anything above +30 is considered healthy; +50 is world-class.

Implementing & Acting on NPS

  • Trigger an in-product, one-question survey after users complete a high-value action.
  • Follow up instantly: ask promoters for reviews or referrals; route detractors to a concierge support path.
  • Tag responses by account size, plan, or feature usage to uncover segmented pain points.
  • Feed qualitative comments into your Koala Feedback board so product teams can prioritize fixes that convert detractors into promoters.
    Regular pulse checks (quarterly or after major releases) keep the score—and the conversation—current.

13. Customer Effort Score (CES)

Few things kill enthusiasm faster than friction. Customer Effort Score quantifies how easy—or maddening—it is for users to achieve a task inside your product. Lower effort consistently links to higher repeat usage, stronger loyalty, and fewer rage-tickets, making CES a handy early warning light before churn shows up in revenue reports.

Impact on Future Engagement

A user who glides through setup or support is far more likely to activate secondary features, leave positive reviews, and upgrade later on. Research from Gartner even ties a one-point drop in CES to a 22 % increase in repurchase intent, underscoring why “easy” beats “delight” in day-to-day UX.

Survey & Scoring

Ask a single, time-boxed question after a key workflow:
“On a scale of 1 (very difficult) to 7 (very easy), how easy was it to ___?”

CES = Sum of all scores ÷ Number of responses

Some teams prefer a 1–5 Likert scale—just keep it consistent so trends stay clean.

Reducing Effort

  • Offer self-serve articles and AI search before routing to support
  • Use inline, context-aware tooltips instead of generic walkthroughs
  • Implement keyboard shortcuts and bulk actions for power users
  • Auto-save preferences to cut repetitive setup steps
  • Regularly review Koala Feedback tickets tagged “confusing” to prioritize UX fixes
    Shaving even seconds from a common task can nudge CES upward and, with it, overall engagement.

14. Bounce Rate & Engagement Rate (GA4)

When someone lands on your site and ghosts after a single view, that’s a bounce—classic “thanks, but no thanks.” Google Analytics 4 pushes the conversation forward with Engagement Rate, which flips the lens to highlight sessions that did stick around for at least 10 seconds, triggered a conversion event, or viewed two or more pages. Tracking both side-by-side shows whether changes are truly inviting exploration or merely masking exits.

Legacy vs. Next-Gen Metrics

  • Bounce Rate (Universal Analytics era) flags one-page sessions with no further interaction—useful for spotting irrelevant traffic or slow pages.
  • Engagement Rate (GA4) spotlights positive behavior: it counts sessions that meet minimal interaction thresholds, giving credit where it’s due even on single-page apps.

Formulas

Bounce Rate (%)     = (Single-page sessions ÷ Total sessions) × 100
Engagement Rate (%) = (Engaged sessions ÷ Total sessions) × 100

Improvement Ideas

  • Trim First Contentful Paint (FCP) and Largest Contentful Paint (LCP) to under 2 s.
  • Lead with a crystal-clear headline that nails the problem you solve.
  • Place a visually dominant primary CTA above the fold.
  • Preload hero images and defer non-critical scripts.
  • Offer an in-page table of contents so visitors can jump straight to what matters.

15. Time to First Action (TTFA)

The longer someone stares at an empty dashboard, the more likely they are to close the tab and never return. Time to First Action zeroes in on that danger zone by tracking how quickly a new visitor performs the very first meaningful event—uploading a file, submitting feedback, sending a message. It’s the quantitative heartbeat of “speed-to-value,” and a stubbornly high TTFA often foreshadows weak activation and soaring churn.

Why Speed-to-Value Counts

Users judge usefulness in minutes, not days. A brisk TTFA correlates with higher activation, stronger word-of-mouth, and lower support volume because early momentum builds confidence and curiosity.

Measuring

TTFA = Timestamp of first key event – Session start timestamp

Pull the delta for each new account, then chart the median by signup cohort to expose onboarding regressions.

How to Shorten

  • Pre-fill sample data so nothing looks blank.
  • Use interactive walkthroughs that guide clicks, not just show tooltips.
  • Collapse optional fields behind an “Advanced” toggle to reduce cognitive load.
  • Trigger contextual nudges if no action occurs within X seconds.
  • Celebrate the first action with instant visual feedback—confetti, badge, or success modal.
    Consistently trimming friction here pays compounding dividends across every downstream engagement metric.

16. Social Shares & Referral Rate

When people broadcast your product to friends or colleagues, they’re vouching for its value—and importing fresh prospects at near-zero cost. Tracking how often users share content or invite others captures this “viral loop” and translates it into concrete engagement data.

Metrics & Formulas

  • Shares per User gauges organic buzz:
    Shares per User = Total social shares ÷ Active users
    
  • Referral Rate shows how many new accounts stem from invitations:
    Referral Rate (%) = (Referred sign-ups ÷ Total sign-ups) × 100
    

Aim for a steady upward trend; even a 2–3 % referral lift can slash acquisition spend.

Encouragement Tactics

  • Embed one-click share buttons with prewritten, benefit-driven copy.
  • Offer dual-sided credits or swag when a referral activates.
  • Spotlight success stories from top referrers to spark friendly competition.
  • Trigger in-app prompts right after a user reaches a milestone (“Loved that feature? Share it!”).
  • Make invite flows seamless—autofill email contacts and allow personal notes.
    Treat sharing friction like any other UX bug, and watch engagement compound through word-of-mouth.

17. Support Ticket Volume per User

Customer questions are inevitable, but an avalanche of tickets can bury your support team and hint at product friction. Support Ticket Volume per User shows how many help requests the average active user opens during a chosen window. Trend it alongside release dates and onboarding cohorts to spot whether spikes come from healthy growth or confusing UX changes.

Balance Between Usage & Friction

A mild uptick after a feature launch often reflects exploration—good news. A sustained climb, especially paired with falling CES or NPS, screams “usability issue.” Segment tickets by topic to see if the same workflow causes repeat pain.

Formula

Ticket Volume per User = Total support tickets in period ÷ Active users in period

Reduction Strategies

  • Build a searchable, video-rich knowledge base and surface it contextually.
  • Add predictive in-app tips that answer common “how do I…?” moments.
  • Simplify workflows: fewer steps, clearer labels, smarter defaults.
  • Tag tickets in Koala Feedback to prioritize high-frequency friction points for the next sprint.
    Continual ticket deflection boosts user satisfaction and frees support for high-value conversations.

18. Engagement Score Composite

No single metric nails the full picture. A composite engagement score rolls your most telling KPIs into one number that everyone—from product to execs—can scan at a glance. Think of it as an engagement “credit score” that updates each sprint and flags shifts before churn or support tickets spike.

Building a Custom Index

Pick 3–5 metrics that align with your growth model—e.g., Session Frequency, Feature Adoption Rate, NPS, and CES. Assign weights that mirror strategic importance (weights must sum to 1). Keep the recipe transparent so teams know which lever to pull when the score dips.

Sample Formula

Engagement Score = (0.4 × Session Frequency)
                 + (0.3 × Feature Adoption Rate)
                 + (0.3 × NPS)

Practical Uses

  • Bucket users into power, healthy, and at-risk segments to personalize nurture flows.
  • Set quarterly OKRs around moving the composite score, not vanity hits.
  • Spotlight anomalies after releases—if the score drops while DAU climbs, dig for friction.
  • Feed the score into renewal and upsell models to predict revenue more accurately.
    With one metric, you’ll rally the whole org around engagement, not just eyeballs.

Quick Wrap-Up

Data is loud; insight is selective. By pairing behavioral signals (DAU, session frequency), financial KPIs (LTV), and sentiment scores (NPS, CES), you create a 360° view that explains what users do and why they do it. No single metric wins on its own—movement in one should always be read alongside two or three companions to confirm the story.

Start small: pick one activation metric, one habit metric, and one loyalty metric that map directly to your current objective. Instrument them well, set a baseline, then run one experiment at a time. When numbers move, dig into qualitative feedback to uncover root causes, rinse, and repeat. Iteration—not dashboard sprawl—drives real engagement gains.

Need help capturing that qualitative “why”? Spin up a free feedback board with Koala Feedback and let users tell you exactly where to steer the next release. Happy measuring!

Koala Feedback mascot with glasses

Collect valuable feedback from your users

Start today and have your feedback portal up and running in minutes.