Key Takeaways
- Few KPIs, clearly prioritised: Conversion rate, CPL, CAC and ROAS beat any reach statistic. Measure everything, steer nothing.
- GA4 changes the logic: Bounce rate in GA4 is now only the inverse of the engagement rate (Google Analytics Help).
- Tracking duty in 2026: Since 16 January 2024, a certified CMP for personalised ads in the EEA and UK has been mandatory, since 31 July 2024 also in Switzerland (Google AdSense Help).
- Page Experience counts: LCP under 2.5 s, INP under 200 ms, CLS under 0.1 are the thresholds (web.dev).
Marketing gets measured, or it stays a gut call. Once a quarterly budget passes 50,000 euros, very few executive teams still accept reach charts as proof of success. The central question for 2026 is no longer "Which tools should we use?" but rather: which few online marketing KPIs really show whether the budget is working? This article shows which metrics matter today, what has changed with GA4 (Google Analytics Help), and how you turn data into decisions.
Which online marketing KPIs really matter in 2026?
The honest answer is uncomfortable: two to four. No more. In B2B these are usually conversion rate, cost per lead, customer acquisition cost and return on ad spend. Everything else is diagnostics. Google defines a conversion as a valuable user action you specify (Google Ads Help), and that definition is the anchor for any analysis.
In our work with mid-market clients, I see a recurring pattern. Marketing teams maintain dashboards with 30 tiles, yet still make decisions on intuition. The reason: too many metrics compete, and nobody knows which movement actually matters. [PERSONAL EXPERIENCE] In workshops, we almost always reduce the set to four lead KPIs, with diagnostic metrics clearly separated one layer below.
The split between north-star KPI and diagnostic KPI is essential. The north star is the economic outcome: customers won, quarterly revenue, pipeline value. Diagnostic KPIs explain why the north-star number moves. Click-through rate, engagement rate or session duration are diagnostic KPIs. They explain. They do not judge.
When you plan marketing strategically, the budget should follow suit. A clean allocation does not start with tools, it starts with goals. How to build the marketing budget step by step is covered in a separate guide.
A short rule of thumb helps with selection: a KPI belongs on the dashboard when it meets three conditions. First, it is cleanly defined and reproducibly measurable. Second, a 20 percent change would trigger an action. Third, someone on the team owns it. If a metric fails even one of these criteria, it is diagnostic material, not a steering KPI.
What changed with GA4?
Universal Analytics stopped processing new data on 1 July 2023 (Google Analytics Help). That matters because many reports still rely on UA-era definitions, such as bounce rate as "single-pageview session" or pages per session as the central engagement metric. Neither exists in that form in GA4.
Bounce rate is no longer what it was
In GA4, bounce rate is the inverse of engagement rate. According to Google, a session counts as "engaged" if it lasts longer than ten seconds, fires a conversion event or contains at least two page views (Google Analytics Help). A "bounce" is therefore not a single-pageview session anymore, but a non-engaged session. Comparing GA4 bounce values with old UA reports compares two different metrics.
Pageviews per session is no longer the lead KPI
Universal Analytics put pageviews per session forward prominently as an engagement indicator. GA4 reframes the engagement concept: the central metrics are now "engaged sessions" and "average engagement time", and pageviews are called "views" in GA4, counted across web and app combined (Google Analytics Help). If a 2026 report still uses classic pageview ratios as a lead KPI, it is time to switch to engagement values.
Third-party cookies, the surprise status
In April 2025, Google announced it would no longer pursue automatic deprecation of third-party cookies in Chrome. Users keep control through their privacy settings instead (Privacy Sandbox). That does not mean tracking gets easier. ITP in Safari, strict consent requirements in the EU and shrinking cookie lifetimes still make server-side tagging a sensible investment (Google Tag Platform).
Which metrics should you systematically track?
Track what you actually decide on. A rough hierarchy has proven itself in many mid-market projects: north-star outcomes at the top, a few performance KPIs per channel below, and technical health metrics at the bottom. Most companies need no more layers than that.
| Layer | Metric | Why it counts |
|---|---|---|
| Outcome | Conversion rate | Measures the share of sessions with a target action |
| Outcome | Cost per Lead (CPL) | Shows efficiency of lead generation |
| Outcome | Customer Acquisition Cost (CAC) | Evaluates the cost per won customer |
| Outcome | Return on Ad Spend (ROAS) | Ratio of revenue to ad spend |
| Diagnostic | Engagement rate (GA4) | Replaces classic bounce rate |
| Diagnostic | Click-through rate (CTR) | Evaluates ads, snippets, emails |
| Diagnostic | Average engagement time | Content quality per session |
| Health | LCP, INP, CLS | Google's page experience thresholds |
Layer / Metric / Why it counts
- Layer
- Outcome
- Metric
- Conversion rate
- Why it counts
- Measures the share of sessions with a target action
- Layer
- Outcome
- Metric
- Cost per Lead (CPL)
- Why it counts
- Shows efficiency of lead generation
- Layer
- Outcome
- Metric
- Customer Acquisition Cost (CAC)
- Why it counts
- Evaluates the cost per won customer
- Layer
- Outcome
- Metric
- Return on Ad Spend (ROAS)
- Why it counts
- Ratio of revenue to ad spend
- Layer
- Diagnostic
- Metric
- Engagement rate (GA4)
- Why it counts
- Replaces classic bounce rate
- Layer
- Diagnostic
- Metric
- Click-through rate (CTR)
- Why it counts
- Evaluates ads, snippets, emails
- Layer
- Diagnostic
- Metric
- Average engagement time
- Why it counts
- Content quality per session
- Layer
- Health
- Metric
- LCP, INP, CLS
- Why it counts
- Google's page experience thresholds
Conversion rate, the most important outcome metric
Conversion rate measures the share of all sessions in which users perform a defined target action (Google Ads Help). In B2B these are usually two macro conversions: demo booking and a qualified inquiry form. Beyond that, it pays to define micro conversions, such as PDF downloads or webinar sign-ups, because they make trust measurable.
Engagement rate instead of session duration
Average engagement time is the GA4 equivalent of session duration. It only counts the seconds a tab was in the foreground. That is much more meaningful than the old UA session duration, which was often skewed by open but inactive tabs. If you want to test content, compare engagement rate and engagement time as a pair.
In practice that means: a session with ten seconds of engagement time and no conversion event is no proof of strong content marketing. A session with forty seconds of engagement time, followed by a PDF download, is closer. Only the combination delivers signal. Do not rely on a single engagement number, build pairs of quantity and quality.
Page experience health
Core Web Vitals are a confirmed page experience signal in Google Search (web.dev). Since 12 March 2024, INP (Interaction to Next Paint) has replaced FID as the Core Web Vital for interactivity (web.dev). Thresholds for "good": LCP under 2.5 seconds, INP under 200 milliseconds, CLS under 0.1. These values belong in every marketing dashboard, not only in tech reporting, because they influence conversion rate.
How much should a lead or customer cost?
[UNIQUE INSIGHT] The truly useful question is not "What is a good CPL?" but rather: "At what CPL does the business model stay viable?" The answer comes from a simple ratio, CAC to CLV (customer lifetime value). If the ratio sits at 1:3 or better, the system is healthy. If it sits at 1:1, you grow yourself broke.
Cost per lead describes the budget per prospect, cost per acquisition or customer acquisition cost the budget per paying customer. CAC is the more honest figure because it includes the full chain. If 80 percent of leads are filtered out during qualification, a cheap CPL is still expensive.
ROAS, return on ad spend, directly measures the ratio of advertising revenue to advertising cost. In e-commerce, ROAS is the lead currency. In B2B it usually sits downstream, because the customer journey is longer and offline touchpoints play a role. Here it helps to understand your customers' buying process and to attribute interim values fairly along the phases.
The connection to the website is closer than many teams assume. A strong ad CTR will not save a weak landing page. To bring the CPL down, you often start where marketing actually stops: with trust, clarity and speed of a professionally built website. How trust elements make confidence visible has a direct effect on conversions.
Concretely: poor page experience makes every ad more expensive. If a landing page loads in 4.5 seconds instead of 1.8, conversion rate drops, CPL rises, CAC rises with it. This chain often only becomes visible when a tracking audit lays outcome and health data side by side. In practice, that is regularly where the fastest impact per euro of investment sits, faster than any in-account ad optimisation.
Common mistakes in KPI analysis, and how to avoid them
In audits, four mistakes show up so often that they qualify as patterns. They cost companies more money than any single campaign.
Confusing reach with success
Visibility is a precondition, not a proof. A campaign with two million impressions can be an economic loss if no inquiries result. Reach belongs on the diagnostic layer, never on the outcome layer. A list of typical stumbling blocks is in common online marketing mistakes to avoid.
Tracking gaps no one notices
Without clean conversion tracking, every other metric is chance. Classic gaps include form-submit events that never fire, double-counted purchases, and cross-domain tracking that breaks across subdomains. On top of that comes the duty to use a TCF-certified consent management platform for personalised ads, since 16 January 2024 in the EEA and UK, and since 31 July 2024 also in Switzerland (Google AdSense Help). If you do not solve this technically, you lose not only data but also bidding quality.
Collecting data without making decisions
Many dashboards run without ever triggering a decision. A simple test: if a KPI doubled this week, would anyone do something differently? If the answer is no, that KPI does not belong on the dashboard. [PERSONAL EXPERIENCE] In quarterly workshops, we often delete half the tiles and replace them with two clear decision triggers.
Setting vanity conversions
Newsletter sign-ups as a macro conversion, a "contact" click as a lead, a "learn more" click as engagement: such definitions inflate success numbers without producing any business. Define macro conversions strictly: an inquiry with real intent, a demo booking, a purchase. Everything else stays a micro conversion. Thinking through how to turn website visitors into customers cleanly is essential.
Adopting last-click attribution uncritically
Standard reports often credit the last touchpoint with the conversion. In B2B with customer journeys spanning weeks or months, that is rarely accurate. A LinkedIn ad creates first awareness, a brand search on Google later leads to the inquiry. If you only pay for the brand search, you have built a cheap-looking acquisition that would not have worked without the LinkedIn awareness. Data-driven attribution, or at least sliding comparisons between first-click, last-click and linear attribution, helps make distortions visible. Workshops often reveal here that ostensibly expensive awareness channels are in fact profitable.
How do you build a reliable KPI dashboard?
A good dashboard is not the one with the most blinking tiles. It is the one in which the quarterly decision falls in two glances. Four building blocks tend to work in practice.
First: a clearly defined conversion plan in tracking. Macro and micro conversions are named, events fire server-side, consent is clean. Second: an outcome view. Conversion rate, CPL, CAC, ROAS per channel, no clutter. Third: a diagnostic view that is consulted only when the outcome view shows an anomaly. Fourth: a tech health view with Core Web Vitals, crawling and index status.
Maintain the dashboard on a quarterly cadence, not daily. Daily fluctuations lead to micro-optimisation instead of strategy. Clean content management with consistent templates also keeps tracking implementations reproducible per new page, which directly affects data quality.
Which tools do you really need?
Three are enough for most SMEs. GA4 as the web analytics backbone, Google Tag Manager (ideally with a server-side container) as the data hub, and a simple BI or Looker Studio dashboard built on your conversion data. CRM data joins in as soon as you want to compare qualified leads per channel. Server-side tagging gives you additional control over data collection and reduces data loss caused by browser tracking protection (Google Tag Platform).
How often should you review KPIs?
Outcome KPIs belong on a quarterly rhythm, with a monthly interim look. Look at diagnostic KPIs when outcome KPIs move noticeably. Tech health values once a month, or with every larger release. Watching ROAS daily usually optimises noise. Going through an anomaly list once a week leads to better decisions, especially in setups with long customer journeys, where a serious website determines lead generation.
What role does data quality play?
KPIs are only as reliable as the tracking behind them. Three data-source problems show up especially often. First: tag manager containers that have grown organically over time, with old tags never deactivated. Second: missing cross-domain links, for example between the main domain and a booking or shop subdomain. Third: inconsistent UTM parameters, because different teams maintain their own naming conventions. A short data-quality checklist before every quarterly report pays off: were all expected events fired? Does the conversion count match the CRM data? Are there day-level breaks pointing to tracking outages? Only when these three questions are clearly answered "yes" does the discussion about optimisation make sense. Before that, you steer on the basis of distorted data.
From Evelan's Practice
Mini-case: B2B SaaS provider in Northern Germany with a customer portal
[ORIGINAL DATA] Starting point: fragmented tracking, no unified KPI focus. Inquiries came in, but nobody knew reliably from which channel. Measures: GA4 plus server-side conversion tracking on inquiry form and demo booking, clearly defined micro and macro conversions, one quarterly KPI dashboard with CPL comparison per channel. Result within six months: clear visibility on the lead mix. One SEA channel delivered leads roughly three times cheaper than parallel LinkedIn ads. Budget shifted, CPL halved. No relaunch, just a clean measurement architecture.
Frequently Asked Questions
There is no single most important KPI. In B2B, conversion rate, cost per lead, customer acquisition cost and ROAS lead the field. Google defines conversions as the actions valuable to your business (Google Ads Help). Reach and clicks are diagnostic metrics, not success in themselves.
Related Evelan articles
- Marketing Budget Planning: The Guide for B2B SMEs
- How companies can avoid common online marketing mistakes
- Turn Website Visitors Into Customers: 9 Conversion Levers
- Micro-Conversions: Make Trust Measurable
- Why is a professional website crucial for lead generation?
Sources
- Google Analytics Help: Engagement rate and bounce rate (2024)
- Google Analytics Help: Comparing metrics: Universal Analytics vs GA4 (2024)
- Google Analytics Help: Universal Analytics is going away (2023)
- Google Ads Help: About conversion tracking (2024)
- Google AdSense Help: Google consent management requirements for serving ads in the EEA, UK, and Switzerland (2024)
- Google Tag Platform: Server-side tagging documentation (2024)
- web.dev: Web Vitals (2024)
- web.dev: INP becomes a Core Web Vital on March 12 (2024)
- Privacy Sandbox: Next steps for Privacy Sandbox and tracking protections (2025)



