Measuring what doesn’t click: real KPIs for SEO in the LLM (AI) era

There was a time when measuring SEO was relatively simple.
Organic traffic, rankings, CTR, conversions.
Imperfect, yes. But understandable.
Today, that framework falls short.
Not because SEO has stopped working, but because user behaviour has changed… and with it, the role search engines and language models play.
In many contexts, the answer no longer leads to a click.
And yet, the influence is still there.
The problem isn’t the lack of data — it’s the mental model
When LLMs, generated answers and AI Overviews show up, the first reaction is often defensive:
“We’re losing traffic.”
Sometimes that’s true.
But it’s not the most important question.
The real question is:
are we measuring the right kind of impact for this new environment?
For years, we’ve confused clicks with value.
Now that clicks don’t always happen, we’ve lost our compass.
When visibility doesn’t end on your website
Language models don’t browse like people do.
They don’t compare ten tabs.
They don’t scroll.
They synthesise.
And in doing so, they decide which sources, brands and concepts shape the answer.
That introduces a deep shift:
- Your content can influence without getting visits.
- Your brand can appear without being linked.
- Your authority can grow without showing up in Analytics.
If we only look at traffic, the diagnosis will always be incomplete.
KPIs that still matter (even without a click)
There’s no magic dashboard for SEO in the LLM era.
But there are cumulative signals that, when read together, tell a fairly clear story.
1. Brand demand (direct and assisted)
When a brand starts showing up as a reference in generated answers, something interesting often happens:
branded search grows.
Not always immediately or linearly.
But it’s one of the strongest indicators of real impact.
Not because the user clicked before, but because they saw you.
2. Share of voice in hybrid SERPs
It’s no longer enough to measure classic rankings.
You also need to watch:
- Presence in featured snippets.
- Mentions inside AI Overviews.
- Visibility in extended generated answers.
It’s not trivial to measure, but it’s measurable with method and consistency.
And, most importantly, with context.
3. Unlinked mentions (brand citations)
For years, unlinked mentions were treated as “secondary”.
Today, they’re a key signal.
Language models rely on semantic and reputational patterns.
Being cited matters, even without a link.
4. “Dark” leads
It’s a pattern we’re seeing more and more:
- Traffic doesn’t grow.
- Opportunities do.
Leads that arrive saying:
“I saw you recommended.”
“You showed up in an answer.”
“Your name rang a bell.”
No UTM will capture that.
But ignoring it would be a mistake.
Measuring SEO for LLMs means accepting imperfection
One of the uncomfortable truths of this stage is simple:
not everything will be perfectly attributable.
There will be:
- More influence than traceability.
- More accumulation than spikes.
- More brand than isolated keywords.
And that requires maturity — especially in environments where reporting has always been surgical.
What is definitely a mistake
There are two common reactions that, in my view, don’t help:
- Deny the shift and keep measuring as if nothing has changed.
- Embrace fuzzy metrics without judgement or context.
No nostalgia. No blind faith.
What we need now is to think better about what we measure — and why.
Fine-tuning
At its core, this debate isn’t only about SEO or artificial intelligence.
It’s about something broader:
learning how to evaluate impact in complex systems.
When results aren’t immediate.
When influence isn’t always visible.
When value doesn’t fit into a single metric.
For years we fine-tuned models built for a world of clicks.
Now we need to fine-tune how we listen.
Look beyond traffic.
Treat brand as a cumulative asset.
Accept that not everything can be attributed — but almost everything can be interpreted.
Measuring what doesn’t click isn’t about lowering standards.
It’s about raising the level of judgement.
And in 2026, that starts to make a real difference.