CX-201a · Module 1
Leading vs. Lagging Indicators
4 min read
Here is the difference that separates health scores that predict from health scores that report. A lagging indicator tells you what has already happened. The client's NPS score dropped from 8 to 5. That is useful information, but the drop happened weeks ago — you are reading the autopsy, not the vital signs. A leading indicator tells you what is about to happen. The client's response time to your emails has increased from 4 hours to 36 hours over the past three weeks. That is a trajectory — and the trajectory predicts the NPS drop before it lands.
Do This
- Weight leading indicators more heavily than lagging indicators in your health score — you need prediction, not retrospection
- Track response time trends, meeting attendance patterns, and stakeholder breadth changes as leading engagement signals
- Monitor login frequency trends and feature utilization changes as leading adoption signals
- Use lagging indicators (NPS, CSAT, renewal decisions) to calibrate whether your leading indicators are actually predictive
Avoid This
- Build health scores primarily from survey data — surveys are lagging indicators that arrive after sentiment has already shifted
- Treat stable metrics as healthy metrics — stability can mask slow decline if you are not watching the trend
- Ignore the direction of change — a score of 70 that was 85 last quarter is more concerning than a score of 65 that was 60 last quarter
- Wait for the renewal conversation to learn the client's sentiment — the renewal decision was made months ago
I have tracked the leading indicators across every account I manage. The strongest single predictor of churn is not NPS, not CSAT, not even outcomes achievement. It is the ratio of client-initiated contact to you-initiated contact. When a client stops reaching out first — when every interaction is you pushing — the relationship is in decline. That ratio is a leading indicator that arrives four to six weeks before any survey score moves.