CW-301a · Module 3

Productivity Metrics

3 min read

Not all metrics matter. The temptation with any new tool is to measure everything — prompts submitted, agents spawned, tokens consumed, sessions per day, average session length. These are activity metrics. They tell you the tool is being used. They do not tell you the tool is producing value. A team that spawns 50 agents per day but produces no additional deliverables is consuming tokens, not creating value.

The metrics that matter are output metrics. Deliverables per week: how many reports, analyses, proposals, and briefs did the team produce? Time-to-output: how long from request to finished deliverable? Revision cycles: how many rounds of feedback before stakeholder approval? First-draft acceptance rate: what percentage of deliverables pass review without revision? These four metrics tell you whether Co-Work is making the team more productive or just more busy.

There is a fifth metric that most organizations miss: workflow coverage. What percentage of the team's recurring workflows have been formalized into Co-Work pipelines? A team that has formalized 3 of its 10 recurring workflows has captured 30% of its potential productivity gain. A team at 80% coverage is approaching the ceiling of what automation can deliver. Workflow coverage tells you how much room for improvement remains.

Ignore vanity metrics. Tokens consumed is not a productivity metric — it is a cost metric. Prompts per day is not a productivity metric — it is an activity metric. Sessions per week is not a productivity metric — it is an engagement metric. These metrics tell finance how much the tool costs and tell IT how much infrastructure to provision. They do not tell leadership whether the investment is working.

Do This

  • Track deliverables per week — the primary output metric
  • Measure time-to-output from request to stakeholder-ready deliverable
  • Monitor revision cycles — fewer revisions means higher first-draft quality
  • Track workflow coverage — what percentage of recurring tasks are formalized

Avoid This

  • Obsess over tokens consumed — that is a cost metric, not a value metric
  • Count prompts per day — activity is not productivity
  • Measure average session length — long sessions might mean struggling, not producing
  • Report vanity metrics to leadership — they need business impact, not usage stats