“What gets measured gets managed,” said Peter Drucker, America’s father of management philosophy.

It’s an appealingly concise piece of wisdom: you will effect change on those things you pay attention to. But as unemployed phrenologists will attest, something that is measurable is not necessarily meaningful.

Customer service is a highly measurable activity. Call volume, chat times, resolution rates, interaction counts, and myriad other numbers are more easily recorded and measured today than ever before. We can produce endless pages of reporting, slicing and sorting data like a Fruit Ninja world champion.

Help Scout automatically generates some very useful reports in real time:

Help Scout Reports

But what your help desk can’t do is tell you what matters most to your company. So of all those options, what should you measure and how should you report on it? I spoke to a number of SaaS customer service managers and put that question to them.

  • Bill Bounds, formerly of MailChimp, where he built their customer service team from scratch
  • Taylor Morgan, head of customer service at SurveyGizmo
  • Kristin Aardsma, head of customer service for Basecamp
  • Justin Seymour, who leads the support team right here at Help Scout
  • Davida Fernandez, head of support at Campaign Monitor

How to choose the right customer service metrics

Measurements can easily be taken at the level of the individual ticket and then aggregated to report on overall team performance and individual customer service agents.

how to choose right metrics

As a customer service leader, you have access to most of these numbers and probably a ton more. The challenge is deciding which to report on, who to report it to, and how it should be presented.

To figure out the most important metrics for your team, consider these three questions:

1. Why are you reporting?

The point of your customer service team is (I hope!) not to generate nice-looking graphs and reports.

It’s to provide great service to your customers.

Metrics are just a more measurable proxy for the real outcome.

Kristin Aardsma is head of support for Basecamp, a company that considers their great service and fast response times to be product features. For Aardsma’s team, the combination of first-response time and customer satisfaction is a meaningful way to tell if they are staying on track.

During the super-high growth days of MailChimp, Bill Bounds’ single most important job was hiring enough new support staff to maintain support quality. In his words, “We were so focused on growth and getting enough people in that my primary concern was really on, ‘Hey, we’re not done hiring yet.’” So Bounds’ primary metrics were trends of volume per agent and customer satisfaction level.

When you are clear about why you are reporting, you can decide more easily what you should report on and, equally important, what not to report on.

2. Who are you reporting to?

Understanding your audience is critical to communication in all forms. What matters most to your frontline support team might not make any sense to your CEO, who doesn’t have that ground-level perspective.

What you show and how you explain it might differ considerably depending on whom you are reporting to. At Campaign Monitor, customer service reporting is done at three levels, and the contents of those reports are slightly different each time.

  • Individual agents are emailed daily reports on their personal activity and their team’s activity.

  • A monthly report is shared on our internal wiki with the whole company. These reports remove some of the individual agent detail but add some long-term perspective.

  • The highest level of reporting is presented on a couple of slides to the senior management team, with some written comments to explain the trends on display.

As a global and distributed company, that’s a great way to make sure everyone is up to date.

Alternatively, in beautiful Boulder, Colo., SurveyGizmo’s team is all in one building. Taylor Morgan presents the weekly reports in person to the support team, and there is an open discussion that senior managers are invited to attend. Physical proximity means that their whole team gets the full context and can ask for clarity easily.

Make sure to determine whom you are reporting to and what they care most about. That will help direct you to the right measures.

3. What outcome do you want to see?

Think back to the quote we began with: what gets measured gets managed (even if that’s to the detriment of the company).

“There can be too much emphasis on fluff numbers in support,” Help Scout’s Justin Seymour told me. “The team likes to know what our goals are, what types of conversations we’re having, and how we’re moving the needle month to month.”

The customer service leader is in the best position to understand where the biggest opportunities are for the company. For Bounds at MailChimp, he needed to quantify his need for more support staff, so he focused his reports on telling that story clearly and accurately.

Campaign Monitor, meanwhile, is a product company at its core, and identifying ways to improve the customer experience through a better product is a big focus of customer service reporting.

Your management team can’t have the perspective you can as the customer lead, so you need to lead them honestly and efficiently to a greater understanding of what action needs to be taken through consistent, clear reporting.

What makes a good metric?

The metrics you choose to report should be the following:

  • Meaningful. They should tie back to something your company wants to achieve. For example, when your goal is highly responsive support, time to first response is an ideal metric. Resolution time may not matter.
  • Moveable. You should measure things on which your team can have impact. In Taylor Morgan’s words, “If there are metrics that aren’t moving, or we feel like they aren’t important, we just drop them.”
  • Authentic. Your reports must tell a true story. It’s possible to use real numbers to send a misleading message. Be honest even when it hurts.
  • Contextualized. Numbers in isolation can be stripped of meaning, so provide them in context.
  • Consistent. The trends over time are usually more important than specific data, and looking back over a quarter or a year can give you some fantastic insights and encouragement.

Presenting useful reports

Focus on trends, because the direction of change usually matters most. Eighty percent customer satisfaction may not sound great, but a month-on-month increase from 70% to 80% is excellent news.

Direct limited attention to anomalies and changes. Your managers are busy people, and they have a limited amount of attention to give you. Make sure it’s easy for them to know what to spend their time on.

“Here are our monthly reports—we received 20% fewer questions about exporting this month, so the reworking we did in the app saved us 12 hours of support time already!”

Look for correlations that tell a bigger story. Looking at individual metrics is useful, but understanding the connections between them is where the real insight can come.

Combining metrics can help you identify deeper issues. For example:

“When our email time to first response goes above four hours, we see consistent dips in customer satisfaction.”

“Answering billing questions takes us three times the average ticket length”.

Below is an example from my experience at Campaign Monitor. Our reporting tool could tell us when tickets arrived and how long customers were waiting for a first reply, but it couldn’t show us how many tickets were waiting for us to respond at any given time.

By exporting data from our help desk and combining it with a week’s worth of manual measurements, we could produce a single chart that showed the correlation between larger queues and higher waiting times.

chart showing correlation between larger queues and higher waiting times

Our support team reviewed this chart, which stimulated a discussion about the stress and impact of a large queue of waiting tickets. Davida, our head of support, worked with her team to split our main queue into smaller, more manageable chunks. That change created a significant decrease in response times without adding any new resources or changing the volume of tickets.

What will you report on next month?

Customer service metrics matter. What you choose to report and how you report it can make a real difference IN the level of service you provide.

Don’t waste your valuable time compiling reports that provoke no questions and generate no action.

Bill Bounds said it beautifully: “Metrics only tell you where to look for the story, they don’t tell you the story itself.”

Pick the right metrics and use them to tell a compelling story about how your customer service team is contributing to your company’s goals.

Mathew Patterson

About the author: After running a support team for years, Mat Patterson joined the marketing team at Help Scout, the invisible help desk software. Learn how Help Scout takes the headache out of email support.