Webinar

Customer Service Reporting with Real Impact

 

Your help desk tool gives you a heap of predefined reports to work with, but they can’t tell you which ones matter to your company. How can you turn your customer service reporting from a document that goes unread to information that makes a real difference to your business?

Join Chase Clemons, Support Pro at Basecamp and founder of SupportOps, and Nick Francis, CEO of Help Scout, for a live chat on how support teams can be more effective in their reporting.

What you’ll learn:

  • Why measuring the right things matters
  • How to find out what your CEO wants to know about your support team
  • How not to waste your boss’s time with your reporting
  • When to review your reporting method and content

Read full transcript

Nick Francis: My name is Nick Francis. I am the co-founder and CEO of a company called Help Scout. We run a community site called HelpU, which provides all sorts of resources and content for customer support professionals, entrepreneurs, everyone in between. No matter which tools you use, if you’re interested in building a customer-centric business, please do check out HelpU. It’s https://www.helpscout.net/helpu. Subscribe to the newsletter. There’s a guy named Mat Patterson on our team that curates that newsletter, comes out with some really great content once week. We’d love for you to be a part of that community.

Nick Francis: So let’s get talking with a good friend of mine, Chase Clemons. How you doing today, Chase?

Chase Clemons: Fantastic. How are you, man?

Nick Francis: Awesome, awesome. Chase is a longtime support pro on the Basecamp support team, and for those of you that don’t know Basecamp, it’s an incredible company. I think more than a million users. Absolutely inspirational, And of course, to Help Scout, I’m not sure where would be without companies like Basecamp really laying the foundation. So, a company that we’ve always admired a great deal, and of course a support team that’s outstanding beyond all measure. Chase is a big part of that, really helping their customers make the most of the product.

Nick Francis: He also founded a website called SupportOps. It’s a podcast that’s actually coming to an end at the end of the year, but a really incredibly podcast that he’s produced over the last few years, helped people deliver better support experience to their customers, and he really laid a foundation for a lot of the things we’re now doing HelpU for the involvement-and-support-driven community. He’s really advocated for support professionals in a truly admirable way over the last few years, so, Chase, it’s a real honor to be with you.

Nick Francis: Today we’re gonna be talking about customer service reporting with real impact. So let’s dive right in. Chase, can you just start off by telling me what Basecamp’s general philosophy is on support reporting, and has that changed over time as you’ve been part of the team there?

Chase Clemons: Yeah, it’s … I mean, the best way to think about is that we tag with intent, and what I mean by that is whenever we do any kind of reporting, any kind of tagging, labeling, any kind of gathering up all the metrics that we can and start looking at it, we’re looking to answer specific questions. That question might be something like, “What’s the reactions from customers from the latest iOS release?” It might be something like, “Hey, we’re seeing this specific bug. How many people are running into it? Is it fixed? Can we let them know about that?” But with everything that we do, when we approach that reporting it’s always with that intent, that intent to answer a specific question.

Chase Clemons: Now, it hasn’t always been that way. Back when I first started with Basecamp in 2011, the kind of prevailing thing that was … I don’t know, the cool thing to do, for lack of a better word, was to basically gather up as much data as you possibly can, look at all of it, and then try to make sense out of all of that. And for us, we tried it, and then it never really worked. It kind of … You end up walking in the wilderness of this data not knowing what to do with, not knowing if the reports that you’re pulling from it are actually useful or if you should be doing other stuff, and then you just end up wasting time. We would find ourselves spending three, four hours basically putting together reports and then realizing they’re not really useful.

Chase Clemons: So we’ve definitely moved away from that more into a very specific … When we go in, when we look at specific data, when we look at specific reports, it’s all about answering a specific question or group of questions that we have.

Nick Francis: I absolutely love that. As soon as I saw the phrase “tagging with intent” … ‘Cause you recently wrote about this on HelpU, but for those of you who want to learn a little bit more about the tactics the team uses, please visit HelpU and read Chase’s article. But what I love about that is how helpful that is to a product organization, right? If support really wants to have an outstanding on the product team and within the organization, tagging with intent and presenting really clear data to answer that question that a product team may be asking is so, so valuable.

Chase Clemons: Mm-hmm (affirmative). Yeah, it’s one of those where … I think back to … So I was at a support conference in Atlanta a little while ago, and one of the things that we talked about was this just prevailing feeling that support teams have to … When you’re in the inbox, you have to go through and just tag everything with every possible tag, because you never know when that might come in handy down the road, and that’s just such a waste of time because 90% of the time it’s not gonna in handy down the road, and the other 10% of the time you get to the road and you’re going, “All right, well, I saw a 5% increase in sign-up-related emails in the last two months.” All right, well, what’s that got to do with anything? What good does a number like 5% increase tell you? And the answer is it doesn’t. You’ve gotta go in with that intent that when you choose to look at data points that you’re doing it with that question in mind, because otherwise you’re gonna get to the end and go, “Well, I’ve got what could be answers to something, but I don’t know what.”

Nick Francis: Right. And when it comes to discovering a trend or having to answer a question you weren’t really planning on answering, the team is going to see that trend. It doesn’t necessarily have to be based on tags, but generally … And you’re speaking to a really kind of way of building products as well, which is I know at Basecamp you all work on six, eight week cycles, right?

Chase Clemons: Right.

Nick Francis: Where you don’t even think about the roadmap beyond that period, and that allows you to say, “Hey, this is what we’re focused on. These are the questions we really want to have answered in the next to six to eight weeks,” and you really don’t have to worry about the longterm, whether you’re gonna come back to that a year later. ‘Cause again, like you said, with any kind of growing product you never get down the line to answer those questions really far down.

Chase Clemons: Yeah. You know, those six-week cycles are so great because you know what’s gonna happen that cycle. You don’t have any idea what’s gonna happen the cycle, much less a year down the road, so it forces you to be really, really honest with your customers. There’s no iffy, washy, “Eh, maybe,” that you give customers. It’s either a, “Yeah, that’s something we’re looking at right now. Can I get some more information from you about that, like how you would you use it?” or it’s a no. It’s just clear, transparent, and makes for such a better relationship with your customers.

Nick Francis: I love that. Yeah, and I know your customers sure do appreciate that. Even when it’s an answer they don’t love, I know they appreciate the honesty.

Chase Clemons: Yeah, even with a no it gives them information to make their own choice. Do we stick with Basecamp? Do we move to something else? Do we find a workaround? Do we tweak how we’re doing something? I’ve seen where customers will actually … We say no, and they say, “Well, how does Basecamp approach this?” and we say, “Well, like this,” and then they take that and implement that in their own company.

Nick Francis: That’s awesome. How do you see the reporting that you do as a company being used outside of support? Have you or the team ever spent time on reporting that you really didn’t end up using?

Chase Clemons: Yeah. You know, I think … We’ll tackle that second question first. For the ones that we didn’t end up using, we created a tool called Smiley when I first started at Basecamp, so like 2010, 2011. And I think it was the first kind of thing that was out there. Don’t quote me on that, but I think it was one of the first ones. Basically, at the end of every email that we sent out, there was a rating where you could pick how well that interaction went with us, so it was like green smiley face, yellow like me face, and then a sad frowny face for it didn’t go well.

Chase Clemons: That, in the beginning I was obsessed, obsessed, because it’s like, “I wanna see full green smiles across the board,” and we were very, very vocal and like, “Oh, look, Marissa has 100 perfect green smiley interactions across the board. Chase has 100 just perfect across the board.” And it was all automated, so it wasn’t like we were pulling reports necessarily, but it was a report that we looked at that gave you almost a dopamine high, basically. Now I look back on that and go, “Wow, that’s great,” but then you get addicted. You get addicted to seeing those green smilies, and when you don’t get those … I think, for instance, when you roll out a feature that people aren’t happy with and all of a sudden it’s just line after line of people being upset, those red frowns looking at you, that can be really demoralizing when you put so much stock into a report like that.

Chase Clemons: I can’t tell you the last time I looked at that report. It’s been at least a year, probably more like two years. We use those very sparingly for new hires. That’s where it does come in really, really handy, for when you’ve got a new hire that is learning how to interact with customers, learning what phrases work, what explanations don’t work, when they’re finding their voice, basically, on the team. That’s when it comes in really, really handy, because that immediate feedback is really beneficial for them.

Chase Clemons: So I think now we’re at this happy place where, for somebody like me who’s been doing it for a couple years, I don’t really pay attention to it. For somebody that is brand new to our team, yeah, we’re kind of looking at that for the first couple of months, and then after that we just drop it.

Nick Francis: I think you’re touching on some really important things about customer support reporting, and from a high level this is what I talk with a lot of people about, which is there is no single support metric. No matter what organization you’re in, there’s no one metric you can look at to quantify the performance of your team. Over the last couples or years, as we try to evaluate success of any initiative, we always look for a quantity metric and a quality metric. So quantity metric could be, let’s say, response time, how quickly are we able to get back to customers, but happiness ratings — which are very much based on the smilies that you all invented several years ago — any sort of customer satisfaction rating would be the quality metric. So you’re not just answering quickly, but you’re also balancing that with a quality metric to make sure customers are happy.

Nick Francis: And that’s really the story at support. There’s no one metric, and even at Help Scout with our reports we’ve tried really hard to not overemphasize any one metric over the other or to gamify the whole process. You talked a little bit about how the team just gets this dopamine high. You start competing on happiness ratings and so on and so forth. It’s very easy to fall into that trap, and we’ve tried to take a very agnostic view, present the data in a very kind of flat way so that it’s available to you, but we try to express no opinion on how that data should be interpreted, because there’s so … Whenever you’re trying to gauge the success of something, there’s a variety of metrics you can leverage to try and inform that opinion. It’s never one metric, and there’s always some qualitative stuff that we can’t report on that’s always a factor there. Is that something that you all have struggled with as well?

Chase Clemons: Yeah, it definitely is, and that extends even out from the customer support team, like when we’re interacting with the product team. So a good example, you were talking about our six-week cycles earlier. With those, the way that we decide what we’re gonna work on as a company is that we do what’s called pitches. So a pitch is, “Here’s the specific, here’s the specific question that the idea is solving.” It might not be a full-blown beta working kind of thing, but the pitch at least has the bare-bone things going into it.

Chase Clemons: And with those, it can be really tempting to just slather on as much data as you can into that. It’s like, “Oh, well, we see 10% of customers requesting this a day. This is going to affect 5% of the customer base. It’s gonna bring in an extra 3% of profit ever year,” yada, yada, pick your number, insert it into that kind of thing. It’s really tempting to go that route, but what we found is that these pitches, especially for the customer support angle of it, you’re building a story using what you know, both quantitative and qualitative data there. So it’s, yeah, you might mention that a couple of people a day write in about this, but you’re really looking more to tell the story about those couple of people. Tell the exact situation that they’re in and how this new idea that you’re proposing is gonna fix that for them.

Chase Clemons: Ryan, one of our product strategy guys here, likes to talk about how something could be really, really important and not ever really requested from customers. So I think about one of the last features that we implemented at Basecamp was the ability to go in and delete a chat line. That’s something that is very, very small, has a very small footprint. Not something that a lot of customers were asking us for, so when you look at just pure data on that it’s iffy if we should even do something like that.

Chase Clemons: But it turns out when you dig into the stories about people that need to delete a chat line, it’s usually because there is a big mistake that’s been made, it’s embarrassing, it’s a source of anxiety, it has to be gone right then and right there, and that story outweighs all of that quantitative data there that is like, “Eh, maybe you should, maybe you shouldn’t. It’s whatever.” When you look at the qualitative emotion wrapped into that, well of course, Basecamp should tackle that, and we did, and we pulled it in off in six weeks, and that was … I don’t even think it was six weeks. It was like a week’s worth of work or something like that. It wasn’t much. But if we’d only looked the quantitative data there and only presented that, it would’ve been very, very misleading.

Nick Francis: That’s a great illustration of how anyone should go about looking at customer support reporting. There’s always a mix of the quantitative and qualitative story, and that’s why, yeah, with our reporting we’ve always had a really hard time, ‘cause we can’t tell the full story. You’d love to have reporting that’s just like, “Here you go, silver platter, this is what you need to know,” but there’s always another side to that story and you just have to do your best to kind of talk with people about that, make sure they’re aware of it.

Nick Francis: Now, there is one metric that I know you guys at least for a while were pretty vocal about, and that was response time. I mean, you have been legendary when it comes to response time. I mean, back in the day it was like two, three, four, five minutes, where a customer would email Basecamp support and you’d be able to get right back with them. And that was something that seems, from the outside, it seems to have generated a lot of really positive sentiment from your customers and about your brand in general, just in terms of what your values are. Talk to me a little bit about how you used that in the past and whether that’s evolved or not based on what you learned.

Chase Clemons: Yeah. So it’s still something we keep an eye. It’s on the Basecamp support site, so if you go over there right now, it’s 12:15 Central Time on Wednesday the 13th. Our response time is seven minutes. That’s what it’s showing on the site right now. That’s an average, so it’s not every single one is gonna be like that. It’s on average seven minutes. And, yeah, so I think back to 2015, 2016, we made a heavy, heavy push to get that down to one and two minutes. We hired on more people, we expanded to 24/7 support. We went all in on that, thinking that it’s kind of like … You know, Jeff Bezos has this great quote about how people are never gonna say no to faster shipping times from Amazon. People aren’t gonna say no to getting a reply faster from a support team. That’s just not something that people do.

Chase Clemons: So we did that, and we focused on that and focused on it heavily, and it’s kind of like the smiley data point we were talking about earlier: it can really, really be demoralizing when that’s all your team is paying attention to, that everything that the team does and thinks about is oriented to how fast can we get replies out to customers. It was really, really easy at that point to get into this just almost like queue monkey mentality of, “I’m just gonna go in. I’ve gotta answer all these emails. I don’t have time for anything else because we’re only worried about how fast we can get replies out.”

Nick Francis: Right. That can be stressful.

Chase Clemons: It’s stressful, it’s frustrating, it’s demoralizing, it’s all of the above. And so we looked at that and said, “This is just not healthy as a team. This is not something, especially where at Basecamp we want this to be a calm company, staring at how quick a reply time is just not calm.” I even think back to they used to have them in fast food restaurants. Our local McDonald’s would have the timer right about the drive thru window, like the moment you order to the moment it goes out that time had to be a certain thing. It’s a countdown clock. It’s just stressful.

Chase Clemons: So we looked and said, “This is not healthy. This is not something we want to be focused on. Let’s back off and just give ourselves room to reply as we would normally, not making a special effort to get certain times, not looking at that number anymore but going, all right, well, just naturally what would our reply times look like.” And they’re fine now. They’re still sub-10 minutes. It’s still really, really fast replies, all things considered, and customers haven’t even mentioned it. It’s one of those where does it really matter if it’s a 2-minute reply versus a 20-minute reply? Probably not. 1% of the time maybe, when you’re locked out of your account or something like that, yeah, but you don’t optimize your company for 1% of the time.

Nick Francis: Right. Well, that’s outstanding, still super commendable that you all are able to keep up with that. Very cool.

Nick Francis: Basecamp is led by two pretty famous guys, Jason Fried and David Heinemeier Hansson, better known as DHH, creator of Ruby on Rails and some other things. How do they let the team know what’s important for them to understand about your customers and support as a whole? What do they really pay attention to as founders of the business?

Chase Clemons: Yeah, so the cool with Jason and David is that they don’t pay a lot of attention to it. It’s not something that we have to regularly update them on because when they hired it was basically, “We trust to take care of our customers.” It’s very much a hands-off approach, so really the only times that David and Jason kind of come into the equation is when we do kind of reviews, like it’s the end of the year, we’re looking back at how did things go in January, how did things go in February all the way up through December? What’s that compared to last year? We’re looking at have we seen an increase in tickets compared … increase in emails compared to last year. Has our reply time gone crazy out of whack recently, anything like that? And those are really only looked at for hiring purposes, really. We want to make sure we stay ahead of the hiring game.

Nick Francis: Right.

Chase Clemons: They look at some of the reports that are part of those pitches that I mentioned earlier, so like when somebody … When somebody does a pitch, they come to the support team and say, “Hey, can you help us out with kind of the building this out,” and we’ll give them data and things like that for it. But for the most part they trust us to take care of the customers, and that’s what we do.

Nick Francis: I was trying to think about this question just in terms of how we do it at Help Scout as well, and it’s a very similar approach. I mean, you get to a point with a support team where you know their values align, you know that culturally we’ve got the right people in the right places and they’re gonna do a great job for us, so there may be once every quarter that I look at some high-level numbers just to understand some of the staffing, but the real value of support within our company is informing product decisions like you’re talking about. So informing those pitches, understanding what the trends are, what the questions that we’re trying to answer as a team, that’s what I’m super into as a founder and trying to just build a better product. It’s all how support is contributing to the larger product development discussion, which is a super exciting thing, yet it involves a lot of kind of nuanced, qualitative data in addition to some of the tagging that you do. So It really does … I’d say it’s, in our world it’s probably 80%, 90% about product and informing that roadmap than it is kind of monitoring certain metrics.

Chase Clemons: Yeah, I think … So, we released kind of a slight tweak to a feature in Basecamp yesterday, so Jason popped into the team chatroom that we have. Just kind of was keeping an eye on things. He was looking at some of the feedback we were collecting and what customers were thinking about it, but again, that’s all just kind of helping him build a better picture of what customers are saying about that specific feature release. It’s not him looking for, “Hey, tell me what percentage of customers complained today about this specific thing.” The reporting is there is very much more that qualitive informing the product rather than I need some TPS report to look at.

Nick Francis: Yeah, absolutely. Okay, so we’ve talked a little bit about how we do support within our companies, but how should support teams generally decide what to report on? What makes a good reporting cadence versus a waste of time and energy? We may have kind of answered this to some extent so far, but talk to me a little bit about what your general guidance would be. I’m sure you get these questions all the time.

Chase Clemons: Yeah, it’s always like I come back to that intent, right? So, what’s the intent of your reports? Obviously your CEO or your leadership, they’re gonna have questions that they think about, so any questions that they have would be something you want to consider. If it’s something like, “Hey, what are …” Like with Jason yesterday, what are customers saying about the feature we just released? That’s a question coming down from leadership that you need to have an answer for, and that’s where looking at some type of reporting is gonna help out.

Chase Clemons: It’s also one where I tell folks that it’s okay not to track every single thing. That’s the one that everyone is always like, “Wait a second, we can track everything. Why don’t we track everything?” And I like to tell folks, “Yeah, you can, but then you get lost in the data wilderness again.” You miss the forest for the trees or trees for the forest, whatever that phrase is. And you kind of get overwhelmed with all of these data points, where, on the flip side, go into it going, “All right, I’ve got one question I’m gonna answer. I need these data points to do that. I’m gonna track those. I’m gonna look at those. Question Two: all right, here’s the second question that I’ve got. I need these data points. I’m gonna start looking at those.” And build up and scale up from there rather than starting wide and trying to focus down.

Chase Clemons: Worst case scenario? Maybe you miss a month of two of data that might’ve been helpful, but I think that’s a decent trade-off to make for having very clear, very specific reports that you’re building versus, again, just wandering around in the wilderness.

Nick Francis: Yeah, and, I mean, at some level of scale if your support is kind of there you don’t really need more than a month of data to really understand what your customers asking about, what your customers need. I mean, looking back much further is really just gonna be a lot of the same story, so that’s another good reason why really you don’t have to worry about tagging every single little thing along the way. As long as you’re answering a question over maybe a course of a month or so, you’ve probably got enough data. Would you say that’s about right?

Chase Clemons: Yeah, absolutely. I mean, that’s one of those things where I look back and … So Basecamp uses what’s called the Jobs to be Done approach to do interviews about how customers are using Basecamp and what brought them to it, just trying to figure out some marketing things and all that. We’ve taken a similar approach with feature requests where we go out and do those similar type of customer interviews to really get at what customers want. Because customers when the email will tell you one thing, and then when you actually get them on a phone call they’ll be like, “Oh, well, it’s actually this other thing that was going on,” and then that’s the real root of their need, their feature request. Those are things that you can’t get from tags and just from emails alone.

Chase Clemons: So when I think about … My proverbial example, like 5% of customers need this. They’ve emailed about it, we’ve tagged it for the past month, we’ve got 5% increase from last month. So what? I could have gone out and talked to just a handful, just maybe six or 10 of those customers that were in that group and found out exactly what they need and exactly what’s going on. I’ve gotten better overall data from that than from however many hours you spend putting together a report looking at metrics like that.

Chase Clemons: So, yeah, you’re not gonna have to reach back over a year and look at stuff. I mean, maybe on occasion you’re looking at that high-level stuff like emails this month compared to a year ago, are they going up or down? Is that trending? But even that might be questionable. I mean, you’re always gonna have more emails this year than last year overall. So, yeah, I would say go back a month. That should be more than enough, and you know what, Nick? If it’s not, then next time you go out two months.

Nick Francis: Right. I mean, can I just say, by the way, that I love how this discussion has really turned into how support can inform the product roadmap? ‘Cause that really is what it’s about. It’s about making great products. It’s not necessarily about increasing or decreasing response time by this much or handle time or resolutions or any of that junk. It’s really about making better products.

Chase Clemons: Yeah, absolutely. That’s the … At the end of the day, the support team is there because they’re the ones that have a direct connection with the customers. They’re talking to them on a daily basis. They are a fantastic resource for the product team to get information from, and that should be our goal. Our goal is not just bang out the next email and be queue monkeys, it’s to go out and do the investigation work, be basically support detectives, and bring all of that information back into the company so that they can use it.

Nick Francis: Awesome. So I’m gonna over to the other side of the coin and talk a little bit about metrics, even though that we know that the most important reason for this sort of reporting is to inform product development. I’m seeing a couple questions from viewers here, so I just want to talk a couple of those. So, first question: “We can relate to wanting to stay diversified with metrics, not just focusing on one too much, maybe a quality and quantity metric, maybe two or three since it can get a little bit too stressful or maybe unintentionally gamify support. But how, though, do you take out actionable meaning from the metrics that you do pull?” So maybe there are a few high-level metrics that you do pay attention to, maybe on a monthly or even quarterly cadence. It doesn’t have to be every day. What are those metrics for you currently, and what sort of insights might you pull from those?

Chase Clemons: Yeah. I hate to sound like I’m beating a dead horse, but the actionable part of that is all about, well, what question are you answering? Because there is gonna be the action that you take going forward. So you think about, all right, one of the questions that we like to think about this time of year is do we need to do any hiring for next year. Does our support team need to grow? One of the metrics that we look at for that is response times. Are response times at a half hour, hour, two hours, whatever? If so, that is a good indiction that we don’t have enough people in the inbox at any specific moment. Handle times. Are handle times stretching out for a day or two or something like that? Another good indication that we don’t have enough people helping out with answering emails and things.

Chase Clemons: So the question we go into looking at those high-levels metrics is do we need to hire more people for next year, because if we do, we need to hire now so we can train and have them ready to go. And the action comes out of that answer. It’s the, “Oh, yes, we do need to hire because response times are high, handle times are high, whatever metric is out of whack,” and so we hire. Or we look and go, “You know what? We’re still … We’re only up maybe 5% over incoming emails from last year. We’re up 2% on handle time from last year. That’s fine. We’re good with those, so no, we don’t need to hire.”

Chase Clemons: So it’s all about that action is gonna come out of the answer, and if you are looking at those high-end-level numbers, if you’re just looking at a dashboard that Help Scout gives or Zendesk gives you or whatever other app gives and you’re just looking at numbers, it’s easy to find what you think is an answer and then start working from there. It’s a much better approach to go in, look at the dashboard, and go, “All right, I have this question. I need to find the answer from this information.”

Nick Francis: I love that. And just to speak from kind of the Help Scout side of things, the most important thing we’re looking at as well is when do we need to hire. How do we staff up? And so I’d say on a quarterly we really look at how has volume changed just overall, how has that impact our response time, and then there’s one other metric that we calculate. I really wish we could calculate this in Help Scout reports. We just don’t have enough data yet. But it’s what percentage of our customers are we hearing from? So what we don’t have is the customer count, right? So we don’t have Basecamp’s customer count, but essentially we just do some really quick math. So it’s like, okay, we heard from 2500 customers over the last quarter. What percentage of our entire customer base is that?

Nick Francis: And that’s a really great reflection, at least to me, of, okay, how good a product are we building? How many customers really do have to reach out with questions, feedback, dissatisfaction or maybe satisfaction about the product? And we know that it’s probably … People don’t wake up every day and say, “Hey, I hope I get to reach out to support today.” It’s usually an inconvenience, and so we try to keep that percentage … At least quarter over quarter we try to monitor that, and that’s somewhat of a reflection of the kind of quality product we’re building. Chances are if it’s got a lot of bugs or we’ve released something that made people upset, then we’re going to see that reflected in the percentage of customers that are reaching out to us.

Nick Francis: So those are a few of the numbers that we look at really high-level. And then, again, it’s like you’ve mentioned and we’ve talked about this entire time. It’s really about, past that, answering a question that you might have and hopefully Help Scout, we’ve got a ton of different metrics that we report on. Hopefully we can help you find an answer, but it’s really not meant for you to have to obsess over those things. It’s really we just want to be able to be there to provide that raw data but otherwise sort of get out of the way, not really express and opinion about what that data means. It may be a clue as to a question that you need to be asking or something that you need to be talking with your customers about, but generally it’s not more than that. It’s not the end-all be-all, whereas in other departments, potentially other areas of the business, it’s all about the numbers and you can know exactly what’s going on and what you need to change based on the quantifiable data. Support just isn’t one of those.

Chase Clemons: Yeah, definitely. And it also … Just one last on thing on that. It’s gonna depend a lot on your leadership, the company leadership, too, because if your company leadership comes in and says, “Hey, we have to meet this amount of target for growth revenue, which means that we need to bring in X,” and if they’re coming in and their goals for the year are very defined out as percentage increases and things like that, they might be expecting that from the support team, too. So you have to realize who your internal customer is and make sure that what you’re bringing back to them matches up with how they best receive that data.

Chase Clemons: Now, fortunately with David and Jason, the way that we build Basecamp, it’s not dictated by a bunch of outside influences or anything like that, so they just want to know what the best ideas for moving forward are. We don’t have to do those kind of reports. But, again, if you’re working for a company that has a specific revenue deadline to meet or whatever and they’re like, “What’s the one feature we can get that’ll make X amount happy and give and X boost in whatever?” then that’s just you’re gonna be in a different situation from what you and I have been talking about.

Nick Francis: Absolutely. Very awesome. Well, dude, I think that’s an amazing place for us to end. We’ve been at it for about 33 minutes. I think there’s been a lot of value. I certainly learned a lot. So thank you so much, everybody, for watching. I really appreciate it. Chase, thank you so much for being a part of this. This has been super fun. And keep up the beard, man. You’re kicking my butt on the beard.

Chase Clemons: It’s looking … It’s winter, so it’s gotta keep my face warm, basically.

Nick Francis: I love it. It’s perfect [inaudible 00:33:37].

Nick Francis: Next month we’re gonna be talking with Jeff Toister, a well-known author. We’re gonna be talking about the role of self-service, which is super important as well. Can really help you kind of scale your support team over time, and there’s a lot of really great things we have to discuss there. So every month we’re doing really great free webinars just like this with industry experts. Doesn’t really matter what tools you use, we’re here to try and create value for you and your company and your support team, so hope you’ll join us next month. Be sure to sign up for the HelpU newsletter to be notified of these sorts of things.

Nick Francis: Again, Chase, thanks so much. Have a wonderful holiday, man. We’ll talk soon.

Chase Clemons: My pleasure. You, too.

Nick Francis: All right. See you, everyone.

Choosing Your Perfect Help Desk

Get a Help Scout Resource

From your initial search to final purchase and setup, this (unbiased) resource will help make choosing any help desk easier.

We’ll also send you new articles every week; unsubscribe anytime.