Customer Satisfaction Score (CSAT) is one of the most widely used customer service metrics. Companies send short CSAT surveys after a support interaction—but these are tied to your customers’ most recent interaction with your company and can be skewed if they had an issue or poor experience. And because CSAT surveys are kept short to make it easy for customers to respond, they don’t tell you why customers feel the way they do.

To get more detailed insights, you should be sending broader surveys to your entire customer base periodically throughout their lifecycle—completely separate and in addition to single-question CSAT surveys. But be careful not to make your surveys too long: SurveyMonkey found that “abandon rates increase for surveys that took more than 7–8 minutes to complete; with completion rates dropping anywhere from 5% to 20%.” That’s roughly 15 multiple-choice survey questions, or more like 10 if you have a lot of free-text answers.

Broader customer satisfaction surveys are important for understanding how happy customers are in general, not tied to individual support interactions. They should be sent periodically, separate to your existing CSAT, Customer Effort Scores (CES), or Net Promoter Scores (NPS) surveys, but you should be cautious not to overload customers with surveys, else you risk negatively affecting customer satisfaction.

This article includes 18 questions to help you start building customer satisfaction surveys that look at the whole customer experience, so don’t try to use them all in the same survey. Choose a mix of questions from each section, and ask the same questions each time, so you can track trends more easily.

customer_support_report_banner_final-4

Customer service questions

Customer satisfaction is normally thought of as the customer service team’s remit. When sending customer satisfaction surveys, you’ll want to gather feedback on the quality of support and service, in which case, these questions are a great place to start.

These are designed to be asked at various times throughout the customer lifecycle rather than tied to a specific customer support interaction. But you could adapt them to ask about their “recent support experience” or “when you contacted us today” if you wanted to use them to follow up after a support interaction.

Many of the questions here involve different rating scales, such as a 1–10 scale or a Likert scale. The Likert scale is typically a 5- or 7-point scale ranging from “strongly disagree” to “strongly agree” or similar. This allows customers to express degrees of opinion—although customers may avoid choosing the most extreme choice on the scale. A 1–10 scale offers more options than the Likert scale, but it can be difficult to quantify the difference between scoring a 3 or a 4. So you may find Likert scale ratings the best fit for many of the questions in this article.

If you’re including several questions that use rating scales in the same survey, be consistent: Keep the top end of the scale as the most positive response and the bottom end as the most negative one.

1. Overall, how satisfied are you with our customer service?

  • Why ask this question: Tracking your Customer Satisfaction Score (CSAT) helps you understand the quality of your support. This question encourages customers to provide feedback on your customer service and helps you understand the quality of service you’re providing year-round—not just for individual interactions. You should keep these responses separate from your regular CSAT results, though, to avoid mixing long-term and short-term customer satisfaction measures.
  • How to collect answers: Rating scale followed by an open text field for the customer to provide clarifying comments

2. How easy is it to contact our support team when you need help?

  • Why ask this question: Ask this to track your Customer Effort Score (CES). This shows how effective your support processes are. For example, is it easy for customers to find a way to contact your support team? Are customers able to contact you on the support channel that’s most convenient for them? Like the CSAT question above, you should keep these responses separate from your regular CES results to separate short- and long-term measurements.
  • How to collect answers: Rating scale, including “N/A” in case customers haven’t needed to contact your support team

3. How satisfied are you with how quickly our support team is able to resolve your issues?

  • Why ask this question: Long resolution times are a common cause of customer dissatisfaction. Tracking this alongside your Average Resolution Time will help you understand customer expectations and set resolution time targets in line with customer expectations.
  • How to collect answers: Rating scale, including “N/A” in case customers haven’t needed to contact your support team recently

4. Thinking back to the most recent time you contacted our support team, what best describes the problem you needed help with?

  • Why ask this question: This can help you spot common problems that can cause customer dissatisfaction. If you ask this question periodically (for example, quarterly), you can track changes in responses to understand whether your company has been able to resolve some of your customers’ most frequent challenges and pain points—or whether you’re seeing the same issues every time.
  • How to collect answers: Multiple choice, including “N/A” in case customers haven’t needed to contact your support team recently

5. Before reaching out to our Support team, in what other ways did you try to solve your problem?

  • Why ask this question: You can use this to identify gaps in your support provision because it shows ways customers tried to solve their problems without success. For example, if you see a trend of customers who tried searching your help center but couldn’t find the answer to their question, that suggests you may need to add or update some of your help center content.
  • How to collect answers: Multiple choice or open text box

Customer loyalty questions

These six questions offer insights and indicators of customer loyalty and can help you understand how likely it is that a one-time purchaser will turn into a repeat customer.

6. On a scale of 1 to 10, how likely are you to recommend [company name]?

  • Why ask this question: Net Promoter Score (NPS) is the traditional measure of customer satisfaction and a leading indicator of word-of-mouth growth. You can measure the number of satisfied customers as well as the degree of customer satisfaction. If you ask this as part of your customer satisfaction survey, you should keep these responses separate from your regular NPS results to separate short- and long-term measurements.
  • How to collect answers: Rating scale

7. How likely are you to purchase from [company name] again?

  • Why ask this question: This question won’t apply to all companies. But if your products or services are designed to be purchased regularly, it’s an important question to ask because dissatisfied customers are unlikely to be repeat customers. This question provides an indicator of customer loyalty (though just an indicator, not a measure, as there’s no guarantee these customers will purchase from you again).
  • How to collect answers: Rating scale

8. Overall, how satisfied or dissatisfied are you with [company name]?

  • Why ask this question: A customer may interact with several different departments—sales, marketing, and support—as part of their buying process. This question focuses on the customer’s complete experience buying from your company, not just their support experience, so you can understand how satisfied they are with your company as a whole.
  • How to collect answers: Rating scale

9. How can we improve your experience with [company name]?

  • Why ask this question: This question gives customers the chance to share their thoughts on areas where you could do better or where you’re not meeting their expectations. For example, you might have customers who come to you from a competitor and so have different expectations for their customer experience. This will give you insight into what you can improve and shows you’re committed to providing the best experience for your customers.
  • How to collect answers: Open text field

10. Based on the service you received, how likely are you to recommend [company/product] to others?

  • Why ask this question: This puts a customer service slant on the traditional NPS survey. High-quality customer service and support can drive customer loyalty, and this question helps to quantify the impact support teams can have on customer retention.
  • How to collect answers: Rating scale

11. How disappointed would you be if you could no longer use [company/product]?

  • Why ask this question: This question was developed by startup advisor Sean Ellis as a leading indicator of product/market fit. But even established companies can get value from this question: It helps contextualize how “sticky” your product is and how easy it would be for customers to stop using your product in favor of a competitor’s.
  • How to collect answers: Rating scale followed by an open text field for the customer to provide clarifying comments

Product usage questions

If your customers aren’t using the product regularly or are struggling to get value from it, they’re unlikely to be satisfied, no matter how good your customer service. While you should try and get as many product-related insights from your product analytics as possible, these questions provide useful context about the customer’s experience using your product, which is just as important as the service you provide.

12. How long have you been using our product?

  • Why ask this question: Where possible, you should try to get this information from your product analytics to avoid asking additional or unnecessary questions. But if you’re not able to link up customer data across multiple platforms, this question provides useful context if you want to break your survey responses down into customer cohorts. Look for trends across the customer lifecycle: Are long-time customers more or less satisfied with your product and service than new customers? You can use these insights to understand where in the customer lifecycle you’re providing the best experience and work to replicate and provide a consistent experience for all your customers.
  • How to collect answers: Multiple choice

13. How often do you use our service/product?

  • Why ask this question: Like the question above, you should be able to get this information from your product analytics. If you can’t, this question can help you predict customer loyalty. A customer who uses your product every day is more likely to renew at the end of their contract than one who only uses it every couple of weeks.
  • How to collect answers: Multiple choice

14. Does our service/product help you achieve your goals?

  • Why ask this question: This question lets you know whether your product meets your customers’ needs. If they’re not achieving their goals, it’s unlikely they’ll be satisfied with your product. You can also use this question to check that your sales and marketing are aligned with your product: Do customer expectations match up to their real experience with your product?
  • How to collect answers: Yes/No followed by an open text field for the customer to provide clarifying comments

15. Which product features do you find most valuable?

  • Why ask this question: This helps you understand where your customers get value from your product. It also offers great insights for your sales and marketing teams, who can use these insights in conversations with prospects, on product positioning on your website, and in marketing campaigns.
  • How to collect answers: Multiple choice followed by an open text field for the customer to provide clarifying comments

16. Which product features do you use most often in your day-to-day life?

  • Why ask this question: This helps you understand how customers use your product—they’ll use different features to achieve different things. And if they select “I don’t know” or “N/A,” that suggests they’re not that engaged with your product and are unlikely to be having the best experience using it.
  • How to collect answers: Multiple choice followed by an open text field for the customer to provide clarifying comments

17. What would you improve about the service/product?

  • Why ask this question: This question can be used as an early warning sign of customer dissatisfaction. It helps you identify areas of your service or product that aren’t meeting customer expectations. If you’re seeing similar responses again and again, you can work to improve those specific areas. Alternatively, customers might use this question to request new features, so you can assess the appetite for new product features and gather data to help prioritize your product roadmap.
  • How to collect answers: Open text field

Closing question

Closing questions are a nice way to round off your customer satisfaction survey. They signal to your customers that this is the last question and give an opportunity to share any final thoughts.

18. Do you have any additional comments or feedback for us?

  • Why ask this question: Don’t end your survey abruptly. This question gives customers a chance to add anything you might’ve missed with an open question to finish. This may surface small concerns that aren’t big enough to warrant reaching out to your customer support—little annoyances like a delivery that was a day or two slower than expected—but that still affect customer satisfaction.
  • How to collect answers: Open text field

Bonus: Best practices for building customer satisfaction surveys

Your customer satisfaction survey needs to be quick and easy for customers to fill out to maximize response rates and bring in the most valuable feedback from your customers. Keep these best practices in mind when creating your surveys.

Build surveys for your customers, not your company

When you think about building a survey, it can be tempting to start from the perspective of, “What information do we want from our customers?” This can lead to you trying to fit too much into one survey because different departments will want different information.

May Lauren Arad, product marketing manager at CoScreen, shared this advice: “Keep in mind who your users are! You are building the survey for them. Make the survey understandable and concise.”

Put this into practice: Keep a customer-centric mindset when creating your survey. Go through the process of filling it out as though you were a customer. Is it easy to complete? Do the questions all make sense? Are the questions relevant? If there are any questions that make you pause or you skip as you go through, you should revisit them as it’s likely your customers will struggle or skip over them too.

Ask Different types of questions

If your survey uses the same types of questions and the same rating scale throughout, it’s easy for customers to complete. But it also makes it easy for them to disengage with your survey, so they’re simply going through the motions.

Asking different types of questions means you can collect different types of feedback—both qualitative and quantitative—and get more insights than you might get if you stick to a single question format, such as multiple-choice questions.

Put this into practice: Mix up your survey with questions that require different types of responses, such as binary yes/no answers, multiple-choice, using a rating scale, and open text answers.

Keep it concise

The longer and more complex your surveys, the less likely you are to get the quality and quantity of responses you need to draw meaningful insights.

SurveyMonkey found that survey completion rates dropped when surveys took more than 7-8 minutes to complete. Respondents spent on average 25-30 seconds per question when completing surveys that were between three and 15 questions long. Spending 30 seconds on 15 questions would be seven-and-a-half minutes, which is about as long as your survey should be.

Source: SurveyMonkey: How much time are respondents willing to spend on your survey?

Put this into practice: Don’t fall into the trap of adding nice-to-have questions to your customer satisfaction survey. Think carefully about each question you’re adding and consider what insights you’ll get from that question. Look at all the questions collectively to minimize overlap between them.

Time it right

The timing of your survey has a significant impact on both your response rate and the feedback you receive. If you send a customer satisfaction survey before the customer’s issue has been resolved, they’ll be more likely to provide negative feedback, which will make your support team look worse than they are.

According to Geckoboard VP Customer Success, Luis Hernandez, it’s important to send your survey at the right time in the customer journey but also to give customers time to complete the survey:

“Time it well, both in the sense of when to send the survey (i.e., you don't want to send a survey if the issue isn't solved or the question hasn't been answered entirely or send a survey to someone that still has a conversation open for that matter) but also how long to keep it open for (you don't want people trying to complete the survey a few days after it was sent only to find they no longer can).”

Put this into practice: Build out a customer satisfaction survey playbook, so you’ve got a plan for when and how to send different types of surveys. For example, you might want to send single-question pulse surveys like CSAT or CES surveys as soon as a support ticket is marked as “resolved,” and you might want to send monthly or quarterly multi-question satisfaction surveys.

Send regular surveys

Collecting regular feedback by sending surveys frequently shows that you’re interested in improving the overall customer experience. If you send surveys after each purchase or support interaction, rather than just sending a survey once a year, it demonstrates a clear commitment and investment in customer satisfaction.

Put this into practice: Don’t try and ask everything at once—alternate between sending longer customer satisfaction surveys and pulse-check single question polls

Don’t be fooled by response bias

Not all of your customers will fill out your customer satisfaction survey. The ones most likely to respond are those with the most extreme experiences: either they either had an amazing experience and love your product, or they had a lot of challenges and want to vent about it. The middle-of-the-road customers are much less likely to respond, so remember that your response data is likely to showcase the two extreme ends of the customer experience spectrum.

Vasu Prathipati, CEO & co-founder at MaestroQA, advises: “Mind the response bias! Customers tend to fill out surveys only when delighted, upset, or incentivized to do so. This tends to bias the data a little bit, so CX/CS leaders should keep that in mind when building customer surveys. Making those surveys as easy to fill up as possible helps you get a wider sample of customers, and helps mitigate the response bias a little.”

Put this into practice: Look at all your response data rather than fixating on individual responses and data points. Focus on overarching trends and common themes to help mitigate the response bias when analyzing your survey response data.

Your customer satisfaction survey template

Be concise. Ask too many questions, and you’ll get no responses. While we’ve included 19 questions in this article, you shouldn’t aim to ask them all in one go. To help you get the best insights from your customers without overwhelming them with survey questions, we’ve put together a customer satisfaction survey template that combines questions on customer service, customer loyalty, and product usage. Make a copy of this template (Google Form) and use it to build out your own customer survey, and start getting that valuable feedback from your customers.


Customer satisfaction survey template

Send this customer satisfaction survey template to get feedback from your customers and understand how satisfied they are with your company, product, and services.

1. On a scale of 1 to 10, how likely are you to recommend [company name]?
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10

2. How long have you been using our service/product?

  • Less than one month
  • 1-3 months
  • 3-6 months
  • 6-12 months
  • 1-2 years
  • More than two years

3. Which product features do you find most valuable?

  • Feature 1
  • Feature 2
  • Feature 3
  • Feature 4
  • Feature 5

4. What would you improve about the service/product?
[open text field]

5. How easy is it to contact our support team when you need help?

  • Very easy
  • Easy
  • Neutral
  • Difficult
  • Very difficult
  • N/A - I haven’t needed to contact the support team [skip to question 8]

6. Thinking back to the most recent time you contacted our support team, what best describes the problem you needed help with?

  • Missing items
  • Delayed delivery
  • Returns or exchanges
  • Problems on website
  • Other [open text field]

7. How satisfied are you with how quickly our support team is able to resolve your issues?

  • Very satisfied
  • Satisfied
  • Neutral
  • Dissatisfied
  • Very dissatisfied

8. How can we improve your experience with [company name]?
[open text field]

9. How likely are you to purchase from [company name] again?

  • Very likely
  • Likely
  • Neutral
  • Unlikely
  • Very unlikely

10. Do you have any additional comments or feedback for us?
[open text field]