How to improve your reputation with average customer service

Customer service is getting worse.

That's the finding from two recent reports. Companies are struggling to avoid service failures and keep their customers happy. But a third report and two top customer experience experts offer a glimmer of hope.

You can stand out from the competition by being average.

Forget wows. Stop worrying about delight. Don't fret over extraordinary. Just be consistently, perfectly, boringly average.

There are few caveats.

  1. Your average has to be just a little better than the competition.

  2. Your average has to be consistent.

More on those in a moment. But first, let's look at the state of customer service.

A customer completing a survey on an app.

Is customer service getting worse?

Yes. Two prominent consumer studies show that customers perceive that service is getting worse.

The first is the American Customer Satisfaction Index (ACSI), which publishes a quarterly national customer satisfaction score for the United States. The composite ACSI score has declined or remained the same for seven quarters.

The second study is the 2020 National Customer Rage Study from Customer Care Measurement and Consulting (CCMC). It found that the number of households experiencing at least one problem over the past 12 months increased by 10 percentage points since the 2017 study.

There is some hope.

I conducted a survey of 1,084 U.S. consumers in November, 2020 and asked them what type of service they receive most often. Surprisingly, 66.7 percent said they usually receive good service.

Get the Report

Download a copy of the report, “What type of service do customers receive most often?”

Should you try to delight every customer?

No. It’s impossible to delight every customer, and trying to do so can be costly. It might seem counterintuitive, but research shows that delighting customers has no significant benefits.

Customer experience expert, Matt Dixon, is the author of the classic business book, The Effortless Experience.

Dixon told me in an interview that he and his colleagues set out to research customer delight and discovered something unexpected. They found that companies were better off avoiding service failures.

"On average, most service interactions don't create loyalty at all. They create disloyalty."

That's because, try as they might, companies often fail to keep their promises.

  • Products don't work

  • Services fall short of expectations

  • Delivery is a logistical nightmare

CCMC's Customer Rage Study shows that companies continue to make life miserable for customers when something goes wrong:

  • 2.9 contacts were needed to resolve a typical complaint.

  • 58 percent of customers never got a resolution.

  • 65 percent felt rage while trying to get a problem solved.

Despite the widespread use of surveys, many companies are doing a poor job identifying the problems that lead to customer rage.

For example, I recently experienced 18 points of frustration when ordering a table and barstools. It's likely the company only identified one.

There's got to be a better way.


How average service can win customers

Average really isn't the right word. Consistency is the key, as Shep Hyken points out in his excellent book, The Cult of the Customer. Hyken has a fantastic definition of customer amazement.

"Amazement is above average, but it's above average all of the time."

Hyken elaborates that companies win customers by being just a little above average, but doing it consistently. It's the consistency that captures customers' attention and eventually earns their trust.

You can hear more from Hyken in this interview.

There are a few things companies can do to be more consistent. The starting point is to create a customer service vision, which is a shared definition of outstanding service that gets everyone on the same page.

I researched customer-focused companies while writing The Service Culture Handbook. A clear vision was a common trait that set elite organizations apart.

The next step is to gather voice of customer feedback. Surveys can play a role, but there are many ways to gather customer feedback without a survey.

It's also important to identify reasons employees struggle to provide consistent customer service. I uncovered ten obstacles in my book, Getting Service Right.

Finally, it's important to understand that every customer is different.

Which brings us back to my study on the type of service customers receive most often. Most people felt they usually receive good service, which is service that meets their expectations.

But there was nuance to the responses. Perceptions changed by age group, gender, and geography. For example:

  • More women than men reported they receive outstanding service most often.

  • People in the western United States were more likely to report outstanding service.

  • Older customers felt they receive more outstanding service.

Cover image of the report, “What type of service do customers receive most often?”

Get the Report

Download a copy of the report, “What type of service do customers receive most often?”

Take Action

Think about the companies you admire most.

They didn't earn their reputation by wowing customers once in awhile. Elite companies are known for dependably good experiences.

Let your competition flail about trying to delight customers. They'll inevitably fall short. You can stand out by being really good.

How to Get Massive Customer Feedback Without a Survey

Nate Brown, Co-Founder, CX Accelerator

Nate Brown, Co-Founder, CX Accelerator

It's no secret that customers are tired of surveys.

We get too many, they take too long to complete, and many fail to adequately capture how we really feel about our experience. There has to be a better way.

Conversations are an untapped resource. We talk to customers face-to-face and over the phone. We have written conversations via email, chat, sms, and social media. This conversational data, often referred to as "unstructured" data, represents a treasure trove of customer insight, but customer experience leaders struggle to capture and organize it all.

Nate Brown, Co-founder of CX Accelerator, has discovered a novel way to solve the problem. He's designed a simple process that allows frontline representatives to quickly and easily capture data from customer conversations.

Brown shares his simple process in this 20 minute interview. We cover:

  • Why capturing data from customer conversations is so important

  • How to turn a simple USB webkey into a “CX Magic Button”

  • Where in the customer journey to look for data

  • How to encourage employees to capture and share customer feedback

  • Simple ways to quickly analyze and act on the data

You can get step-by-step instructions from this post or follow him on Twitter at @CustomerIsFirst. You’ll also get more customer feedback help on this survey resource page.

You can get low-cost USB webkeys with your company’s logo from Lev Promotions.

How ecobee Wins Customers With Smart Surveys

Advertising disclosure: We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

I recently purchased an ecobee3 lite smart thermostat for The Overlook, a vacation rental property my wife and I own. It's one of those where you can control the temperature remotely via an app.

The decision to go with ecobee came down to service and support. 

The company uses a customer-centric approach to its product design, its pre-sales support, and its customer service. Ecobee also takes voice of customer (VOC) feedback seriously, and does an impressive job deploying both Net Promoter Score (NPS) and Customer Satisfaction (CSAT) surveys.

Ecobee's Director of Customer Service, Andrew Gaichuk, was kind enough to share some insight into how ecobee uses VOC feedback to stay on top.

The Ecobee 3 Lite. Image source: Ecobee

The Ecobee 3 Lite. Image source: Ecobee

My ecobee Experience

I considered a number of different options before purchasing the ecobee3 Lite ($169 on Amazon).

It had received a number of good reviews. A vacation rental I stayed in a few months ago had the same model and it was very easy to use from a guest perspective. Ecobee even has this simple tool on its website that allows you to verify compatibility with your house's heating and cooling system.

These factors, coupled with a poor support experience from one of ecobee's main competitors, cemented the decision.

Installation was a breeze with this helpful online guide. There were also easy-to-follow instructions in the box along with a few extras such as a plate to cover the hole in the wall left behind from your previous thermostat.

Once installed, I downloaded the ecobee app that lets me adjust the heating schedule remotely. This is a big plus since I'll be able to lower the temperature whenever guests check out, which means a lower propane bill this winter.

Best of all, it's easy for guests to use. Temperature adjustment is intuitive and simple, with a slide of the finger being all that's required.

 

Ecobee and NPS

Customers get an NPS survey two weeks after registering their ecobee. The survey arrives via an email sent by the NPS survey company Delighted.

This is a good way to deploy a Net Promoter Score survey, since it asks how likely a customer is to recommend a company's product or service.

ecobee.jpeg

Gaichuck explained the rationale behind sending the NPS survey after two weeks. "This gives the customer enough time to experience the product and feel the benefits of ownership."

This was certainly the case for me. The Overlook had guests the first two weekends after I installed the ecobee, so I was already getting a sense of how the thermostat was working.

Many companies make the mistake of sending out an NPS survey after each customer service transaction. This really isn't the best tool to assess customer service alone, since likelihood to recommend is based on many more factors. 

In the case of ecobee, the purchase experience, installation, and the product itself all weigh on whether a customer would recommend the product to a friend.

Ecobee's NPS survey also has an open comment question. This allows customers to provide additional detail on why they gave a certain rating, which can be analyzed later.

feedback1.jpeg

The survey asks just two questions, a rating question and an open comment question, yet it's a powerful tool because Ecobee uses the data correctly.

Ecobee's customer service team follows up with anyone who gives a rating of six or lower on the likelihood to recommend question. In NPS parlance, people who give a 6 or lower are known as detractors, so this is a chance to dig deeper into customer feedback or perhaps even save the customer.

Gaichuk and his team also analyze NPS survey comments for trends.

"We define trends through key words such as Customer Service, Installation, Wifi, etc. to help narrow down what key issues customers are experiencing so we can action it for future improvements. For example if we see any detractor for 'Customer Service' we can investigate the interaction, determine the issue and provide one on one coaching/feedback with the CSR."

 

Ecobee and CSAT

Customers who contact ecobee's customer service team receive a CSAT survey at the end of the interaction. 

CSAT is a much more appropriate survey type than NPS for service transactions, so it's good to see ecobee using both NPS and CSAT in an appropriate way.

Ecobee uses Zendesk customer service software, which has a built-in survey question that simply asks customers, "Are you satisfied or unsatisfied?"

Like the NPS data, Gaichuk uses these responses to identify trends.

"I can measure these C-Sat scores by department, CSR team or agent level. The Supervisors are each responsible to review the Unsatisfied results with their respective team members and identify areas for improvement."

Ecobee's customer service team currently has an outstanding 91 percent CSAT rate.

The company sends customers a survey as a post-transaction email. My research shows this is a best practice, and Ecobee enjoys a robust 19 percent response rate.

According to Gaichuk, the customer service team uses the survey invitation to create another positive customer touch point.

"In early 2017 we changed our call process so CSR’s are now responsible to email the customer a summary of the call interaction. This is a great way to finish the interaction, wow the customer and provide them any additional information that may help. As a result the customer is provided the ability to rate the CSR’s support they provided."

 

Conclusion

Writing this blog post means I'm definitely recommending the ecobee3 Lite to friends and colleagues. 

The product is excellent, though I think it's the service and support that really makes the difference. Perhaps most impressive is how Gaichuk and his team at ecobee are using customer feedback to continuously improve.

What is a Good Survey Response Rate?

It's the most common question I get about surveys.

Customer service leaders are understandably concerned about getting a lot of voice of customer feedback. So my clients want to know, "What is a good response rate for our customer service survey?" 

The answer may surprise you—there's no standard number. 

There are situations where an 80 percent response rate might be bad while a 5 percent response rate might be phenomenal in other circumstances.

In fact, I'm not overly concerned with the percentage of people who respond. My advice to clients is to use a different set of criteria for judging their survey responses.

Here's how to evaluate your own survey response rate the same way I do.

Three Response Rate Criteria

There are three criteria that you can use to determine if you're getting a good response to a customer service survey:

  • Usefulness

  • Representation

  • Reliability

Usefulness is the most important consideration.

Any response rate that provides useful customer feedback is good. That's not to say you can't do even better than your current rate, but the whole purpose of a customer service survey should be to yield useful data.

For example, let's say you implement a contact opt-in feature that allows you to follow-up with customers who leave negative feedback. That survey could become tremendously useful if it allows you to contact angry customers, fix problems, and reduce churn.

Representation is another important way to gauge your response rate.

You want your survey to represent all of the customers you are trying to get feedback from. Imagine you implement a new self-help feature on your website. A representative survey in this case would ask for feedback from customers who successfully used self-help as well as customers who weren't successful and had to try another channel.

Sometimes you need to augment your survey with other data sources to make it more representative. The authors of The Effortless Experience discuss the self-help scenario in their book and suggest having live agents ask customers if they first tried using self-help.

This question can help identify people who didn't realize self-help was available and therefore wouldn't complete a survey on its effectiveness. It could also capture feedback from people who tried self-help, were unsuccessful, and didn't notice a survey invitation because their priority was contacting a live agent to solve the problem.

My final criterion is reliability.

This means the survey can be relied upon to provide consistently accurate results. Here's a summary of considerations from a recent post on five characteristics of a powerful survey.

  1. Purpose. Have a clear reason for offering your survey.

  2. Format. Choose a format (CSAT, NPS, etc.) that matches your purpose.

  3. Questions. Avoid misleading questions.

Many surveys have problems in one or more of these areas. For instance, a 2016 study by Interaction Metrics discovered that 92 percent of surveys offered by the largest U.S. retailers asked leading questions that nudged customers to give a more positive answer.

For example, Ace Hardware had this question on its survey:

How satisfied were you with the speed of your checkout?

The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.

A more neutral question might ask, "How would you rate the speed of your checkout?"

 

Resources

A survey response rate is good if it generates useful data, is representative of the customer base you want feedback from, and is reliable.

That doesn't mean you shouldn't strive to continuously improve your survey. Here are some resources to help you:

A Simple Way to Double Your B2C Survey Responses

Everyone wants a better survey response rate. The Center For Client Retention (TCFCR) recently shared some data about business-to-consumer (B2C) surveys that revealed an easy way to improve results.

TCFCR helps businesses conduct customer satisfaction research. The company's client focus is primarily Fortune 500 companies in business-to-business (B2B) and B2C segments.

There's a big need for these type of services given that a recent study from Interaction Metrics found 68 percent of surveys offered by America's largest retailers were "total garbage."

I provide similar services to small and mid-sized businesses, so I was curious to see what TCFCR's survey data might reveal.

One quick look and I immediately saw a way for businesses to double the response rate on their B2C surveys.

The Response Rate Secret

TCFCR pulled aggregate data from thousands of surveys across all of their clients for a 12-month period. The company compared response rates for "in the moment" surveys versus follow-up surveys sent via email. 

Here are the results:

Follow-up surveys had more than twice the average response rate!

An in the moment survey is offered at the time of service. It could be a link in an email response from a customer service rep, an after-call transfer to an automated survey, or a link in a chat dialog box.

A follow-up email survey is sent after the customer service interaction is complete.

TCFCR also found that sending a reminder email after the initial survey invitation typically generated an additional 5-point increase in response rates!

Some companies do follow-up surveys via telephone instead of email. TCFCR's data shows that those surveys get an average response rate of 12-15 percent, which is on par with in the moment surveys.

One thing to keep in mind is that this data is for B2C surveys only. TCFCR found that B2B surveys typically get a response rate that's half of what you'd expect from a B2C.

 

Increase Response Rates Even More

There are a few more things you can do to stack the deck in your favor.

One is to keep your surveys short. A 2011 study from SurveyMonkey found that survey completion rates drop 5-20 percent once a survey takes 7+ minutes to complete. The same study discovered that's usually around 10 questions.

Most surveys will gather adequate data with just three short questions.

Another way to improve response rates is through rules-based offering. A lot of customer service software platforms, such as Zendesk, have a built-in survey feature that allows you to adjust which customers receive a survey and when.

For instance, you might only send a follow-up survey once a support ticket is closed rather than after every single interaction. Or if you offer a subscription-based service, you might survey all customers when they reach the six month mark in their annual subscription, regardless of whether they've contacted your company for support.

You can learn more about response rates and other survey-related topics here.

The Powerful Survey Feature That Drives Customer Loyalty

Improving loyalty is a big reason companies survey customers.

The challenge is finding ways to actually accomplish that goal. Customer service leaders tell me confidentially that analyzing survey data is a struggle. Getting leaders to take meaningful action is another tough task.

There's one survey feature that can immediately improve your results. Seriously, you could implement it today and start reducing customer defections.

What is it? 

It's the contact opt-in. Here's a run-down on what it is, why it's essential, and how to implement it immediately.

What is a Contact Opt-In?

A contact opt-in is a feature at the end of your customer service survey that allows customers to opt-in for a follow-up contact.

The opt-in does three important things:

  • It allows you to follow-up with an upset customer and save their business.

  • The survey itself remains anonymous, which is important to some customers.

  • The opt-in doesn't promise a contact, it just gives you the option.

Best of all, it's really simple. Here's a sample opt-in:

May we contact you if we have additional questions?

Just make sure you add fields to capture a customer's name and contact information if they say yes!

 

Why are Follow-ups Essential?

There's a widely held perception among customers that surveys are meaningless.

That's because we're inundated with survey requests, but we rarely see any meaningful changes as a result of our feedback. Many customers are convinced their feedback is routinely ignored. (Spoiler alert: they're right.)

A follow-up tells customers you're listening. It demonstrates caring and empathy. Some customers have told me they were surprised and amazed to get a follow-up contact!

Now here's the best part: you might even be able to solve the problem and save the customer!

Data provided by the customer feedback analysis company, Thematic, shows that customers who give a "0" rating on Net Promoter Surveys have a lot more to say in the comment section than customers who give other ratings:

Data source: Thematic

Data source: Thematic

“Detractors across dozens of companies we’ve worked with complain about the inability to contact the company about an issue they have, lack of communication, or difficulty finding information on how to fix an issue themselves”, says Alyona Medelyan, CEO at Thematic. “We have also observed that many customers leave their full name, phone number or reference number in a free-text comment. Detractors are three times more likely to leave contact details than others.”

This presents customer service leaders with two choices:

You can ignore all that anger and wait for the customer to tell family, friends, and colleagues or you can contact the customer and try to iron things out.

 

How to Implement a Contact Opt-In

The process is very straight forward.

  1. Add a contact opt-in to the end of your survey.

  2. Review your survey for opt-ins (I recommend daily).

  3. Contact as many customers as possible, especially angry ones.

Through trial and error, I've found that a phone call often works better than email or other channels for following up. It's easier to have a dialogue if you catch them on the phone and a surprising number of customers will call you back if you leave a message and a phone number where they can call you directly.

Here are a few other tips:

  • Empower your follow-up person (or team) to resolve as many issues as possible.

  • Use customer conversations to learn more about their situation.

  • Summarize feedback from customer follow-ups to identify broad trends.

 

Conclusion

Some leaders worry about the time required. If that's your focus, your head's probably not in the right place.

Here are three compelling reasons why you definitely have the time:

  1. Follow-up is optional. You don't have to contact every single customer.

  2. Saving customers can directly generate revenue and reduce servicing costs.

  3. Fixing chronic problems leads to fewer customer complaints in the long run.

Here are some additional resources to help you turn your survey into a feedback-generating, customer-saving, money-making machine:

Study: Surveys On Store Receipts Are "Total Garbage"

We've all gotten a survey invitation on a store receipt.

A 2016 study from Interaction Metrics found that 41 of the 51 largest U.S. retailers included a survey invitation on the standard receipt. The surveys were evaluated to see how useful and engaging they were.

Not a single one was fully engaging and scientific.

The study also found that 68 percent of the surveys were "total garbage," meaning the surveys were so flawed they weren't worth the time required to complete them.

You can view the entire study here. Below is a summary of the results along with some action items and resources to help improve your organization's customer satisfaction survey.

How the Study Worked

The study assessed surveys based on four criteria. Each one was weighted to reflect the relative importance of each category:

  • Access: Ease of locating and beginning the survey (5%)

  • Branding: Style reflecting the brand, correct spelling and grammar (10%)

  • Engaging: Keep customers engaged throughout the process (35%)

  • Accuracy: Survey design that yielded accurate data (50%)

The surveys were all obtained by making purchases from the retailer, either in store or online.

 

Accuracy Flaws Uncovered

Inaccurate data can prevent companies from taking the right action to improve service. 

Or worse, a survey might be gamed to yield high scores that disguise the fact that service needs to be improved at all.

Asking leading questions was one of the most prevalent flaws, showing up in 92 percent of the surveys examined. These are questions that are worded in a way that naturally leads customers to a particular answer. 

For example, Ace Hardware had this question on its survey:

How satisfied were you with the speed of your checkout?

The problem with a question like this is it assumes the customer was satisfied. This assumptive wording makes a positive answer more likely.

A more neutral question might ask, "How would you rate the speed of your checkout?"

Another issue was the use of overly positive wording that can bias a customer's response. The study found that 82 percent of surveys contained at least one question with overly-positive wording.

Here's an example from GAP:

The look and feel of the store was very appealing.

This question also suffers from vague wording. Does "look and feel" refer to branding such as signage, displays, and decor? Or does it refer to cleanliness and organization? Perhaps it means the store's layout?

Here's an example from the now-defunct Sports Authority, where a cashier biased the survey in another way. He stamped the expected response right on the invitation:

highlysatisfied.JPG

Engagement Flaws Revealed

Surveys reflect on your company's brand.

They're part of the customer journey. Many retailers have made their surveys so needlessly long or aggravating that the survey itself reflects poorly on the brand, like this egregious example from Buffalo Wild Wings that required customers to navigate through 39 different screens!

The average retailer's survey had 23 questions.

That's a tedious amount of questions to expect customers to answer. Nordstrom advertised its survey took just 2 minutes, but it contained 25 questions. The survey actually took 4 minutes to complete.

The study found that 13 percent of surveys were difficult to access. Walmart required not one but two receipt codes to be answered. Rite Aide, Ross, and Walgreen's all had broken links.

The best surveys are short and easy to complete. In many cases, you can capture troves of useful data with just three questions.

 

Resources

There are many resources to help you develop, implement, and refine your customer service survey while avoiding these mistakes. Here are just a few:

How to Use Surveys to Save Angry Customers

Companies that use customer service surveys fall into three groups.

The first is the majority. These companies just report the numbers. They don't really understand why they're surveying their customers, they just know that higher numbers are good.

Unfortunately, you really haven't learned anything if all you know is your Customer Satisfaction (CSAT) score is 85 percent one month and 86 percent the next. 

The second group uses their survey data to identify actionable insight. This group knows why CSAT moved from 85 percent to 86 percent. They also have a clear idea on how to get it to 87 percent next month.

The final group uses their survey to identify actionable insight, but they also use it to connect with individual customers.

This group knows that if 85 percent of customers were satisfied, then 15 percent were not. They want to find that 15 percent and help them before they take their business to a competitor.

This post explains how you can be a part of that third group too.

Why Yelp is (Almost) the Perfect Survey System

Take a moment to consider the beauty of Yelp.

Yes, it has some flaws that businesses don't like. The reviews are public (scary!), some of the reviews are fake (true story), and most people leave negative reviews (patently false).

Yelp also has a simple design that can give you a lot of feedback.

First, it asks customers to give a single rating. There's no convoluted mess of 36 different dimensions that will never be read or analyzed. Just one rating. One to five stars, with five being best.

Do you think people would write Yelp reviews if they had to answer 36 questions? Not a chance.

Next, Yelp asks customers to explain their rating in the comment section. The beauty of this is you can do some basic text analysis to understand why someone would give you a five star rating versus a three star rating.

Best of all, Yelp allows you to close the loop with your customers.

You can follow-up with the customer in private to (hopefully) resolve their issue. You can also respond to their review publicly so other customers know you're listening.

In many ways, Yelp emulates the ultimate three question survey. In fact, the biggest problem with Yelp as I see it is most businesses don't get enough reviews. 

 

Creating Your Own Better Yelp Model

You can easily create a survey that includes Yelp's best features.

Unlike Yelp, you will likely get a lot more responses and the results will remain private unless you choose to release your data to the world.

Here's a sample survey:

A survey like this can yield lots of useful data without burdening your customers with unnecessary questions. You just need to know how to analyze it. 

Fortunately, you can use this handy guide.

Notice the third question allows customers to opt-in for follow-up contact. This is the linchpin that can allow you to identify and follow-up with angry customers.

For example, you can set a rule that any customer who gives a rating of three or lower gets a follow-up contact. (Provided, of course, that the customer opted-in.) 

This follow-up can yield all sorts of great things:

  • You might fix the problem.

  • You might save the customer.

  • You might gain additional insight.

There's also a bonus.

One data analyst at a large company confided in me that customers who received a follow-up contact generally gave top scores on their next survey. So, closing the loop with angry customers can be really, really good for your overall survey score.

Let's not forget that our executives really do care about that score.

 

Resource

You can learn more about creating customer service surveys by watching this training video on LinkedIn Learning (subscription required).

How to Stop Employees From Survey Begging

We've all experienced survey begging.

Sometimes, employees offer an incentive. My nephew was recently offered free food in exchange for giving a fast food restaurant all 10s on their survey.

Other times, employees try to pull on our heart strings. They tell customers they'll get in trouble if they don't receive a good score. 

My friend Halelly recounted a recent experience taking her car into the dealership for service. "The (service advisor) coached me in person when I got the car serviced and has now sent me this email too."

The email warned Halelly that she would be getting a survey from the dealership and possibly the manufacturer. The advisor wrote:

We would greatly appreciate your time to complete the surveys. Anything other than all 10's is considered a fail.

This post explains why employees engage in survey begging. It also explains how you can stop them from this annoying habit.

Survey Begging Defined

Here's my definition of survey begging:

Asking a customer to give a positive score on a survey by explaining how it will directly benefit the customer, the employee, or both.

Here are a few examples:

  • Offering customers discounts in exchange for a good score

  • Telling customers a bad survey will get you fired

  • Displaying "We strive for five" or similar signs

  • Directly asking customers for a positive survey score

  • Ignoring actual feedback that's not attached to a positive score

Side note: this definition is a first draft, so I welcome your feedback!

 

Why It's a Problem

Survey begging causes two problems.

First, it's annoying. Customers don't like being begged and cajoled into giving a survey score. This practice reinforces the perception that companies aren't really using voice of customer data to improve service.

The second problem is survey begging can cover up real service issues by artificially inflating scores. Customers might start spending less or stop doing business with a company entirely, without the company ever understanding what's causing the problem.

In other words, survey begging defeats the purpose of using a survey.

 

Why Employees Survey Beg

It's all about incentives.

Employees engage in survey begging because they have a clear incentive to achieve a high score or a strong incentive to avoid getting a low score.

Some employees have bonuses tied to their average survey score. This incentivizes them to ask customers for good scores because those positive surveys are literally adding to their paycheck. A slightly negative, but truthful survey might prevent an employee from earning their bonus.

Other employees can face disciplinary action if they receive too many low scores. One automotive service advisor told me he only pushes the survey to customers he thinks are happy because he could lose his job if he gets too many low scores.

Survey begging happens in many industries, but it's a particularly big problem in the automotive sector. Here's a great article on Edmunds.com that explains why.

The bottom line is if you want to stop the begging, you need to remove the begging incentive.

 

Getting Rid of Incentives

Many customer service managers are reluctant to get rid of survey incentives.

They operate under the false assumption that employees need these incentives to be motivated. There's a mountain of evidence that shows this isn't true. In fact, the number one motivator for customer service employees is being able to help their customers.

I wrote about a great example of this in my book, Service Failure. The Westin Portland was achieving consistently high guest service scores. Then General Manager Chris Lorino explained that part of their success came from a resistance to implementing survey score incentives. 

Instead, the hotel made guest service a core part of each associate's job. Here's an excerpt from the book:

"Associates coach and encourage each other to deliver high levels of service that will help them achieve their (guest satisfaction) goals. The hotel's leadership team regularly discusses guest feedback with the associates and encourages people to share ideas that will improve service even further."

Other managers are concerned that eliminating incentives makes it difficult to monitor employee performance through survey scores.

The problem is survey begging artificially inflates survey scores, so you end up rewarding employees who are best at begging, not best at service. 

A better approach is to use survey feedback to manage behaviors. For example, if an employee frequently gets surveys saying they are a little abrupt, you can coach them on ways to create a better impression.

The Strange Effect of Surveys on Consumer Behavior

The survey equation seems simple.

You ask a customer a few questions. Their answers help you spot problems. You then use their feedback to improve. If all goes well, customers become more loyal and sales go up.

It's applying feedback that's been a sticking point. Many companies don't. One study suggests that only 10 percent of companies use survey data to improve service.

It turns out that a survey has influence, even if you don't use the data.

A 2002 study published in the Journal of Consumer Research determined that the mere act of surveying customers increased sales and loyalty. Here's a summary of what the study's authors found.

The Study

The research was conducted by Utpal M. Dholakia and Vicki G. Morwitz. They separated customers at a financial services firm into a test and control group. The test group took a customer satisfaction survey while the control group did not.

Dholakia and Morwitz then used actual customer transactions to identify any differences between the customers who were surveyed and those who were not.

 

The Results

Two conclusions jumped out.

First, the surveyed customers were much more likely to open new accounts with the financial services firm.

Second, the surveyed customers were much less likely to defect. 

The Implications

It would be a mistake to just assume you can launch a survey and watch sales soar. 

There's a bit of nuance here. For example, the study was conducted in 2002 when surveys weren't nearly as constant as they are now. It's entirely possible the study would see different results if it was conducted today.

My guess is the survey itself isn't what's driving consumer behavior. My hypothesis is there are two parts of the survey that are really making the impact.

The first is the act of story telling. 

Customers relive their experiences when they complete a survey. Retelling their story through a survey can make strong feelings (good or bad) even stronger. And, like so many stories, the details change and become exaggerated over time to create a stronger narrative.

The second part is a survey can demonstrate that a company actually cares.

It was probably easier to do that in 2002 when surveys were less common than they are in 2015. The big take away should be that if companies can show customers they care then customers will likely reward them.