My local car dealership struggles with service.
On multiple occasions, I've arrived for an appointment only to learn a needed part didn't arrive as expected. That meant I had to drive home and come back another day.
The mechanic once badly scratched my car's front fender and didn't say anything—I noticed the damage just as I was getting in the car. The dealer fixed it, but my car was in the body shop for a few days.
A recent experience was the last straw.
I called to make a service appointment and asked how long it would take. The employee informed me it would be two hours, but when I arrived, the service advisor told me it would take four hours.
That was time I didn't have.
You'd think the dealership would be interested in learning from mistakes and finding a way to keep my business.
In reality, what matters most to the dealer is my survey score. An employee has directly asked me to give a good rating on their survey after every one of these service failures.
Unfortunately, they aren't alone. Here's how surveys can make service failures worse.
Does your survey focus on the wrong thing?
Surveys should focus on the experience itself, not just the customer service employees who are there to help when things go wrong.
For example, I recently bought an inflight internet pass to use while I was flying cross country. The internet was spotty the entire trip and my connection repeatedly dropped, so I emailed customer service to ask for a refund.
The customer service rep responded quickly and offered a credit for a future flight, which I accepted as a fair compromise.
I received a survey the next day. It asked me to evaluate the support employee, but not my experience using the company's product.
From a customer perspective, I had already shared all the feedback the company needs to know:
The service failure itself
My satisfaction with the resolution
We had ended the poor experience on a high note. Now the survey reminded me of the bad experience all over again. As Shep Hyken recently wrote, the survey shouldn’t be the last thing the customer remembers about you.
The day after the latest service failure at the dealership, I received this text from my service advisor:
The message was clearly automated, but it still comes across as completely oblivious.
It’s an example of survey begging.
I already shared my feedback with the service advisor directly.
He knows it wasn't an exceptional service experience.
You can prevent this problem by establishing a clear purpose before creating a survey. Understand who you want to survey, why you want to survey them, and what you plan to do with that information.
This short video explains how to set a survey goal.
Should you even send a survey?
There are situations when a survey is a bad idea. For example, some companies send a survey after each customer service interaction. That could really infuriate a customer who has to contact support multiple times to resolve the same issue.
That text from my service advisor was another poor example. It re-hashed the memory of the service failure and made it even fresher in my mind.
Our ensuing text exchange tells me he still doesn't get it.
The last thing I said to him when we spoke in person was, "I'm tired of you wasting my time. I'm taking my business to another dealer."
He still hasn't apologized. Now he's inviting me to come back in like nothing happened?!
The good news is many customer service survey platforms can be configured with rules that determine when to send a survey and when not to. For example, you can:
Limit the amount of surveys a customer is sent in a certain time period
Avoid sending multiple surveys for the same issue
Prevent surveys from being sent when they're not warranted
This last one is tricky.
Some companies have found that unscrupulous employees will prevent surveys from going out just to keep their average higher. I never received the promised survey from the dealership, which leads me to believe that’s what’s happening here. The service advisor anticipates a low score and might have prevented it from going out.
Does your survey inspire action?
It's frustrating for customers to give the same feedback over and over again. You need to use your surveys to identify issues and take action to fix them. Otherwise, you're just wasting your customers' time.
You probably see a lot of survey invitations at the bottom of receipts. A 2016 study from Interaction Metrics found that 68 percent those surveys “are total garbage.” The questions are so manipulative and the surveys so badly designed that they yield little useful information.
I once spoke with an executive who proudly announced her company had implemented a new survey. "What are you doing with the data?" I asked.
She explained that the survey scores were shared in a monthly executive meeting. There was a long pause, since I expected her to continue. No. That was it.
The survey was a waste of time in a number of ways:
It didn't have a comment field so customers could explain their ratings.
The company wasn't analyzing survey data to identify trends.
Nobody was taking any action to improve service.
My local dealership has experienced the same issue.
I’ve directly shared my concerns about service quality multiple times. The service advisor knows about it. Several of his colleagues do, too. I’ve talked with at least two of his bosses.
And yet, after every poor experience, someone awkwardly approaches me and asks me to be nice on the survey. Meanwhile, nothing gets better.
The sad part is the issues are fixable.
I called another dealership to book an appointment for the service my car still needed. The employee was careful to advise me that the appointment would take approximately four hours, and he gave me the option to wait, get a loaner car, or have Uber take me somewhere.
Take Action
Now is a good time to take a hard look at the surveys you offer. Ask yourself:
Why are we surveying our customers?
How are we using this data to improve the experience?
What aspects of these surveys could be annoying?
You can learn more about customer service surveys and get tools to create a great one on this resource page.