Why Customer Satisfaction Surveys Get it Wrong

We just switched our phone service from Verizon to Cricket (real savings!) A few hours after we had set up service, I got one of those customer satisfaction surveys. Get ready for a rant.

“How satisfied were you with the service provided by your sales advocate –from 10 (completely) to 1 (not at all)?

Um, 5? We were the only customers in the store, and one of the guys friends came in and they started talking about parties. The service rep got the job done, but it wasn’t that difficult a job, we weren’t very demanding, and he wasn’t particularly welcoming.

He was also manning the dingy store by himself. It didn’t look like a fun job, but he did it and our new phones were set up and numbers transferred with no issues.

I don’t know, it was Cricket. My expectations weren’t that high, but they were met. If I valued a sparkly clean store and better service more than I valued the $100 a month we should save by switching, I’d have stayed with Verizon.

I put down 8.  It seemed reasonable to let the company know there were no real complaints but there was room for improvement.

And so it goes. Another customer satisfaction survey taken, another chance to get meaningful feedback from a customer blown by a well-meaning company who prioritized getting any answer over getting answers that mattered.

This One Goes to Eleven

I remember when I bought my next to last car. The salesman looked straight at me and said, “Now you’re going to get a survey. If you’re happy with your service, give me a 5 because anything else is bad.”

I appreciated his honesty, but the system ticked me off. The salesman did a good job, but why should I give a perfect score for good, solid work. Shouldn’t perfect scores be reserved for above and beyond service, not meeting expectations?

It’s like giving the very best trophy for participation, a big “You Did Okay” award. “

I think this approach to customer satisfaction surveys devalues excellence. If I give a perfect score for “Good”, what should I do for “Above and Beyond the Call of Duty to make sure I was a Happy Camper?”

It wasn’t the fault of the car salesman that I didn’t need him to do anything special, and he didn’t deserve to be penalized for doing the job well. I gave him his fives, even though I wanted to go with 4s. (Very good. Not supercalifragelicious, but I would recommend him to someone else who wanted to buy a Honda with no qualms.)

But if Excellent is the standard for good, how do you recognize excellence?

And if I say something is “Good”, why the heck should someone else judge that as being bad?

Customer Satisfaction Surveys: A Dissatisfying System

Businesses need to know how they are doing, and getting customer feedback can be key to finding out what’s working and failing. Businesses get some feedback from sales levels, but well-designed customer satisfaction surveys can give more targeted and actionable feedback.

Usually those surveys are long and tedious, like the one I filled out for my financial advisor last week. Three pages of questions with very specific things to comment about my satisfaction with my advisor, his staff, their company, their company’s website, etc. (I’m pretty happy and gave mostly 9s.)At least there was some room to differentiate on aspects of their service.

Contrast this to the text survey, where I was asked three questions to sum up my entire Cricket experience.Three questions on a 10 point scale are not targeted and actionable feedback. They’re noise.

It’s easy to text back a number, so I did it, but the number has no meaning. Cricket just asked for a number, so I gave them one that indicated that the experience was fine but not perfect.

Maybe the company will read it that way, maybe they won’t. They certainly won’t know what about the experience they could make better.

A better question would have been something more specific. How did the sales advocate behave?

It’s an open-ended question. I could have answered any way I wanted. I could have said “Adequate service considering there was only one associate working.” I could have said, “Got the job done in a very workmanlike manner.” I could have told about his buddy coming in.

That would have taken time to answer. With an open ended question, I might have given the answer, or I might have just decided the survey was too much trouble.

That’s why almost everyone business gives you these point scale surveys. They’re easy for the customers to submit, but they are not very informative.

My 8 means I got the service I expected, not that it was good service. On a bad day, I might have given him a 3 for smirking.

(The sales rep was a smirker. My impression of him was that he looked like a jerk on his best behavior, the guy you’re so happy your best friend finally broke up with. He did not give a good first impression.)

Should I have penalized him because he looked like someone I wouldn’t enjoy knowing socially? Should I have rewarded him more for meeting my very minimal expectations?

Should I have ignored the survey?

Usually I do.

Amazon has sent me 4 or 5 this week alone. Did the item meet my expectations?

It’s December. I won’t be able to answer that until the items get tried out Christmas morning. Delete.

At least Amazon asks for more than a 5 star system. They want at least a short justification for any rating you give. Why was it great, terrible or mediocre? Putting the stars and the comments together gives much better feedback.

Read enough Amazon reviews on a project, and you’ll have a pretty good idea of how the product performs, what its best qualities are, what its potential problem areas are, and whether it will meet your needs.

That kind of feedback takes time to write, though, so I generally only give feedback on the great or the terrible. If I love it or hate it, I want everyone to know why. If it’s just okay or even pretty good, I don’t usually feel the need to add my voice.

So on most items, I don’t leave my opinion, even if I like the item. I don’t have the inclination to share.the good or fair, only the great or terrible, when sharing requires a commitment of time and energy. 

That means there’s a whole lot of customer satisfaction going unreported. There’s also a fair amount of customer apathy being lumped together with satisfaction, which is an entirely different problem.

The Real Feedback

So the feedback I didn’t leave for Cricket? A fair 5 Point Scale Amazon-style review of my transaction at Cricket would have read:

(3.5/5) I’m looking forward to the major savings I can get from switching my cell service to Cricket, and appreciate the solid Android phone I purchased at a good discount. It was not difficult to switch over to my new service and my phone number got ported over easily and in a timely fashion. The phone and internet service seems like it will be fine, though I did notice slightly less audio clarity on the one or two phone calls I made. The phone calls were still audible and easily understandable. Cricket is cheap in part because it doesn’t put a lot of money into having comfortable physical locations. The actual Cricket store experience is decidedly minimalist in both staffing and décor. Fortunately, I anticipate few visits to the Cricket store.

I didn’t like the Cricket store, but that’s not a deal breaker. The Verizon store was nicer, but I went there once every other year and don’t love the phone store experience even under the best of circumstances. I still have the option of going to a store if needed, and I like having that option.

The Cricket phone service itself, while not absolutely the best, is pretty good. There are a lot of good phone options, and the price of the service after enrolling in autopay ($35 per month per phone for talk, text and 2.5 gigs of data) is an improvement.

I anticipate being satisfied with this consumer choice.

Better and more complete feedback. Not necessarily something I would fill out via text, but far better than “8.”

I understand there’s a trade off. When companies ask for more complete or specific feedback, customers are less likely to provide it. When companies ask for simple feedback that customers are more willing to provide, they get only the vaguest of information about the customer’s state of mind.

I’m not sure what to do about surveys that seem to be such a waste of everyone’s time and resources..In this case, I did the customer satisfaction survey and ranted afterwards about its lack of meaning. I could swear I’ll never do another point value spread, but I will. I can promise to not inflate my answers on the next survey, but I probably will anyway.

There’s no solution. Just more surveys, more things for the consumer to do, and more noise for marketing to sift through to find out how the company is actually doing.

So, on a 1 to 10 scale, with 10 being the awesomest and 1 being the worst, how do you rate this post?

 Oh, and how do you feel about customer satisfaction surveys? Do you take them or ignore them? How do you answer them fairly?

Image courtesy of nenetus at FreeDigitalPhotos.net

4 thoughts on “Why Customer Satisfaction Surveys Get it Wrong

  1. Ha. The last one I took I gave all ones because a salesperson had lied to me. Give minutes later the company called back with a cobbled together package for me to make up for it. So I got the imaginary rate he had made up. Hope no one got in trouble because I took it before they had fixed it. Good point about people not reviewing unless ticked off or over the moon happy.
    FF @ Femme Frugality recently posted…Ancestry Deals for Black Friday/Cyber Monday WeekendMy Profile

  2. Tim recently got one where the salesman said he would get in trouble for anything left than perfect. Which left us thinking that the system was ridiculous, but also the salesman had no business telling us that. It’s passive aggressive. I get that these sorts of systems can hurt the employee. But if anything it makes me less inclined to give him/her a perfect score. Because that last statement was unnecessary and is essentially a guilt trip. That’s the note I leave you on, which colors my experience.

    I prefer the places that leave room for detail. It feels like more of them have that option now. We go to the same restaurant for date night each week. If you fill out a survey within 3 days of your visit, you get a free appetizer (which is what I order as my entree). I like it because it lets you elaborate in a few places if you want. And our server is really amazing — one of the reasons we keep going back. So I love to sing her praises.
    Abigail @ipickuppennies recently posted…Get paid to listen to musicMy Profile

    • Telling someone you have to get perfect scores can certainly backfire, and make people either not want to give the score or ignore the survey as useless.
      It sounds like your fave restaurant has a good way of getting customers to engage with the survey.

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*
Website

CommentLuv badge