IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

How to Improve Customer Satisfaction with IT

Here are some strategies for creating real satisfaction when it comes to helping customers.

Here's a fun exercise: For the next month, seek out and complete every survey and customer comment card you are offered. I warn you, this exercise isn't for the faint of heart, as they will be everywhere! The truth is, we are so inundated with surveys that we ignore most of them. Accept this challenge and by the end of the 30 days, you'll be amazed at how many organizations seem interested in obtaining your feedback with the implied goals of improving service and exceeding your needs.

During my month of surveying the surveys, several recurring themes made me question the effectiveness of these widely accepted tools. The most revealing finding was that almost a year later, I haven't seen one process change at the restaurants where I eat, the hotels I stay in, my gym, my bank or my barber. It may appear that these cards are the fabric that keeps organizations running, but I've peeked behind the curtain, and more often than not, our comments are given less consideration than we give to filling out the surveys.

Let's analyze a classic survey on a five-point scale and see if any of these sound familiar.


1: Very Unsatisfied - Strongly Disagree
When I get bad service, the last thing I want to do is tell someone about it on paper. I may want to scream at someone, but more often than not, self-control gets the better of me and I choose to tell them about my disapproval by never coming back. If I bring my truck in for new tires and leave with a four-inch scratch in the door, I'm not sure a comment card is going to accurately capture my feelings at that moment, or if any response could possibly convince me to return for future service. The damage is done and I want to go home.


2: Unsatisfied - Somewhat Disagree
In addition to reaching me after the damage is done, comment cards never seem to ask the questions I want to answer. I recently purchased some new computer hardware online. The electronic survey that came two weeks later asked if I found what I was looking for on the Web site, if the product was reasonably priced, if shipping time was acceptable and if the product met my needs - along with a host of demographic questions seemingly more interested in marketing than customer satisfaction. What was not on the survey was the biggest disappointment I experienced: the time it took to install due to the lack of adequate instructions. That question was conveniently left off the survey.


3: Neutral - Complete Apathy
It's great that the employee at the drive-through was smiling. I appreciate being offered a receipt, and my food was served at the proper temperature. I guess I feel like I got "good value for my food budget dollar," but I went through the drive-through. I didn't use the restrooms; I can't comment on the establishment's cleanliness, and no I'm not going to tell a friend. After all, I'm getting a burger from a clown's mouth. I just want my burger and let me go.

I really don't care about your business, and the comment card does little to further my relationship with your organization. You're asking the wrong customer.


4: Satisfied - Somewhat Agree
During my tenure at a large marketing firm, we tracked customer comment cards from more than 30 clients in a Microsoft Access database that would have made the staunchest application developer proud. I could break out data so detailed that only a statistician could truly appreciate it. But when it came right down to it, all I had was a lot of subjective ratings, and worse yet, we had no idea what to do with the information we collected.

You could score 4.3 one month and 4.6 the next, but no one knew why the numbers fluctuated, or if anything we changed made any impact. Ultimately we had relatively meaningless measures.


5: Very Satisfied - But I Still Disagree
A couple years ago, I bought a new pickup truck. When I signed the final papers, the dealer handed me a new customer survey that was already filled out. The dealer correctly assumed I would never fill out the eight pages of questions so he did it for me. He also assumed I was going to answer fives across the board.

I've now heard similar stories from friends. Apparently various companies use these surveys to rate dealerships, and anything less than a five leads to issues with the manufacturers. I'm sure the intent is that dealers are so concerned with getting a bad survey that they are changing all their processes and tactics to meet today's customer expectations ... nope, they just fill out the survey for you.

On principle alone, I filled out a fresh survey and sent it in. They sent me lawn chairs. That was nice.

So what does any of this have to do with IT? Unfortunately when we attempt to measure customer satisfaction in our field, this is the model we use - a flawed instrument that provides us little more than a warm fuzzy feeling when the ratings look good.

We see this primarily in our support centers or help desks. Typically these folks have the most interaction with customers and usually under unpleasant conditions - when something is broken. They are the face of IT, but by the time customers see them, those customers are generally dissatisfied. Because of this relationship, most IT organizations measure customer satisfaction as a matter of survival and self-preservation. Show customers and executive management a warm fuzzy or else feel the wrath of the business director.

Despite not typically having caused the problems, as well as sharing the customer desire to fix problems as quickly as possible, these support centers are still the targets for customers venting their frustration with technology. With unhappy people calling all day, every day, and very little control of call volume, we measure what we can to show anyone who will look that we're doing a good job. Did we answer the phone? Were we courteous? Did we fix the problem? And if we can track it, how quickly was the problem resolved?

We e-mail surveys after each work order is complete and dutifully collect the 3 percent that are returned. This way, when it hits the fan - and it will hit the fan - we can justify the six months it took to install new speakers by showing a 90 percent "satisfied" rating at a staff meeting.

I've seen IT directors and CIOs use help desk measures as ratings for the entire technology department. It's ludicrous to think that the complete customer technology experience and the shop's ability to meet the organization's needs can be captured in end-user support comment cards. In many cases, that's the best we have to offer because that's the best - or at least the most common - model we see. However, other than the warm fuzzy, survey results often have little else to offer.

I'm not suggesting that measuring customer satisfaction in our field is easy. First of all, very few people really know what IT does. We were recently asked in a budget hearing for a "regular English" definition of circuits, routers and switches. Eyes glazed over before we could even answer. Face it: We operate within the fog of war, and although many times it works to our advantage, when trying to communicate customer satisfaction, it makes our task very difficult.

Three other things hinder our ability to adequately measure customer satisfaction:

1. In most cases, we don't make an easily definable product. We may store and manage terabytes of data, but it's not really our product since the data is created and used by the organization. We simply house it. So exactly what is the product of the data center?

2. We juggle competing interests. In addition to having to spread limited resources across abundant demand from various divisions, IT must fight for infrastructure costs, such as disaster recovery, that end up competing for funding with new system development.

3. We rarely interact with our customers. Over the last five years, we've made significant strides in project management, and bridging that gap between IT and the business side. But the customer is too often out of sight and out of mind for the everyday technician.

With everything stacked against us, we either give up under the pressure or resort to the help desk survey. I would argue there is a better way.

 

Step 1: Talk to your customers, and they'll tell you what they want.

It starts with opening the lines of communication between the CIO and the division business leaders. In a consolidated environment, you may have 20 distinct entities that you must balance. In the increasingly rare case that you serve only a single agency, you probably have competing divisions. I encourage you to reach out to each and every one of them regularly.

When Missouri consolidated IT functions for 14 executive departments, we segmented our new customers into groups and asked the group leaders a simple question: "What results do you expect from your IT provider?"

We noticed a pattern. They wanted reliable systems that were secure, available and accessible to end-users. They wanted dependable project management on application development, and better communication regarding timelines and budgets. This was excellent to hear because we already measured some of these things. Maybe they hadn't been addressed on an enterprise scale or with the appropriate gusto, but most network managers look at available service, server farms calculate uptime, and we all should have software that helps us with security and disaster recovery.

For the areas where we had no current metric, we worked with the customers to define what we could count and measure that would demonstrate the effectiveness of that area. At the end of step one, we had a list of expectations and metrics for each item identified that were developed in conjunction with the customer. That's the exact opposite of a comment card where the "what to ask" and "how to rate" is set by the service provider. Comment cards assume we know what the customer wants in the same way the car dealer assumes that if I bought the vehicle, I must be happy.

Step 2: Have your customers set the performance targets.

The expectations and measures are the foundations on which you can build your service-level agreements or your management dashboards, but in themselves, they don't help you determine the real levels of customer satisfaction. Setting a performance target or level of service that will make your customer happy requires more discussion.

If a division tells us the most important thing IT does for it is update its Web site, we can set up a metric that measures how quickly the site is updated by counting the time between a request being made and the update. Using a simple follow up question, we can determine a measure of customer satisfaction: "How fast does the update need to take place for you to be extremely satisfied?"

If the answer is one day, we can easily count how many requests are posted within a day. That percentage is the percentage of customer satisfaction. For updates that took more than a day, we know we missed the mark. No survey, comment card or follow-up is needed - just simple up-front communication combined with clear expectations and meaningful measures.

Step 3: Improve your processes.

Unlike a comment card, you can use these performance targets to drive process improvement. Let's say the performance target is three hours. You know from the measures that you average two working days. Closing the gap between two days and three hours is the goal. Changes to your processes and procedures can be measured by the time they save and how much closer they bring you to the performance target.

You can see how improvements impact your customer satisfaction in a very real way - and a way that's different than looking at a comment card and trying to guess why you went from 4.3 to 4.6 this quarter. This approach changes the entire mindset of customer focus. We move away from the idea of, "are you satisfied with what I'm giving you?" to one of, "I understand what you need or want, and we're striving to provide that to you."

Step 4: Lather, rinse, repeat.

Customer expectations change over time. About every three to five years, it's essential to review the customer satisfaction measures, and make sure you're still delivering the right service the right way and meeting your business customers' needs.

Defining the process in four easy steps isn't intended to give you the impression that customer satisfaction measurement is easy. Internally we struggled with developing measures that were meaningful to our customers. Sometimes what's most meaningful to our customers is very difficult for us to measure.

The availability of an application depends on at least four major factors: the software itself, the performance of the platform (servers and desktops) it runs on, the availability of the network it uses, and the end-user utilizing the application. With a customer satisfaction target of 100 percent availability during work hours, I know disruption in any of those areas can result in application downtime for the end-user. Measuring each of the four factors as a stand-alone system is relatively easy. Measuring them all as they work in harmony to bring the application to the end-user is a challenge worthy of another article altogether.

After the measures are set, the benefits of objectively measuring improvements and the confidence in knowing what you have to do to meet customers' needs, as well as what you must do to proactively exceed their expectations, is a reward worth all the work - a reward that I am betting you will enjoy twice as much as scoring a 4.8 on that subjective survey.

So take my challenge: First complete as many surveys as you can, then put the surveys away and actually talk to your customers. You will find it to be a much better measurement method.