OPERATIONALIZING SERVICE QUALITY: CUSTOMERS PERSPECTIVE

OPERATIONALIZING SERVICE QUALITY: CUSTOMERS’ PERSPECTIVE Linda Boardman Liu, Simmons College School of Management 300 The Fenway, Boston, MA 02115 61...
Author: Horace Welch
3 downloads 0 Views 58KB Size
OPERATIONALIZING SERVICE QUALITY: CUSTOMERS’ PERSPECTIVE

Linda Boardman Liu, Simmons College School of Management 300 The Fenway, Boston, MA 02115 617-521-2412 / [email protected]

ABSTRACT This research examines the relationships between technical quality, functional quality, and overall quality in the service operations environment. We capture individual customer assessments of service quality both in terms of what activities and measures are important to service quality, as well as specific assessment of a particular service encounter. Our findings provide insights into how to manage service quality.

Introduction & Literature Review Customers select services that will both satisfy their needs and provide a pleasant experience. Service companies try to do a good job for their customers yet customers are frequently disappointed and companies often do not understand why. For services, there are many approaches to defining quality but common to all is that the concept of quality is important as service quality is clearly linked to customer satisfaction (examples include [2], [1], or [6]). Harvey [5] differentiated between quality of results and quality of process. He proposed that results are the objective, technical aspect of quality and that the process quality includes four of the Parasuraman, Berry & Zeithaml [7] dimensions: empathy, responsiveness, assurance, and tangibles. Grönroos [3] described a dichotomized service quality framework consisting of technical and functional quality. Grönroos connected these two dimensions together as “experienced” quality. More recently, Grönroos wondered if he should have called it “technical and functional features” which supports the idea that technical and functional quality are distinct from each other and contribute each in their own way to overall quality [4]. Recognizing that service quality is a critical goal of the service delivery system, managing the system to deliver this quality experience is a critical, if not the critical, operations objective. What specific aspects of the service delivery system actually result in service quality continues to be elusive. If service quality is only assessed through customer perception, then each specific service encounter will have to be unique, as each customer is unique. But managing a system that is completely customized to each customer’s expectations is simply not viable for most service providers. Therefore, the service process needs to be designed and managed to achieve specific, directly measured elements that will result in the indirectly measured, customer perceived service quality. Understanding customers’ expectations is challenging and each customer may have different expectations – from other customers, and even from encounter to encounter. Managing to an average level of quality satisfies no one specifically, and setting service quality perception goals that are “the best” may be impossible to achieve, challenging to manage, and prohibitive from both a cost and process perspective.

This research examines the relationships between technical quality, functional quality, and overall quality in the service operations environment. We capture individual customer assessments of service quality both in terms of what activities and measures are important to service quality, as well as specific assessment of a particular service encounter. Our findings provide insights into how to manage service quality. Research Setting This research is set in the call center environment, a mass service process type by Schmenner’s classification [8]. Call centers are deployed throughout the world as a cost-effective way of enabling customer-company interactions. A call center provides front-line contact to customers. An inbound call center is generally accessed by a customer initiating a call to the company. The call is routed to a group of agents usually with the help of a call distribution platform and/or voice response unit in order to route the caller to the ‘right’ agent, depending on the firm’s choice of segmentation. Survey Data Collection We conducted a survey to explore the operational definition of service quality via the customers’ conceptualization of service quality. We partnered with an organization that provided customerlevel contact information immediately after a service encounter. We collected customerresponse data related to the specific service encounter via an on-line survey distributed within four days after the service encounter. The data set includes the customer’s perception of technical quality, functional quality, and overall service quality. The data from this study were analyzed to identify any relationships between and among the quality constructs. Survey Respondent Demographics Two hundred thirty-one individuals provided contact information to participate in the survey. Of the 207 emailed surveys, 53 were completed. Of the 24 telephone contacts attempted, 10 contacts were made, and all 10 surveys were completed. We paneled the data from the two respondents groups, online and telephone, and conducted t-tests to determine if there was a difference in the groups. No significant difference was found between the two groups’ responses therefore we have combined the groups into one data set. With only one-touch allowed for data collection, we obtained a 30.4% response rate. We initially asked the respondents an open-ended question to solicit general ideas and concepts relative to a quality call center experience. Respondents were asked “When you call a company and have an interaction with a service representative, what makes that encounter a quality experience? List up to three things that happen during a call that make you think a company provides high quality service.” Respondents provided 174 individual statements, with 61 of the 63 respondents providing at least one statement. We used open coding to analyze and reduce these statements (Table 1). Four dominant themes emerge from these statements. First, they expect the agent to have a certain kind of attitude. Secondly, customers expect that the agent will be capable to resolve the issue. Third, customers expect that the encounter will be expedient. Finally, they expect the agent to be genuine. Although not a dominant theme, 10 respondents specifically stated that the call needed to be answered by a “live person” or “human.”

Table 1 – Customer Expectations Expectation Key words Attitude Friendly…courteous…polite…tone...attitude…helpful… easy… interested… pleasant Resolve Knowledge…answered question…information available …get answer…resolves issue…competence Expedient Speed…wait time…efficient…hold time Genuine Listen…understanding…empathy…unscripted

# Responses 51 46 31 27

Customer Evaluation: Measures & Experiences Survey participants were presented a list of typical call center measurements and asked to indicate how important these measures were relative to service quality on a 7-point scale, ranging from “very unimportant” to “very important.” See Table 2 for the average rating and standard deviation for each measure. Customers believe that low wait time is important for service quality, measured as service level (“most of the time when you call, the call is answered quickly”), wait time (“how long you wait until the CSR answers the phone”), queue length (“how long you wait until you hang up and call later”) and access (“whether you get a busy signal”). In addition, customers believe that first call resolution (“whether your needs are met on your first call”) is also important relative to service quality. From the customer perspective, these five measures are statistically equal. Following these measures, customers perceive the amount of time spent on the phone as being important to service quality (mean=5.63). Of the measures evaluated, customers rank call monitoring as the least important measure relative to service quality (mean=5.00). Of note, all of the measures are relatively important from the customer perspective, with 5.0 being the lowest mean on a 1 – 7 point scale.

Table 2– Customer Importance Rating of Call Center Measures Measure How long you wait until the customer service representative answers the phone. Whether your needs are met on your first phone call. The total time you are on the phone interacting with the customer service representative. Whether you get a busy signal when you call the company. How long you wait for someone to answer before you hang up and call again another time. Most of the time when you call, the call is answered quickly. Whether the company has someone monitoring calls to ensure quality.

Importance Rating (1 – 7) 6.40

Standard Deviation 1.056

6.38 5.63

1.113 1.299

6.29 6.37

1.122 .927

6.43 5.00

.979 1.558

Customers were asked to identify the nature of the relationship between these call center measures and service quality. Table 3 presents the response frequencies for each measure in order of highest consistency. There is high consistency in the perception that having to wait to reach a CSR is related to low service quality, as is having to call back. Interestingly, actual call

duration and the customer perception of its relationship to service quality is mixed. A majority (33 of 57 respondents) indicate that longer encounters increases service quality. Given the emphasis on speed this is an interesting finding. Table 3 – Call Center Measures Relationship to Service Quality Description If most of the time when you call the call is answered quickly, service quality goes ... When you wait longer for someone to answer before you hang up and call again another time, service quality goes ... When you have to call back to get your needs satisfied, service quality goes ... If you get a busy signal when you call the company, service quality goes ... When you wait longer for a customer service representative to answer the phone, service quality goes... If the company has someone monitoring calls to ensure quality, service quality goes ... When you are on the phone longer interacting with the customer service representative, service quality goes ...

Up 59

Down 1

1

59

2

59

2

57

3

57

53

4

33

24

Participants were asked to evaluate the importance of various experiences identified as being part of a quality service encounter relative to service quality using a 7-point scale, ranging from “very unimportant” to “very important”. Table 4 presents the average ratings. Most important to these customers is that the customer service representative is professional and knowledgeable. The customer service representative is able to resolve the inquiry in one call, having answered the call quickly, without technology issues and without transferring the customer. Table 4 – Call Center Experiences Relationship to Service Quality Importance The customer service representative… Rating (1 – 7) is easy to understand, polite, professional, courteous & friendly. 6.92 has technology that supports calls, and system issues don’t slow 6.60 down the process. is knowledgeable, has all the information needed and is able to 6.89 answer your questions. does not transfer you or only transfer you once to someone who 6.32 answers your question. resolves your issue in one call, the first time you call. 6.76 answers the call quickly, promptly, in a reasonable amount of 6.73 time.

Standard Deviation .272 .610 .317 .839 .465 .545

Technical Quality, Functional Quality, Overall Quality The last set of survey questions were particular to the specific service encounter recently completed. The respondents were asked to evaluate their experience with this particular company in terms of service quality using a 5-point semantic differential scale, defined as 1 = Poor, 2 = Fair, 3 = Good, 4 = Very Good, and 5 = Excellent. This scale was used in order to be consistent with other surveys deployed by the focal organization. These customers rate the overall quality, technical quality (the service being done correctly) and functional quality (a satisfying interaction) very high (Table 5). We compared these average ratings using t-tests, and found no statistical difference between the ratings. We conducted exploratory factor analysis on these ratings to determine if these three constructs were evaluating different aspects of service quality. Using the three variables identified (overall quality, technical quality and functional quality) we found that they loaded onto a single factor that explained 86.94% of the variation inherent in the elements (Table 6). Table 5 – Evaluation of Overall Quality, Technical Quality, and Functional Quality Average Standard How would you rate this company on… Rating (1 – 5) Deviation The overall quality of your recent call? 4.79 .410 The quality of the service being done correctly during your 4.81 .398 recent call? The quality of your interaction being satisfying during your 4.79 .410 recent call?

Table 6 – Exploratory Factor Analysis of Customer Service Quality Evaluation Total Variance Explained Component Initial Eigenvalues Extraction Sums of Squared Loadings % of % Total Variance Cumulative % Total of Variance Cumulative % 1 2.608 86.942 86.942 2.608 86.942 86.942 2 .292 9.733 96.675 3 .100 3.325 100.000 Component Matrixa Overall Quality Done Correctly Interaction Satisfying a

Component 1 .914 .967 .914

1 components extracted Extraction Method: Principal Component Analysis

DISCUSSION Customers have clear ideas about what kinds of experiences should occur in a service encounter. We coded these expectations into four themes: provider attitude, resolve the issue, expedience, and genuine affect. Attitude, expedience, and affect align nicely with the broad concept of functional quality, defined as providing a positive interaction, or how the service is provided. Resolving the issue is just another way of saying meeting the output performance requirements: technical quality. Customers identify wait time measures and first call resolution as equally important measures of service quality, failing to prioritize technical or functional quality one above the other. How these various measures affect service quality was consistent with our expectations, with one exception: average handle time. The value-added nature of being directly engaged with the service provider to resolve the issue seems to eliminate the need for speed, as long as the issue is resolved through that single encounter. With a majority of the respondents indicating that longer interactions with the service provider actually increases service quality, a focus on resolution rather than call duration is appropriate for this process type. Our exploratory factor analysis established that from the customer perspective, technical quality, functional quality, and overall service quality are indistinguishable one from the other.

REFERENCES [1] Aga, M. and O. V. Safakli (2007). "An Empirical Investigation of Service Quality and Customer Satisfaction in Professional Accounting Firms: Evidence from North Cyprus." Problems and Perspectives in Management 5(3): 84 - 98. [2] Chen, C.-N. and S.-C. Ting (2002). "A Study Using the Grey System Theory to Evaluate the Importance of Various Quality Factors." The International Journal of Quality and Reliability Management 19(6/7): 838-860. [3] Grönroos, C. (1988). "Service Quality: The Six Criteria of Good Perceived Service Quality." Review of Business 9(3): 10-13. [4] Grönroos, C. (2001). "The Perceived Service Quality Concept - A Mistake?" Managing Service Quality 11(3): 150-152. [5] Harvey, J. (1998). "Service Quality: A Tutorial." Journal of Operations Management 16: 583-597. [6] Hume, M., G. Sullivan Mort, et al. (2006). "Understanding Service Experience in Non-profit Performing Arts: Implications for Operations and Service Management." Journal of Operations Management 24: 304-324. [7] Parasuraman, A., V. A. Zeithaml, et al. (1988). "SERVQUAL: A Multiple-Item Scale for Measuring Consumer Perceptions of Service Quality." Journal of Retailing 64(1): 12 - 40. [8] Schmenner, R. (1986). "How Can Service Businesses Survive and Prosper?" Sloan Management Review 27(3): 21-33.

Suggest Documents