Is there a benchmark for IT support?

As head of research at the Service Desk Institute, the questions I’m asked most often are not: How do I motivate or retain staff? or How can I improve my service desk’s processes? 


The questions I’m asked most often are:  What are the industry average call answering times?, How many incidents will 1000 end users log every month?, How do I know if our 10 per cent call abandon rate is above or below the industry average?

In a way, perhaps it is not surprising that I am asked this question so often. Inherent within in all of us is a desire to understand how good we are, for all of us compare and contrast ourselves with others and organisations.
    
The craving to compare and contrast is not unique to the IT support industry, but I feel that the desire burns even stronger within IT support organisations because sometimes they can feel isolated and siloed, and lack way markers and indicators to guide and assure them that they are indeed doing a good job (one that is at least comparable to other organisations within the same sector). Another reason for the burning desire to compare and contrast performance is that service desks attract and collect huge volumes of data. Every interaction is recorded and this creates masses of data – therefore the desire to utilise this data to better understand your service desk’s performance is completely logical. The data is all there to be used, so why not use it to see how good your service desk really is?

Unfortunately, direct comparisons between service desks and support organisations are virtually impossible, for numerous reasons that I will extrapolate below. However, that is not to say that benchmarking is a futile exercise, but it is one that should be used with caution and a high degree of hesitancy.             

An alternative solution, and one that can really help to drive and improve your service desk’s performance, is to track and trend your own service desk’s metrics over a period of time (I refer to this as internal benchmarking – comparing benchmarks with other service desks is external benchmarking). Internal benchmarking is an incredibly powerful tool and - combined with elements of selective quantitative external benchmarking and qualitative work – can provide a true picture of your service desk and its performance.

Why 65.879 per cent of statistics are wrong
The above figure is completely facetious, but it does demonstrate an important point about the emphasis and credibility we place on statistics. Whilst it is not my intention to start a debate on how statistics should be measured, recorded and utilised, it is important to understand that in the IT support sphere when comparing statistics, you will most likely be comparing apples to oranges. This means that it is dangerous to invoke direct comparisons with other service desks based on metrics alone. I’ll use this real-life example (I was asked this exact question this morning) to demonstrate what I mean.

The question directed to me was What’s the industry average for how many calls should we expect for 1000 end users on a weekly and monthly basis? This appears to be a quite a reasonable question, indeed I’m sure many service desks would love to know the answer to this as it would aid immeasurably in their planning and resource allocation (amongst others). Being able to plan in advance would be a definite boon for Service Desk Managers and Team Leaders and additionally it would let you know how busy your service desk is compared to others in the industry.
    
Unfortunately an industry average measure for calls received does not exist; some of the reasons for this are: Organisations have end users with different technical skills (consider the fallacy of comparing data from a company involved in technology and one where it’s been struggle to stop people using pen and paper). The better end users’ technical knowledge and skills, the less likely they are to log calls; Service desks might not be as well regarded by some organisations as others – if end users don’t like contacting the service desk then this organisation will have a lower support call volume; Hardware and software is likely to be very different across organisations – it is fair to compare the call volumes for an organisation with brand new tools and technology to one that is working on 10 year-old PCs?

And if so, should we take this into consideration in our calculations?; And lots and lots of others including: staff turnover rate – are there lots of new starters, does this generate high call volumes? Is it easy to contact the service desk? Are calls logged in the same way, or do we differentiate between support calls and calls for contact details? What about the complexity of the calls – are they taking a long time to fix hence your lines are always engaged?

Dangerous comparisons
Above are just a few examples of why direct comparisons of metrics and KPIs are dangerous, as it is very likely that we’re not comparing apples with apples. Taken as gospel, any industry average for the number of support calls could be very damaging. You might see it as a good sign that you receive fewer calls than other service desks, but what if the reason for this is because your end users had such a low opinion of your service that they saw you as a last resort? On the other hand, perhaps your calls are much higher than the average – does this mean you need to provide end users with extra training or implement a self-help solution? The answers to these questions could be true, but they should not be prompted by incorrect information, the type that is created by industry benchmarks – the true source of improvement lies within ourselves and our support organisation.

That being said, industry benchmarks do still have their place in the support industry. For anyone who is starting to get to grips with measurement they provide a useful indicator of what you should be measuring and provide some ballpark figures. Knowing what metrics other service desks are utilising can point you towards some important measures that you had not hitherto considered. Used with caution and with careful consideration afforded to the fallibility of industry averages, industry benchmarks have their value and their place but are not a sound basis for decision making or improvement initiatives. For those, we need to look a little closer to home.

Internal benchmarking
The best way to benchmark your service desk is to benchmark internally. It may appear that this seems like a worthless endeavour – how can you know how good you are if you are only competing against yourself? The answer is that you compete against your previous performance, pushing and driving to improve results month on month and year on year. To do this, take a picture of all of your metrics and KPIs (call waiting, resolution times etc.) and write them down as an average for the last month. Then do the same next month, the month after and so on. Very soon you’ll have data that is trending, and after a year  will have some pretty comprehensive results. Now look at these measures and compare them to a year ago and they will tell you in what direction your desk is heading, and you are now in a position to set goals and targets and ensure that the data is trending towards the goal. The very best service desks have been doing this for a long time and it has aided immeasurably in their ability to identify improvements, ensure that they have sufficient resources, and build business cases for additional expenditure.
    
On the subject of internal benchmarking, it is essential that service desks include qualitative measures to complement the quantitative measures. Useful qualitative work includes interviewing customers to gauge their thoughts and opinions concerning the service desk; going out into the organisation’s end user population (floor-walking is a good way to do this) and understand more about IT is used – what are end users’ concerns or complaints? How could IT be improved? Are there any identifiable ways that the service desk can use its knowledge to improve the way that people work? When qualitative work is combined with internal benchmarking it creates a very solid foundation for continual service improvement initiatives (CSI) to help push the service desk to the next level.

Conclusion
Benchmarking is a vital part of service delivery, but it is important that it is used in the right way. Benchmarking against industry standards has value, but much greater returns will be experienced when benchmarking against your own standards. If you have recently implemented a new change management process then you will be able to see whether this has improved service by examining your metrics and looking at the trend of your data. If metrics have improved as a result of the processes you have implemented then celebrate these achievements with your organisation – it’s the only way they’ll truly know that you are committed to improving service for their employees and increasing productivity.

Further information
www.servicedeskinstitute.com

About the Author
Dr Daniel Wood is an experienced analyst on all matters concerning IT business management and is committed to helping support organisations realise their full potential through the sharing of ideas, knowledge and best practice. In his capacity as a best practice auditor, Daniel has worked with some of the UK’s largest public sector bodies and has advised leading blue-chip organisations on driving business improvements through better use and understanding of the potential of IT. Daniel is currently Head of Research and Publications at the Service Desk Institute, Europe’s largest IT service and support organisation.

 

Please register to comment on this article