Ken Auerbach from NYU is leading a conversation about help desk metrics.
What kinds of things are meaningful to know about the help desk?
Bill says that time to resolution of help desk tickets has a strong correlation to the perception of organizational quality.
Joel says that reporting what kind and how many issues get passed to 2nd and 3rd level support is of interest. Ken thinks that the second level needs to know what the first level solved.
I said that real-time information on what kinds of things are coming in to the desk is important. Kitty says that using help desk requests to understand impacts of changes in services is of interest.
Paul Hill points out in the back channel this service at MIT that gives some real-time information on service availability: http://3down.mit.edu/3down/index.php.
Greg shows this service at Chicago, which is also available as an rss feed: http://hp-announce.uchicago.edu/archive.php?areaID=30&listType=current.
Shel responds with Berkeley’s site, which is hosted offsite in the event of local failure: http://ucbsystems.org/.
Steven chimes in with Princeton’s version: http://helpdesk.princeton.edu/outages/list.plx.
What are the kinds of things we want to know about our services? (from NYU):
Categories such as Performance (availability of a particular service, mean time between failures, mean time to repair); Utilization (Who, When, for what); Satisfaction; Costing
Ken says that Metrics have to tell a story – correlating the numbers with specific events and contexts (e.g. XP released, blaster worm, machine registration improved, etc).
Karen from CMU says that they often do lightweight benchmarking with other institutions. She asks if we shouldn’t have some sort of designated contact at our institutions for IT benchmarking. Does it have to go through the CIOs?
Shel notes that there’s a lot of work in normalizing data, though ITIL helps with some of that. Bill and Jerry agree that the Stanford/MIT benchmarking work was not at all lightweight, but that it had significant impact, changing the way they did business in the help desk and their client surveys. It took them months to agree on data definitions, but that’s where the payoff lays.