Part 2: The contact centre's relationship with callers

Performance of the contact centre for Work and Income.

2.1
In this Part we:

Access for callers

2.2
To evaluate access for callers, we:

  • measured the contact centre’s performance against its service level targets; and
  • assessed the practical implications of our findings for callers.

2.3
We also evaluated the contact centre’s service level targets and reporting to see whether:

  • the contact centre’s service level target is an appropriate indicator of the contact centre’s call-answering performance; and
  • the contact centre’s performance against its service level target is measured and reported in the most appropriate way.

2.4
Our analysis used:

  • monthly service levels for the 2003/04, 2004/05, and 2005/06 years; and
  • detailed data on call answering times for the 12 weeks from 28 January to 22 April 2006.

Main findings

2.5
The contact centre met its monthly service level target1 (answering a certain percentage of calls within a set period of time) and abandonment target (having no more than a set percentage of callers give up on waiting for their call to be answered) for six months of the 2005/06 year. The periods when the contact centre did not meet its targets coincided with the introduction of new processes and technology, which caused difficulties.

2.6
We found that, even in months when the contact centre met its monthly targets, there were fluctuations in weekly and daily service levels. The contact centre’s practice of reporting to the Work and Income Executive on an average monthly service level means that these fluctuations are not explicitly visible.

Service levels

2.7
Answering every call immediately is not practical in most contact centres. An immediate answer for every caller would require some CSRs to be free and waiting to take calls at all times. This is not usually possible or desirable, given the resources that would be required to maintain this level of staffing.

2.8
Contact centres therefore define a target “service level” – that is, the percentage of calls that they aim to answer within a specified waiting time for the caller (expressed in seconds). For example, a target service level of 75/15 means that the contact centre aims to answer 75% of calls within 15 seconds.

2.9
Service level is a crucial measure of accessibility – how easy it is for callers to have their calls answered.

2.10
The pattern of incoming calls is commonly subject to random variations, making it difficult to forecast call volumes accurately or meet a service level target set for short periods. It is therefore realistic to accept that service levels will fluctuate, and sometimes be below the target service level. However, if the service level is below target for extended periods, many callers will experience longer waiting times, and will continue to experience those waiting times even if they abandon their telephone call and ring back later.

Work and Income’s service level targets

2.11
The contact centre’s monthly service level target is to answer 80% of incoming calls within 20 seconds (an 80/20 service level). However, while some callers will have to wait to be answered, no calls should receive a busy signal or be cut off.

2.12
The General Manager Contact Centres reports the monthly service level performance to the Work and Income Executive. The average service level for the year is reported to the Minister for Social Development and Employment in the Ministry’s annual report.

2.13
The contact centre’s target for abandonment – the percentage of callers who hang up before reaching a CSR – is to have no more than 5% of callers abandon their calls.

The contact centre’s performance against its targets

2.14
We examined the contact centre’s monthly service level for the years 2003/04, 2004/05, and 2005/06.

2.15
The contact centre’s performance in meeting its monthly service level target has improved steadily in the last three years.

2.16
In the 2005/06 year, the contact centre met its monthly service level target in six out of 12 months (see Figure 3). This was a substantial improvement on previous years’ results.

Figure 3
The contact centre’s monthly service levels and abandonment rates for the 2005/06 year


Service level: % of incoming calls answered within 20 secondsAbandonment rate: % of callers who hung up before the call was answered
July 200555%9%
August 200587%2%
September 200594%1%
October 200575%5%
November 200591%2%
December 200591%1%
January 200677%8%
February 200639%16%
March 200659%8%
April 200673%7%
May 200686%2%
June 200681%4%
Yearly average76%6%

Source: Work and Income.

2.17
The contact centre did not meet or exceed its monthly service level target (of answering 80% of calls within 20 seconds, or 80/20) in July 2005, October 2005, and the four months January to April 2006.

2.18
These periods of low service levels coincided with the contact centre introducing new processes and technology. Contact centre management told us that introducing these new processes and technology caused the average call handling time (that is, the amount of time a CSR will take, on average, to handle a call, including time to complete work after the call) to rise. Because CSRs were taking longer with each call, the waiting times for other callers were longer.

2.19
Although we were unable to identify all possible contributing causes or assess their effect, our own examinations and discussions confirmed that performance was affected by special factors in these periods.

Abandonment

2.20
The number of abandoned calls is often linked to the service level – a high service level will usually result in fewer callers hanging up before their call is answered. However, many factors outside the control of the contact centre may also influence a caller’s decision to abandon their call. Abandonment is of some value in assessing accessibility, but should not be used as a measure on its own.

2.21
In the 2005/06 financial year, the contact centre met its target abandonment rate (of lower than 5%) for six out of the 12 months. The abandonment rate was 5% or higher for the months that the contact centre did not achieve its target service level.

Assessing the relevance of the contact centre’s service level target

2.22
To evaluate the contact centre’s service level target and reporting, we wanted to answer two questions:

  • Is the contact centre’s service level target an appropriate indicator of its call answering performance?
  • Is the contact centre’s performance against its service level measured and reported in the most appropriate way?

Measuring the percentage of calls answered within 20 seconds

2.23
To assess whether this measure is a good indicator of the contact centre’s performance in answering calls, we looked at what was happening to the calls the contact centre did not answer within the 20-second target.

2.24
We analysed call answering times for the three months from 28 January to 22 April 2006.2

2.25
The contact centre provided data for this period that showed the number of calls answered within 20 seconds, within 30 seconds, and within 60 seconds.

Figure 4
Call answer time distribution, 28 January to 22 April 2006

Figure 4.

2.26
Figure 4 shows the number of calls answered within 20 seconds, between 20 and 30 seconds, between 30 and 60 seconds, and after 60 seconds, by the time of day. It also shows abandoned calls. For example, the section of the graph between 0700 and 0730 represents all the calls made in that half hour during the three month period.

2.27
The darker green area on this graph represents calls answered within 20 seconds. The blue areas represent calls answered after 20 seconds but before 60 seconds. The pale green area shows the calls answered after 60 seconds. The black area shows abandoned calls.

2.28
Only a small percentage of calls were answered in the period between 20 and 60 seconds. Therefore, callers who were not answered within the target time of 20 seconds were likely to be waiting for longer than a minute. Other analysis we undertook showed that the abandonment rate was usually well over the 5% target in periods where many callers had to wait a minute or more.

2.29
Our analysis of call answering patterns indicates that measuring the percentage of calls answered within 20 seconds provides a good overall picture of the contact centre’s performance in answering calls. We note that 80/20 is one of the standard service level targets in the contact centre industry.

Is the contact centre’s service level performance measured and reported appropriately?

2.30
The contact centre’s performance against its target service level of 80/20 is assessed and reported to the Work and Income Executive as a monthly average.

2.31
A service level averaged over a day, week, or month is likely to conceal fluctuations in the service level during the period. Assessing the service level using a monthly average does not necessarily reflect how easy or difficult it has been for individual callers to access the contact centre at different times during that month.

2.32
The contact centre forecasts daily and weekly service levels, based on forecast call volumes, forecast call handling times, available staff, and other relevant factors. The contact centre refers to this forecast as the “planned service level”.

2.33
The actual daily service level is monitored throughout the day. If the service level for the day and for the week is likely to exceed 80/20, staff may be rostered off the telephones for coaching or other work.

2.34
Lower than planned service levels may prompt the contact centre to divert staff away from other work and on to answering calls.

2.35
However, planned service levels may be less than 80/20. The contact centre’s performance targets for site managers accept that service levels may be below 80/20 for 75% of the weeks in a year.

2.36
Figure 5 shows planned and actual service levels for the period 3-8 April 2006. The figure also shows the number of calls made to the contact centre for each day in that period. The pattern of calls in Figure 5 is typical of the call pattern experienced by the contact centre. Calls volumes are generally heaviest on Mondays and there are fewer calls made towards the end of the week.

Figure 5
Planned and actual service levels for the period 3-8 April 2006, and the number of incoming telephone calls

DayPlanned service level: % of calls forecast to be answered within 20 secondsActual service level: % of calls that were answered within 20 seconds*Number of calls made to the contact centre
Monday0%**23%32,192
Tuesday15%79%22,309
Wednesday44%94%20,440
Thursday48%92%18,083
Friday72%78%16,596
Saturday78%90%1,650
Average 33%69%

* Rounded to the nearest percent.
** The contact centre forecast that on this Monday none of its calls would be answered within 20 seconds.
Source: Work and Income.

2.37
The contact centre’s overall service level for the month of April 2006 was 73% (that is, 73% of all calls were answered within 20 seconds). The variation in daily planned service levels was usually less extreme in months in which the contact centre achieved its monthly service level target.

2.38
We analysed the daily service levels for the 2005/06 year.

2.39
Our analysis showed that, even for those months in which the contact centre met its monthly target, there were significant variations in the daily service level. The contact centre regularly failed to meet the 80/20 service level on Mondays and Saturdays, and occasionally failed to meet it on other days of the week. However, there was not a sustained pattern of low daily service levels – service levels on other days of the week often exceeded 80/20 (meaning that more than 80% of calls were answered within 20 seconds).

2.40
For months in which the contact centre failed to meet its monthly service level target, we observed a sustained pattern of low daily service levels. However, even in these months, there were some days on which service levels met or were close to the 80/20 target.

2.41
We accept that it is not realistic for the contact centre to consistently meet its service level target. However, in practice, there is a pattern of marked variation in service levels from week to week and day to day.

2.42
Daily service levels well below 80/20 mean that many callers are likely to be waiting some minutes. Our analysis shows that more callers are likely to hang up when this occurs.

2.43
Daily service levels well above the target mean that, for that day, the contact centre had more staff answering telephones than was necessary. A contact centre that regularly exceeds its service level target may be operating inefficiently.

2.44
The contact centre’s practice of averaging the service level for the month and setting lower service level targets for some days and weeks means that it accepts that callers will experience different waiting times depending on the day or week in which they call.

2.45
Advice we have taken suggests that contact centre best practice is to set a consistent daily service level target. However, this daily service level target is best set as a range – for example, the target could be to maintain daily service levels of between 75/20 and 85/20 for every day that the contact centre is open. Using a range recognises that there will be fluctuations in daily service levels because of factors outside the contact centre’s control, but, nonetheless, requires the contact centre to strive to provide a defined level of service to callers each day.

2.46
Best practice also suggests that the service level by interval (for example, the service level for every 15 minutes) for the month be reported to senior management. We have included an example of the kind of graph that could be used for this reporting in Appendix 2. We note that the service level in the contact centre is measured in 15-minute intervals.

2.47
We accept that senior managers are aware that the contact centre does not provide a consistent level of service throughout the month. However, a daily service level target and more detailed reporting to the Work and Income Executive would encourage the contact centre to continue to improve its daily service levels as well as its monthly performance. More detailed reporting would also allow the Work and Income Executive to make better-informed decisions about resourcing and managing the contact centre, as interval and daily service level patterns will more accurately show the level of service experienced by callers during the month.

Recommendation 1
We recommend that, in conjunction with existing monthly service level targets, Work and Income set a consistent daily service level target for the contact centre, expressed as a range.
Recommendation 2
We recommend that performance against this daily target be included in the contact centre’s formal monthly reporting to the Work and Income Executive. These reports should include explanations for any variation from the target range, together with a plan to address the underlying cause.
Recommendation 3
We recommend that the contact centre’s formal monthly reporting to the Work and Income Executive include the service level performance by 15-minute interval.
Recommendation 4
We recommend that the contact centre identify all possible causes of current variations in daily service levels, and provide the Work and Income Executive with an analysis of options for addressing these.

How the contact centre records and responds to the needs of clients

2.48
The contact centre is an important point of contact between Work and Income and its clients. It is important that the contact centre is well informed about clients’ needs, and responds effectively.

2.49
We examined how the contact centre communicates with clients, and contacted several groups for their views on the quality of the services delivered by the contact centre.

2.50
We also looked at how the contact centre assessed how satisfied clients were with the service the contact centre had provided, and how the contact centre handled complaints.

Main findings

2.51
Work and Income has established channels for communicating with client groups. Feedback is shared between the contact centre and other parts of Work and Income. This communication helps to ensure that the contact centre and other parts of Work and Income work together effectively in the interests of clients.

2.52
The groups that we contacted were generally positive about the quality of the contact centre’s services, and their comments raised no new issues for this performance audit.

2.53
Regular surveys are carried out to assess client satisfaction, using a robust methodology. Survey results show consistently high levels of client satisfaction. However, the Ministry needs to consider increasing the sample size if it wishes to extract more detailed data from the survey.

2.54
An effective system is in place to deal with complaints. There are very few complaints, given the volume of calls handled by the contact centre.

Communicating with client groups

The role of national client managers

2.55
Client relations for the contact centre are managed through Work and Income’s national client managers, who are responsible for specific client groups such as those receiving New Zealand Superannuation or people on working age benefits.

2.56
These managers meet regularly with client groups and pass on any issues raised at those meetings to the contact centre management team. They told us that the contact centre management team responds well to these suggestions and ideas for improving services.

2.57
The contact centre also passes relevant information to the national client managers. For example, the relevant national client manager would be advised if there was an increase in the number of calls from a particular group of clients.

2.58
National client managers sometimes also provide guidance to the contact centre on policy matters or on how to handle particular types of call.

Communication with beneficiary advocate groups

2.59
Work and Income holds quarterly meetings with beneficiary advocate groups at which the advocate groups raise any matters of concern. The General Manager Contact Centres attends these quarterly meetings if contact centre issues are on the agenda. On occasion, the contact centre has also used these meetings to inform groups of planned changes, and to explain the reasons for the changes and how they will affect clients.

2.60
We reviewed records of these meetings and found that the groups had raised few issues relating to the contact centre. Any such issues were passed on to the contact centre management team.

2.61
One group of advocates visited a local contact centre site, and then brought some questions about contact centre practices and processes to the next meeting. Such visits can be a useful way to show how the contact centre operates.

Our survey

2.62
A large number of non-government agencies deal with the contact centre, either directly or on behalf of clients. We asked a small selection of these about the quality of the contact centre’s services. Our purpose was to identify common views, and to reveal any new issues we needed to examine as part of our performance audit.

2.63
We asked whether contact centre staff:

  • answered the telephone promptly;
  • were polite;
  • were helpful; and
  • were knowledgeable about Work and Income services.

2.64
Nine agencies responded. They were generally positive about the quality of contact centre services. The responses revealed no new issues for our audit.

The client satisfaction survey

2.65
The Ministry contracts an independent company to undertake continuous client satisfaction surveys. Clients who have recently rung the contact centre are contacted and asked about the quality of the service they received.

2.66
We examined the questionnaire used for the client satisfaction survey and sought assurances about the robustness of the sample selection, and about the design and relevance of the questionnaire.

2.67
Samples are randomly selected. We were told that the current sample size is large enough to have an acceptable margin of error for quarterly and annual reports. However, the monthly reports are indicative only, because the sample size is not large enough for the monthly data to be statistically reliable.

2.68
The questionnaire’s design was robust and covered the main aspects of contact centre service. The questionnaire is reviewed annually.

Survey results

2.69
We examined the results of the client surveys for the year January to December 2005. Total overall satisfaction was consistently well above the contact centre’s target of 85%.

2.70
Monthly results are reported to the contact centre’s management team. Quarterly and annual results are included in the Ministry’s output reporting.

2.71
The survey results are presented in various ways, including by site, client type, and relevant demographics. We were told that site managers monitor their site’s results, and are asking for more detailed reporting than in the past. However, the current sample size is too small for the research company to provide assurance that these more detailed results are statistically reliable.

Recording and resolving complaints

2.72
CSRs enter the details of complaints they receive into a contact centre database.

2.73
The client’s record shows which CSRs they have spoken with, so it is possible to assign all complaints related to the contact centre to one of the five sites.

2.74
Each site has a staff member responsible for assigning complaints to the appropriate person to resolve – usually a service manager. Complaints about service centre staff are referred to the relevant service centre.

2.75
After a set period of time, unresolved complaints are referred to a senior manager.

How complaints are resolved

2.76
We reviewed all complaints entered into the database in the year between 8 May 2005 and 8 May 2006. During this time, the contact centre received about 6 million calls.

2.77
We examined:

  • how complaints were followed up;
  • how quickly complaints were resolved; and
  • what complaints were about.

2.78
During the 12-month period we reviewed, 369 complaints were recorded in the complaints database. Of these, 92% (340) were identified as resolved.

2.79
The most common follow-up action (83%) was telephone contact with complainants. Letters were sent to complainants in 12% of recorded complaints. No further action was considered necessary for the remaining 5% of recorded complaints.

2.80
Most of the resolved complaints (72%) were resolved within five working days. Only 9% of resolved complaints took more than 20 working days to resolve.

2.81
Most complaints were about the knowledge of CSRs. These made up nearly two thirds of all recorded complaints. The next most common complaint type was about the behaviour of CSRs, which made up nearly one-quarter of complaints recorded during the year we reviewed.

2.82
Call recording aids the investigation of complaints about contact centre staff. The staff who investigate complaints often use call recordings to examine the nature and validity of complaints.

2.83
In our view, an effective system is in place to deal with complaints. There are very few complaints, given the volume of calls handled by the contact centre.

Service Express

2.84
Service Express is a self-service facility accessed through an 0800 telephone number.3 It uses voice recognition4 technology to link clients receiving benefits or New Zealand Superannuation from Work and Income to the Work and Income payments management database, giving clients access to personal information about their payments and debt balances. Clients can also report income directly without having to speak to a CSR. Callers can choose to leave Service Express and join the main queue of contact centre callers at any time during their call.

2.85
Service Express has been in operation since early 2002. There are about 50,000 calls to Service Express each month. The contact centre has an 0800 demonstration telephone number that callers can call to find out how Service Express works before using it.

2.86
Service Express has the potential to reduce the number of calls that need to be answered by CSRs. Fewer calls could lead to savings for the contact centre and shorter waiting times for callers.

2.87
CSRs are expected to promote Service Express in cases where this is appropriate for the particular client. CSRs have been provided with training and guidance to help with this.

2.88
However, the 2005 survey of case managers and CSRs found that some case managers were not actively encouraging clients to use Service Express. The survey also found that CSRs could more actively encourage clients to use this service. One group of CSRs we spoke to told us that Service Express was not well promoted and that clients did not yet trust the technology. Another group of CSRs confirmed that they mentioned it to clients who were likely to find it useful.

2.89
The contact centre has run an outbound calling campaign to make clients aware of this option for reporting income. We note that the contact centre is working on a marketing strategy to promote Service Express, to ensure that best use is made of it.

2.90
In our view, Service Express could be used more often, and by more clients, if Work and Income were to more actively promote this service.

2.91
Work and Income has investigated opportunities to expand the range of call types handled by Service Express or other automated contact channels. This could provide a more convenient service for many clients and enable CSRs to answer other calls or attend to other work. We encourage Work and Income to actively pursue these options.


1: A single service level target applies to almost all calls, except for a very small number of calls to the employer line.

2: As noted in paragraph 2.18, the contact centre’s service level was affected by the introduction of new processes and technology during this period.

3: Calls made to Service Express are not included in service level measurement.

4: Voice recognition (otherwise known as “natural language recognition”) technology interprets spoken words, enabling a caller to report their income or perform other transactions without the assistance of a CSR.

page top