Go to top of page

10.2 Performance Measure Results

Program 1.1 Services to the Community—Social Security and Welfare—Performance overview

Performance measure

Met

1. Customer satisfaction: Achievement of customer satisfaction standards.

×

2. Achievement of digital service level standards: Departmental interactions completed through digital channels.

3. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

4. Achievement of payment quality standards: Centrelink: Delivery of correct customer payments.

5. Achievement of face-to-face service level standards: Average wait time.

×

6. Achievement of telephony service level standards: Average speed of answer.

7. Achievement of processing service level standards: Claims processed within standard.

×

8. Internal reviews: Percentage of decision reviews requested by Centrelink customers finalised within standard.

9. Achievement of payment integrity standards: Centrelink: Debt under recovery.

Services to the Community – Social Security and Welfare – Program performance overview

The department met six of nine Social Security and Welfare performance measures in 2018–19. This compares with 13 of 15 in 2017–18 and 13 of 15 in 2016–17. Four performance measure results improved, two were comparable and three declined compared to 2017–18 results.

The department delivers high quality services and payments through its digital, face-to-face and telephony channels, providing a variety of options for customers to interact with the department. During 2018–19, the department processed over 3.5 million Social Security and Welfare claims, while working to simplify services and communicate more effectively with customers.

To improve customer experience of face-to-face services, many staff are now skilled to complete both Social Security and Welfare and Health services during the same interaction. Our performance measures record customers as receiving either a Social Security and Welfare or Health service, rather than being able to record that they received both services in the same interaction. We are looking at how we can update our methodology to reflect this.

Interactions completed digitally for the Social Security and Welfare Program grew by 11.4 per cent in 2018–19. We will continue to encourage customers to utilise the department’s digital channels, providing them with opportunities to be more self-sufficient.

We aim to provide families, individuals and communities with appropriate Social Security and Welfare support, such as the Age Pension, Family Tax Benefit and Newstart Allowance. Our debt recovery activities ensure the integrity of government outlays, while also providing customers with flexible repayment options to suit their circumstances.

The department is focused on improving the customer experience. This includes making the correct payments to the right customer the first time, improving the average wait time and speed of answer for customers, undertaking timely internal reviews of decisions to verify accuracy, and reviewing and processing claims in a timely manner. Collectively, these factors provide a customer-centric focus.

Underpinning this work were reliable ICT systems that were available 99 per cent of the time, enabling greater uptake of digital services and providing customers with 24/7 access. To help maintain and improve digital servicing, further investment in the department’s ICT systems will be required.

The department remains committed to delivering high-quality services and payments for the community on behalf of Government.

1. Customer satisfaction: Achievement of customer satisfaction standards.

Criterion

1. Customer satisfaction: Achievement of customer satisfaction standards.

Target

Customer Satisfaction of ≥85 out of 100. This is the level of satisfaction survey respondents have with their most recent interaction.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥85 out of 100

76.7

×

N/A

2017–18

≥85%

75.2%

×

5.7%

2016–17

≥85%

69.5%

×

2.7%

Analysis

While we delivered a range of activities aimed at improving customer experience this financial year, the target for 2018–19 was not achieved.

It should be noted that an index score of 75 out of 100 indicates an average customer satisfaction rating of 4 out of 5 (Satisfied). This means that overall, Social Security and Welfare customers were satisfied with the service they received, and satisfaction has increased over the past 3 years.

Staff assisted channels

The department has simplified our telephony queues and used voice recognition software to better direct customer calls. These efforts improved the 'telephony time to receive service' and 'effort' satisfaction drivers. Delivery by external Service Delivery Partners increased to the equivalent of 2,750 operators.

However, telephony channel results remain 7.4 points lower than results for the face-to-face channel. This is driven by lower results in every driver of satisfaction except 'fair treatment', which is comparable. In particular, 'time to receive service' is 19.7 points lower and 'effort' is 10.6 points lower.

Digital channels

For digital channels, satisfaction results increased for multiple drivers across the year for both online and mobile applications. Projects designed to improve the digital service experience for customers have contributed to these increases.

The department continues to invest in improvement initiatives to enhance the digital experience, including:

· The Student Transformation and Job Seeker Transformation projects focused on improving the online claiming experience and increasing digital services for students, job seekers and other customer cohorts where applicable.

· The Job Seeker Engagement Transformation project will streamline processing by removing manual steps and eliminating unnecessary back-end administrative processing.

· The Digital Enablement project moved Jobseeker and Youth Allowance onto a new platform, allowing customers and their nominees to manage payments, enhancing the Digital Assistants and Centrelink Online Account and streamlining claiming.

The department is considering ways to develop more targeted initiatives to improve customer satisfaction and experience.

The historical results below provide context for performance, however, due to a change in how the measure is calculated, the results for 2018–19 are not directly comparable with previous years. Results for the individual drivers of satisfaction were:

Satisfaction drivers

2016–17

2017–18

Satisfaction drivers

2018–19

Perceived quality

69.2%

74.8%

Perceived quality

76.0 out of 100

Personalised service

75.8%

79.6%

Personalised service

82.2 out of 100

Communication

79.4%

83.2%

Communication

80.8 out of 100

Time to receive service

46.3%

55.5%

Time to receive service

68.5 out of 100

Fair treatment

89.8%

91.3%

Fair treatment

91.7 out of 100

Ease of access

56.7%

66.6%

Effort

73.0 out of 100

Method

Customer survey

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 22

2018–19 Portfolio Budget Statements, page 32

Data source

Customer survey

Calculation formula

(A + B + C + D) ÷ 4

A. Overall Index Score for the face-to-face channel.

B. Overall Index Score for the telephony channel.

C. Overall Index Score for the online channel.

D. Overall Index Score for the mobile apps channel.

Calculation variables

Based on the responses provided to the drivers of satisfaction questions, an index score ranging from 0 to 100 is established for every single survey respondent.

The survey questions align with the six drivers, with questions tailored to the service channel. Personalised service and fair treatment relate to staff-service and are only applicable to the face-to-face and telephony channels. The remaining four drivers are applicable across all channels.

The drivers are measured as follows:

· perceived quality is a measure of the overall quality of service.

· personalised service is a measure of staff taking into account individual circumstances.

· communication is a measure of staff communication/clarity of online information.

· time to receive service is a measure of time taken to complete business.

· fair treatment is a measure of staff treating customers with respect.

· effort is measured by the customer’s assessment of the ease of handling their request.

All questions are measured on a 5-point scale, with 3 being neutral.

Notes and Definitions

A number of changes were made to this performance measure for 2018–19:

· inclusion of online survey results from 1 July 2018.

· inclusion of mobile applications survey results from 1 October 2018.

· inclusion of our Service Delivery Partners telephony results from 1 January 2019 onwards.

· changed the measurement approach from a percentage based measure (≥85 per cent) to an index-based measure (≥85 out of 100). This change was adopted as an index-based approach provides a more accurate result, being more sensitive to changes in ratings.

· updated the survey questionnaire to improve response rates and enhance our understanding of both satisfaction and customer experience.

All results are derived from the customer survey and are the customer’s perception.

Due to a change in survey provider, 2017–18 results are for 1 September 2017 to 30 June 2018 only.

2. Achievement of digital service level standards: Departmental interactions completed through digital channels.

Criterion

2. Achievement of digital service level standards: Departmental interactions completed through digital channels.

Target

≥5 per cent increase in the total number of interactions conducted via digital channels compared with 2017–18.

Results

Year

Target

Result

Achieved

2018–19

≥5% increase

11.4% increase

2017–18

≥5% increase

6.6% increase

2016–17

≥5% increase

5.3% increase

Analysis

There was an 11.4 per cent increase in Social Security and Welfare departmental interactions completed through digital channels compared to 2017–18, this increase was comprised of:

· 204.4 million online and mobile apps interactions, an increase of 25 million

· 56.6 million online letters, an increase of 2.6 million

· 51.4 million electronic data exchange transactions, an increase of 5 million

· 2.9 million phone self-service interactions, a decrease of 360 000.

The department increased the online options available to customers in accessing payments and services. During 2018–19, the department moved several types of claims to a new, more user-friendly online platform, including job seeker, age pension, concession card, carer payment/allowance and parenting payment. The shift from phone self-service transactions towards digital was supported by the department’s Digital Adoption Strategy, as well as improvements to the Express Plus app.

Method

Data mining.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 22

2018–19 Portfolio Budget Statements, page 32

Data source

Enterprise Data Warehouse, Business Activity Reporting and Analytics and the Outbound Correspondence Report.

Calculation formula

(A – B) ÷ B × 100

A. Total number of departmental interactions by customers conducted through digital channels in 2018–19.

B. Total number of departmental interactions by customers conducted through digital channels in 2017–18.

Calculation variables

Nil.

Notes and Definitions

Departmental interactions include the following self-managed transactions and electronic interactions:

· online and mobile applications.

· phone self-service (Interactive Voice Response).

· online letters.

· electronic data interchange transactions.

3. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

Criterion

3. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

Target

ICT systems that support 24/7 customer access are available ≥98 per cent of the time.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥98%

99.0%

-0.1%

2017–18

≥98%

99.1%

-0.2%

2016–17

≥98%

99.3%

1.3%

Analysis

The department continued to meet digital service level standards relating to the availability of ICT services, with customers able to access ICT systems 99 per cent of the time.

The department continued to develop and implement programming to improve communication between Centrelink’s mobile and web channels. This has improved the customer experience through increasing choice in how customers interact with us. Express Plus Centrelink continued to simplify the user interface of available services to make it more modern and user friendly.

Method

Data mining.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 22

2018–19 Portfolio Budget Statements, page 33

Data source

Internal data sources including incident records, problem records, and scheduled maintenance periods are utilised to calculate the availability result.

Calculation formula

A ÷ B × 100

A. Service uptime (availability window subtracting outage time).

B. Availability window (total service hours subtracting scheduled maintenance periods).

Calculation variables

The performance measure includes the following services in its calculation:

· Centrelink online accounts.

· Centrelink mobile applications.

Notes and Definitions

· The title was updated in 2018–19 to clearly articulate that scheduled maintenance periods are excluded from the calculation of this performance measure. The calculation formula for this performance measure has not changed

· Outage time means a confirmed disruption to Centrelink online accounts or Centrelink mobile applications.

4. Achievement of payment quality standards: Centrelink: Delivery of correct customer payments.

Criterion

4. Achievement of payment quality standards: Centrelink: Delivery of correct customer payments.

Target

≥95 per cent of Centrelink customer payments delivered correctly.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥95%

98.3%

-0.2%

2017–18

≥95%

98.5%

0.2%

2016–17

≥95%

98.3%

-0.1%

Analysis

Performance in 2018–19 is consistent with previous annual results. The department continued to expand self-service options and automated actions, which helped reduce the opportunity for staff errors.

To improve the accuracy of payments, the department also undertakes a range of activities and strategies to educate and assist people in meeting their obligations. This includes sending letters and text messages (SMS) to prompt people to update their circumstances to make sure they receive the correct rate of payment.

Maintaining strong performance is supported by the modernisation of systems, appropriate staff training, and movements towards our vision for integrity of 'the right payment, to the right person, at the right time, without their active involvement (unless required by policy)'.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 22

2018–19 Portfolio Budget Statements, page 33

Data source

Extracts from the Random Review Results and Integrated Review Systems.

Calculation formula

A ÷ B × 100

A. Number of completed Random Sample Survey (RSS) reviews for customers which did not have an administrative error with a dollar impact.

B. Number of completed RSS reviews for all customers.

Notes and Definitions

Reviews with administrative error with a dollar impact are classified into staff error, system error, and legislative error.

A correct payment is where the RSS has found no rate variation or debt has been caused by administrative error.

The performance measure result represents the errors detected in a given year and may relate to prior year payments.

RSS reviews are a point in time analysis of customer circumstances, designed to establish whether the customer is being paid correctly.

The RSS Program covers a subset of Social Security and Welfare payments, and includes the following payments:

· ABSTUDY

· Age Pension

· Austudy

· Carer Payment

· Carer Allowance

· Disability Support Pension

· Family Tax Benefit

· Newstart Allowance

· Parenting Payment

· Partner Allowance

· Sickness Allowance

· Special Benefit

· Widows Allowance

· Youth Allowance

5. Achievement of face-to-face service level standards: Average wait time.

Criterion

5. Achievement of face-to-face service level standards: Average wait time.

Target

The average length of time a customer waits to access social security and welfare face-to-face services in our service centres is ≤15 minutes.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤15 minutes

15 minutes
10 seconds

×

41 seconds

2017–18

≤15 minutes

14 minutes 29 seconds

2 minutes 25 seconds

2016–17

≤15 minutes

12 minutes 4 seconds

1 minute 6 seconds

Analysis

In 2018–19 to reduce wait times for face-to-face services, the Front of House application was used to identify customers who could be provided with a service over the phone from another service centre if the location they were in was experiencing longer wait times.

The department will continue exploring other opportunities to manage demand across locations, including balancing resourcing between competing priorities and improved technology (for example, customer video conferencing ICT capability).

During 2018–19, the reporting of annual performance has been affected by customers who complete multiple transactions in a single visit to a Service Centre:

· Currently, our reporting records customers as receiving a Social Security and Welfare or Health service rather than being able to record that they received both services in the same interaction.

· Customers attending in relation to a Social Security and Welfare enquiry may also complete a Health enquiry at the same time.

The underlying methodology for this performance measure does not reflect this current situation where many staff are now skilled to complete both Social Security and Welfare and Health services during the same visit, and this affects reporting. For example, 17 per cent (over 2.8 million) of Health contacts are not represented in the data for 2018–19. To provide a more accurate and integrated view of face-to-face customer interactions we are working towards aligning performance measure methodology with the current service offer.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 23

2018–19 Portfolio Budget Statements, page 33

Data source

Enterprise Data Warehouse/SAS Enterprise Guide.

Calculation formula

A ÷ B

A. Total wait time of customers serviced in the face-to-face channel.

B. Total number of customers serviced in the face-to-face channel.

Calculation variables

· Data is captured in the SAPUI5 Front of House application. It includes Virtual Waiting Room (VWR) and resolved enquiry data.

· Abandoned contacts are not included as the time that the customer elects to leave the premises cannot be recorded.

· Wait times attributed to reassigned contacts are not included in the result as the wait time measures the initial wait time of a customer only.

· A zero wait time is recorded for small sites where customer enquiries are resolved at first contact with a Customer Liaison Officer.

· Customers serviced via outreach arrangements or by agents are not included in this calculation. These contacts are not recorded in the queue management system.

· Customers who attend a site to use self-service facilities that are not recorded by a ticketing machine or Customer Liaison Officer are not included in this calculation. These contacts are not recorded in the queue management system.

Notes and Definitions

· SAPUI5 Front of House application is a single browser based application that operates on both desktop and mobile devices to manage Front of House customer contacts.

· VWR refers to all contacts that have a status of waiting, assigned and presented.

· Abandoned contacts refers to the count of customers booked into the VWR that left the service centre before being served.

· Reassigned contacts are a count of walk-ins reassigned to the wait room. If a person has entered the service centre to discuss matters in more than one program, for example Centrelink and Medicare, they will be put into a wait room for one program and once those transactions have been completed they will be returned to the wait room before doing transactions for the other program. The second wait time is the reassigning of contacts.

6. Achievement of telephony service level standards: Average speed of answer.

Criterion

6. Achievement of telephony service level standards: Average speed of answer.

Target

The average length of time a social security and welfare customer waits to have a call answered through our telephony services is ≤16 minutes.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤16 minutes

15 minutes
32 seconds

-26 seconds

2017–18

≤16 minutes

15 minutes 58 seconds

14 seconds

2016–17

≤16 minutes

15 minutes 44 seconds

35 seconds

Analysis

During 2018–19, the department answered more than 16.2 million Social Security and Welfare calls. This target was met due to a better ability to handle customer demand with additional Smart Centre operators from Service Delivery Partners, who handled a third of the overall call volume. Service Delivery Partners handled calls in main business lines (Disabilities, Sickness and Carers, Families and Parenting, Employment Services, Older Australians and Youth and Students), Earnings, MyGov, Online Services Support and Income Management (including BasicsCard).

The department has simplified our telephony queues and used voice recognition software to better direct customer calls and improve access to self-service telephony applications.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 23

2018–19 Portfolio Budget Statements, page 34

Data source

Telstra Computer Telephony Interface files.

Calculation formula

A ÷ B

A. Total time customers waited for their call to be answered.

B. Total number of calls answered by service officers.

Calculation variables

· Average speed of answer is measured from the time a customer enters the queue to the time their call is answered by a service officer.

· Place in queue calls are not included in the calculation.

· Calls transferred internally between queues are counted as separate calls with separate wait times and are included in the calculation.

· Calls that are abandoned after entering a queue are not included in the calculation as the calculation measures how long calls have waited to be answered only.

Notes and Definitions

· From 1 January 2019 onwards, the results handled by our Service Delivery Partners were included in the calculation for this performance measure.

· Place in queue calls are made by dialling the caller back after accepting an offer of a call back. The initial time a caller waited is included in this calculation. When the call back is made, it is an outbound call made where the department waits for the caller to answer and therefore not included in the calculation.

7. Achievement of processing service level standards: Claims processed within standard.

Criterion

7. Achievement of processing service level standards: Claims processed within standard.

Target

≥82 per cent of claims processed within standard.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥82%

68.0%

×

-14.1%

2017–18

≥82%

82.1%

No change

2016–17

≥82%

82.1%

3.6%

Analysis

More than 3.5 million Social Security and Welfare claims were processed in
2018–19, compared to over 3.4 million claims in 2017–18.

The reduction in the department's performance against this measure during
2018–19 reflects the priority given to reducing the number of claims on hand, in particular older claims.

In addition, a number of other factors influenced performance throughout
2018–19:

· The diversion of resources to support emergency responses to natural disasters affected Social Security and Welfare claims processing capacity.

· The expansion of the blended workforce with the use of Service Delivery Partners required a large upfront resource commitment by the department to train and support these staff to reach proficiency. This reduced the number of departmental staff available to process claims in the short term.

· The department's Service Delivery Partners onboarded during the year and the additional capacity they provide to Social Security and Welfare calls allows more departmental staff to process claims.

The performance measure calculation also continues to include time outside the department’s control, such as requesting documentation or medical assessments. The ability to influence customer behaviour and have information returned in a timely manner remains a challenge for the department despite a number of innovations in this area, including SMS reminders.

The department is committed to transforming the delivery of Social Security and Welfare payments and making it easier for people to claim online and is continuing to invest in:

· Ongoing transformation of the processing system with more payments and services moving from legacy systems to new platforms.

· Enhancements to the customer experience in the digital channel through improved online claiming and reduced touch points for customers.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 23

2018–19 Portfolio Budget Statements, page 34

Data source

Enterprise data Warehouse/SAS Enterprise Guide.

Calculation formula

A ÷ B × 100

A. Total number of claims processed within standard.

B. Total number of claims processed.

Calculation variables

· Newstart Allowance, Youth Allowance (Full Time Student), Youth Allowance (Other) and Family Tax Benefit claim results have the highest influence on this performance measure.

· A processed claim equals ‘A new claim lodged for a social security payment or a concession card that has been assessed by a service officer, resulting in the claim being either granted or rejected. This does not include claims that are cancelled, deleted or withdrawn’.

Notes and Definitions

Standard

Service Reason / Claim Type

≥90% within 2 days

Crisis Payment

≥90% within 21 days

Parenting payment Single

≥85% within 14 days

Child Care Benefit (Approved Care) excluding Lump Sum claims; Child Care Benefit (Registered Care)

≥85% within 21 days

Child Care Benefit (Approved Care) Lump Sum claims only; Dad and Partner Pay; Paid Parental leave

≥85% within 42 days

Austudy; Mobility Allowance

≥85% within 49 days

Carer Allowance (excluding claims older than 84 days)

≥70% within 21 days

ABSTUDY; ABSTUDY PES; Pensioner Education Supplement; Youth Allowance (Other)

≥80% within 16 days

Newstart Allowance

≥80% within 21 days

Assistance for Isolated Children; Bereavement Allowance; Special benefit; Stillborn Baby Payment

≥80% within 28 days

Low Income Card; Parenting Payment Partnered; Seniors Health Care Card

≥80% within 42 days

Youth Allowance (Full time student)

≥80% within 49 days

Age Pension; Carer Payment (excluding claims older than 84 days)

≥80% within 56 days

Double Orphan Pension

≥75% within 16 days

Widow Allowance

≥70% within 33 days

Family Tax Benefit

≥70% within 35 days

Sickness Allowance

≥70% within 49 days

Disability Support Pension (excluding claims older than 84 days)

8. Internal reviews: Percentage of decision reviews requested by Centrelink customers finalised within standard.

Criterion

8. Internal reviews: Percentage of decision reviews requested by Centrelink customers finalised within standard.

Target

≥70 per cent of internal reviews are finalised within the 49-day standard.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥70%

84.6%

-12.5%

2017–18

≥70%

97.1%

22.6%

2016–17

≥70%

74.5%

-2.0%

Analysis

In 2018–19, the department received 66,083 applications for formal internal review, compared to 61,347 in 2017–18. Although the performance against this measure was lower than that in 2017–18, the department continued to achieve against its target.

Strategies to help manage processing of reviews included:

· a workload allocation model, which ensures timely progression of reviews

· a review of staff skills to ensure there is sufficient capability to undertake the various review types, and upskilling staff where required to address identified skill shortages

· initiatives to improve the efficiency of the review process.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 24

2018–19 Portfolio Budget Statements, page 34

Data source

Key date values in the appeals system, as captured in SAS dataset.

Calculation formula

A ÷ B × 100

A. Total number of internal reviews finalised within 49 days.

B. Total number of internal reviews finalised.

Calculation variables

· ‘A’ is dependent on the number of reviews finalised where the calculation of days elapsed from ‘Date Review Requested’ to ‘Date Review Finalised’ is less than or equal to 49 days, and ‘Date Review Finalised’ falls into the reportable financial year.

· ‘B’ is determined by the number of reviews where ‘Date Review Finalised’ falls into the reportable financial year.

· For both variables, a review is one that has been finalised at the Authorised Review Officer (ARO) level under the Improved Review Process introduced on 12 November 2016, or at the DHS Review Officer level for Enhanced Internal Reviews prior to 12 November 2016.

Notes and Definitions

Reviews finalised refers to formal reviews finalised via the internal review process within the 2018–19 reporting period.

The steps in the internal review process are:

· a customer or their nominee requests a review of a decision.

· a subject matter expert undertakes a quality check of the decision, and where appropriate, the original decision can be altered.

· if the customer remains dissatisfied with the decision, a formal internal review of the decision is completed by an ARO.

The title of the performance measure was updated in 2018–19 to clearly articulate that decision reviews are requested by customers.

9. Achievement of payment integrity standards: Centrelink: Debt under recovery.

Criterion

9. Achievement of payment integrity standards: Centrelink: Debt under recovery.

Target

≥60 per cent of Centrelink debt has a current debt recovery arrangement in place.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥60%

69.5%

N/A

2017–18

≥60%

59.0%

×

-2.8%

2016–17

≥60%

61.8%

-1.7%

Analysis

Legislation requires the department to manage and pursue the recovery of debts and negotiate suitable payment arrangements for customers based on their capacity to repay their debt.

The department continued to progress strategies to assist in increasing the recovery of outstanding debt to the Commonwealth, including:

· the promotion of and subsequent increase in uptake by customers of the online payment facility (Money You Owe), enabling former recipients to make debt repayment arrangements online

· offering a range of repayment methods including withholdings from social welfare payments, direct debit, BPAY, telephone or internet banking, Australia Post's Billpay service, and the Money You Owe online service.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 24

2018–19 Portfolio Budget Statements, page 34

Data source

Enterprise Data Warehouse - Debt Under Arrangement report.

Calculation formula

A ÷ B × 100

A. Value of debt under arrangement as at end of reporting period.

B. Value of total outstanding debt as at end of reporting period.

Calculation variables

Both A and B:

· include debts with external collection agents

· exclude debts with a status of temporary write off or held/parked, as they are not recoverable debts and cannot be subject to an arrangement. Debt under arrangement does not include debts temporarily written off (debt recovery suspended).

Notes and Definitions

The definition for a current debt recovery arrangement is that:

· the debt has been determined and the customer owes the department the current outstanding amount

· the customer has a current, pending, future or broken (recent) arrangement in place at the reporting date.

A broken arrangement is where a customer has a current payment arrangement in place, however has missed their regular agreed ongoing payment. The system allows a period of time where the payment is overdue before it is no longer considered an agreed payment arrangement and it is cancelled. When this happens, the debt is no longer considered to be under arrangement.

The performance measure was updated in October 2018 to include debts referred to external collection agents for debt recovery. The historical results provide context for performance, however, the result for 2018–19 is not directly comparable with previous years.

Program 1.2 Services to the Community—Health—Program performance overview

Performance measure

Met

1. Satisfaction with Medicare provider service delivery: Practitioners, pharmacists and practice managers.

2. Customer satisfaction: Achievement of customer satisfaction standards.

3. Achievement of digital service level standards: Medicare Benefits Schedule digital claiming rate.

4. Achievement of digital service level standards: Departmental interactions completed via digital channels.

5. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

6. Achievement of payment quality standards: Medicare: Delivery of accurate medical benefits and services.

7. Achievement of face-to-face service level standards: Average wait time.

8. Achievement of telephony service level standards: Average speed of answer: Pharmaceutical Benefits Scheme Authorities and My Health Record Providers.

9. Achievement of telephony service level standards: Average speed of answer: Providers.

10. Achievement of telephony service level standards: Average speed of answer: Customers.

11. Achievement of processing service level standards: Claims processed within standard.

Services to the Community – Health – Program performance overview

The department met all 11 of its Health performance measures in 2018–19. This compares with meeting 13 of 14 measures in 2017–18 and 12 of 14 measures in 2016–17. Five performance measure results improved, four are comparable and one declined, compared to 2017–18 results. The result for Medicare provider satisfaction is reported as an overall figure and is not directly comparable to the individual provider segments results for 2017–18.

By measuring the average wait time and speed of answer across telephony and face-to-face services, the department is able to calculate and understand the customer experience and highlight areas for improvement to achieve high-level standards across health services.

The department is able to determine the wait times for customers, providers and medical professionals, such as Pharmaceutical Benefits Scheme Authorities and My Health Record Providers. Collectively, the department upholds standards for customer engagements and ensures that we are supporting providers, medical practitioners and customers through efficient service delivery.

The department is also able to support the health care needs of families, individuals and communities over a range of face-to-face, telephony and digital channels. The use of 24/7 digital channels continues to grow with a 5.7 per cent increase in the use of digital channels. We will continue to encourage digital claiming and utilise resources to assist customers, providers, and medical practitioners in achieving greater self-sufficiency and accessing support services.

Underpinning this work were reliable ICT systems that were available 99.4 per cent of the time, enabling greater uptake of digital services and providing customers with 24/7 access. To help maintain and improve digital servicing further investment in the department’s ICT systems will be required.

Overall, the 2018–19 results for the Health Program performance measures demonstrate that the department has achieved its purpose of delivering high-quality services and payments for the community on behalf of Government.

1. Satisfaction with Medicare provider service delivery: Practitioners, pharmacists and practice managers.

Criterion

1. Satisfaction with Medicare provider service delivery: Practitioners, pharmacists and practice managers.

Target

≥70 per cent of practitioners, pharmacists and practice managers are satisfied with or neutral towards the service provided.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥70%

96.8%

N/A

Analysis

Overall, practitioners, pharmacists, and practice managers were most satisfied with:

· how the department respects people’s rights including privacy (97.8 per cent, 100 per cent and 100 per cent respectively)

· the department’s customer service (96.8 per cent, 97 per cent and 90.8 per cent)

· the accuracy of claims and payments (95.2 per cent, 97.8 per cent and 90.1 per cent)

· how easy it is to understand the information the department gives (94.8 per cent, 95 per cent and 85 per cent).

Practitioners, pharmacists and practice managers continue to prefer to contact the department via phone:

· 73 per cent of practitioners contacted the department via phone in the last 12 months, compared to 46 per cent via email, 43 per cent via the department’s website/online and 8 per cent via post.

· 93 per cent of pharmacists contacted the department via phone in the last 12 months, compared to 84 per cent through online claiming, 78 per cent via the department’s website and 31 per cent via email.

· 93.1 per cent of practice managers contacted the department via phone in the last few months, compared to 53.5 per cent via email, 59.4 per cent via the department's website/online and 33.7 per cent via post.

· 91.7 per cent of practitioners, 94.6 per cent of pharmacists and 85.1 per cent of practice managers who contacted the department via phone were satisfied with their interactions.

The table below includes historical results for the individual provider segments:

Provider

2016–17

2017–18

2018–19

Practitioners

95.0%

86.4%

96.8%

Pharmacists

94.0%

95.5%

Practice managers

82.0%

86.0%

Method

Survey.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 25

2018–19 Portfolio Budget Statements, page 35

The title of this performance measure has been updated from 'General practitioners' to 'Practitioners' to accurately reflect that a broader cohort of Medicare and ancillary providers are surveyed.

Data source

Survey.

Calculation formula

A ÷ B × 100

A. The weighted number of respondents who provide a rating of 3 or above to the overall satisfaction question.

B. The total weighted number of respondents who respond to the overall satisfaction question.

Calculation variables

· Respondents who respond ‘don’t know’, ‘not applicable’ or refuse to answer the question are excluded from the calculation.

· All questions are measured on a 5-point scale, with 3 being neutral.

Notes and Definitions

Based on an external review, the three Medicare provider performance measures from
2017–18 were consolidated into a single performance measure for 2018–19 and beyond. Because of this change, results are not directly comparable to those of previous years.

2. Customer satisfaction: Achievement of customer satisfaction standards.

Criterion

2. Customer satisfaction: Achievement of customer satisfaction standards.

Target

Customer Satisfaction of ≥85 out of 100. This is the level of satisfaction survey respondents have with their most recent interaction.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥85 out of 100

86.1

N/A

2017–18

≥85%

83.7%

×

2.0%

2016–17

≥85%

81.7%

×

4.1%

Analysis

The annual result reflects the department's increased focus on customer-centric service design and delivery. The department continued to adapt its service delivery to meet customer expectations for convenient servicing, alongside the one-on-one support provided to customers in service centres.

The top performing drivers of satisfaction in 2018–19 were the staff-related drivers of 'fair treatment' (94.5) and 'communication' (92.6).

The lowest performing drivers of satisfaction in 2018–19 were 'time to receive service' (71.0) and 'effort' (83.2).

The historical results below provide context for performance, however, the results for 2018–19 are not directly comparable with previous years due to a change in how the measure is calculated. Results for the individual drivers of satisfaction were:

Satisfaction drivers

2016–17

2017–18

Satisfaction drivers

2018–19

Perceived quality

78.5%

81.5%

Perceived quality

84.6 out of 100

Personalised service

83.5%

87.9%

Personalised service

91.4 out of 100

Communication

89.6%

92.1%

Communication

92.6 out of 100

Time to receive service

69.2%

68.2%

Time to receive service

71.0 out of 100

Fair treatment

93.3%

94.2%

Fair treatment

94.5 out of 100

Ease of access

75.8%

78.1%

Effort

83.2 out of 100

Method

Customer survey

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 25

2018–19 Portfolio Budget Statements, page 35

Data source

Customer survey

Calculation formula

A

A. Overall Index Score for the face-to-face channel.

Note, additional services channels will be added to the calculation when available, including the overall Index Scores for the telephony, online and mobile apps channels.

Calculation variables

Based on the responses provided to the drivers of satisfaction questions, an index score ranging from 0 to 100 is established for every single survey respondent.

The survey questions align with the six drivers, with questions tailored to the service channel. Personalised service and fair treatment relate to staff-service and are only applicable to the face-to-face and telephony channels. The remaining four drivers are applicable across all channels.

The drivers are measured as follows:

· perceived quality is a measure of the overall quality of service.

· personalised service is a measure of staff taking into account individual circumstances.

· communication is a measure of staff communication/clarity of online information.

· time to receive service is a measure of time taken to complete business.

· fair treatment is a measure of staff treating customers with respect.

· effort is measured by the customer’s assessment of the ease of handling their request.

All questions are measured on a 5-point scale, with 3 being neutral.

Notes and Definitions

A number of changes were made to this performance measure for 2018–19:

· changed the measurement approach from a percentage based measure (≥85 per cent) to an index-based measure (≥85 out of 100). This change was adopted as an index-based approach provides a more accurate result, being more sensitive to changes in ratings.

· updated the survey questionnaire to improve response rates and enhance our understanding of both satisfaction and customer experience.

All results are derived from the customer survey and are the customer’s perception.

The historical results provide context for performance, however, the result for 2018–19 is not directly comparable with previous years.

Due to a change in survey provider, 2017–18 results are for 1 September 2017 to 30 June 2018 only.

Teradata issues associated with extraction of the Medicare sample resulted in a limited sample being used for March 2019 (only 9 days of transactions) and no sample being used for April or May 2019. This reduces the confidence level for the overall Medicare result.

3. Achievement of digital service level standards: Medicare Benefits Schedule digital claiming rate.

Criterion

3. Achievement of digital service level standards: Medicare Benefits Schedule digital claiming rate.

Target

≥97 per cent of Medicare claimed services are lodged electronically across all digital Medicare service channels.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥97%

98.6%

0.7%

2017–18

≥96%

97.9%

0.8%

2016–17

≥96%

97.1%

1.0%

Analysis

In 2018–19, the department processed 429.6 million Medicare services, and paid $24.4 billion in benefits. In comparison, the department processed 419.9 million Medicare services and paid $23.5 billion in benefits in 2017–18, and processed 399.4 million Medicare services and paid $22.4 billion in benefits in 2016–17.

The annual result reflects the department's commitment to increasing digital claiming, by providing a convenient option for customers that results in fast processing of claims.

This is seen by a 0.2 per cent increase in digital Bulk Bill claiming over the 12 month period (99.3 per cent digital claiming), a 4.2 per cent increase in digital patient claims (93.8 per cent digital claiming), and a 0.1 per cent increase in digital Simplified Billing claiming (99.7 per cent digital claiming).

Method

Data mining.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 25

2018–19 Portfolio Budget Statements, page 35

Data source

Data extract.

Calculation formula

A ÷ B × 100

A. Digitally lodged Medicare Benefits Schedule (MBS) Services.

B. Total lodged MBS services.

Calculation variables

Total lodged MBS services = digital + non-digital lodgement.

Notes and Definitions

MBS service types include:

· Bulk Billing, Patient Claiming and Simplified Billing.

Bulk Bill claiming is where the patient assigns their right to the Medicare benefit to the health professional. There are no out of pocket costs for the patient as the health professional claims the Medicare benefit for the service directly from Medicare.

Digital claiming is where a Medicare benefit is claimed electronically from Medicare – this can either be the claimant submitting the claim electronically or the health professional submitting the claim electronically on the claimant’s behalf.

Simplified Billing allows for claims to be submitted for unpaid in-patient services where a patient has been admitted as a private patient of a public or private hospital. Simplified Billing enables a patient to assign their right to benefit to the private health insurer or billing agent.

MBS lodgement channels include:

· digital lodgement: Medicare Online, Medicare Easyclaim, Health Provider Online Services, Electronic Claim Lodgement and Information Processing Service Environment, Simple Mail Transfer Protocol, Claiming Medicare Benefits Online via MyGov, Express Plus Medicare App.

· non-digital lodgement: Manual.

4. Achievement of digital service level standards: Departmental interactions completed via digital channels.

Criterion

4. Achievement of digital service level standards: Departmental interactions completed via digital channels.

Target

≥5 per cent increase in the total number of interactions conducted via digital channels compared with 2017–18.

Results

Year

Target

Result

Achieved

2018–19

≥5% increase

5.7% increase

2017–18

≥5% increase

8.2% increase

2016–17

≥5% increase

6.2% increase

Analysis

Patient verifications undertaken online continue to be a convenient method for providers to verify patient details for billing purposes. This verification contributes the majority of transactions to this performance measure each year. Work with the software vendors, and ongoing work to upgrade our digital interaction options continues to see steady increases in self-service transactions each year.

There was a 5.7 per cent increase in Medicare interactions completed through digital channels compared to 2017–18. This increase comprises:

· 26.9 million Online Concessional Entitlement Verifications and Online Patient Verifications, an increase of 3.1 million

· 4.3 million Health Provider Online Services transactions, an increase of 0.9 million

· 1.3 million Medicare Online Account transactions, an increase of 0.2 million.

Method

Data mining.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 26

2018–19 Portfolio Budget Statements, page 35

Data source

Enterprise Data Warehouse.

Calculation formula

(A - B) ÷ B × 100

A. Total number of departmental interactions by customers, providers and third parties conducted via digital channels in 2018–19.

B. Total number of departmental interactions by customers, providers and third parties conducted via digital channels in 2017–18.

Calculation variables

Departmental interactions include the following self-service transactions and electronic interactions:

· Health Professional Online Services.

· Medicare Online Accounts – via MyGov and Express Plus Medicare App.

· Online Concessional Entitlement Verification and Online Patient Verification.

Notes and Definitions

Patient verification volumes have been adjusted to exclude transactions caused by an external software vendor experiencing a technical issue that resulted in multiple searches for the same transaction initially being recorded in the system. A review of the previous transaction history for this vendor shows an average transaction volume of 1.3 million per month. This averaged volume has been used to re-calculate the result for the impacted period.

Medicare Benefits Schedule service claim transactions are captured under the ‘Achievement of digital service level standards: Medicare Benefits Schedule digital claiming rate’ performance measure.

5. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

Criterion

5. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

Target

ICT systems that support 24/7 customer access are available ≥98 per cent of the time.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥98%

99.4%

-0.2%

2017–18

≥98%

99.6%

0.2%

2016–17

≥98%

99.4%

No change

Analysis

The department continued to meet digital service level standards relating to the availability of ICT services, with customers able to access ICT systems 99.4 per cent of the time.

To improve customer experience and security, a number of system changes were made during 2018–19 to reduce maintenance periods. The department also moved to a new configuration management platform to improve the stability and availability of digital services, including Express Plus Medicare and Medicare online accounts.

Method

Data mining.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 26

2018–19 Portfolio Budget Statements, page 35

Data source

Internal data sources including incident records, problem records, and scheduled maintenance periods are utilised to calculate the availability result.

Calculation formula

(A ÷ B) × 100

A. Service uptime (availability window subtracting outage time).

B. Availability window (total service hours subtracting scheduled maintenance periods).

Calculation variables

The services supporting 24/7 customer access included in this calculation are:

· Medicare online accounts.

· Express Plus Medicare mobile applications.

Notes and Definitions

· The title was updated in 2018–19 to clearly articulate that scheduled maintenance periods are excluded from the calculation of this performance measure. The calculation formula for this performance measure has not changed

· Outage time means a disruption to Medicare online accounts or Express Plus Medicare mobile applications.

6. Achievement of payment quality standards: Medicare: Delivery of accurate medical benefits and services.

Criterion

6. Achievement of payment quality standards: Medicare: Delivery of accurate medical benefits and services.

Target

≥98 per cent of medical benefits and services are delivered accurately.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥98%

98.8%

No change

2017–18

≥98%

98.8%

-0.3%

2016–17

≥98%

99.1%

-1.1%

Analysis

In 2018–19, the department processed 429.6 million Medicare services, and paid $24.4 billion in benefits. In comparison, the department processed 419.9 million Medicare services and paid $23.5 billion in benefits in 2017–18, and processed 399.4 million Medicare services and paid $22.4 billion in benefits in 2016–17.

The department continues to invest in its people to ensure they have the necessary skills to meet this performance measure. While staff are developing their capability to process medical benefits and services they have all work quality checked until they are able to process medical benefits and services with reduced supervision.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 26

2018–19 Portfolio Budget Statements, page 35

Data source

Enterprise Data Warehouse.

Calculation formula

(A - B) ÷ A × 100

A. Total of manually and automatically processed and paid services.

B. Weighted number of errors for manually and automatically processed and paid services.

Calculation variables

There are two components to this calculation: The first component relates to the quality of manual data entry associated with payment of a service. The second component relates to the number of payment errors for automatically processed and paid services that do not require any manual intervention.

Notes and Definitions

Manually processed and paid services:

· Medicare Manual and those submitted digitally requiring operator intervention.

· Medicare Eligibility Enrolments.

· Medicare Safety Net Registrations.

Automatically processed and paid services:

· Bulk Bill, Patient Claim and Simplified Billing Services submitted digitally.

7. Achievement of face-to-face service level standards: Average wait time.

Criterion

7. Achievement of face-to-face service level standards: Average wait time.

Target

The average length of time a customer waits to access face-to-face services in our service centres is ≤15 minutes.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤15 minutes

13 minutes
9 seconds

-5 seconds

2017–18

≤15 minutes

13 minutes
14 seconds

2 minutes
43 seconds

2016–17

≤15 minutes

10 minutes
31 seconds

2 minutes
23 seconds

Analysis

The target was met through effective demand management, including by using the Front of House application to identify customers who could be serviced virtually over the phone by staff in other locations when wait times were longer.

The face-to-face channel serviced approximately 2.9 million Health contacts during 2018–19, a 2.7 per cent decrease compared to 2017–18. The reduced demand is in part due to increased uptake of digital channels.

In 2018–19, the reporting of annual performance has been affected by customers who complete multiple transactions in a single visit to a Service Centre:

· Currently, our reporting records customers as receiving a Social Security and Welfare or Health service rather than being able to record that they received both services in the same interaction.

· Customers attending in relation to a Social Security and Welfare enquiry may also complete a Health enquiry at the same time.

The underlying methodology for this performance measure does not reflect this current situation where many staff are now skilled to complete both Social Security and Welfare and Health services during the same visit, and this affects reporting. For example, 17 per cent (over 2.8 million) of Health contacts are not represented in the data for 2018–19. To provide a more accurate and integrated view of face-to-face customer interactions we are working towards aligning performance measure methodology with the current service offer.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 27

2018–19 Portfolio Budget Statements, page 36

Data source

Enterprise Data Warehouse/SAS Enterprise Guide.

Calculation formula

A ÷ B

A. Total wait time of customers serviced in the face-to-face channel.

B. Total number of customers serviced in the face-to-face channel.

Calculation variables

· Data is initially captured in the SAPUI5 Front of House application. It includes Virtual Waiting Room (VWR) and resolved enquiry data.

· Abandoned contacts are not included in the result.

· Wait times attributed to reassigned contacts are not included in the result.

· A zero wait time is recorded for small sites where customer enquiries are resolved at first contact with a Customer Liaison Officer.

· Customers serviced via outreach arrangements or by agents are not included in this calculation. These contacts are not recorded in the queue management system.

· Customers who attend a site to use self-service facilities that are not recorded by a ticketing machine or Customer Liaison Officer are not included in this calculation. These contacts are not recorded in the queue management system.

Notes and Definitions

· SAPUI5 Front of House application is a single browser based application that operates on both desktop and mobile devices to manage Front of House customer contacts.

· VWR refers to all contacts that have a status of waiting, assigned and presented.

· Abandoned contacts refers to the count of customers booked into the VWR that left the service centre before being served.

· Reassigned contacts are a count of walk-ins reassigned to the wait room. If a person has entered the service centre to discuss matters in more than one program, for example Centrelink and Medicare, they will be put into a wait room for one program and once those transactions have been completed they will be returned to the wait room before doing transactions for the other program. The second wait time is the reassigning of contacts.

8. Achievement of telephony service level standards: Average speed of answer: Pharmaceutical Benefits Scheme Authorities and My Health Record Providers.

Criterion

8. Achievement of telephony service level standards: Average speed of answer: Pharmaceutical Benefits Scheme Authorities and My Health Record Providers.

Target

The average length of time a Pharmaceutical Benefits Scheme Authority or My Health Record provider waits to have a call answered through our telephony services is ≤30 seconds.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤30 seconds

28 seconds

-2 seconds

2017–18

≤30 seconds

30 seconds

-10 seconds

2016–17

≤30 seconds

40 seconds

×

No change

Analysis

In 2018–19, the department shifted to an ‘opt-out’ participation model for the My Health Record system.

In comparison to 2017–18, there was a 4.3 per cent increase in the number of calls from Pharmaceutical Benefits Scheme Authorities and My Health Record Providers answered by the department.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 27

2018–19 Portfolio Budget Statements, page 36

Data source

Telstra Computer Telephony Interface files

Calculation formula

A ÷ B

A. Total time providers waited for their call to be answered.

B. Total number of calls answered by service officers.

Calculation variables

· Average speed of answer is measured from the time a provider enters the queue to the time their call is answered by a service officer.

· Calls transferred internally between queues are counted as separate calls with separate wait times and are included in this calculation.

· Calls that are abandoned after entering a queue are not included in the calculation as the calculation measures how long calls have waited to be answered only.

Notes and Definitions

Nil.

9. Achievement of telephony service level standards: Average speed of answer: Providers.

Criterion

9. Achievement of telephony service level standards: Average speed of answer: Providers.

Target

The average length of time a provider waits to have a call answered through our telephony services is 2 minutes.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤2 minutes

1 minute
54 seconds

1 second

2017–18

≤2 minutes

1 minute
53 seconds

-3 seconds

2016–17

≤2 minutes

1 minute
56 seconds

29 seconds

Analysis

In comparison to 2017–18, there was a 12 per cent decrease in the number of calls from providers answered by the department.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 27

2018–19 Portfolio Budget Statements, page 36

Data source

Telstra Computer Telephony Interface files

Calculation formula

A ÷ B

A. Total time providers waited for their call to be answered.

B. Total number of calls answered by Service Officers.

Calculation variables

· Average speed of answer is measured from the time a provider enters the queue to the time their call is answered by a Service Officer.

· Calls transferred internally between queues are counted as separate calls with separate wait times and are included in this calculation.

· Calls that are abandoned after entering a queue are not included in the calculation as the calculation measures how long calls have waited to be answered only.

Notes and Definitions

Providers includes all types of health practitioners, practice managers and pharmacists who contact the 13 21 50 Provider Enquiry telephone line.

10. Achievement of telephony service level standards: Average speed of answer: Customers.

Criterion

10. Achievement of telephony service level standards: Average speed of answer: Customers.

Target

The average length of time a customer waits to have a call answered through our telephony services is ≤7 minutes.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤7 minutes

6 minutes
59 seconds

3 seconds

2017–18

≤7 minutes

6 minutes
56 seconds

22 seconds

2016–17

≤7 minutes

6 minutes
34 seconds

-17 seconds

Analysis

In comparison to 2017–18, there was a 3.6 per cent decrease in the number of calls from customers answered by the department.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 28

2018–19 Portfolio Budget Statements, page 36

Data source

Telstra Computer Telephony Interface files.

Calculation formula

A ÷ B

A. Total time customers waited for their call to be answered.

B. Total number of calls answered by service officers.

Calculation variables

· Average speed of answer is measured from the time a customer enters the queue to the time their call is answered by a service officer.

· Calls transferred internally between queues are counted as separate calls with separate wait times and are included in this calculation.

· Calls that are abandoned after entering a queue are not included in the calculation as the calculation measures how long calls have waited to be answered only.

Notes and Definitions

Health customer refers to a member of the public only, not businesses or health practitioners.

11. Achievement of processing service level standards: Claims processed within standard.

Criterion

11. Achievement of processing service level standards: Claims processed within standard.

Target

≥82 per cent of claims processed within standard.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥82%

90.3%

-7.1%

2017–18

≥82%

97.4%

2.0%

2016–17

≥82%

95.4%

-3.9%

Analysis

During 2018–19, 388 million of 430 million Medicare services claims were processed within standard. The department carefully monitored staff scheduling to ensure that the performance measure target was met this year.

While peaks and troughs are expected for some work types (e.g. provider registration has an annual cycle), unexpected spikes in incoming work can affect the achievement of performance measure targets.

During 2018–19, an increased number of services were processed and a greater value of benefits paid, and the average period from lodgement to processing decreased significantly:

2016–17

2017–18

2018–19

Bulk billing

313.6 million

332.3 million

341.0 million

Patient claiming

52.9 million

53.5 million

54.0 million

Simplified billing

32.9 million

34.0 million

34.5 million

Total services processed

399.4 million

419.9 million

429.6 million

Bulk billing

$15.6 billion

$16.5 billion

$17.1 billion

Patient claiming

$4.3 billion

$4.4 billion

$4.5 billion

Simplified billing

$2.5 billion

$2.6 billion

$2.7 billion

Total benefits paid

$22.4 billion

$23.5 billion

$24.4 billion

Average benefit per service

$56.08

$56.04

$56.78

Average period (date of lodgement to processing)

2.5 days

2.3 days

0.88 days

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 28

2018–19 Portfolio Budget Statements, page 36

Data source

Enterprise Data Warehouse.

Calculation formula

A ÷ B × 100

A. Total number of Medicare Benefits Schedule (MBS) services processed within standard.

B. Total number of MBS services processed.

Calculation variables

The measure only includes MBS processed service items.

The data includes both manually and digitally lodged MBS services.

The data measures the timeframe from the date of lodgement to the date that processing is complete.

MBS service include:

· Bulk Billing, Patient Claiming and Simplified Billing.

MBS lodgement channels and standards are as follows:

· standard of 2 days:

Digital lodgement: Medicare Online, Medicare Easyclaim, Health Provider Online Services, Electronic Claim Lodgement and Information Processing Service Environment.

· standard of 7 days:

Simple Mail Transfer Protocol, Claiming Medicare Benefits Online via MyGov, Express Plus Medicare App.

· standard of 21 days:

Non-digital lodgement: Post, Face to face, Teleclaims, Two-way (via health funds).

Notes and Definitions

Nil.

Program 1.3 Child Support—Program performance overview

Performance measure

Met

1. Customer satisfaction: Achievement of customer satisfaction standards.

×

2. Achievement of digital service level standards: Departmental interactions completed via digital channels.

3. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

4. Child Support collection: Percentage of domestic active paying parents with less than one month Child Support collect liability.

5. Achievement of telephony service level standards: Average speed of answer.

×

6. Achievement of processing service level standards: New registrations completed within standard.

×

7. Achievement of payment quality standards: Child Support: Debt under arrangement.

×

Child Support – Program performance overview

The department met three of seven Child Support performance measures in 2018–19. In comparison, the department met three of seven in 2017–18 and five of seven in 2016–17. Three performance measure results improved, two are comparable and one declined, compared to 2017–18 results. The Child Support debt under arrangement performance measure is new for 2018–19.

The department supports families, parents and children through the Child Support Program. The Program aims to ensure that children receive an appropriate level of financial support from separated or separating parents. We provide child support assessment, registration, collection and disbursement services to parents and non parent carers such as grandparents, legal guardians and other family members. A range of referral services and products are also provided to separated parents and non parent carers to help them with their child support needs.

The Child Support Program provides services to parents and non-parent carers in a timely manner for the benefit of children. The department provides services to parents and non-parent carers through telephony and digital channels. This year, the department implemented a number of changes to improve the Program in line with the Government Response to the Parliamentary Inquiry into the Child Support Program—Implementation 2017–18 Budget measure.

The interactions completed through digital channels for the Child Support Program grew by 8.3 per cent in 2018–19, with the department successfully migrating over 530,000 customers to a new online system. We will continue to encourage digital channels to assist customers in achieving greater self-sufficiency.

Underpinning this work were reliable ICT systems that were available 99.6 per cent of the time, enabling greater uptake of digital services and providing customers with 24/7 access.

A large focus of our work is reducing child support debt through compliance and enforcement programs. By providing payment arrangements to paying parents who still owe a child support debt, parents are able to manage their debt and repayments in a self-sufficient manner.

In 2018–19, the department worked with separated parents to facilitate the transfer of $3.7 billion to support approximately 1.2 million children.

The department continued to balance telephony strategies and case management processes to ensure that new registrations were completed in a timely manner to better assist customers.

Overall, the 2018–19 results for the Child Support Program performance measures are consistent with previous years. While the average speed of answer for Child Support customers was longer in 2018–19 compared to 2017–18, the other performance measures that did not meet annual targets in 2017–18 improved in 2018–19, demonstrating the department’s commitment to delivering high-quality services and payments for the community on behalf of Government.

1. Customer satisfaction: Achievement of customer satisfaction standards.

Criterion

1. Customer satisfaction: Achievement of customer satisfaction standards.

Target

Customer Satisfaction of ≥85 out of 100. This is the level of satisfaction survey respondents have with their most recent interaction.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥85 out of 100

75.2%

×

N/A

2017–18

≥85%

74.5%

×

-10.4%

2016–17

≥85%

84.9%

×

9.3%

Analysis

While we delivered a range of activities aimed at improving customer experience this financial year, the target for 2018–19 was not achieved.

It should be noted that an index score of 75 out of 100 indicates an average customer satisfaction rating of 4 out of 5 (Satisfied). This means that overall, Child Support customers were satisfied with the service they received.

The top performing drivers of satisfaction in 2018–19 were the staff-related drivers of 'fair treatment' (85.6) and 'communication' (83.1).

The lowest performing drivers of satisfaction in 2018–19 were 'time to receive service' (56.3) and 'effort' (70.3), however, both results improved during 2018–19.

The historical results below provide context for performance, however, the results for 2018–19 are not directly comparable with previous years due to a change in how the measure is calculated. Results for the individual drivers of satisfaction were:

Satisfaction drivers

2016–17

2017–18

Satisfaction drivers

2018–19

Perceived quality

85.7%

78.3%

Perceived quality

80.3 out of 100

Personalised service

88.2%

85.7%

Personalised service

81.6 out of 100

Communication

90.4%

87.9%

Communication

83.1 out of 100

Time to receive service

75.0%

50.9%

Time to receive service

56.3 out of 100

Fair treatment

92.8%

89.6%

Fair treatment

85.6 out of 100

Ease of access

77.1%

54.9%

Effort

70.3 out of 100

Method

Customer survey.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 29

2018–19 Portfolio Budget Statements, page 37

Data source

Customer survey.

Calculation formula

A. Overall Index Score for the telephony channel.

Note, additional service channels will be added to the calculation when available, including the overall Index Scores for the online and mobile apps channels.

Calculation variables

Based on the responses provided to the drivers of satisfaction questions, an index score ranging from 0 to 100 is established for every single survey respondent.

The survey questions align with the six drivers, with questions tailored to the service channel. Personalised service and fair treatment relate to staff-service and are only applicable to the telephony channel for Child Support. The remaining four drivers are applicable across all channels.

The drivers are measured as follows:

· perceived quality is a measure of the overall quality of service.

· personalised service is a measure of staff taking into account individual circumstances.

· communication is a measure of staff communication/clarity of online information.

· time to receive service is a measure of time taken to complete business.

· fair treatment is a measure of staff treating customers with respect.

· effort is measured by the customer’s assessment of the ease of handling their request.

All questions are measured on a 5-point scale, with 3 being neutral.

Notes and Definitions

A number of changes were made to this performance measure for 2018–19:

· changed the measurement approach from a percentage based measure (≥85 per cent) to an index-based measure (≥85 out of 100). This change was adopted as an index-based approach provides a more accurate result, being more sensitive to changes in ratings.

· updated the survey questionnaire to improve response rates and enhance our understanding of both satisfaction and customer experience.

All results are derived from the customer survey and are the customer’s perception.

The historical results provide context for performance, however, the result for 2018–19 is not directly comparable with previous years.

2. Achievement of digital service level standards: Departmental interactions completed via digital channels.

Criterion

2. Achievement of digital service level standards: Departmental interactions completed via digital channels.

Target

≥5 per cent increase in the total number of interactions conducted via digital channels compared with 2017–18.

Results

Year

Target

Result

Achieved

2018–19

≥5% increase

8.3% increase

2017–18

≥5% increase

1.3% increase

×

2016–17

≥5% increase

5.0% increase

Analysis

In 2018–19 the number of Child Support interactions completed through digital channels increased by 8.3 per cent. This increase was comprised of:

· 4.8 million online letters, an increase of almost 700,000

· 48,0000 Child Support online account interactions, an increase of 128,000

· 296 000 phone self-service interactions, a reduction of 75,000

· 71,000 online registrations, a reduction of approximately 2,400

· 4.6 million electronic payments, a reduction of 41,000.

The department continues to improve its 24/7 online services to support customers to manage their affairs at a time to suit them. The department's continued investment in digital servicing will assist in driving strong performance against this performance measure in 2019–20.

Method

Data mining.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery and modernisation.

Reference

2018–19 Corporate Plan, page 29

2018–19 Portfolio Budget Statements, page 37

Data source

Online IMS Reporting, COGNOS, Infoserve Reporting, SAP Outbound Correspondence, Pluto Online Services, Excel workbooks for electronic payments, and Child Support Adams System.

Calculation formula

(A - B) ÷ B × 100

A. Total number of departmental interactions by customers conducted through digital channels in 2018–19.

B. Total number of departmental interactions by customers conducted through digital channels in 2017–18.

Calculation variables

Departmental interactions include the following self-managed transactions and electronic interactions:

· online letters.

· customer online services.

· phone self-services.

· electronic payment transactions.

· online registrations.

The list of digital interactions included are:

· electronic letters.

· Child Support online account (all transactions from CUBA and PLUTO).

· phone self-services.

· online registrations (CUBA and PLUTO).

· Centrelink deductions.

· Gateway results.

· electronic payments – (Billpay, EFT, Bpay and Credit Card).

Notes and Definitions

CUBA and PLUTO are the online services that Child Support customers use to update their Child Support online and Child Support mobile app.

3. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

Criterion

3. Achievement of digital service level standards: Availability of ICT services excluding scheduled maintenance periods that support 24/7 customer access.

Target

ICT systems that support 24/7 customer access are available ≥98 per cent of the time.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥98%

99.6%

-0.1%

2017–18

≥98%

99.7%

0.2%

2016–17

≥98%

99.5%

-0.4%

Analysis

The department continued to meet digital service level standards relating to the availability of ICT services, with customers able to access ICT systems 99.6 per cent of the time.

A continued focus on customer experience has resulted in changes to internal processes that have improved the overall experience and security of digital services throughout 2018–19.

Method

Data mining of internal metrics and records to inform calculation of results.

Rationale

This performance measure supports the strategic themes of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 29

2018–19 Portfolio Budget Statements, page 37

Data source

Internal data sources including incident records, problem records, and scheduled maintenance periods are utilised to calculate the availability result.

Calculation formula

A ÷ B × 100

A. Service uptime (availability window subtracting outage time).

B. Availability window (total service hours subtracting scheduled maintenance periods).

Calculation variables

The services supporting 24/7 customer access included in this calculation are:

· Child Support online accounts.

· Express Plus Child Support.

Notes and Definitions

· The title was updated in 2018–19 to clearly articulate that scheduled maintenance periods are excluded from the calculation of this performance measure. The calculation formula for this performance measure has not changed.

· Outage time means a confirmed disruption to Child Support online accounts or Express Plus Child Support services.

4. Child Support collection: Percentage of domestic active paying parents with less than one month Child Support collect liability.

Criterion

4. Child Support collection: Percentage of domestic active paying parents with less than one month Child Support collect liability.

Target

≥63 per cent of domestic active paying parents in child support collect cases have less than one month liability outstanding.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥63%

64.6%

No change

2017–18

≥63%

64.6%

-1.4%

2016–17

≥63%

66.0%

0.1%

Analysis

Parents and non-parent carers who are unable to arrange child support payments directly may ask the department to collect and transfer payments on their behalf. In providing this service, we use a customer-centric approach, offering a range of payment options to customers. The department has a number of business processes that contributed to the positive performance, including:

· setting up employer withholdings to ensure regular and timely payments

· early intervention follow-up where paying parents go into debt for the first time, to get them back on track early

· a debt repayment methodology that explores payment in full in every customer interaction to get payments back on track as quickly as possible.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 30

2018–19 Portfolio Budget Statements, page 37

Data source

Enterprise Data Warehouse.

Calculation formula

A ÷ B × 100

A. Number of domestic active paying parents in collect cases without debt or where debt is less than or equal to one month liability.

B. Number of domestic active paying parents in collect cases.

Calculation variables

The calculation for this measure is a point in time result as at 30 June each year.

Notes and Definitions

Domestic active paying parents in collect cases are:

· paying parents, with both parents residing in Australia

· involved in at least one active case (the case has not ended and at least one child is under 18 years of age)

· a Child Support collect case (i.e. not a private collect case).

5. Achievement of telephony service level standards: Average speed of answer.

Criterion

5. Achievement of telephony service level standards: Average speed of answer.

Target

The average length of time a customer waits to have a call answered through our telephony services is 3 minutes.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≤3 minutes

12 minutes
45 seconds

×

3 minutes
40 seconds

2017–18

≤3 minutes

9 minutes
5 seconds

×

4 minutes
53 seconds

2016–17

≤3 minutes

4 minutes
12 seconds

×

1 minute
19 seconds

Analysis

In 2018–19, the department answered more than 1.7 million Child Support calls and made over 1.6 million outbound calls to separated or separating parents. This compares with nearly 1.9 million answered calls and more than 1.5 million outbound calls in 2017–18.

The department continues to balance its resources to achieve customer outcomes across all performance measures. Despite all available resources contributing to the management of inbound telephony demand, the department's existing workforce capacity was insufficient to achieve the Child Support average speed of answer target.

The reduction in the 2018–19 performance continues to be impacted by the transition of Child Support legacy technology to a new platform.

The Child Support program is also unable to utilise the department's Service Delivery Partners to help create additional telephony capacity due to legislative restraints which require the department to use only Australian Public Service staff for Child Support business.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 30

2018–19 Portfolio Budget Statements, page 37

Data source

Telstra computer telephony interface files.

Calculation formula

A ÷ B

A. Total time customers waited for their call to be answered.

B. Total number of calls answered by service officers.

Calculation variables

· Telephony lines used to calculate this measure include all Child Support inbound queues.

· Average speed of answer is measured from the time a customer enters the queue to the time their call is answered by a service officer.

· Calls transferred internally between queues are counted as separate calls with separate wait times and are included in this calculation.

· Calls that are abandoned after entering a queue are not included in the calculation as the calculation measures how long calls have waited to be answered only.

Notes and Definitions

Nil.

6. Achievement of processing service level standards: New registrations completed within standard.

Criterion

6. Achievement of processing service level standards: New registrations completed within standard.

Target

≥82 per cent of new registrations processed within standard.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥82%

80.7%

×

4.5%

2017–18

≥82%

76.2%

×

-6.8%

2016–17

≥82%

83.0%

-7.6%

Analysis

Although the department did not meet the target in 2018–19, the annual result for the processing of new registrations did improve by 4.5 per cent. This was despite a reduction in the number of staff processing new registrations as the department continued to balance resources to meet demand.

The 2018–19 improvement is attributed to better work management practices including providing staff with dedicated processing time, tailored system usage for managing registrations, and targeted staff training.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 30

2018–19 Portfolio Budget Statements, page 37

Data source

Child Support Enterprise Data Warehouse.

Calculation formula

A ÷ B × 100

A. Total number of registrations processed within the 28 day standard.

B. Total number of registrations.

Calculation variables

This performance measure counts the percentage of registrations that are finalised within 28 days.

Notes and Definitions

Registrations include:

· New registrations: A new registration refers to the process of an application for a child support assessment that has been received from a customer who is claiming child support for a child/ren that they have not previously claimed for.

· Restarts: A restart refers to cases that have been previously registered but the assessment was not accepted (invalid), withdrawn (customer decided not to proceed with the application) or ended and the customer makes an application to have the case restarted.

Finalised is defined as when the case status changed from recorded or pending to any other status other than cancelled.

7. Achievement of payment quality standards: Child Support: Debt under arrangement.

Criterion

7. Achievement of payment quality standards: Child Support: Debt under arrangement.

Target

≥39 per cent of child support debt is under arrangement.

Results

Year

Target

Result

Achieved

Yearly change

2018–19

≥39%

35.4%

×

N/A

Analysis

This is the first time this performance measure has been reported on.

The department has a number of business processes in place that contribute to this measure, including:

· setting up employer withholdings to ensure regular and timely payments

· early intervention follow up where paying parents go into debt for the first time to get them back on track early

· a debt repayment methodology that explores payment in full in every customer interaction to get payments back on track as quickly as possible

· working with the Australian Taxation Office to apply available tax refunds to Child Support debts.

In 2018–19, the department introduced a number of new initiatives including the ‘nudge letter’ program. The wording of these letters is based on our behavioural analytics research, which helps us understand how to motivate positive customer responses. The department issued 68,314 ‘nudge’ letters to customers who had missed payments, which resulted in $77.9 million being paid in full, and a further $66.2 million captured under payment arrangements.The department also applies a customer management approach, which ensures that debt is discussed and followed up as part of every customer interaction.

Method

Data mining.

Rationale

This performance measure supports the strategic theme of customer-centric service delivery.

Reference

2018–19 Corporate Plan, page 30

2018–19 Portfolio Budget Statements, page 38

Data source

Enterprise Data Warehouse.

Calculation formula

A ÷ B × 100

A. Child Support debt under payment arrangement.

B. Total Child Support debt.

Calculation variables

Total Child Support debt

Equals the sum of all Child Support Program (CSP) maintenance arrears.

Child Support debt under payment arrangement

CSP maintenance debt under agreement for repayment that includes repayment methods:

· customers with manual payment arrangements.

· customers with employer withholding of arrears.

· customers with Centrelink deductions.

CSP Customer

A parent or carer that has or had a CSP case registered.

Child Support Debt

CSP maintenance arrears that excludes cost, fines, or penalties that is a result of the unpaid liabilities raised for the payment of child support that is collectable by the CSP.

Payment arrangement

· manual payment arrangement – where a customer makes a payment to CSP.

· employer withholding arrears – where an employer garnishes a customer’s wage or salary.

· Centrelink deductions – where Centrelink pays a portion of some Centrelink payments to CSP on behalf of a customer.

Notes and Definitions

This performance measure commenced in 2018–19.