Appendix 2: Methodologies
Attorney-General’s Department stakeholder survey 2021
The department conducts an annual stakeholder survey to evaluate performance against targets that apply across our key activity areas. We engaged ORIMA Research to independently conduct the 2021 stakeholder survey using a list of stakeholders provided by the department.
Who we survey
For results to be comprehensive and reliable, the survey seeks feedback from representatives of organisations and individuals. The survey was an attempted census of key stakeholders.1 A total of 2,931 stakeholders were sent a survey, a larger number than in previous years.
As this survey was conducted as a census, the results are not subject to sampling error. They are, however, subject to non-sampling measurement error.
The survey questionnaire was designed to differentiate between the following types of stakeholders:
- 'knowledgeable observers' who are in a position to provide an informed view about the department's effectiveness, timeliness and responsiveness (the latter 2 performance aspects are the department's proxy measures for efficiency)
- other stakeholders.
Stakeholders who self‑identified through the survey that they had dealt with the department at least once every 3 months in the past 12 months were assessed to be knowledgeable observers. Knowledgeable observers are likely to be more familiar with what the department does than other stakeholders and are therefore better placed to assess our performance.
A relative weighting of 4:1 has been applied to knowledgeable observers and other stakeholders, respectively. This has been deemed an appropriate weight to provide balanced measures that take account of the views of all respondents while minimising the distortion caused by including feedback from those not well placed to judge performance.
Knowledgeable observers were also asked to comment in more detail on their experience of dealing with the department. This approach targets stakeholders with the most relevant set of questions that they are informed to answer.
How the survey was conducted
Survey fieldwork began on 7 May and closed on 21 May 2021. ORIMA emailed stakeholders a secure unique web link to access the survey online. To encourage the provision of open feedback, respondents were provided with the option of providing their responses on an anonymous basis (i.e. without their individual survey response being provided to the department).
Measuring success over time
The results of this year’s survey are not directly comparable to those reported last year due to methodological changes, including different sample selection criteria, weighting of responses and changes to the survey questions. All of these changes do, however, represent methodological improvement and have improved the survey’s reliability.
The structure of the survey was changed from previous years to focus on individual performance targets, as set out in the Corporate Plan 2020–24. The approach to survey questions also changed significantly with a shift from a satisfaction scale to an agreement rating scale, focusing on what stakeholders are best able to assess and rate based on their direct experience in dealing with the department. Respondents were streamed to question sets based on their qualification to respond.
The survey questionnaire contained groups of questions addressing stakeholder perceptions of the department’s performance on key activities specified in the corporate plan. The corporate plan includes 13 performance targets that rely on the annual stakeholder survey. For each performance target we have measured both effectiveness and efficiency.
Composite index measures were constructed for each performance target addressed. Each reported index for a performance target is formulated based on the average of individual question responses for questions that address the target. The index approach provides more meaningful and complete measures of stakeholder ratings of effectiveness and efficiency.
The index for a question is the mean (average) response for the question across respondents (using the numerical score from the 5-point response scale) transformed into a 0 to 100-point scale.
The aggregate indices have the following properties:
- index scores of 0–49 indicate that, on average, respondents have provided an unfavourable assessment of the department's performance
- an index score of 50 indicates that, on average, respondents have provided a neutral assessment
- index scores of 51–100 indicate that, on average, respondents have provided a favourable assessment
- the higher the index score, the more positive the average respondent’s perception of the department's performance
- if all respondents provided the most positive rating possible to all of the questions covering a performance target, the index score would be 100
- if all respondents provided the least positive rating possible to all of the questions covering a performance target, the index score would be 0.
The percentage of respondents providing a positive rating consists of the proportion of those that reported an index score higher than 50 index points, that is, 51–100 index points. This percentage can then be compared to the target for stakeholder and client satisfaction of greater than 80%.
The Corporate Plan 2020–24 reset the department’s performance framework, moving away from the concept of strategic priorities to focus on the work of the department across 5 key activities. While results for the 2020 and 2021 surveys are therefore not directly comparable, they do provide trend information over time. High-level results from this year demonstrate continuing high rates of stakeholder satisfaction over time and suggest increased stakeholder satisfaction with the department overall since 2019–20.
Each year, the survey has included questions to measure our effectiveness in conducting the business of the department. In 2020, respondents could indicate their level of satisfaction from ‘very dissatisfied’ to ‘very satisfied’2, with 87.4% being ‘satisfied’ or ‘very satisfied’ with the department’s overall effectiveness. An average of 93% of respondents to this year’s survey provided positive ratings of ‘agree’ or ‘strongly agree’ in response to questions about our effectiveness.
In 2020, survey questions to measure efficiency in specific strategic priority areas related to the timeliness with which the department delivered its work, with 84.7% of respondents being ‘satisfied’ or ‘very satisfied’ with our efficiency.3 This year’s survey included questions to measure our efficiency in conducting our business largely focused on timeliness and responsiveness. An average of 87% of respondents provided overall positive ratings of ‘agree’ or ‘strongly agree’ for the department’s overall efficiency.
Survey results and analysis
We received 830 responses to the survey, a response rate of 28%. This response rate is similar to the 2020 response rate of 29% and is within the usual range of response rates for comparable government agency stakeholder surveys (20–40%). While a higher response rate would have reduced the degree of potential non-response measurement error (stemming from the possibility that stakeholders who did not respond held systematically different views to those who did), the achieved response rate is sufficient to provide reliable results. In addition, it should be noted that many non-respondents are likely to have not responded on the basis that they did not feel that they were in a position to provide informed feedback. Given this, a higher response rate may not have necessarily resulted in more reliable data. The following table sets out the number of and percentage response rates by organisation type.
Organisation type | Invitation sent | Number of responses | Response rate |
---|---|---|---|
Department portfolio agency | 300 | 136 | 45% |
Other Australian Government department or agency | 1,505 | 361 | 24% |
Australian state/territory department or agency | 460 | 100 | 22% |
Government organisation from a country other than Australia | 97 | 32 | 33% |
International organisation | 11 | 9 | 82% |
Peak body or representative organisation or association | 174 | 46 | 26% |
Individual business, company or firm | 260 | 53 | 20% |
Not-for-profit organisation, non-government organisation | 54 | 42 | 78% |
University/research institute | 36 | 8 | 22% |
Other | 34 | 20 | 59% |
Did not provide response to organisation type question | 23 | ||
Total | 2,931 | 830 | 28% |
The overall performance ratings of those surveyed were high (similar to the 2020 survey) with average (across all performance targets) positive ratings for effectiveness of 93% and efficiency of 87%. The proportion of respondents who provided positive ratings for overall effectiveness ranged from 82–100%. The proportion who provided positive ratings for overall efficiency ranged from 63–100%, with 2 performance targets obtaining positive ratings of less than 80%.
Overall, respondents rated the effectiveness of the department at least equal if not greater than its efficiency against the majority of performance targets. Knowledgeable observers overall provided higher positive ratings compared to others in relation to the majority of performance targets for both effectiveness and efficiency. Stakeholders dealing with the department for 3 years or less reported higher effectiveness and efficiency ratings compared to those who had been dealing with the department for 4 years or more across 4 performance targets.
Measuring effectiveness
The survey included questions to measure our effectiveness in conducting our business against each performance target largely focused on the department’s expertise and the quality of our relationship with stakeholders, shown in the following table. On average, 93% of respondents provided overall positive ratings of ‘agree’ or ‘strongly agree’ with the department’s overall effectiveness.
Statements used to measure effectiveness
Effectiveness statements asked of knowledgeable observers:
- The department demonstrated a high level of expertise.
- The department provided high-quality advice.
- The department provided consistent advice.
- The department added value in informing decision-making.
- The department’s advice considered the views of all relevant stakeholders.
- The department was committed to finding solutions to problems.
- The department based its decisions on sound evidence.
- The department was effective, overall, in working to support the achievement of the Australian Government’s objectives.
- The department was effective in its contribution to the management of international disputes.
- The department was effective in its contribution to the management of constitutional and related legal risks.
Effectiveness statements asked of knowledgeable observers and other stakeholders:
- The department communicated with me effectively.
- The department’s staff engaged with me in a respectful manner.
- The department provided sufficient information to me.
We achieved particularly high levels of positive ratings among stakeholders who responded to the survey for Key Activity 1 and 2 as shown in the following table.
Key activity | Performance measure | 2021 target | 2021 result | Number of responses | Number of qualifying stakeholders | Response rate |
---|---|---|---|---|---|---|
1: Provide legal services and policy advice and oversee legal services across government | Performance Measure 1.2: International law and policy advice | >80% | 98% | 43 | 58 | 74% |
Performance Measure 1.3: Constitutional policy and related public law advice | >80% | 100% | 10 | 38 | 26% | |
2: Manage casework | Performance Measure 2.1: International crime cooperation, federal offender, international family law, private international law and United Nations human rights committee communications casework | >80% | 100% | 48 | 166 | 29% |
3: Administer and advise on legal and policy frameworks | Performance Measure 3.1: Legal and policy advice on the federal justice system | >80% | 82% | 38 | 112 | 34% |
Performance Measure 3.2: Administration of family law and marriage legislation and policy frameworks | >80% | 89% | 35 | 97 | 36% | |
Performance Measure 3.3: Administration of the industrial relations system | >80% | 90% | 61 | 139 | 44% | |
Performance Measure 3.4: Legal and policy advice on Australia’s integrity and rights frameworks | >80% | 100% | 20 | 68 | 29% | |
Performance Measure 3.5: Legal and policy advice on criminal justice and national security frameworks | >80% | 89% | 51 | 99 | 52% | |
4: Administer and implement programs and services | Performance Measure 4.1: Legal assistance | >80% | 85% | 17 | 31 | 55% |
Performance Measure 4.5: Building counter-fraud and protective security capability across government | >80% | 94% | 60 | 148 | 41% | |
Performance Measure 4.6: Administration of the Foreign Influence Transparency Scheme and Lobbying Code of Conduct | >80% | 100% | 6 | 13 | 46% | |
Performance Measure 4.7: Pacific law and justice programs | >80% | 100% | 21 | 77 | 27% | |
5: Establish and support Royal Commissions and other bodies | Performance Measure 5.2: Support for the overarching and ongoing purpose of Royal Commissions | >80% | 85% | 14 | 60 | 23% |
Measuring efficiency
The department primarily provides policy and legal advice to government. There are challenges to measuring efficiency, defined as the unit cost of an output generated by an activity4, of the provision of legal and policy advice. Accordingly, we use client and stakeholder satisfaction with the timeliness of our policy and legal advice and our responsiveness to resolving complex legal and policy issues as a proxy measure for efficiency.
The survey included questions to measure our efficiency in conducting our business against each performance measure largely focused on timeliness and responsiveness as shown in the following table. On average, 87% of respondents provided overall positive ratings of ‘agree’ or ‘strongly agree’ with the department’s overall efficiency.
Statements used to measure efficiency
Efficiency statements asked of knowledgeable observers:
- The department provided timely advice.
- The department was responsive to requests for assistance.
Efficiency statements asked of knowledgeable observers from Australian Government agencies:
- There was a clear delineation of responsibilities between the department and my organisation.
Efficiency statements asked of knowledgeable observers and other stakeholders:
- The department provided information to me in a timely manner.
- The department’s staff responded in an appropriate time frame to issues or concerns raised by me.
We achieved particularly high levels of positive ratings among stakeholders who responded to the survey for Key Activity 1 and 2 as shown in the following table.
Key activity | Performance measure | 2021 target | 2021 result | Number of responses | Number of qualifying stakeholders | Response rate |
---|---|---|---|---|---|---|
1: Provide legal services and policy advice and oversee legal services across government | Performance Measure 1.2: International law and policy advice | >80% | 90% | 43 | 58 | 74% |
Performance Measure 1.3: Constitutional policy and related public law advice | >80% | 100% | 10 | 38 | 26% | |
2: Manage casework | Performance Measure 2.1: International crime cooperation, federal offender, international family law, private international law and United Nations human rights committee communications casework | >80% | 91% | 48 | 166 | 29% |
3: Administer and advise on legal and policy frameworks | Performance Measure 3.1: Legal and policy advice on the federal justice system | >80% | 72% | 38 | 112 | 34% |
Performance Measure 3.2: Administration of family law and marriage legislation and policy frameworks | >80% | 85% | 35 | 97 | 36% | |
Performance Measure 3.3: Administration of the industrial relations system | >80% | 80% | 61 | 139 | 44% | |
Performance Measure 3.4: Legal and policy advice on Australia’s integrity and rights frameworks | >80% | 94% | 20 | 68 | 29% | |
Performance Measure 3.5: Legal and policy advice on criminal justice and national security frameworks | >80% | 85% | 51 | 99 | 52% | |
4: Administer and implement programs and services | Performance Measure 4.1: Legal assistance | >80% | 63% | 17 | 31 | 55% |
Performance Measure 4.5: Building counter-fraud and protective security capability across government | >80% | 92% | 60 | 148 | 41% | |
Performance Measure 4.6: Administration of the Foreign Influence Transparency Scheme and Lobbying Code of Conduct | >80% | 100% | 6 | 13 | 46% | |
Performance Measure 4.7: Pacific law and justice programs | >80% | 100% | 21 | 77 | 27% | |
5: Establish and support Royal Commissions and other bodies | Performance Measure 5.2: Support for the overarching and ongoing purpose of Royal Commissions | >80% | 83% | 14 | 60 | 23% |
Qualitative assessment
This year, we used similar qualitative analysis processes to assess our performance against a number of targets. Each assessment was conducted by a specially convened panel or officers that reviewed a representative sample of relevant work, either chosen randomly or based on criteria specified in internal methodology documents and approved at the beginning of the performance cycle. A number of panels applied assessment criteria based on the New Zealand Government’s Policy Quality Framework and used the associated assessment scoring template and scoring scale. At the conclusion of these assessments, the panels documented their findings.
Detail on each process is provided.
Performance measure 1.2: International law and policy advice
1.2.2 The assessment panel consisted of 3 Senior Executive Service officers (SES) from the Office of International Law and one SES from another part of the department. It examined 5 work products that covered different types of legal advice and briefings. To mitigate the risk of bias, the work products were selected at random from our document management systems by an officer outside the Office of International Law.
Performance measure 1.3: Constitutional policy and related public law advice
1.3.2 and 1.3.3 An assessment panel consisting of 3 Executive Level 2 (EL2) officers and SES from other business areas within the department assessed work for both targets.
For 1.3.2, samples for assessment were chosen in 2 ways:
- Four were randomly selected from submissions to the Attorney-General on matters of constitutional law, policy and litigation, and briefings for the Attorney‑General and the Secretary on pandemic-related matters.
- Two samples of advice we provided to other agencies relating to new policy proposals using the following criteria: that we had provided substantive comments and the advice involved a particularly complex proposal involving various heads of constitutional power.
For 1.3.3, the panel considered a case study of a constitutional litigation matter, selected based on the following criteria:
- the matter was significant for the department in that it both:
- involved a substantial amount of work, and
- required the department to play a leading role in managing the matter (noting that our work is often collaborative)
- the work occurred predominately during 2020–21
- the matter was resolved or expected to be resolved during 2020–21, meaning the matter has at least proceeded to hearing and judgment was or will shortly be delivered.
Performance measure 2.1: International crime cooperation, federal offender, international family law, private international law and United Nations human rights committee communications casework
2.1.4 A panel comprised of 2 SES and one EL2 from the business unit and one EL2 from another area of the department considered 4 pieces of advice across different casework types and applied qualitative assessment criteria. The 4 pieces of advice were selected by EL2 officers as work that represented the complexity of the matters dealt with in the business unit. The panel reviewed all pieces of advice and met to discuss its views of whether the criteria were met. A numerical scoring system was not used for the qualitative assessment process, but will be used in future processes.
Performance measure 3.1: Legal and policy advice on the federal justice system
3.1.2 A panel of 3 EL2 officers, 2 external to the branch and one external to the immediate sections, evaluated 10 pieces of work undertaken during 2020–21. A sample of 125 pieces of policy work, covering ministerial submissions, legislative scrutinies, Cabinet documents, meeting briefs and new policy proposals, was initially identified. Responses to submissions that returned ‘nil comment’ and speeches and accompanying logistics captured as ‘meeting briefs’ were excluded from the sample as not being representative of substantive work undertaken by the branch. Items from the sample were selected for review using a random number generator.
Performance measure 3.3: Administration of the industrial relations system
3.3.3 A panel of 6 officers, including 1 external to the Industrial Relations Group, reviewed 3 pieces of work from each relevant work area. The panel applied the New Zealand Policy Quality Framework and used the associated Policy Quality paper-scoring template.
3.3.4 Six pieces of advice pertaining to 3 (out of a total of 41) litigation matters finalised in 2020–21 were reviewed by nominated officers. Each nominated officer was an experienced Principal Government Lawyer (EL2) selected from a legal branch other than the branch that developed the legal advices being assessed. The work sample was selected through a stratified random sampling process and tested against equally weighted criteria, adapted from the New Zealand Policy Quality Framework paper-scoring template.
Performance measure 3.4: Legal and policy advice on Australia’s integrity and rights frameworks
3.4.2 and 3.4.3 A panel consisting of 3 senior officials, including one external panel member from the Department of the Prime Minister and Cabinet and one from a separate business unit assessed a case study for both of these targets. The case study was selected using criteria set at the beginning of the performance cycle. The responsible business unit presented to the panel and provided written materials to aid the evaluation. For 3.4.3, the panel assessed the effectiveness of the policy advice using the New Zealand Policy Quality Framework, assessment template and scoring scale.
Performance measure 3.5: Legal and policy advice on criminal justice and national security frameworks
3.5.2 and 3.5.3 A panel consisting of 3 senior officials, including one external panel member from the Department of the Prime Minister and Cabinet and one from a separate business unit assessed a case study for both of these targets. The case study was selected using criteria set at the beginning of the performance cycle. The responsible business unit presented to the panel and provided written materials to aid the evaluation. For 3.5.3, the panel assessed the effectiveness of the policy advice using the New Zealand Policy Quality Framework, assessment template and scoring scale.
Performance measure 3.6: Administration of other legal frameworks for which the department is responsible
3.6.2 Case studies that met the following criteria were nominated for assessment:
- The case study raises or raised significant, complex or sensitive issues (whether legal advice has been obtained may be a good indicator).
- The case study in some way involves or involved the Attorney‑General, Assistant Minister to the Attorney‑General or the Commonwealth generally.
- The case study relates or related to the administration of the native title system in a broad sense, including native title and native title compensation claims and related policy work.
Case studies were separated into 3 categories of native title compensation litigation, other native title litigation and policy matters to ensure analysis included an overview of the different types of work involved in the administration of the native title system. One case study from each category was selected using a random number generator (3 in total).
The 3 selected case studies were evaluated by action officers from the Native Title Unit to determine whether the administration of the system had been ‘effective’. They considered whether the Commonwealth’s involvement in the matters had been targeted, proportionate and represented an appropriate application of legal and other resources.
3.6.3 An assessment panel consisting of one SES Band 1 and 2 EL2s, 2 of which were external to the branch and one external to the business unit whose work was being assessed a sample of policy work undertaken over the year. Items were selected for review using a random number generator. The panel applied the New Zealand Policy Quality Framework, template and scoring scale.
Footnotes
- Key stakeholders were defined as those who have had 2 or more business-related interactions with the department in 2020–21.↩
- From page 205 of the Attorney-General's Department Annual Report 2019–20. ↩
- From page 208 of the Attorney-General's Department Annual Report 2019–20. ↩
- Department of Finance (2020), RMG 131 Developing good performance information.↩
Visit
https://www.transparency.gov.au/annual-reports/attorney-generals-department/reporting-year/2020-21-57