Fair Presentation of Service Delivery Performance 2022

Tabled: 22 March 2023

Snapshot

Why this review is important 

The government spends public money to deliver goods and services to the Victorian community. Parliament and the community require accurate and fair reporting of the performance of those services. 

Our 2021 Measuring and Reporting on Service Delivery report examined the way 8 government departments measure and report on service delivery. We found they did not meet their responsibilities to measure and report on their performance as required by the Department of Treasury and Finance (DTF).

Who and what we examined

We determined whether the public sector fairly presents its service delivery performance information.

We assessed 9 Victorian Government departments’ performance statements in DTF’s Budget Paper No. 3: Service Delivery (BP3) and whether they complied with DTF’s Resource Management Framework (RMF).

We focused on 210 new performance measures and the Department of Education and Training's (DET) presentation of school performance information.

What we concluded

Service delivery performance is not clearly visible to Parliament and the community.

Departments do not fully follow the requirements of the RMF, and BP3 includes too much information that is not relevant to output budgeting. 

This extra information detracts from the primary purpose of BP3 and makes it harder to discern how well departments are delivering services.

Nothing came to our attention to indicate that departments’ performance information is not accurate and reliable.

What we recommended

We recommended that DTF further improve the RMF’s guidance materials.

Departments need templates and definitions to develop their data dictionaries.

Departments also need somewhere other than BP3 to report performance information about inputs and processes.

When departments make changes, DTF should advise them to follow the RMF guidance.

Video presentation

Video transcript

Key findings

Of departments' 210 new performance measures in 2021–22 and 2022–23, 37% did not measure outputs; 53% were not useful for strategic decision-making; 40% were not attributable to departments; 27% were not relevant to departmental objectives; 22% were not written clearly; and 50% were not comparable over time.

Source: VAGO.

Back to top

Our recommendations

We made 3 recommendations for the Department of Treasury and Finance. 

Recommendations Agency responses

Department of Treasury and Finance

 

1

Provides departments with guidance or a framework for reporting performance information about inputs and processes and broader demographic information (see Section 2).

Reviews Budget papers and advises departments to exclude performance measures other than those for outputs. 

Accepted in principle

 

2

Improves the Resource Management Framework’s guidance materials to:

  • show departments how to develop a data dictionary, including templates and definitions
  • include practical examples of data dictionary entries (see Section 2).

Accepted

 

3

Reviews Budget papers and provides advice to departments if they do not explain why they changed their objectives (see Section 2).

Accepted

 

 

Back to top

1. A framework for service delivery performance

The Department of Treasury and Finance (DTF) issues the Resource Management Framework (RMF), which is mandated for use by all departments by the Assistant Treasurer. We adapted internationally recognised performance measurement models to the Victorian context to assess compliance with the RMF in service performance reporting.

Department names

In January 2023, machinery of government changes affected some departments.

In this report we use old department names when referring to data from the past and current department names when referring to the new department.


 

How departments measure and report on service delivery performance

Departments use outputs, objectives and measures to assess performance

The government funds public service departments to deliver goods and services (‘outputs’) to the Victorian community in clear alignment with departmental ‘objectives’ (what they aim to achieve).
In Budget Paper No. 3: Service Delivery (BP3), DTF provides information about the performance reporting framework, departmental objectives, output ‘measures’ and how the government meets its performance ‘targets’ for delivering those outputs.


 

Revenue certification is dependent on output performance

The Assistant Treasurer certifies departments’ revenue based on their outputs. This certification is dependent on the successful delivery of output performance and each department must submit an output performance report with its invoice. 

The progress of delivering departmental outputs and departmental performance against targets published in BP3 is key to the certification process.


 

How inputs become outputs and meet objectives

Departments deliver services by using ‘inputs’ to create ‘outputs’ that meet their intended ‘objectives’.

Figure 1: Service delivery map

Source: VAGO.


 

Departmental performance statements should focus on outputs

BP3 and revenue certification focus on service delivery. Accordingly, the information in BP3 should be specific to outputs and objectives (as shown in Figure 1). Performance measures in BP3 should relate to the provision of goods and services external to the department, not its inputs or activities.

DTF publishes each department’s objectives, outputs, performance measures, targets and actual results from the previous financial year in departmental performance statements in BP3. The information reflects both new and existing Budget initiatives.


 

Departments publish performance results

Each department also publishes the results of its service delivery performance in its annual report, measured against the agreed indicators, targets and measures.

A department may also report information about the internal workings of an agency (inputs and activities) in its annual report and other internal reports. 

DTF guides departmental performance reporting

DTF gives departments and agencies guidance in the RMF about planning, specifying objectives, outputs, performance measures, targets and reporting performance information. 

The RMF is a governance and operational framework for public sector accountability that gives the responsibility for portfolio performance to portfolio ministers. Ministers and their departments (and accountable officers) manage the Budget to deliver agreed outputs that align with departmental objectives.

Mandatory requirements relating to departmental performance reporting
The RMF has mandatory requirements for:
  • the content and annual review of departmental performance statements
  • performance measure footnotes for new, amended and discontinued measures
  • identification of outputs that best achieve objectives
  • the specification of a meaningful mix of quality, quantity, timeliness and cost performance measures for each output that assess:
    • service efficiency and effectiveness
    • all major activities of the output.

 

The place of outcomes in service delivery

DTF’s departmental funding model, as described in the RMF, includes objectives but not outcomes. 

Objectives are what the department aims to achieve – they are a measure of the goods and services produced. Outcomes are the result or impact of the service for the recipient or the community – they are a measure of success. More than one department or external factors may contribute to outcomes.

The Department of Premier and Cabinet (DPC) published Outcomes Reform in Victoria, which says that outcomes are key to delivering a modern, responsive and adaptable public service. This model includes outcomes but not objectives. 

In response to the recommendations of our 2021 Measuring and Reporting Service Delivery report (https://www.audit.vic.gov.au/report/measuring-and-reporting-service-delivery), DTF and DPC told us they would work together to ensure coherence and cohesiveness in department performance reporting.


 

PAEC reviews output performance measures

Each year, the government reviews performance measures and publishes in BP3 any measures it proposes to change or discontinue. Parliament’s Public Accounts and Estimates Committee (PAEC) then reviews the proposals. After its review, PAEC publishes the results and its recommendations. In some cases, PAEC highlights issues it finds with departments’ proposed changes to measures. Parliament also publishes the government’s response to PAEC’s recommendations.

Discontinued performance measures
The RMF states performance measures may be discontinued if:
  • they are no longer relevant due to a change in government policy or priorities and/or departmental objectives
  • projects or programs have been completed, substantially changed or discontinued
  • milestones have been met
  • funding is not provided in the current Budget for the continuation of the initiative
  • improved measures have been identified for replacement.

 

Ensuring fair presentation of service delivery performance information

What fair presentation is

For this annual assessment VAGO created a framework based on DTF’s RMF. Appendix D explains the rationale for our assessment of each step in the framework.

Using this framework, service delivery performance information is fairly presented when it:

  • represents what it purports to represent
  • is capable of measurement
  • is accurate, reliable and auditable.

 

How we assessed new performance measure information

To assess whether performance information ...

We determined whether ...

represents what it purports to represent

  • measures reflect the delivery of goods or services (outputs)
  • measures are useful to inform decisions or understand service delivery performance
  • the agency is responsible for performance or delivering the goods and services (attributable)
  • measures have a logical relationship to departmental outputs and objectives (relevant)
  • it is clear what the agency intends to achieve.

is capable of measurement

measures can demonstrate performance over time (comparable).

is accurate, reliable and auditable.

  • agencies have clear processes to define measures and set targets
  • agencies have controls in place to assure the accuracy and reliability of the data obtained.

 

Assessing whether performance information represents what it purports to represent

Measures should reflect outputs

Information contained in BP3 departmental performance statements should enable the reader to understand what the department intends to achieve. That is, it should include information about the outputs (goods and services) that the government funds departments to deliver. 

Departmental performance statements are not the place to report performance measures of inputs, activities or outcomes (such as the internal workings or activities of an agency).

They are also not the place to report on specific initiatives or individual programs. Rather, any such initiatives or programs should be mapped to outputs and their expected effect on either output targets or actual results.

Outputs
The RMF defines outputs as:
'The final products, or goods and services produced or delivered by, or on behalf of, a department or public agency to external customers/recipients. Outputs include products and services delivered to the community (e.g. education, health services), or products and services provided to other departments (e.g. services provided by the Victorian Public Sector Commission to support the public sector)'.

 

Measures are classified as outputs or not outputs

To classify each measure we used a decision tree to identify which were outputs and which were not (that is, were inputs, activities or outcomes).

Figure 2: Measure classification decision tree

Source: VAGO.


 

Measures should reflect better-practice criteria

The RMF includes a checklist of characteristics that indicate a ‘better standard in public sector output performance measurement information’ (better-practice criteria).

To assess whether performance information represents what it purports to present, we tested the measures against 4 of the RMF’s better-practice criteria. We assessed whether each measure is useful, attributable, relevant and clear.

Usefulness
Measures enable performance reporting and analysis and inform decisions about resource allocation.
Attribution
The organisation is responsible for the actions or delivery of the goods and services being measured.
Relevance
Measures align with both the departmental objectives and the relevant output.
Clarity
Measures use clear, concise, non‑technical language, and what is being measured is not ambiguous.

 

Assessing whether performance information is capable of measurement

Measures demonstrate performance over time

To be comparable over time, measures or targets should account for variations in factors such as population size, service demand and volume of service use. Measuring targets as proportions can help account for these variations. For example, the proportion of students with a career action plan is a performance measure, while the number of students is not because student numbers change.

Comparable over time
The RMF requires accountable officers to ensure 
‘… any outputs and performance measures created enable meaningful comparison and benchmarking over time. Where possible, across departments and against other jurisdictions’.
'The accountable officer must include [a] footnote disclosure… in the departmental performance statements… [for] all proposed discontinued [performance] measures … with changes in source data/methodology used to measure target or changes in unit of measurement, which renders past performance history incomparable’.

 

Improvement trends can be an increase, a decrease or neutral

To support comparability and target setting, expectations about performance should indicate whether performance improvement is an increase or a decrease in the measure. For example, performance improvement can be:

  • an increase in the number of mental health consumers who report a positive experience of care
  • a decrease in the number of acute mental health inpatients readmitted within 28 days of discharge.

For example, take average daily number of young people aged 10 to 13 years under supervision. The Department of Justice and Community Safety (DJCS) included a footnote to explain ‘New performance measure for 2022–23 to reflect the focus on reducing the number of young people aged under 14 in custody’. With the addition of this footnote, we understand that a decrease is a performance improvement.

Some measures do not have a ‘right’ level of output and we call them neutral measures. We cannot assess what improvement looks like for neutral measures without more information.


 

Assessing whether performance information is accurate, reliable and auditable

How the government manages information

The Victorian Government has an information management framework (https://www.vic.gov.au/information-management-whole-victorian-government) that is intended to improve decision-making and support the planning and delivery of services to the public. The framework names information governance and data management as key enablers of an information management framework.

Alongside the framework, the government has policies, standards and templates to guide departments in managing data and information. The RMF also has some information about documentation and reporting of performance measures.

DPC published the Data Quality Guideline: Information Management Framework (Data Quality Guideline; https://www.vic.gov.au/data-policies-and-standards) that explains that data and methods should be well documented and traceable. It suggests departments should have a data dictionary for each data collection. DPC told us the management of the Data Quality Guideline shifted to the Department of Government Services after the machinery of government changes in January 2023.

Data dictionary
DPC’s Data Quality Guideline states:
'Data dictionaries are a reference of standardised concepts including data definitions, business rules, validations and allowable formats for data which should be applied. Implementation of data dictionaries creates a common understanding of data items which can be applied consistently by data suppliers'.
They should be:
'available, regularly maintained and updated with any changes made to data. For example, the definition, naming conventions, or scope of data that is collected periodically may change over time'.

 

How we gathered information about data

Using the government’s guidance for departments in managing data and information, we chose 6 criteria to gather departments’ information about data.

Figure 3: Criteria we used to gather information about data

Criterion What information should be included
Measure description What activity is being measured, key terms and what is being reported
Data collection What data is collected, how the data is collected, the frequency of data collection and data security arrangements
Business rules What the measure counts and any assumptions relevant to how the data is captured
Inclusions and exclusions Key quantitative or qualitative data, categories, groups or activities that are specifically included or excluded
Method How the result is calculated
Data validation Processes for validating/assuring the quality of the raw data and/or calculated result, for example, whether the result is verified and endorsed internally or by an internal or external audit

Source: VAGO.


 

Improving the fair presentation of service delivery performance information

Departments are on a journey of improvement

Our 2021 Measuring and Reporting on Service Delivery report included 11 recommendations to improve the fair presentation of service delivery performance information. In response, departments committed to a timeframe for each of the recommendations they accepted. Some of those timeframes have not yet passed. We are mindful of this in our assessment of the 210 new service measures from BP3 2021–22 and BP3 2022–23.

We note the number of performance measures being changed or discontinued will likely increase in the next few years. In the short term this may affect the ability of Parliament and the community to assess performance over time but should have long-term benefits.

Measuring and Reporting on Service Delivery includes the recommendations, departmental responses and the timeframes for them.


 

Using our online dashboard to compare departments’ performance 

About our dashboard

We developed a fair presentation of service delivery performance dashboard in 2021 so you can see how departments perform against their output measures.

Using the dashboard, you can compare departments’ performance against each other and drill down to examine trends for individual measures over time. You can also export raw data on output performance measures.

The dashboard shows whether a department met its targets or not and provides trend data for each measure.


 

September 2022 dashboard update

In September 2022 we published an update to the dashboard (https://www.audit.vic.gov.au/dashboards/fair-presentation-service-delivery-performance-2022).

It presents the results of departments’ output performance published in DTF’s BP3 2021–22 and includes data from 2015–16 to 2021–22.


 

DTF is developing a dashboard

DTF told us it is developing options for an output performance dashboard for the government to consider. It aims to publish this dashboard in 2023, subject to the government's consideration.


 

Back to top

2. Measuring departmental performance

In this section we summarise the changes that departments have made to their objectives and outputs since BP3 2020–21. We also show the results of our assessment of departments’ 210 new performance measures using the framework outlined in Section 1.

A series of assessments

This is our first limited assurance review in a series that will assess the way departments measure output performance each year. 

Departments have a total of 1,436 performance measures in 2022–23. We limited our assessment to the new performance measures introduced since our last report.


 

Assessing new output performance measures

210 new measures introduced and 10 discontinued

Between 2021 and 2023, departments introduced a total of 210 new performance measures, including 100 measures in 2021–22 and 110 in 2022–23. 

In 2022–23, departments discontinued 10 measures that were new in 2021–22, but they remain in our analysis for completeness.

Appendix G provides the data for new performance measures for each department by attribute and by year.


 

Most new performance measures have a quantity attribute

We recorded the attribute of each of the 210 new performance measures as reported in BP3 – that is, whether it was a measure of quality, quantity, timeliness or cost.

Of the new measures, most (58 per cent) were measures of the quantity of outputs delivered, with the fewest (12 per cent) being measures of the timeliness of service delivery. 

There were no new cost measures. Cost performance measures are usually the full accrual cost to a department of producing an output, so would rarely change.

We also noted an almost complete absence of cost information relating to efficiency of service delivery. DTF plans to give departments further guidance on efficiency measures. 

Mandatory mix of performance measure attributes
The RMF requires the accountable officer to ensure the specification of a meaningful mix of quality, quantity, timeliness and cost performance measures for each output to assess:
  • service efficiency and effectiveness
  • all major activities of the output.

 

Figure 4: Number of new performance measures by attribute (2021–22 and 2022–23)

Across the 9 departments, most new performance measures (58%) measured quantity; only 30% measured timeliness.

Note: DELWP stands for Department of Environment, Land, Water and Planning, DET stands for Department of Education and Training, DFFH stands for Department of Families, Fairness and Housing, DH stands for Department of Health, DJPR stands for Department of Jobs, Precincts and Regions, and DoT stands for Department of Transport.
Source: VAGO analysis of DTF's BP3 2021–22 and BP3 2022–23.


 

69 per cent of measures cover all 4 attributes

Ideally, each output would have a measure of each of the 4 attributes.

When departments have performance measures for all 4 attributes of an output, they cannot trade off one for another (for example, quality for timeliness).

Our 2021 Measuring and Reporting on Service Delivery report found that 64 per cent of departments’ outputs had a mix of performance measures that cover all 4 attributes. In 2022–23 this increased to 69 per cent.

Departments are making modest progress in this regard. Between 2021–22 and 2022–23, the number of outputs with a mix of performance measures across 4 attributes increased by 6 (DPC with 2, and DET, DFFH, DH, and DJCS with one each).


 

PAEC found issues with only 10 discontinued measures

We analysed the number of measures that departments discontinued in the last 2 years.

In that time:

Departments ...

While PAEC ...

proposed to discontinue 134 measures:

  • 60 in 2021–22
  • 74 in 2022–23.

supported most of these, raising issues about 10:

  • 3 in 2021–22
  • 7 in 2022–23.

The government tabled a response to the issues that PAEC raised about the 2021–22 measures on 8 March 2022. It will table a response to the issues PAEC raised about the 2022–23 measures in 2023.

In PAEC’s Report on the 2022–23 Budget Estimates, it noted:

'transparency could be improved by including, where relevant, details of any changes made to the performance measure proposed to be discontinued in the prior year’s budget. This can be important in considering whether the explanation included in the budget papers for the proposed discontinuation of a performance measure is sufficient'.

Appendix H shows how many performance measures each department proposed to discontinue and how many PAEC had issues with.


 

 

10 per cent of measures discontinued within a year

Of the new measures departments introduced in 2021–22, 10 of them (10 per cent) were proposed to be discontinued in 2022–23:

  • Departments replaced 6 with more appropriate measures.
  • Departments discontinued the other 4 because the program or funding ended.

Changing performance measures sometimes means a department cannot measure performance over time.

When departments introduce measures for each stage or disaggregation of a program there are likely to be more changes. For example, DELWP introduced 5 measures for the cladding rectification works program in 2021–22 and one was discontinued in 2022–23.


 

37 per cent of departments’ measures do not relate to outputs

We classified each of the 210 new performance measures introduced in 2021–22 and 2022–23 as a measure of input, activity, output or outcome (as explained in Figure 2).

We found ...

Were measures of ...

And should be reported …

63 per cent

outputs

in BP3.

37 per cent

inputs, processes or outcomes

in the department’s annual report or internal reporting systems.

For clarity and consistency BP3 should report measures only of outputs – not inputs, processes or outcomes.

Departments that choose to report a measure elsewhere and discontinue reporting it in BP3 should explain this to PAEC.


 

The results for each department vary

The range of results for the proportion of each individual department’s new performance measures that is an output or input is broad. A department that has a lower-than-average proportion of output measures has a higher than average proportion of input measures.

The proportion of each department’s measures that are outputs varies by 63 per cent. One department has 92 per cent of measures classified as outputs while another has only 29 per cent. 

The difference between the highest and lowest result when counting inputs is 67 per cent.

Figure 5: Range of departments’ proportions of new performance measures classified by type

Source: VAGO analysis of DTF's BP3 2021–22 and BP3 2022–23.


 

53 per cent of new measures are not useful

We assessed whether new performance measures are useful (that is, enable performance reporting and analysis and inform decisions about resource allocation).

Of the 210 new performance measures ...

And …

Only 47 per cent would:

  • be useful for informing strategic government decision-making about priorities and resourcing
  • provide stakeholders with an understanding of the department’s service delivery.

Results for each department’s measures varied – between 19 and 73 per cent of new measures were useful.

 

Defining ‘useful’

The RMF explains ‘useful’ in the context of assisting the government to make resource allocation decisions and secondly, being used to inform government decision-making or for internal management.

Performance measures are used to support government resource allocation
The RMF states: 

'Performance measures are used in the planning stage to assist government in making resource allocation decisions, specifically how many units (or additional units) of goods or services can be delivered at what cost. Performance measures are used to ensure the delivery of outputs, and as a mechanism for accountability over government spending by specifying what the Government wants to achieve'.

Performance measures should be capable of being used in a variety of ways
The RMF states: 

'In addition to assessing and reporting performance, they should also inform decision making by the organisation and by Government as well as helping other stakeholders understand the organisation’s performance. The data should be available to meet relevant planning and reporting timeframes'.

Our framework for service delivery performance reporting in BP3 is based on the usefulness of performance measures to support government resource allocation. 

Many of the performance measures that do not meet our criteria may be useful for other reasons than BP3 reporting, such as monitoring internal performance.


 

60 per cent of new measures are attributable

We assessed whether new performance measures are attributable (the department is responsible for the actions or delivery of the goods and services being measured).

Of the 210 new performance measures ...

And …

However …

60 per cent were:

  • within the responsibility of the department or agency
  • directly attributable to the actions of the department in delivering the service

results for each department varied –between 27 and 83 per cent of new measures were attributable.

if we include measures that were partly attributable, the results increase to between 79 and 100 per cent.

Two departments explained that by considering how external influences impact on services they can increase the extent to which performance is attributable to them. For example, Department of Transport and Planning can improve public transport fare compliance by making the ticketing system easier to use and by checking tickets. We included an assessment of ‘partly attributable’ for measures like these.

74 of the new measures (35 per cent) were only partly attributable because external forces may influence performance (such as demand for services or user behaviour).


 

73 per cent of new measures are relevant

We assessed whether new performance measures are relevant (they align with the departmental objectives and the relevant output).

Of the 210 new performance measures ...

And …

  • 73 per cent align with both the department’s objective and the relevant output
  • 27 per cent did not clearly indicate how achieving the target would assist the department to achieve its objective and were considered not relevant

results for each department varied – between 33 and 97 per cent of new measures were relevant.


 

78 per cent of new measures are clear

We assessed whether new performance measures are clear (they use clear, concise, non‑technical language and what is being measured is not ambiguous).

Of the 210 new performance measures ...

And ...

78 per cent were written clearly and demonstrated what was being measured. 

However, the others:

  • did not express how users would measure results
  • did not express who would provide the good or service or who the recipients were
  • used technical language or jargon
  • were hard to understand

Results for each department varied – between 62 and 92 per cent of new measures were relevant.

Departments told us they have already selected many of the measures that were not clear for revision or discontinuation.


 

Half of the new measures are comparable over time

We assessed whether new performance measures are comparable over time.

We found that 106 of the 210 new performance measures did not allow for comparison of performance over time. This was usually because they were a count of a product or service that did not account for changes in population, funding or demand. 

Numeric measures can be useful for performance reporting but they often require contextual information to understand their comparability over time. Departments told us they can also address this through target setting. However, we found this does not make the measure itself comparable over time.

Our assessment shows that the proportion of each department’s performance measures that are comparable over time varies between 31 and 71 per cent.


 

What we recommend regarding output performance measures

We recommend DTF provide departments with guidance or a framework for reporting performance information about inputs and processes and broader demographic information.

We found that many of the new measures introduced by departments in the last 2 years do not follow some aspects of the RMF. 

Information about inputs, processes and context may be useful for broader government decision-making. But, given the importance of output performance to the revenue certification process, BP3 should be limited to include only output measures.


 

Assessing departments’ information about data

Data dictionary work is progressing

We asked departments to give us information about the way they measure, collect, calculate and validate the data they use to explain service delivery performance. We gathered information against each of the 6 criteria in Figure 3 for the 210 new performance measures. 

Some departments told us they were still developing their data dictionaries. One told us it would be useful to know what the key elements of a data dictionary should be and to have an example.

DTF confirmed there is currently no template for a data dictionary.


 

Information about data is limited

Our assessment of departments’ data information was constrained by:

  • our method
  • this report being a limited assurance review
  • the status of departments’ work on responding to our 2021 Measuring and Reporting on Service Delivery recommendations. 

The departmental results are variable and the information we gathered cannot be used to determine whether a department can fairly present its service delivery performance.


 

What we recommended in 2021

In 2021 we recommended DTF regularly review departments’ data dictionaries to ensure they include all the required information. 

DTF accepted our 2021 recommendation in principle because it believes accountability for compliance rests with each department.

DTF committed to review the RMF guidance and clarify the requirements for documenting methodologies.


 

DTF must hold departments to account

The RMF is mandated for use by all departments by the Assistant Treasurer. Portfolio performance is the responsibility of portfolio ministers. It is DTF’s responsibility to review the information about data that departments gather and provide advice based on the rules they have set.


 

What we recommend regarding data dictionaries

We recommend DTF improve the RMF's guidance materials to:

  • show departments how to develop a data dictionary, including templates and definitions
  • include practical examples of data dictionary entries.

Departments require detailed guidance to develop a data dictionary. They also need a systematic approach to produce information about data that is consistent with other departments and the Victorian Government’s information management framework and data quality guidelines.


 

Changes to departmental objectives

Why objectives might change

Departmental objectives are the results that departments hope to achieve. Objectives and objective indicators should show progress over time so departments should not change them each year.

Making changes to objectives
The RMF allows departments to make changes to objectives, which may include the following circumstances:
  • machinery-of-government changes
  • changes to the government's strategic direction
  • a change in government
  • other reasons decided by the government of the day.
The accountable officer must include in the Budget papers an explanation as to why such changes have been made.

 

Changes to objectives since 2020–21

Since BP3 2020–21, 3 departments have changed objectives:

  • In February 2021, the Department of Health and Human Services (DHHS) became DFFH and DH. As a result, DHHS’s objectives were distributed to DFFH and DH.
  • At the same time, one of DPC’s objectives became an objective of DFFH, DPC discontinued another objective and introduced one objective.
  • In 2021, DFFH updated its 4 objectives. BP3 2022–23 named the new objectives and DFFH’s 2021–22 annual report reported against them but these documents did not explain the change. DFFH’s questionnaire response to PAEC’s Inquiry into the 2022–23 Budget Estimates explains that ‘Departmental objectives have been updated to better reflect the activities of the department’.

Appendix E shows the changes to departmental objectives between BP3 2020–21, BP3 2021–22 and BP3 2022–23.

BP3 2023–24 will reflect the machinery-of-government changes that occurred in January 2023 and any subsequent changes to objectives.


 

What we recommend regarding departments’ changing objectives

The accountable officer for DFFH was responsible for including an explanation as to why its objectives changed in the 2022–23 Budget papers, but the Budget papers did not include an explanation.

We recommend DTF review Budget papers and provide advice to the department if an explanation of objective changes is not included.


 

Changes to departments’ outputs

Outputs change as part of the Budget process

Parliament funds departments to deliver outputs. When the government decides to reallocate funds, outputs may also change. Each year, departments review their outputs to ensure they are relevant. They make changes as part of the Budget process.

Making changes to outputs
The RMF requires that the accountable officer ensures:
  • an annual review of the department’s outputs and performance measures is conducted to assess continued relevance
  • any changes to departmental outputs and performance measures are only made annually as part of the Budget process (in departmental performance statements).
When considering changes, comparability of performance over time should be taken into account.

 

Disaggregating an output or changing its name can cause problems

When a department disaggregates or changes the name of an output it is harder to compare performance over time. For example:

  • DFFH disaggregated ‘women’s policy’ to become ‘women’s policy’ and ‘primary prevention of family violence’. This means that data for women’s policy will no longer include the same programs
  • DELWP changed the name of Solar Homes to Solar Victoria to reflect the expansion of deliverables that include rebates for businesses and zero-emission vehicles as well as residential homes. This means the Department of Energy, Environment and Climate Action (previously DELWP) will no longer report output cost data for solar homes separately in BP3.

 

Some outputs have changed since BP3 2020–21

In BP3 2021–22:

  1. 3 departments changed one or more outputs
  2. DPC transferred 5 outputs to DFFH. 

In BP3 2022–23 5 departments changed one or more outputs. 

DTF also noted that outputs moved because of the machinery-of-government changes (DHHS to DH and DFFH).

Figure 6: Outputs that have changed since BP3 2020–21

Department Change from BP3 2020–21 to BP3 2021–22 Change from BP3 2021–22 to BP3 2022–23

DELWP

N/A

  • 1 output renamed
  • 1 output became 2 outputs

DFFH

1 output became 2 outputs

1 output removed

DH

9 outputs became 23 outputs

  • 1 output renamed
  • 1 output partially transferred to DFFH

DJCS

  • 2 outputs renamed
  • 1 output became 2 outputs

1 output became 2 outputs 

DPC

5 outputs transferred to DFFH

  • 3 outputs became 6 outputs
  • 3 outputs renamed

Source: VAGO summary of output changes between DTF's BP3 2020–21, BP3 2021–22 and BP3 2022–23.


 

Back to top

3. Measuring school performance

We take a closer look at a different department’s output performance measures each year. This year we focus on DET's school performance measures.

DET is now DE

DET became the Department of Education (DE) in January 2023. In this report we use DET when referring to data from the past and DE when referring to the new department.

How DE manages the performance of Victoria’s school services 

DE is accountable for school services

The Minister for Education is responsible for Victoria’s education system, including government, Catholic and independent schools. DE is accountable to the Minister for:

  • administering the education system
  • running and maintaining government schools
  • school performance and compliance.

Parliament and the community can use DE’s performance reporting to hold it accountable for the public funds it spends on school services.


 

DE must find measures to assess output performance

DTF requires DE to develop performance output measures for schools that meet the RMF criteria.

DE is continuing to work on its response to our 2021 Measuring and Reporting on Service Delivery report recommendations. DE told us it is:

  • currently reviewing its departmental performance statement
  • addressing issues with measures that do not align with the RMF, which may mean some performance measures change.

 

Schools cost 79 per cent of DET’s output budget

In BP3 2022–23, DET planned to spend $13.0 billion on school services (school education and support services), which is 79 per cent of its total output budget ($16.5 billion). 

DET has 7 outputs:

Five outputs relate to school services …

And 2 do not …

  • school education – primary
  • school education – secondary
  • strategy review and regulation
  • support for students with disabilities
  • support services delivery.
  • early childhood education and training
  • higher education and workforce development.

 

DET's 104 school services performance measures

In BP3 2022–23, DET planned to report the performance of school services using 104 performance measures. At that time, DET had plans to discontinue 2 of those measures pending PAEC's response. 

DET’s school services outputs are broken down by number of performance measures and budget as follows:

Figure 7: DET’s school services outputs, performance measures and budget (2022–23)

Output Description Performance measures (number) Output cost ($m)
School education – primary This output provides services to develop essential skills and learning experiences to engage young minds and improve the quality of learning of students in prep to year 6 in government and non government schools. 41 5,942.9

School education – secondary

This output involves provision of education and support services designed to improve student learning, development and wellbeing in years 7 to 12 in government and non-government schools. 

These services seek to consolidate literacy and numeracy competencies including creative and critical thinking, as well as physical, social, emotional and intellectual development in adolescence.

It also covers the provision of services to improve pathways to further education, training and employment.

39

5,026.9

Strategy review and regulation

This output develops, plans and monitors strategic policy settings across all stages of learning. It includes intergovernmental negotiations as well as research, data and performance evaluations.

It also supports regulation that ensures quality education and training is delivered.

5

110.9

Support for students with disabilities The support for students with disabilities output covers programs and funding to support students with disabilities, as well as transport, welfare and support services for students with special needs. 6 1,522.3
Support services delivery The support services delivery output primarily provides student welfare and support, student transport (excluding transport for special needs students) and health services. 13 440.4
Total   104 13,043.4

Source: VAGO analysis of DTF's BP3 2022–23.


 

DET did not align its outputs to objectives

DET did not align each of its outputs to one of its stated objectives. In its performance statement, all of DET’s outputs contributed to all DET’s objectives. This means the contribution of each output to each objective, distinct from others, is not possible to determine.

The RMF requires that performance outputs align with departmental objectives. DTF gives an output summary by objectives in each departmental performance statement in BP3. For each objective there are usually one or more outputs.

Figure 8 shows that DET had 4 objectives in BP3 2022–23.

Figure 8: DET’s departmental objectives

Theme Objective
Achievement Raise standards of learning and development achieved by Victorians using education and training
Engagement Increase the number of Victorians actively participating in education and training
Wellbeing Increase the contribution that education and training make to quality of life for all Victorians, particularly children and young people
Productivity Increase the productivity of our services

Source: DTF's BP3 2022–23.

DE told us it will be working on the alignment of outputs to objectives prior to the publication of BP3 2023–24.


 

VAGO’s framework for assessing output performance 

Service performance framework

We developed a framework to assess school performance based on the service logic map in Section 1 (Figure 1). The framework maps inputs through to outcomes (or the objectives that departments meet). Naming outcomes in the framework helps departments identify output measures.


 

The unit of education output is an educated student

The key to performance reporting is naming the intended output and the right measures for it.

The purpose of schools is to educate students. The unit of output is a student who has been educated. The inputs and processes are those that support the delivery of that output. The next figure shows the framework that we used to assess DET's school performance measures.

Figure 9: School services performance framework

Source: VAGO. 


 

A model performance statement

This framework can be used to guide the development of an output performance statement.

A model performance statement gives DTF an example of what better-practice performance reporting by departments looks like. A performance statement should include some analysis of the trends in data and policy context for understanding those trends.

Appendix I provides an example of a model performance statement.


 

Assessing DET’s school performance measures

Most of DET’s school performance measures relate to inputs

We assessed DET’s 104 school services performance measures listed in BP3 2022–23. 

We found only 35 of the 104 measures (33.7 per cent) relate to the provision of outputs. Most (51.0 per cent) are measures of input.

We classified 53 performance measures as input measures. These include:

  • 21 measures that reflect investment or funding for services 
  • 17 measures that reflect school staff.

Rather than reporting on the performance of inputs in BP3, DE should present it in its annual report or internal reporting systems.
Figure 10 shows our classification of school performance measures by input, outcome, output and process.

Figure 10: Classification of school performance measures, 2022–23

Of school performance measures in 2022–23, over half were input measures, with most being related to secondary school education.

Source: VAGO analysis of DTF's BP3 2022–23.


 

Teachers are an input

DET includes 17 measures related to staff, mostly training for teachers and principals. DET sees the school services system as one that has several outputs, including teachers. This view is conceptually flawed and at odds with international, national and sub-national approaches taken by other jurisdictions in understanding performance of education services.

Teachers are an input, not an output, of school services. Performance measures for teachers, especially teacher quality, are useful to DET for internal reporting purposes, but their inclusion in an output performance report only dilutes the focus on students.


 

School quality and quantity measures are balanced

DET has a balance of quality and quantity in the mix of school performance measures. 

DET has only one measure of timeliness: the percentage of government schools compliant with the Child Safety Standards 3 months after review. Prior to 2022–23, DET categorised this as a measure of quality. We would also count this as a measure of quality because the nature of school delivery does not lend itself to timeliness measures related to outputs.

Figure 11: Mix of quantity, quality, timeliness and cost measures, 2022–23

Nearly half of measures were of quality; slightly fewer were of quantity; just 5.8% were of cost or timeliness.
Source: VAGO analysis of DTF's BP3 2022–23.


 

DET has no output performance measures of efficiency

The RMF requires each output to have a meaningful mix of quality, quantity, timeliness and cost performance measures. Those measures should include an assessment of service efficiency and effectiveness. 
DET includes a measure of total output cost for each of its outputs. But none of its 2022–23 output performance measures are measures of efficiency. 
One of DET’s objectives is to increase the productivity of its services. DET’s departmental performance statement links each output with its relevant productivity measure: 

  • expenditure per kindergarten student per year
  • expenditure per primary school student per year
  • expenditure per secondary school student per year
  • expenditure per vocational education and training student contact hour.

 

Assessing whether DET's performance measures reflect better-practice

Most school performance measures are better suited to internal reporting

We found 42 school performance measures (40 per cent) would be useful for informing government decision-making in the context of BP3 reporting.

The next figure shows some examples of DET’s school performance measures that would be better suited to inform internal departmental monitoring and reporting.

Figure 12: School performance measures better suited to internal monitoring

Measures that do not help stakeholders understand service delivery (output) performance Measures that do not inform strategic decisions about priorities and resourcing
Schools allocated a nurse through the Secondary School Nursing Program Number of students participating in the Victorian Young Leaders program
Investment in travelling allowances and transport support Number of Digital Assessment Library items developed
Number of registered training organisation quality audits and school reviews undertaken annually. Number of Victorian schools participating as a ‘lead school’ for the Respectful Relationships program.

Source: DTF's BP3 2022–23.


 

40 per cent of measures are directly attributable

We found all 104 school performance measures were either directly (42 measures or 40 per cent) or partly (62 measures or 60 per cent) attributable to DET.

External forces (such as demand for services or user behaviour) may influence 38 of the measures assessed as partly attributable. For example:

  • student choice influencing the proportion of Navigator program participants (students supported to return to school) who re engage in schooling
  • the number of schools using the Local Administrative Bureau.

 

46 per cent of school performance measures are relevant

We found 48 school performance measures (46 per cent) aligned with outputs or departmental objectives. 

We found the other measures did not clearly express how they would support DET in achieving its objectives. This is challenging for DET because its outputs are not directly aligned to its objectives. Examples include:

  • education peak bodies that rate the Victorian Registration and Qualifications Authority effective or highly effective in performing its regulatory function
  • number of school staff attending strategic business and financial support training.

 

Almost all measures are clearly written

When departments use technical terms or do not explain what is being measured, we consider those performance measures unclear.

We found only 4 of 104 school performance measures were not clearly written.


 

66 per cent of school performance measures are comparable over time

We found 66 per cent of school performance measures (69 measures) can be used to compare results over time.

33 per cent of school performance measures did not:

  • support comparison of performance over time (3 measures)
  • account for changes in population, funding or demand (32 measures).

For example, median VCE study score is not comparable over time because the result is standardised. Measures like the number of students participating in accredited vocational programs or the number of principals participating in leadership development programs depend on the population of students and principals and access to programs.


 

The challenge of measuring school services' performance

Developing a framework for school services performance is difficult, particularly because of the overwhelming influence of social and demographic factors that influence outcomes.

The impact of school education on an individual is far greater than what a performance framework can measure. A student’s experience is formative and it shapes their identity and values. They develop many skills that are not assessed and behaviours that support their lifelong physical and mental wellbeing and can be exposed to diverse perspectives.

Including data that explores these factors provides valuable context but is not necessary in BP3. DET provides some of this contextual data in its statistics on Victorian schools and teaching.

We intend this framework and model performance statement to illustrate what better-practice output performance reporting looks like. Appendix I presents data that is either available in BP3 now or elsewhere.

Alternative approaches to school service performance measures exist, but all depend on the data that is available. For example, a better measure of efficiency might be weighted by student attendance, but the data is not available. Our model uses data that is available for presenting a time series. Where data does not yet exist, departments should seek to obtain it.


 

Back to top

Appendix A. Submissions and comments

Download a PDF copy of Appendix A. Submissions and comments.

 

Download PDF

Download Appendix A. Submissions and comments

Back to top

Appendix B. Abbreviations, acronyms and glossary

Download a PDF copy of Appendix B. Acronyms, abbreviations and glossary.

 

Download PDF

Download Appendix B. Acronyms, abbreviations and glossary

Back to top

Appendix C. Review scope and method

Download a PDF copy of Appendix C. Review scope and method.

 

Download PDF

Download Appendix C. Review scope and method

Back to top

Appendix D. How VAGO assessed departmental measures

Download a PDF copy of Appendix D. How VAGO assessed departmental measures.

 

Download PDF

Download Appendix D. How VAGO assessed departmental measures

Back to top

Appendix E. Departmental objective changes

Download a PDF copy of Appendix E. Departmental objective changes.

 

Download PDF

Download Appendix E. Departmental objective changes

Back to top

Appendix F. Departmental output changes

Download a PDF copy of Appendix F. Departmental output changes.

 

Download PDF

Download Appendix F. Departmental output changes

Back to top

Appendix G. New performance measures by department by attribute

Download a PDF copy of Appendix G. New performance measures by department by attribute.

 

Download PDF

Download Appendix G. New performance measures by department by attribute

Back to top

Appendix H. Discontinued performance measures

Download a PDF copy of Appendix H. Discontinued performance measures.

 

Download PDF

Download Appendix H. Discontinued performance measures

Back to top

Appendix I. Model Performance Statement

Download a PDF copy of Appendix I. Model Performance Statement.

 

Download PDF

Download Appendix I. Model Performance Statement

Back to top