Effectiveness of the Victorian Public Sector Commission

Tabled: 8 June 2017

3 Efficiency and effectiveness

Working efficiently and effectively enables public sector agencies to make the best use of scarce resources and achieve their objectives. Robust performance information is important for agencies to understand how well they are performing. However, measuring performance can be complex, particularly when agencies are trying to measure the impact of their activities.

The Victorian Public Sector Commission's (VPSC) objectives, set out in the Public Administration Act 2004 (the Act) , are complex to measure. This is because they have multiple parts that are not all within VPSC's control. A 2012 review of the State Services Authority (SSA) conducted by the Department of Premier and Cabinet (DPC) identified the need for improved performance reporting but also noted the difficulty of doing so for a central agency like SSA—later VPSC—that has no clear, attributable outcomes.

In this Part, we examined four case studies to understand the effectiveness of individual activities VPSC carries out, as well as its overall performance measurement.

3.1 Conclusion

We were unable to conclude on how effectively VPSC is achieving its objectives. VPSC's performance measurement is too limited to provide insight into the impact of its work. As a result, it is not possible to determine the extent to which it is achieving its objectives to strengthen the efficiency, effectiveness and capability of the public sector, and to maintain and advocate for public sector professionalism and integrity.

Our assessment of four of VPSC's key activities highlighted that it fulfils its statutory obligations. We found examples of good practice, where VPSC has effectively managed activities to achieve positive results. We also found examples of inefficiencies and gaps that compromise the efficiency and effectiveness of VPSC's work. VPSC has implemented some recent improvements across some of the areas that we examined and it is planning more. These are encouraging signs.

A critical gap is VPSC's understanding of its own performance. Addressing this gap is not just good practice, but also provides better assurance that VPSC's resources are being used effectively.

3.2 Understanding performance

To understand VPSC's efficiency and effectiveness, we examined VPSC's performance measurement, monitoring and reporting, and the extent to which VPSC's Budget Paper 3 (BP3) measures accurately measure its performance.

Performance measurement

VPSC's performance measurement is limited, and provides little insight into the effectiveness or efficiency of its activities. This affects VPSC's ability to understand the effectiveness of its operations, but also its ability to advocate for additional funding. As an example, VPSC's 2016 business case lacked evidence of VPSC's effectiveness and was not supported by robust performance information. The Department of Treasury and Finance advised government that the case for additional funding had not been fully demonstrated.

In the development of its 2016–17 annual plan, VPSC developed 101 mostly output‑based performance measures for the activities included in the plan, but it does not report against these measures. This is a key weakness and it means that VPSC cannot be sure that it is directing its resources to activities that are most effective in addressing the key risks and challenges facing the public sector.

Instead, VPSC's executive team monitors progress of some, but not all, of the activities in the annual plan through bi-monthly status reports. These status reports began in 2016, and indicate whether the activity is on track or complete, and what stage it is at—however, the reports do not contain all of the activities from the annual plan. Prior to this, no status reporting of annual plan activities occurred. Individual business areas, such as VPSC's Integrity and Advisory team, also undertake routine status updates against their specific projects and activities.

This lack of performance measurement is a significant gap. While status updates are a recent improvement, they need to be supplemented with robust performance information. This is essential for VPSC's executive team to make informed decisions about how it will allocate public resources. Further, this limited performance measurement compromises VPSC's understanding of the effectiveness and efficiency of its activities, and whether it is achieving its objectives.

Other ways of understanding performance

VPSC's annual report is the principal tool that it uses to demonstrate its accountability to Parliament and the public. VPSC's annual reports consist mainly of descriptive information about activities and the volume of work undertaken, rather than focusing on the quality of the work or outcomes.

VPSC's 2015–16 annual report gives little insight into VPSC's performance, and provides limited information about the effectiveness of VPSC's activities. The information included about the Graduate Recruitment and Development Scheme (GRADS) reports on its increased communications activities and subsequent increases in application rates, but this information is limited and it is not included for all of VPSC's activities.

Ineffective planning is one factor that has contributed to the weakness of VPSC's annual reporting (see Section 2.2). Without better measures and performance reporting, VPSC's annual reports will not be able to provide useful information.

We acknowledge the challenges associated with measuring and reporting performance, but it is essential for transparent and accountable use of public funds and the achievement of policy goals. Furthermore, not effectively measuring performance hampers VPSC's requests for increased funding, as evidenced by VPSC's unsuccessful 2016 business case.

The New Zealand State Services Commission (NZSSC) has a well-developed set of performance measures that could be a useful guide for VPSC. NZSSC's four-year plan contains a set of objectives, outcomes, impact and output measures and targets that help it understand its performance (see Appendix B for an example).

Most of VPSC's activities have only one measure, which limits understanding of its performance. In contrast, NZSSC uses a set of measures for each outcome, enabling a more comprehensive understanding of performance against cost, quality and output‑based measures.

Although NZSSC has a different role to VPSC, VPSC should consider adapting these measures or adopting a more comprehensive set of measures for each of its activities to enable a better understanding of its performance.

BP3 measures—service delivery

BP3 is published annually, and details the goods and services or outputs that departments are funded to deliver, and how they support government's strategic objectives. BP3 includes performance measures for monitoring departments' outputs.

The Victorian Government Performance Management Framework (PMF) requires measures to be appropriate and easily understood. They must consider timeliness, quality, quantity and cost, and be benchmarked over time and for comparison with similar activities in other jurisdictions. Performance measures should also demonstrate efficiency and effectiveness.

Portfolio departments are subject to the mandatory requirements of the PMF, and the BP3 measures should be consistent with the requirements and guidance in the PMF.

VPSC's BP3 measures do not provide an adequate understanding of VPSC's effectiveness. DPC has not effectively met its responsibility to ensure that the BP3 measures set for VPSC meet the requirements of the PMF.

Figure 3A shows VPSC's BP3 measures.

Figure 3A

VPSC's BP3 measures

Measure

Description

Target

2014–15 actual

2015–16 actual

2016–17 expected

Quantity

Advice and support provided to the public sector on relevant issues

80

80

80

80

Number of referred reviews(a) underway or completed aimed at improving service delivery, governance and/or public administration efficiency and effectiveness

5

5

5

5

Quality(b)

Proportion of recommendations arising from reviews of actions reported to be implemented by the public service

100%

100%

100%

100%

Timeliness(c)

Proportion of data collection and reporting activities completed within target time frames

100%

100%

100%

100%

(a)Referred reviews are organisational reviews requested by the Premier or ministers, not fee-for-service reviews.

(b)VPSC's quality measure refers to reviews undertaken in relation to Employment Standards.

(c)For 2015–16, VPSC's target for timeliness was 90 per cent.

Source: VAGO, based on BP3, Service Delivery.

From 2014–15 to 2015–16, VPSC met all of the targets in its BP3 measures, and is expected to do so in 2016–17. However, VPSC's BP3 measures are not a reliable means of understanding its performance because they do not comprehensively demonstrate its effectiveness.

VPSC has met both of its quantity measures every year since 2010. It does not provide any supplementary information that demonstrates why these quantity targets are reasonable, and the targets themselves do not provide any insight into performance.

In contrast, the quality measure addresses effectiveness by measuring the recommendations arising from reviews of actions in relation only to the Employment Standards—that is, agencies' actions following VSPC's recommendations after investigating complaints from public service employees about recruitment or other employment related activities.

This is a proxy measure for understanding the impact of recommendations, and it provides an understanding of the quality or effectiveness of VPSC's work on promulgating codes of conduct and standards. However, VPSC has no control over the reliability of this measure because it relies on unverified reports from departments. Further, this work is not a large part of VPSC's activities.

VPSC's timeliness measure is a more useful measure of its efficiency—it measures the volume of services provided within a specific time frame.

As VPSC's role is to lead good governance in the public sector, its BP3 measures should sufficiently measure its performance.

3.3 Case studies of key activities

Because of VPSC's limited performance measurement and reporting, we examined its efficiency and effectiveness by looking at its delivery of four key activities:

  • organisational reviews
  • GRADS
  • codes of conduct and employment standards for the Victorian public sector
  • Workforce Data Collection and People Matter Survey.

We selected these activities to cover VPSC's two key divisions—leadership and workforce, and performance and integrity. These activities also reflect differences in the legislative basis of VPSC's activities—organisational reviews and GRADS align with VPSC's responsibilities under the Act, but are not explicitly stated functions. In contrast, its work on codes of conduct, employment standards and data collection are explicitly stated functions.

We considered the effectiveness and efficiency of these activities.

To understand the effectiveness and efficiency of the four key activities we examined, VPSC undertakes a range of activities including seeking stakeholder feedback, analysing data and undertaking reviews of its work. However, these activities provide a limited understanding of VPSC's performance, and it does not have a clear understanding of the overall effectiveness of these activities and what outcomes they are achieving.

The following Sections provide further insights from our assessments of the four activities we looked at.

3.3.1 Organisational reviews

One of the established functions of VPSC, and SSA before it, is undertaking reviews of part or all of a public sector agency. From 2005 to 2014, SSA completed 75 reviews, and from April 2014 to 2016, VPSC completed 22. VPSC's reviews are delivered under section 39(1)(a) of the Act (see Section 1.1.2).

We examined two reviews:

  • Organisational Capability Review of Ambulance Victoria, 2016 (the AV review), commissioned by Ambulance Victoria (AV)
  • Monitoring of Department of Education and Training (DET) Integrity Reforms(the DET review), a two-stage review undertaken in 2015–16, commissioned by the Minister for Education.

VPSC develops the method, deliverables, project schedule and project management tools for each individual review, to reflect its specific terms of reference and context.

Key findings

Feedback from stakeholders indicates that VPSC's organisational reviews are effective and highly valued. We also found that VPSC follows well-developed project management practices in undertaking its reviews. Further, VPSC uses client surveys to gain insights into its performance and is committed to continuously improving its review practice.

However, VPSC could make improvements to better understand and improve the efficiency and effectiveness of its review activities—this would enhance the effectiveness of VPSC's broader work by ensuring its review activities address public sector issues directly relating to VPSC's priorities.

Planning and delivery

Both the AV review and the DET review demonstrated sound planning and were delivered efficiently within agreed time frames. The AV review was also delivered within budget. Both were subject to ongoing monitoring while they were underway and, at their conclusion, VPSC's team conducted an internal review to reflect on the process, identify effective practices, and identify room for improvement.

The documentary evidence we examined for the AV review demonstrated more comprehensive project planning and monitoring than that for the DET review, which reflects the different requirements for fee-for-service reviews compared to reviews funded out of VPSC's budget.

The chief executive officer (CEO) of AV approved the project brief, which defined the objectives, methodology, governance, cost and time line of the AV review. The AV review's progress and costs were monitored through weekly meetings between the review director and VPSC executives. The AV CEO confirmed that VPSC worked to efficiently manage the costs and time lines of the project.

The Minister for Education approved the project plan, time line and deliverables for the DET review. Regular monitoring occurred through meetings between the review director and VPSC executives. Because the review was funded from VPSC's budget, it was not necessary to monitor all costs, but doing so may assist VPSC to find efficiencies in its work and apply these to future reviews.

The AV CEO highlighted the value of the AV review and the professionalism of VPSC staff. This value is further evidenced by the fact that VPSC continues to receive requests to undertake reviews from public sector clients.

Use of lead reviewers

For many of its reviews, VPSC appoints one or more external lead reviewers, who are usually senior, highly experienced public servants. Both the AV and DET reviews used lead reviewers. However, the costs associated with a lead reviewer are high—for example, they accounted for 23 per cent of the AV review budget. Despite this, VPSC does not evaluate the contribution of lead reviewers, nor do VPSC's client surveys ask questions about the value of the lead reviewer from the client's perspective.

VPSC has advised that it will begin evaluating the contribution of lead reviewers, including seeking clients' views. This should enable it to find efficiencies in the way it allocates resources and understand the effectiveness of lead reviewers.

Tracking the implementation of recommendations

Both DET and AV accepted the findings and recommendations of the reviews. However, although VPSC makes recommendations through its reviews, it does not have a formal practice of tracking their implementation. Implementation is outside VPSC's control, but tracking the changes that result from VPSC's review work would provide important evidence of how it is fulfilling its statutory objectives.

VPSC is helping AV to implement its review recommendations, but this occurs on a case-by-case basis rather than for every review.

Similarly, VPSC has not systematically leveraged insights gained through its organisational reviews as a way to identify broader issues across the public sector. VPSC undertook an important first step towards doing this in April 2017, by aggregating the issues it had identified in its organisational review to shape its future initiatives.

Measuring effectiveness

VPSC reviewed the effectiveness of its review activities both during and at the end of the DET review. It did so by determining what work practices were effective and how to manage the relationship, given that DET did not commission the review.

VPSC has 10 measures relating to seven review activities in its 2016–17 annual plan, but these are output based or relate to endorsement from the review commissioner. Figure 3B shows the performance measures for three of the activities.

Figure 3B

Review activities and measures

Activity

Performance measure

Subject to the direction of the Premier, lead implementation of key recommendations arising from the Review of Victoria's Executive Officer Employment and Remuneration Framework:

 
  • Conduct industry segment reviews of executive employment and remuneration

Reviews supported by Premier and relevant minister and portfolio secretary

  • Establish an executive remuneration panel

Panel established

  • Commission the design of a framework and tools for determining VPS executive classification and remuneration

Commission designed

Source: VAGO, based on VPSC.

The last two of these measures are output-based and do not provide an understanding of effectiveness. However, the first measure, which focuses on support for the review from the Premier and relevant minister and portfolio secretary, is appropriate, because it measures whether the output aligns with the specifications of the review—in this way, it acts as a proxy measure for quality. However, it does not provide an understanding of the efficiency of the activity, and as a result, is not comprehensive.

VPSC does not report against these measures, but provides high-level information to the VPSC executive about the status of projects. This limits VPSC's ability to improve its review activity, to understand whether reviews are successful, and to apply review findings to its broader activities within the public sector.

Responses to the reviews from AV and the Minister for Education indicate that review commissioners value VPSC's reviews. However, lack of reporting against these measures is a missed opportunity for VPSC to demonstrate its effectiveness rather than relying on the ongoing demand for its review services and positive feedback from clients.

3.3.2 Victorian Graduate Recruitment and Development Scheme

VPSC delivers GRADS on behalf of participating public sector agencies. GRADS is a 12‑month employment program intended to recruit high-performing graduates to meet the current and future needs of the Victorian public service. VPSC is responsible for coordinating agencies' participation in the program, marketing and promotion, procurement and contract management, delivery of a learning and development program for graduates, and supporting stakeholders.

In 2016, VPSC's management of GRADS resulted in recruitment and learning and development of 83 people in 11 different agencies.

The total cost of delivering the program in 2016 was $1.596 million. VPSC delivers the GRAD program on a cost-recovery basis—participating agencies cover all costs, which are charged on a fee-per-graduate basis. This fee model means that the higher the number of participants in the program, the lower the cost per graduate for departments.

Key findings

VPSC has a solid understanding of the GRADS program's performance each year, but does not have a comprehensive understanding of its impact. There is an ongoing lack of strategic oversight of the program, which means that the long-term outcomes are not known.

Seeking feedback and reviewing the program

VPSC regularly seeks feedback on the management and operation of the program from graduates and other stakeholders, both formally—through committees and surveys—and informally. VPSC uses this feedback to improve the way it manages GRADS—for example, it recently changed its model for delivering learning and development from multiple providers to a single provider.

While VPSC reviews the program and seeks feedback, it could improve the way it delivers GRADS so it can gain a greater understanding of the effectiveness and efficiency of the program. A comprehensive review of GRADS is scheduled for 2017, which should assist VPSC to do this.

Strategic oversight

SSA established the GRADS Governance Board in 2013, to improve strategic oversight of the program in response to a recommendation from a 2012 review. The governance board continued to operate under VPSC. However, VPSC advised that the board did not have any strategic impact, and it was disbanded in 2015.

After the board was disbanded, the Human Resources Directors Network, made up of representatives from the seven departments, was intended to provide a strategic oversight role. However, VPSC's engagement with the network has been limited and the network is largely informal. This arrangement has not provided any strategic oversight of GRADS.

As a result, VPSC's management of GRADS does not incorporate the strategic direction needed to identify and respond to the future needs of the Victorian public sector. There is little understanding of the long-term effectiveness of the program, and GRADS does not have defined objectives to work towards or measures to help evaluate its performance. There is currently no governance structure in place to facilitate these activities.

Measuring effectiveness

VPSC's 2016–17 annual plan contains a range of measures for GRADS, but they are output-based and VPSC does not report against them. However, VPSC did report on the effectiveness of its marketing and attraction strategies for GRADS in its 2015–16 annual report—for the 2016 intake, VPSC created a video and changed its broader attraction strategy, which led to a 27 per cent increase in applications for graduate roles.

Gathering data

VPSC could improve its review activities for GRADS by collecting longer-term data on retention, which would help it to understand the longer-term effectiveness of the program and whether the costs of participating are an efficient means of recruitment for agencies. VPSC does not currently undertake this work.

VPSC does not collect raw application and assessment data on education, cultural background or other demographic information from applicants. VPSC could use this data to better understand the pool of applicants which, in turn, could help it to identify possible efficiencies within the application and assessment process and contribute to broader VPSC initiatives around workforce management. Given the strategic priority that VPSC places on data, this is a missed opportunity.

3.3.3 People Matter Survey and Workforce Data Collection

Collecting and reporting on whole-of-government data is one of VPSC's core functions under the Act. VPSC collects and maintains six major datasets, shown in Figure 3C.

Figure 3C

Major datasets maintained by VPSC

Dataset

Description

People Matter Survey (PMS)

Results of an annual survey on employees' views about the application of public sector values and employment principles in their workplaces

Workforce Data Collection (WDC)

Data from an annual census of all public sector employees

Executive Data Collection

Data from an annual census of public sector executives

Government Sector Executive Remuneration Panel (GSERP)

Data on remuneration of public sector executives

Progression Data Collection

Data on progression outcomes for the public sector

Government Appointments and Public Entities Database (GAPED)

Data on appointments and composition of public sector boards

Source: VAGO.

These datasets are essential for VPSC to fulfil its statutory functions of:

  • monitoring adherence to public sector values and codes of conduct
  • reporting to agency heads on their organisations' workforces, and their adherence to the codes of conduct.

These data collections are also an important input for publications such as the annual report State of the Public Sector in Victoria and other initiatives.

We examined VPSC's management of two key data sets—PMS and WDC.

Key findings
Timeliness of data collection activities

VPSC has delivered its data collection activities within agreed time frames. VPSC's BP3 timeliness measure looks at the proportion of its data collection activities completed within agreed time frames, as an indicator of the efficiency of its activities.

In 2015–16, the timeliness target was 90 per cent, but VPSC exceeded this target, delivering 100 per cent of its activities on time.

Addressing issues with data accuracy, sustainability and security

Although VPSC exceeded its timeliness measures, issues with data management documentation, data storage, obsolescence of data collection platforms and manual data validation methods create risks to the accuracy, sustainability and security of the data. VPSC's Data System Strategy 2016–19 details these issues and their impact on its efficiency, as well as how they will be addressed.

One example of this is VPSC's decision to replace the obsolete Workforce Analysis and Collection Application (WACA) tool—the first stage of this strategy. VPSC included this action in its 2016–17 annual plan and is currently considering options for how to progress this action.

VPSC previously made efforts to replace the WACA tool. It prepared procurement documentation in 2015, but the project did not proceed.

It is important for VPSC to follow through on its current attempt.

Processes and controls

Our analysis of the processes associated with PMS and WDC show they are sufficient for VPSC to fulfil its statutory obligations. However, there are flaws in VPSC's processes and controls that create a risk that the data may be unreliable. These risks have not materialised, and the PMS and WDC data we tested was accurate. However, these risks continue to threaten VPSC's ability to deliver activities that rely on data analysis.

VPSC validates and investigates its data, but documentation for these activities is incomplete.

WDC is collected through WACA, but this tool is no longer supported by the original developer and cannot be repaired or enhanced. VPSC has attempted to manage this issue by using additional spreadsheets to support its own data validation. However, this process is manual and highly inefficient.

VPSC previously used the WACA tool for WDC and VPS executive data collections. However, due to WACA's declining functionality the VPS executive data collection is now received by email and manually stored in spreadsheets, so it is not subject to appropriate data warehousing or security. These issues highlight the importance of VPSC following through on its plans to replace WACA.

Reporting survey results back to clients

VPSC has recently worked to enhance its use of data by developing new products to report the results of its data collection back to clients. These include:

  • presenting PMS results as heat maps to make data more meaningful
  • preparing 'data insights' reports on a range of specific topics, using data from the 2016 PMS survey
  • piloting data dashboards for each department to enable them to gain insights and use this data for their planning.

These are positive developments that demonstrate VPSC's efforts to leverage its valuable datasets. However, as VPSC's use of data increases, so too does the need for it to address issues with management and storage of its datasets.

3.3.4 Code of conduct and employment standards

Under the Act, two of VPSC's functions are issuing and applying codes of conduct derived from the Public Sector Values, and issuing and applying standards derived from the Employment Principles. VPSC's codes of conduct and standards work is linked with its statutory objective of maintaining and advocating for public sector professionalism and integrity.

Figure 3D shows VPSC's responsibilities for issuing and applying codes of conduct and standards.

Figure 3D

Values, principles, standards and codes

Figure 3D shows VPSC's responsibilities for issuing and applying codes of conduct and standards

Note: The Codes of Conduct for VPS Employees and Employees of Special Bodies were issued in 2007, and updated and reissued in 2015. The Code of Conduct for Directors of Victorian Public Entities was issued in 2006, and updated and reissued in 2016.

Source: VAGO.

Public sector agencies' employment processes must be consistent with the principles and the standards. There are six standards:

  • fair and reasonable treatment
  • merit in employment
  • equal employment opportunity
  • human rights
  • reasonable avenue of redress
  • career public service.

Codes of conduct are derived from the values set out in the Act, so they remain relatively stable unless there are changes to the legislation. VPSC updated the codes of conduct in 2015 and 2016, prompted by the changes to the Act in 2014 that created VPSC.

The Standards for the Application of Public Sector Employment Principles (the standards) are mandatory. They were originally issued in 2006 and were updated and reissued by VPSC in January 2017. VPSC communicates these changes to heads of public agencies.

Section 40(1)(c) of the Act requires VPSC to monitor and report to the heads of public sector agencies on compliance with the values, codes, principles and standards. VPSC's performance and analytics team does this annually, using the results of the PMS.

VPSC's integrity and advisory team responds to enquiries from public sector employees about their employer's compliance with the standards, investigates inquiries and makes recommendations to public sector agencies to help them comply. In 2016, VPSC received 179 enquiries. VPSC exercised formal information request powers and made recommendations on 13 of these.

Key findings

VPSC's codes and standards work complies with the requirements of the Act.

Developing a program of work

Through the Integrity Strategy, introduced in 2016, VPSC has introduced more active monitoring of its codes of conduct and standards work and a more defined, consolidated program of work than it had previously. These are positive steps.

The Integrity Strategy aims to strengthen integrity and promote sustained community and government trust in the Victorian public sector. The strategy sets a program of work for the Integrity and Advisory team, but VPSC has not yet developed systems to help it understand how effective its codes of conduct and standards work has been.

VPSC's performance measurement in this area is output-based and measures compliance with the Act, rather than the effectiveness of its activities.

VPSC has undertaken activities to apply PMS results to planning codes of conduct and standards work under the Integrity Strategy, but this work is not systematic and occurs on an ad hoc basis. For example, VPSC's analysis of 2016 PMS results helped it to identify a cohort of departments with bullying and harassment issues. It is currently developing resources to help departments improve their performance in these areas.

Identifying issues and monitoring compliance

As part of the Integrity Strategy, VPSC is in the early stages of developing a matrix to identify public sector agencies that need help to comply with codes and standards. A small number of them will be selected for annual review. These reviews aim to identify noncompliance before a breach occurs and should enable VPSC to become more effective in targeting compliance issues with the codes and standards.

The Integrity Strategy includes an initiative to use data from the PMS, reviews and enquiries to identify emerging issues, which should enable VPSC to better exploit its datasets to anticipate sector issues and, potentially, to reduce the number of enquiries that VPSC must address.

VPSC has also undertaken some preliminary analysis activities, using the results of its organisational reviews to identify key areas of concern.

VPSC's activities in relation to bullying and harassment, the Integrity Strategy and its anticipation of future demand demonstrate a willingness to be responsive to emerging issues. While this work is in its early stages, VPSC is taking positive steps to develop a more strategic approach to its work.

Back to Top