Contract Management Capability in DHHS: Service Agreements

Tabled: 20 September 2018

Overview

The Department of Health and Human Services (DHHS) partners, through service agreements, with approximately 1 900 funded organisations to deliver person-centred services and care. It spends approximately $2.8 billion annually in this way. Service agreements define DHHS's and funded organisations' mutual responsibilities and obligations.

Funded organisations provide a wide range of health and human services through service agreements. Some of these services support clients that are particularly vulnerable and include children who are placed in out of home care and clients experiencing homelessness and family violence.

Previous reviews of government departments' partnership with community sector organisations have commonly highlighted the need for improved oversight of outsourced health and human services, with deficiencies focusing on inadequate monitoring practices.

In this audit, we assessed whether DHHS has sufficient capability in managing service agreements to ensure funded organisations deliver agreed health and wellbeing supports and outcomes to clients.

We make five recommendations for DHHS.

Back to top

Transmittal Letter

Ordered to be published

VICTORIAN GOVERNMENT PRINTER September 2018

PP No 442, Session 2014–18

The Hon. Bruce Atkinson MLC
President
Legislative Council
Parliament House
Melbourne
 
The Hon Colin Brooks MP
Speaker
Legislative Assembly
Parliament House
Melbourne
 

Dear Presiding Officers

Under the provisions of section 16AB of the Audit Act 1994, I transmit my report Contract Management Capability in DHHS: Service Agreements.

Yours faithfully

Signature of the Auditor-General.png

Andrew Greaves 
Auditor-General

20 September 2018

Back to top

Acronyms and abbreviations

ANAO

Australian National Audit Office

CIMS

Client Incident Management System

DHHS

Department of Health and Human Services

DPAC

Divisional Performance, Assurance and Compliance

FAC

Funded Agency Channel

FOPMF

Funded Organisation Performance Monitoring Framework

HACC

Home and Community Care

KPI

key performance indicator

LEO

local engagement officer

MACNI

Multiple and Complex Needs Initiative

RAT

Risk Assessment Tool

SACC

Service Agreement Compliance Certification

SAMS2

Service Agreement Management System

SDT

Service Delivery Tracking

SSG

service standards and guidelines

TAFE

technical and further education

VAGO

Victorian Auditor-General's Office

VGPB

Victorian Government Purchasing Board

VPS

Victorian Public Sector

Back to top

Audit overview

The Department of Health and Human Services (DHHS) is responsible for policies, programs and services to support and enhance the health and wellbeing of all Victorians.

DHHS partners, through service agreements, with approximately 1 900 funded organisations to deliver person-centred services and care. It spends approximately $2.8 billion annually in this way. Service agreements define DHHS's and funded organisations' mutual responsibilities and obligations.

Funded organisations provide a wide range of health and human services through service agreements. Some of these services support clients that are particularly vulnerable, including children placed in out of home care and clients experiencing homelessness and family violence.

Establishing and maintaining sufficient contract management capability across both health and human services is inherently challenging. Service agreements must be managed in a way that caters to a wide range of service types and client needs of varying complexity across the state. DHHS management needs to assure that outsourced services are delivered as contracted and to the required quality, and that clients' safety is not compromised.

Previous reviews of government departments' partnership with community sector organisations have commonly highlighted the need for improved oversight of outsourced health and human services, in particular inadequate monitoring practices.

The objective of the audit was to determine whether DHHS has sufficient capability in managing service agreements to ensure funded organisations deliver agreed health and wellbeing supports and outcomes to clients.

Conclusion

DHHS does not have sufficient capability to manage its service agreements.

In responding to multiple past reviews highlighting a need for improved oversight of outsourced health and human services, DHHS's approach to managing and monitoring service agreements has become increasingly fragmented and duplicative and is not commensurate with service risk. Its capability has been further constrained by its lack of investment in developing its service agreement staff. This has precluded staff from opportunities to acquire and maintain their core contract management skills and has resulted in an overall lack of staff awareness about the purpose of their role.

DHHS has a duty of care to the individuals who access its contracted services, many of whom are particularly vulnerable. Its contract management shortcomings compromise its ability to consistently meet this obligation and heighten the risk of further instances where significant client safety risks go undetected. Ultimately, a more strategic service agreement management framework is needed that is integrated, risk-based and capable of reporting on performance at a system-wide level.

It is encouraging to see that DHHS is already working to address these issues and to significantly reform its service agreement management function.

Findings

Setting service agreement requirements

Performance standards refer to the quality of the service or activity that funded organisations are contracted to deliver, such as family violence support services. Relevant agreement clauses, departmental policies and guidelines fall within this definition.

DHHS needs sufficient assurance that clients are receiving quality services in a proper, timely and efficient manner. This requires that service agreements:

  • contain clearly defined performance standards, deliverables and review mechanisms
  • impose requirements on funded organisations that are proportionate to their risk profiles.
Service agreement performance standards

While some service agreement performance standards are explicitly listed within the agreement itself, others are detailed in documents that sit alongside the agreements. For agencies that deliver a broad range of activities, the applicable standards can be extensive. Organisations would benefit from DHHS clearly linking standards to deliverables where relevant, within the agreement, so that specific requirements for each funded activity are clear.

Service agreement deliverables

Deliverables are service activity outputs, including what needs to be delivered, to what standard and in what timeframe. Performance measures fall within this definition.

The performance measures in service agreements are inadequate. Not only are they inconsistent across service agreements for similar services, they are also inconsistent across documents and systems recording performance measures for the same organisation. Service agreements also do not consistently include mandatory performance measures set out in the Department of Health and Human Services Policy and Funding Guidelines 2017 (Policy and Funding Guidelines).

These issues indicate a lack of system-wide oversight and quality control over service agreements within DHHS. DHHS does not perform a system-wide review of service agreement performance measures for similar activities to ensure that they are both set and recorded in a compliant and consistent manner.

Review mechanisms refer to the triggers and supporting processes that enable variations to the terms and conditions of the service agreement.

Performance measures are also heavily output-driven and lack focus on service quality, nor are they clearly linked to DHHS's desired service system outcomes.

Service agreement review mechanisms

The mechanisms to review the terms and conditions of DHHS service agreements are sound. However, DHHS lacks assurance that variations are being processed in accordance with these mechanisms. Specifically, DHHS has not completed its annual variation compliance audit for 2017–18 after first introducing this process in 2016–17.

Categorising funded organisations according to risk

The scale and complexity of outsourced health and human services varies greatly, so it is important that the requirements set under each service agreement are targeted and proportionate to service risks.

DHHS has used a growing number of mechanisms to identify and manage service agreement risks. Over time these mechanisms have become increasingly fragmented and largely disconnected from each other.

One key mechanism is a risk-tiering framework that DHHS introduced in July 2015 to categorise funded organisations according to risk. However, the framework has limited coverage, applying only to approximately one-third of all organisations. Additionally, DHHS does not use the risk-tiering results, nor any results from its other risk oversight mechanisms, to inform funded organisations' service agreement obligations. Consequently, funded organisations commonly viewed their compliance and administrative obligations as excessive and duplicative.

Funded organisations' administrative and compliance requirements

Through our online survey of funded organisations, we sought views on service agreement administrative and compliance requirements:

  • Seventy per cent of surveyed funded organisations either agreed or strongly agreed that their administrative and compliance obligations were proportionate to service risk. However, funded organisations' open-text responses commonly raised concerns about excessive administrative and compliance requirements set by DHHS that were not proportionate to organisation size or level of funding provided.
  • A high proportion of funded organisations view their service agreement administrative and compliance requirements as duplicative, at both a departmental and inter-jurisdictional level—52 per cent and 67 per cent respectively.
  • Funded organisations that deliver services in multiple DHHS areas reported greater misalignment between their administrative and compliance requirements and their service risks, as well as higher duplication across data and reporting obligations.
  • Only about half of the surveyed organisations believe they are consistently able to meet their service agreement administrative and compliance obligations.

Staff skills, capabilities and capacity

The varied and often competing priorities of service agreement staff reinforce the need for DHHS to clearly define their roles and responsibilities, and the key skills and capabilities they require. DHHS also needs to provide new and experienced staff with sufficient opportunities to acquire and develop key skills and capabilities over time.

Defining required skills and capabilities

DHHS restructured its service agreement management function at a divisional and area level across the first half of 2018. This included:

  • combining the roles of the human services-focused local engagement officers (LEO) and the health services-focused program advisers into a single service agreement adviser role that extends across both portfolios
  • creating a new central performance unit to oversee and manage funded organisation performance at a statewide level
  • creating a new regulatory enforcement unit to focus on system-wide regulation of health and human service practitioners, providers and facilities.

The new position descriptions for service agreement advisers—as well as the newly created regulatory and performance units—focus more explicitly on managing the performance of funded organisations against contractual obligations, compared to the previous position descriptions for LEOs and program advisers. The new position descriptions more closely align with better practice contract management skills and capabilities, such as those from the Australian National Audit Office's (ANAO) 2012 better practice guide Developing and Managing Contracts (ANAO's better practice guide) and the Victorian Government Purchasing Board's (VGPB) VPS Procurement Capability Framework.

In adopting a more performance management-focused approach, it is important that DHHS also retain its focus on relationship management and tailor its engagement approach to the capability of each funded organisation, as well as to the risks associated with the services they provide.

Beyond aligning position descriptions more closely with better practice, DHHS will need to ensure that its staff perform their roles according to the new position descriptions and do not undertake tasks outside their roles, which occurred prior to the restructure. Our DHHS staff survey results show that a high proportion of respondents believe much of their work was on tasks that were outside their position description:

  • 28 per cent of respondents believe that somewhere between 25 and 50 per cent of their tasks are outside their position description
  • 21 per cent of respondents believe that over 50 per cent of their tasks are outside their position description.

Examples of additional tasks that staff have performed outside of their roles include meeting service clients to resolve individual issues and finding information and data for DHHS's central office.

Providing learning and development pathways

DHHS provides some training for service agreement staff, including an introduction to managing service agreements and training focused on how to use relevant DHHS systems and follow established processes.

The training does not sufficiently focus on good practice principles for contract or risk management. Although the introductory program covers good practice contract management and governance principles, the content is high-level and is not sufficiently targeted to equip service agreement staff with the contract management and governance skills needed to effectively manage service agreements.

Results from our online survey of DHHS service agreement staff also indicated that:

  • 29 per cent of respondents viewed their role orientation and induction as ineffective at giving them the basic skills needed to manage service agreements
  • 32 per cent of respondents viewed their training as ineffective at building and maintaining the skills needed to manage service agreements
  • only 76 per cent of respondents had an individual performance plan
  • 32 per cent of respondents viewed the performance planning and review process as ineffective at meeting their learning and development needs.
Corporate knowledge risks

Only two key DHHS staff hold a significant amount of corporate knowledge relating to the DHHS Service Agreement Management System (SAMS2), which DHHS uses to record and manage service agreements. One of these two staff members recently moved into another role within DHHS but is still regularly called upon to assist with SAMS2-related issues and queries. DHHS currently has no formal measures in place to capture the knowledge of these two staff.

This poses a risk to DHHS and its ongoing capacity to manage service agreements.

Monitoring and managing performance of funded organisations

Performance monitoring framework

DHHS's Funded Organisation Performance Monitoring Framework (FOPMF) provides the process for DHHS staff to assess funded organisations' compliance with service agreement requirements and respond to identified risks and underperformance.

There are limitations in FOPMF's design which reduce its effectiveness as a performance management framework:

  • FOPMF is essentially a one-size-fits-all framework, with some minor exceptions where FOPMF requirements are either optional or not applicable. It does not scale to account for the varying complexities and sizes of funded organisations, nor their risk profiles.
  • FOPMF monitoring tools are heavily compliance driven, and while this helps ensure funded organisations meet legislative and policy requirements, there is lack of focus on monitoring service quality and performance issues.
  • FOPMF drives a fragmented and duplicative approach to collecting performance information. In particular, DHHS staff need to enter performance data into various systems, which makes completing FOPMF monitoring tools administratively difficult and time consuming. This is further compounded by the lack of clarity in FOPMF guidance about the frequency of performance data collection and overlapping requirements across different FOPMF monitoring tools.

Our DHHS staff survey highlighted that overall satisfaction with FOPMF is relatively low. Only 42 per cent of respondents said they agree or strongly agree that FOPMF helps them monitor and manage the performance of funded organisations effectively.

Applying the performance monitoring framework

DHHS service agreement staff are not applying FOPMF as intended.

The uptake of FOPMF tools has been inconsistent. The main reasons for this are design limitations, lack of staff awareness about FOPMF components, insufficient training, and a heavy reliance on alternate local systems and tools.

The low uptake of the Risk Assessment Tool (RAT)—52 per cent of surveyed FOPMF users reported using it—is particularly problematic, as the tool is intended to ensure staff assess the severity of performance issues consistently and accurately. This undermines the ability of staff to track actions and address underperformance effectively and in a timely manner.

Our analysis found a total of 127 planned remedial actions to address funded organisation performance issues were overdue as at 17 April 2018, with the average number of days that actions were overdue being 264 days.

Additionally, we could not find evidence that DHHS had used existing performance information—generated through FOPMF or otherwise—to inform future service agreement funding decisions. This is despite DHHS's documented guidance instructing staff to do so.

Recommendations

We recommend that the Department of Health and Human Services:

1. apply centralised, system-wide quality assurance when setting service agreement performance measures so that they:

  • are set consistently across different service agreements where appropriate
  • are recorded consistently across different documents and systems
  • clearly link to desired service system outcomes (see Section 2.2)

2. develop and apply a system-wide framework for risk-profiling funded organisations that:

  • integrates the department's various disconnected risk oversight mechanisms
  • is applied to all funded organisations
  • is used to set service agreement requirements that are proportionate to the level of risk associated with the funded organisation and the services they are funded to deliver (see Section 2.3)

3. develop and implement support structures to ensure staff skills and capabilities, and the tasks performed, align with the new position descriptions including:

  • ongoing regular supervision and support for all service agreement staff that reinforces the new roles and responsibilities
  • individual performance plans for all service agreement that reflect the identified skills and capabilities needed to manage service agreements
  • a clear learning and development pathway for all service agreement staff for developing and attaining the identified skills and capabilities needed to manage service agreements (see Sections 3.2, 3.3 and 3.4)

4. capture and retain the corporate knowledge held exclusively by key staff in relation to its Service Agreement Management System (SAMS2) (see Section 3.5)

5. redesign its Funded Organisation Performance Monitoring Framework so that it:

  • scales monitoring effort according to service risk, organisational capability and funding levels
  • balances monitoring effort between compliance and service quality
  • integrates and streamlines performance data collection arrangements
  • systematically informs future service agreement funding decisions. (see Sections 4.2 and 4.3).

Responses to recommendations

We have consulted with DHHS and we considered its views when reaching our audit conclusions. As required by section 16(3) of the Audit Act 1994, we gave a draft copy of this report to DHHS and asked for its submissions or comments. We also provided a copy of the report to the Department of Premier and Cabinet.

DHHS provided a response which is summarised below. The full response is included in Appendix A.

DHHS acknowledged the value of this audit and accepted all five recommendations in full. It provided an action plan that addresses each recommendation.

Back to top

1 Audit context

DHHS is responsible for policies, programs and services to support and enhance the health and wellbeing of all Victorians.

DHHS's service responsibilities are vast and include:

  • health services—acute health care, aged and home care, primary and dental health, mental health and drug services
  • human services—child protection and family services, housing assistance, community participation and disability services.

DHHS plays multiple roles in delivering health and human services as shown in Figure 1A.

Figure 1A
DHHS roles in delivering health and human services

DHHS roles in delivering health and human services

Source: VAGO.

Collaboration, including through partnership with people and organisations, is one of DHHS's core values. DHHS is responsible for ensuring service systems are sustainable and capable of delivering continuity of care. This includes supporting funded organisations to provide health and human services and ensuring its oversight balances appropriate accountability and administrative obligations.

DHHS partners with approximately 1 900 funded organisations to deliver person-centred services and care.

1.1 Service agreements

A service agreement is a contract between a department and an organisation to deliver services on behalf of government.

DHHS spends approximately $2.8 billion annually on funded organisations to deliver services to Victorians through service agreements. The service agreement defines DHHS's responsibilities and obligations in funding organisations, and organisations' responsibilities and obligations in delivering services.

State of Priorities—annual accountability agreements between Victorian public healthcare services and the Minister for Health.

The Victorian Common Funding Agreement is mandated for all Victorian Government departments that fund not-for-profit community organisations to deliver services and projects. Service agreements typically follow a four‑year cycle, except for disability services, where contracts are limited to three years under the Victorian Disability Act 2006. DHHS service agreements are based on the Victorian Common Funding Agreement and have additional clauses and schedules due to the nature of services funded. Public healthcare services, such as public hospitals, dental health services and Ambulance Victoria, are managed through a Statement of Priorities (SOP) instead of a service agreement.

DHHS uses a standard service agreement template for all organisations that it funds. Figure 1B details the structure of the standard agreement.

Figure 1B
DHHS standard service agreement structure

Service agreement section

Description

Signatories

Contains the signature clauses for authorised persons to sign for the department and the organisation.

Background

Contains a brief background to the service agreement and describes reasons for the organisation and the department entering into the service agreement.

Details

Contains core service agreement details—the organisation's legal name, the department's name, the agreement's start and end dates and the primary contact details of each party.

Terms and conditions

Contains the standard terms and conditions of the service agreement.

Schedule 1

Lists the applicable departmental policies related to the delivery of services by the organisation.

Schedule 2

Includes:

  • a funding summary and payment schedule
  • data collection requirements (for performance reporting and other data reporting)
  • services to be provided by the organisation
  • funding to be paid by the department for the services
  • service performance measures and targets
  • delivery catchments information if applicable
  • any additional requirements related to the specific service plan or activity.

Schedule 3 (optional)

Optional schedule that enables the department and organisation to record any special conditions and/or actions that sit outside the service plan.

Source: VAGO.

DHHS annually updates its Policy and Funding Guidelines, which contain information for managing and administering service agreements. Descriptions of funded activities are linked to the service agreement and provide further detail on an organisation's service delivery, regulatory and compliance obligations. The guidelines also include service standards and guidelines (SSG) and applicable policies.

As with all procurement, DHHS is required to manage these contracts actively to ensure providers deliver the purchased services to the appropriate standard, and that they represent value for money for the investment of government funds. Managing service agreements requires different and more complex capabilities to those necessary for managing contracts for commercial goods and services. In particular, managing service agreements requires staff, supported by systems and processes, to carefully balance objectives of delivering quality services, fulfilling a duty of care to clients, and maintaining a sustainable service system across Victoria.

Based on currently available data for the 2015–19 agreement period in SAMS2, the following are some key statistics for service agreements as of April 2018:

A short-form agreement is commonly used for organisations that receive funding that is low in value and one-off in nature.

  • A total of 1 927 organisations have a total of 2 680 distinct agreements with DHHS. This includes 563 organisations that have a short-form agreement.
  • The total value of these agreements over four years is $11.3 billion with an average of $5.84 million per funded organisation.
  • The funding provided to organisations varies significantly from over $350million to as low as $983. The top 10 funded organisations account for $2.6 billion of funding, whereas the bottom 100 organisations make up only $556.4 million.

Figure 1C shows the number of organisations funded by quartile.

Figure 1C
Funding distribution for current service agreement period (2015–19)

Per cent of funding

Number of organisations

Median value for each cohort ($)

Average value for each cohort ($)

First 25

11

244 594 467

249 936 650

25–50

40

56 018 739

71 695 808

50–75

114

23 397 983

24 728 434

75–100

1 762

232 415

1 601 362

Total/All

1 927

290 519

5 842 126

Note: Amounts include both standard and short-form service agreements.
Source: VAGO based on DHHS data.

1.2 Service agreement framework

The framework that supports DHHS and funded organisations to implement and manage service agreements comprises the following components:

  • Policy and Funding Guidelines—information for managing and administering service agreements, including descriptions of funded activities
  • Service agreement information kit—information for funded organisations about service agreement requirements
  • Service Agreement business rules and guidelines—information to assist DHHS staff in managing and meeting DHHS's contractual obligations under the service agreement
  • FOPMF— a toolkit comprising checklists and a RAT to assist DHHS to monitor funded organisations' performance
  • information systems—to assist with managing service agreements and collecting client and program data
  • Funded Agency Channel (FAC)—an online portal through which funded organisations can access information about their service agreements and related resources.

1.3 Roles and responsibilities

The responsibility for executing, managing and monitoring service agreements rests with various parts of DHHS in central and regional offices. Figure 1D describes the distribution of responsibilities.

Figure 1D
Service agreement roles and responsibilities across DHHS

DHHS level

Team responsibilities

Central office

Operational Performance and Quality Branch (from February 2018):

  • develop and maintain service agreement policies, frameworks, systems and other resources
  • renegotiate the standard service agreement template
  • provide system oversight and lead the development of a new operating model
  • manage FOPMF including online functions
  • develop learning and development tools and provide training to staff
  • advise and support staff and funded organisations on systems and policy issues
  • advise and support divisions and areas undertaking service reviews
  • provide centralised oversight and analysis of funded organisation performance
  • undertake complex or high-risk performance reviews and other risk assessment
  • process variations to funding commitments where DHHS is implementing complex or systemic change.

Program areas:

  • provide program requirements and guidelines, measures and applicable departmental policies
  • manage service agreements for funded activities (only performed by some central program areas).

Divisions (x 4)

Service agreement contact officers/Deputy Secretary/financial delegates:

  • approve funding commitments under service agreements (subject to delegation)
  • analyse funded organisations' financial reporting and advise monitoring staff on their financial health (performed by divisional finance teams, and by Central Office Finance Branch for centrally managed agreements)
  • develop performance reports and analytics for areas and divisions (Performance and Analysis units).

Areas (x 17)

Agency Performance and Systems Support Units (from June 2018):

  • service agreement advisers perform day-to-day management of service agreements with funded organisations, including monitoring performance and adherence to DHHS policy, program guidelines and requirements.

Before June 2018, the service agreement adviser role was performed by:

  • LEOs for the human services portfolio
  • program advisers for funded activities for the health services portfolio.

Source: VAGO.

1.4 Recent changes

DHHS instigates a service review where it identifies a high level of risk or issues of concern with a funded organisation. It can be collaborative or investigative in nature.

In October 2017, DHHS announced plans to restructure aspects of its organisation, which affect responsibilities for managing service agreements.

Central office changes

The new structure within DHHS's central office took effect in February 2018. It brings together staff responsible for establishing service agreements and developing policies and frameworks for performance monitoring and staff responsible for broader, system-wide performance and quality. These functions were previously located in different parts of DHHS. The purpose of this restructure is to establish a single point of responsibility for the oversight of service agreement policy, and for inquiry and action to address underperformance of funded organisations. The change also included creation of a new leadership position overseeing the performance of service agreements and new positions created to perform data analysis and support and strengthen compliance.

This new organisational design aims to improve alignment between service agreement creation, execution and analysis and to ensure funded organisations meet the conditions of both operational and financial performance.

Area-level changes

In May 2018, DHHS announced changes at the area level that included the creation of an Agency Performance and System Support Unit within each of DHHS's 17 Areas. Each unit consolidates the service agreement management functions for health and human services, with the roles of LEO and program adviser combining into one service agreement adviser role.

These changes aim to:

  • improve the capacity and capability of staff to manage service agreement performance in a risk-based manner
  • focus effort on improving funded organisations' performance
  • provide an in-depth understanding of locally provided funded services
  • balance service agreement management and capability activities within areas, based on demand.

The changes took effect in June 2018, with ongoing implementation managed through DHHS's Agency Performance and System Support Operating Model Working Group.

1.5 Managing and monitoring service agreements

Prior to the area-level changes, LEOs and program advisers in the 17 area offices across Victoria played the lead role in monitoring and managing service agreements with funded organisations. This included implementing FOPMF, which DHHS developed in 2015 to provide a consistent framework for performance monitoring. FOPMF consists of three key elements:

  • Service agreement monitoring—ongoing collection of information relevant to core performance metrics and broader considerations like organisational governance, financial viability, compliance with relevant standards, quality and safety considerations. It involves collecting and reviewing data through regular engagement with funded organisations and drawing on other relevant DHHS databases.
  • Risk assessment of identified performance issues—applying a defined methodology to assess risks associated with performance issues.
  • Responses to performance issues—responding appropriately to performance issues, ranging from agreeing to remedial actions with the funded organisation, to undertaking a wider service review, through to de‑funding organisations.

SAMS2 is the key information system used in managing service agreements. It records DHHS's contractual arrangements with the organisations it funds, agency performance data and DHHS monitoring data. DHHS also uses this system to create service agreements.

There are multiple additional information systems relevant to service agreements, including the Client Incident Management System (CIMS) and a range of program-specific data collections to which funded organisations must submit information.

1.6 Why this audit is important

With approximately $2.8 billion spent on service agreements each year across a wide range of services with varying complexity, DHHS needs assurance that its investment is producing high-quality services.

Service agreements represent an intersection between multiple priorities including:

  • value-for-money procurement
  • responsive services that meet citizens' needs
  • management of risks associated with outsourced service provision.

Effectively managing service agreements requires carefully balancing these priorities.

Previous reviews of government departments' partnership with community sector organisations have highlighted unique challenges. These reviews include:

  • VAGO's 2010 performance audit Partnering with the Community Sector in Human Services and Health, which highlighted the need for the former departments to 'improve consistency in managing and monitoring service agreements to further reduce administrative and related cost burdens placed on funded organisations without compromising accountability'.
  • The Royal Commission into Institutional Responses to Child Sexual Abuse, which produced a series of case studies resulting from public hearings in 2013–17. Findings highlighted the need for effective oversight of funded services in the areas of governance, service delivery and financial management.
  • The Productivity Commission highlighted in 2016 that governments are imposing management requirements on outsourced services that are out of proportion with the level of risk, can lead to high compliance costs, and can hinder responsiveness and innovation.

More recently, in August 2017 the Coroner's Court of Victoria released the findings of its inquest into the death of a person in the care of DHHS who resided at a residential care unit. The inquest highlighted a number of shortcomings with:

  • DHHS's monitoring of the funded organisation
  • information sharing between DHHS and the funded organisation
  • the capability of staff employed by the funded organisation.

In September 2017 the Victorian Ombudsman released the report of an investigation into the management of a disability group home and the protection of residents within it. The facility was managed by a funded organisation providing disability services through a service agreement between the organisation and DHHS. The investigation found numerous deficiencies, including that the funded organisation did not meet standards prescribed in the service agreement and that DHHS did not effectively monitor the agreement or intervene to remedy the shortcomings in its execution.

More broadly, numerous other reviews—completed at either a whole‑of‑government, departmental or service agreement level since 2011—have also identified various challenges relevant to DHHS's service agreement management. These reviews highlighted the need for:

  • a more integrated and consistent service delivery model that is both client- and outcome-focused
  • a more risk-based approach to overseeing performance of funded organisations that removes duplication and gaps in performance monitoring
  • improved information systems that reduce complexity
  • clearer staff roles and responsibilities.

These wide-ranging reviews highlight some of the challenges that service agreement management presents for departments and funded organisations. Given that service agreements are at the heart of DHHS's operating model, examining DHHS's capability to manage them effectively—including its systems, processes and human resource capability—provides important insight.

1.7 What this audit examined and how

The objective of the audit was to determine whether DHHS has sufficient capability in managing service agreements to ensure funded organisations deliver agreed health and wellbeing supports and outcomes to clients.

To address this objective, we assessed whether DHHS service agreements are fit-for-purpose, focusing on:

  • service agreement performance standards, review mechanisms and deliverables
  • whether service agreement requirements imposed on funded organisations are proportionate to risk.

We also assessed whether DHHS is implementing an effective system for managing service agreements, focusing on:

  • the skills and capabilities within DHHS that relate to managing service agreements
  • the design and implementation of DHHS's service agreement performance monitoring framework.

The audit examined whole-of-department systems and processes and included fieldwork in two DHHS divisions—East and West.

The audit also included:

  • consultation with a selection of funded organisations within these divisions
  • surveys of DHHS service agreement staff and funded organisations.

We distributed our DHHS survey to 513 staff who either currently or have previously managed service agreements. This included staff who manage service agreements as a core part of their role, as well as staff with a lesser role in managing service agreements. The results of this survey are in Appendix D.

We distributed our funded organisation survey to 1 021 funded organisations. The results of this survey are in Appendix E.

Across the two surveys, we sought views on:

  • service agreement administrative and compliance requirements
  • staff skills, capability and capacity
  • performance monitoring and reporting arrangements.

Figure 1E shows the response rate for each survey.

Figure 1E
Response rate for VAGO surveys of DHHS service agreement staff and funded organisations

Survey

Invitations sent

Responses received

Response rate (%)

DHHS staff

513

200

39

Funded organisations

1 021

355

35

Source: VAGO.

The audit also included closer examination of 12 service agreements, covering:

  • both health and human services
  • DHHS's East and West Divisions
  • a mix of funding levels.

The funded organisations included in this selection are detailed in Figure 1F.

Figure 1F
Organisations selected for detailed examination of service agreements

 

Metro

 

Rural

Service

Outer Eastern Melbourne (East)

Western Melbourne (West)

 

Ovens–Murray (East)

Western District (West)

Human Services

Australian Childhood Foundation

MacKillop Family Services

 

Junction Support Services

Winda-Mara Aboriginal Corporation

Anglicare Victoria

Western Region Centre Against Sexual Assault Inc.

 

Rural Housing Network Limited

Brophy Family & Youth Services Inc.

Health Services

Ranges Community Health

Cohealth Ltd

 

Westmont Aged Care Services Limited

Western Region Alcohol and Drug Centre Inc.

Source: VAGO.

The audit focused on the current service agreement cycle that commenced on 1 July 2015.

We conducted our audit in accordance with section 15 of the Audit Act 1994 and ASAE 3500 Performance Engagements. We complied with the independence and other relevant ethical requirements related to assurance engagements. The cost of this audit was $625 000.

1.8 Report structure

The remainder of this report is structured as follows:

  • Part 2 examines how DHHS sets service agreement requirements.
  • Part 3 examines staff skills and capabilities to manage service agreements.
  • Part 4 examines how DHHS monitors service agreement performance.

Back to top

2 Setting service agreement requirements

With $2.8 billion spent annually on service agreements across 1 927 funded organisations, DHHS needs sufficient assurance that clients are receiving quality services in a proper, timely and efficient manner. This requires that service agreements:

  • contain clearly defined performance standards, deliverables and review mechanisms
  • impose requirements on funded organisations that are proportionate to their risk profiles.

In this part, we assessed whether DHHS service agreements are fit-for-purpose, focusing on these two areas.

2.1 Conclusion

DHHS's service agreements are not fit-for-purpose. A fragmented approach to their development and management means that performance measures are set and recorded inconsistently, without a clear focus on desired service quality and outcomes. This fragmentation has also resulted in an increasingly complicated, disjointed and duplicative approach to the risk-profiling of funded organisations that does not inform the service agreement requirements imposed on them.

These issues prevent DHHS from having a clear and accurate understanding of funded organisation performance and service delivery risks. This understanding is critical to ensuring that clients' safety and wellbeing is not compromised.

2.2 Setting performance standards, deliverables and review mechanisms

Performance standards and deliverables

Based on existing better practice material—including the ANAO's better practice guide—we applied the following definitions when assessing service agreement performance standards and deliverables:

  • Performance standards―the quality of the service or activity that funded organisations are contracted to deliver, such as family violence support services and housing assistance services. Relevant agreement clauses, DHHS policies and guidelines fall within this definition.
  • Deliverables―service activity outputs, including what needs to be delivered, to what standard and in what timeframe. Performance measures fall within this definition.

Each service agreement contains a service plan in Schedule 2 that details the service activities that the funded organisation must deliver. Each service activity has funding, performance measures and targets attached to it.

We found that:

  • DHHS could organise performance standards in service agreements in a more meaningful way so that funded organisations clearly understand how the standards apply to each funded service activity
  • performance measures are inconsistent across service agreements for similar services and are internally inconsistent across documents and systems that record performance measures for the same organisation and agreement
  • service agreements did not consistently include mandatory performance measures
  • DHHS had set service agreement performance measures without sufficient system-wide oversight and quality control arrangements.
Performance standards

Service agreements contain standard terms and conditions that detail the mandatory performance standards for funded organisations.

The Funded Agency Channel is a secure website that funded organisations use to access their service agreements, performance reports, DHHS policies and standards, as well as other supporting information.

While some of these requirements are explicitly listed in service agreements, others are in documents that sit alongside them. Funded organisations can access these documents through the FAC.

For agencies that deliver a broad range of activities, the applicable standards can be extensive. The more services an organisation is funded for, the more SSGs are listed in the agreement, but the SSG documents are not organised in any meaningful way. One of the 12 service agreements we reviewed listed over 70 SSG documents.

Organisations would benefit from standards that are clearly linked to relevant activities within the agreement, so that specific requirements for each activity are clear. Funded organisations can run a report in the FAC that provides hyperlinks to all applicable SSGs for each funded activity, but we found that the majority of hyperlinks were outdated and broken.

Appendix B details the key areas of the service agreement that establish performance standards.

Deliverables

Figure 2A summarises the service agreement clauses and schedules that detail deliverables.

Figure 2A
Key deliverables in DHHS service agreements

Clause/schedule

Deliverables

Clause 8

Requires that funded organisations submit service delivery and financial accountability reports to DHHS as stated in the schedules and on request.

Schedule 2

Lists each service activity that the organisation is funded to perform. Activities that are classified as 'non-investment activities' should have a performance measure and target. These performance measures reflect the deliverables associated with the funding received.

Also lists various data collection requirements, including but not limited to:

  • service activity reports
  • project reports
  • national minimum dataset
  • annual reports.

Source: VAGO.

Regarding timing of deliverables, service agreements include financial year targets and DHHS requires funded organisations to report some performance measures more regularly. These additional requirements are not directly documented in the service agreement. Instead they are listed:

  • in activity descriptions available on DHHS's website (for human services activities)
  • in an appendix to the Policy and Funding Guidelines (for health services).
Omission or misalignment of mandatory performance measures

Activity descriptions in volume 3 (human services) and Appendix 4.1 of volume 2 (health) of the Policy and Funding Guidelines set out performance measures for each service activity. All performance measures in activity descriptions are mandatory. Performance measures in Appendix 4.1 are either mandatory or non-mandatory, which creates inconsistency in performance monitoring. Neither the Policy and Funding Guidelines nor the Service Agreement business rules and guidelines explain the basis for having non-mandatory measures.

Mandatory performance measures were not always included in the 12 service agreements that we reviewed. For example, Appendix 4.1 of the Policy and Funding Guidelines lists 'number of hours of service (provided to clients)' as a mandatory performance measure for the Home and Community Care program (HACC) volunteer coordination activity. Our review of three service agreements providing HACC services showed, however, that the only performance measure set for this activity is 'number of hours of coordinator time', which is a non‑mandatory measure. The mandatory measure is omitted from each of the three agreements.

In another example, the activity description for the home-based care—adolescent community placement service includes three mandatory performance measures:

  • daily average occupancy
  • percentage of the total number of children and young people in placements greater than six months who are in any of the following circumstances:
    • on family reunification
    • being cared for by DHHS Secretary
    • on long-term care orders that are contracted to the provider
  • percentage of total exits from placement that are planned.

Two service agreements that we reviewed included this adolescent community placement service, yet one of them did not include daily average occupancy as a performance measure.

The Multiple and Complex Needs Initiative is a time‑limited specialist disability service for people 16 years and older, who have been identified as having multiple and complex needs.

Our review of the 12 selected service agreements also showed that performance measures for the Multiple and Complex Needs Initiative (MACNI) service did not fully reflect the mandatory performance measures and targets set out in the Service provision framework: Multiple and Complex Needs Initiative December 2017 (MACNI Service provision framework) or the activity description.

The MACNI Service provision framework states that organisations providing MACNI service plans are required to report against three key performance indicators (KPIs):

  • 90 per cent of care plans are endorsed by the area panel within 12 weeks from the date of eligibility
  • 90 per cent of care plans are reviewed and endorsed by the area panel within six months
  • 100 per cent of clients have an exit transition plan endorsed at least six months prior to care plan termination.

These mandatory KPIs do not align with the three mandatory performance measures listed in the MACNI service activity description:

  • number of clients
  • percentage of MACNI clients that have an assessment and endorsed care plan within 12 weeks of eligibility
  • number of episodes of capacity building to provide care plan coordination for MACNI clients.

One of our 12 selected service agreements included MACNI services. We found that the agreement only listed one performance measure—100 per cent of MACNI clients have an assessment and endorsed care plan within 12 weeks of eligibility. This performance measure is slightly different to the corresponding measure in the MACNI Service provision framework and the service activity description. The agreement makes no reference to the remaining mandatory measures across these two documents.

Inconsistent performance measures between organisations

We also found that performance measures were inconsistent across different organisations with the same service activity.

Figure 2B shows the range of performance measures in five different service agreements for the Integrated Family Services activity, alongside the performance measures as required in the activity description.

Figure 2B
Differences in performance measures for the Integrated Family Services activity across five service agreements

Performance measure

Activity description

Organisation

A

B

C

D

E

Number of cases(a)

Number of service hours provided(a)

Number of clients

Number of packages

Number of families—intensive support to families—200 hours per family

(a) Mandatory measure as required in the activity description.
Source: VAGO.

DHHS attributes this inconsistency to some organisations not receiving funding for all components of the Integrated Family Services activity and therefore not being subject to all performance measures. However, given each of these funded organisations are funded to provide services directly to clients, it is reasonable to expect that the 'number of clients' performance measure would apply to all. Additionally, DHHS's documented activity descriptions make no mention of the link between funding and performance measures. It acknowledges that it could better explain the application of performance measures in its activity descriptions.

We also found inconsistencies across health services activities. For instance, Appendix 4.1 of the Policy and Funding Guidelines states that the mandatory performance measure for the HACC flexible service response activity is an annual service activity report. Three service agreements in our selection included the flexible response service, but one did not have the service activity report as a performance measure.

DHHS advised the type of funding attached for a service activity can affect whether performance measures and targets are required. DHHS classifies funding for activities into one of six categories:

  • ongoing and indexable
  • ongoing and non-indexable
  • fixed-term and indexable
  • fixed-term and non-indexable
  • minor capital
  • prior year adjustment.

DHHS applies annual price indexation at the rate approved by government to ongoing or fixed-term funding that is linked to wages. Specifically, for service activities that receive ongoing funding, SAMS2 would automatically require DHHS staff to include performance measures and targets. However, for service activities that receive fixed-term and non-indexable funding, performance measures are optional. This creates inconsistencies in how contracts are managed and limits DHHS's ability to assess the performance of these service activities. The basis for this differentiation is unclear given that organisations are delivering the same service.

Regardless of the funding arrangement, clients receiving services deserve the same level of assurance about the quality and accessibility of that service.

Inconsistent performance measures for the same organisation

We found that, even within one organisation, performance measures could be inconsistent across the service agreement and other related records and systems for performance measurement. This creates confusion for funded organisations and for DHHS about the level of service required. Figure 2C shows the differences in performance measures across different documents and systems for the Family Violence Support Services activity in one service agreement.

Figure 2C
Example of variance in performance measures across different service agreement documents and systems—Family Violence Support Services activity

Performance measure

Activity description

Service agreement

Service Delivery Tracking (SDT)

SAMS2

Number of new cases(a)

Number of contacts/referrals (Court Network)(a)

Percentage of clients sampled who are satisfied with the service provided

(a) This performance measure is mandatory as stated in the Family Support Services activity description document.
Source: VAGO.

Service Delivery Tracking is an online tool within the FAC website where funded organisations submit performance data on a monthly basis. It applies to approximately one-third of human services activities, discussed further in Section 3.2.

DHHS advised that SDT can only record one key performance measure per activity. DHHS plans to address this limitation through system improvements currently planned for 2018–19.

Other performance measure issues

We found that the performance measures and targets detailed in Schedule 2 of the 12 service agreements we examined were not always practical or easily understood. For example, some performance measures had a target of '0.1 new cases'. We heard conflicting reasoning for this from DHHS:

  • Area-based DHHS staff advised that an arbitrary target of 0.1 is entered when the SAMS2 system requires a target to be entered before finalising the agreement.
  • Other DHHS staff advised that it could be an administrative error.

Regardless of the reasoning, having a target of 0.1 new cases provides no insight into the level of service provided.

We also saw examples of what appears to be duplicate performance measures being set for the same service activity and financial year. While DHHS attributes its duplicative performance measures to the functionality of its SAMS2 system, it is potentially a source of confusion for funded organisations.

A lack of quality control in setting performance measures

The omission of mandatory performance measures, along with the inconsistency in how measures have been set and recorded, highlight a lack of system-wide oversight and quality control within DHHS.

Program area staff enter proposed performance measures into SAMS2, which then must be approved by:

  • a peer or team leader within the same program area
  • a finance approver for the relevant group, division or region.

DHHS's Service Agreement business rules and guidelines provide no guidance for staff on what to consider when approving proposed performance measures.

Beyond the program and finance-level approvals, DHHS does not perform a system-wide review of service agreement performance measures for similar activities to ensure that they are set and recorded in a compliant and consistent manner. This prevents DHHS from obtaining a clear and accurate understanding of performance across the state.

Agreement review mechanisms

Service agreements should include mechanisms and triggers to review the terms and conditions of the service agreement.

Service agreements provide for two types of reviews:

  • Clause 9 on audits and performance reviews
  • Clause 21 on reviewing terms and conditions of the service agreement.

This section is focused on the Clause 21 review process. Clause 9 is discussed in part three of this report.

We found that the mechanisms to review the terms and conditions of DHHS service agreements are sound. However, DHHS lacks assurance that its service agreement variations are being processed in accordance with these mechanisms.

Clause 21 of each service agreement states that the agreement may only be varied if either:

  • DHHS and the organisation agree in writing to the variation, or
  • DHHS notifies the organisation in writing of a proposed variation to the agreement and the date the proposed variation will take effect, and the organisation continues to deliver all or part of the services or delivers new services as described in the proposed variation after the effective date.

Variations are commonly used for service growth or new services. They can also be used for other changes, such as to funding and performance targets. In the service agreements of the 12 selected funded organisations we examined in this audit variations included:

  • increase/decrease to targets following a performance review
  • one-off funding allocation
  • recoup of unspent funds
  • transfer of funding from one organisation to another
  • lapsing funding allocation from the previous financial year.

DHHS has a standard variation process to support this clause that sets out a monthly variation schedule and approval process. This ensures consistency in documenting and timing variations.

The Service agreement information kit sets out the triggers for an agreement variation. These triggers include changes to funding and deliverables, or changes to other requirements contained in the agreement. Either DHHS or the organisation can initiate negotiations for a variation.

DHHS documents the details of each variation in SAMS2 and a finance delegate approves it. Once approved, organisations can review the variation and an amended service agreement through FAC. Organisations have five working days to check that the new version of the agreement reflects their expectations and to advise if there are any errors. Variations are effective five working days after being published on FAC.

The DHHS Service Agreement business rules and guidelines also includes further guidance for DHHS staff on the variation process. It sets out the minimum information required for the financial delegate to approve a variation. It also introduced the requirements for annual compliance audits of variations to verify that staff record the minimum information required in SAMS2 when processing a variation. DHHS has conducted the audit only once, using a small sample of 25 variations. DHHS has not conducted the audit for the 2017–18 financial year due to staff resourcing constraints.

The Adult, Community and Further Education Board plans and promotes adult learning, allocates resources, develops policies and advises the Minister for Training and Skills on matters related to adult education in Victoria.

The 2016–17 compliance audit found that:

  • nine variations (36 per cent) were fully compliant
  • eight variations (32 per cent) were partially compliant, with the majority of these supported by signed approval records but missing other key information, such as the cost centre or allocation method
  • eight variations (32 per cent) were noncompliant, having no supporting documentation recorded in SAMS2. Six of these variations related to service agreements that are managed by the Adult, Community and Further Education Board but recorded in SAMS2.

The absence of any subsequent variation compliance audits since 2016–17 limits assurance that DHHS is approving and processing service agreement variations in a compliant, evidence-based manner.

Links to the Department of Health and Human Services strategic plan and the Victorian public health and wellbeing outcomes framework

Service agreements should contain explicit links to DHHS's desired service system outcomes. In particular, a service agreement's accountability structures—comprising performance reporting and compliance with standards—should link to these outcomes.

We found that only some service agreement accountability structures clearly link with the Department of Health and Human Services strategic plan (DHHS strategic plan) and the Victorian public health and wellbeing outcomes framework (DHHS's outcomes framework).

DHHS established five outcomes in its 2017 strategic plan. This audit focused on one of these—'Victorian Health and Human Services are person‑centred and sustainable'. It is consistent with DHHS's outcomes framework.

To achieve this strategic direction and outcome, DHHS has established four supporting service system outcomes and identified underlying key results for each outcome as shown in Appendix C of this report. The four supporting service system outcomes are:

  • services are appropriate and accessible in the right place, at the right time
  • services are inclusive and respond to choice, culture, identity, circumstances and goals
  • services are efficient and sustainable
  • services are safe, high-quality and provide a positive experience.

The Department of Health and Human Services Standards are a single set of service quality standards for DHHS‑funded organisations and DHHS‑managed services. Organisations that provide direct client services must meet the standards as an obligation of their service agreement

The accessibility and quality of services that funded organisations provide under the service agreements directly impact DHHS's ability to achieve key results under the service system outcomes.

Service agreements require funded organisations that deliver services within the scope of the Department of Health and Human Services Standards (DHHS Service Standards) to obtain accreditation, every three years, against four standards:

  • Empowerment—people's rights are promoted and upheld.
  • Access and engagement—people's right to access transparent, equitable and integrated services is promoted and upheld.
  • Wellbeing—people's right to wellbeing and safety is promoted and upheld.
  • Participation—people's right to choice, decision-making and to actively participate as a valued member of their chosen community is promoted and upheld.

These standards align with DHHS's service system outcomes. Therefore, the service agreement requirements to obtain accreditation against the DHHS Service Standards contribute towards ensuring that service delivery aligns with the DHHS strategic plan and DHHS's outcomes framework.

In contrast, the way in which performance measures and activity reporting requirements in service agreements are linked to the DHHS strategic plan and DHHS's outcomes framework is less clear. Performance measures in service agreements almost exclusively reflect outputs and do not demonstrate how well organisations are achieving the outcomes DHHS has identified. Nor do service agreements explicitly mention the DHHS strategic plan or DHHS's outcomes framework.

Output-based performance measures

Typical examples of the output-driven performance measures in service agreements are:

  • number of service hours
  • number of clients
  • number of sessions.

These measures do not provide any information on service quality. Furthermore, the outcomes framework does not have any benchmarks or targets to assess performance or achievement of outcomes.

Across the 12 selected service agreements we found only two examples of performance measures that clearly focus on service quality:

  • 'percentage of clients who are satisfied with the service provided'—included in six out of the 12 agreements
  • 'percentage of services provided and/or referred to against identified key needs'—included in four of the 12 agreements.

While both measures are directly relevant to the system outcome 'services are safe, high quality and provide a positive experience', they are not mandatory for all funded organisations delivering the corresponding service activity.

2.3 Aligning service agreement requirements to risk

The scale and complexity of outsourced health and human services varies greatly, so it is important that the requirements set under each service agreement are targeted and proportionate to service risks.

We found that:

  • DHHS uses numerous mechanisms to manage service agreement risks—which are also fragmented and largely disconnected from each other
  • DHHS's main tool for categorising funded organisations according to risk has limited coverage, applying to only around a third of all funded organisations
  • DHHS's fragmented risk oversight does not inform funded organisations' service agreement obligations—which funded organisations commonly viewed as excessive and duplicative.

Categorising funded organisations according to risk

In recent years DHHS has introduced new methods, tools and systems to identify and assess risks associated with funded organisations and service agreements:

  • DHHS established FOPMF in 2016 to monitor performance and risks associated with funded organisations' service delivery, financial management and governance. This includes using a RAT to assess the severity of performance issues. We discuss FOPMF in further detail in Part 4 of this report.
  • In January 2016 DHHS launched the live monitoring component of SAMS2 where DHHS staff can record in real time performance issues and risks relating to funded organisations' service agreement.
  • DHHS launched its new CIMS in early 2018 to record and investigate incidents that have a direct impact on the safety of clients. Incidents recorded in CIMS include, but are not limited to, death, physical, emotional and sexual abuse and poor quality of care.
  • Since 2015 DHHS has performed spot audits of residential care providers to ensure that they deliver high-quality, compliant services to children and young people who reside in out-of-home care. DHHS undertakes these audits exclusively for residential care due to the higher risk that the activity poses.

Additionally, in 2015 DHHS introduced a risk-tiering framework to categorise funded organisations according to risk. This occurred as a result of internal reviews that took place between 2011 and 2014 all of which highlighted the need for a more risk-based approach to monitoring funded organisations.

Risk-tiering framework

Under the risk-tiering framework, DHHS performs quarterly assessments of funded organisations using criteria based on various reporting, systems and reviews, such as:

  • failure to meet the DHHS Service Standards
  • risk to client safety
  • loss or unauthorised disclosure of client information
  • failure to meet targets as highlighted in the SDT data
  • failure to meet its obligations in a timely manner.

Based on the results of these assessments, DHHS places organisations on one of three tiers—high, medium or low risk.

The risk-tiering framework has some key limitations. DHHS applies risk-tiering exclusively to funded organisations that fall under the DHHS Service Standards, which apply to direct client contact human services and not health or mental health activities. This means that approximately two-thirds of the 1 331 funded organisations with a standard service agreement are excluded from risk-tiering assessments. While risk-tiering does apply to higher risk client-facing services, DHHS acknowledges the need to expand its risk-based oversight to all funded organisations.

Figure 2D shows the average risk-tiering assessment score given to individual funded organisations during 2017 against the total funding received. Risk-tiering assessments do not consider the level of funding that an organisation receives.

Figure 2D
Average risk-tiering assessment score given to funded organisations against funding received, 2017

Average risk-tiering assessment score given to funded organisations against funding received, 2017

Note: Green dots = low-risk organisations; Orange dots = medium-risk organisations; red dots = high-risk organisations.
Source: VAGO based on DHHS data.

DHHS also does not use the results of risk-tiering assessments to inform service agreement obligations imposed on funded organisations. Consequently, funded organisations, regardless of their risk assessment, are subject to similar service agreement requirements, with the exception of those on short-form agreements and of the variations made to active service agreements.

Instead, risk-tiering results are sent to:

  • divisional staff for consideration, alongside local monitoring
  • central office staff to assist with prioritising unannounced audits of residential care providers, and to inform decisions to register organisations in line with requirements under the Children, Youth and Families Act 2005and the Disability Act 2006.

We found that there is a lack of integrated strategic risk assessment and management of DHHS service agreements. DHHS's other forms of risk-based oversight such as risks identified through FOPMF, live monitoring and audits of residential care providers, are not considered in combination with the risk‑tiering results and are mostly dealt with in isolation. This fragmentation increases the chance of inconsistent results and significant risks being missed by relevant departmental staff.

Funded organisations' administrative and compliance requirements

Through our online surveys and face-to-face interviews, we sought the views of funded organisations on whether:

  • service agreement administrative and compliance obligations align with the level of risk associated with contracted services
  • there is any duplication in the service agreement and data reporting requirements
  • they are able to consistently meet their service agreement and data reporting requirements
  • DHHS follows up when administrative and compliance obligations are not met.

We found that:

  • while the majority of funded organisations view their administrative and compliance obligations as being proportionate to service risk, a significant proportion of organisations believe they are excessive
  • service agreement administrative and compliance requirements are often duplicative at the departmental and inter-jurisdictional level—especially for larger funded organisations that provide services across multiple DHHS areas
  • only about half of the surveyed organisations believe they are consistently able to meet their service agreement administrative and compliance obligations
  • human services-focused organisations more commonly viewed their administrative and compliance obligations as being disproportionate to risk and beyond their own capacity.
Survey results

We summarise all the survey responses from funded organisations in Appendix E. The following sections focus on survey responses regarding funded organisations' administrative and compliance requirements.

Matching administrative and compliance obligations to risk

Figure 2E summarises funded organisations' responses to our survey question about whether their administrative and compliance obligations were appropriately matched to service risks. It shows that 70 per cent of respondents either agreed or strongly agreed that administrative and compliance obligations in their service agreement aligned with the associated risk.

Figure 2E
Survey responses—Funded organisations
Question 3: To what extent do you agree with this statement: Service agreement administrative and compliance requirements are appropriately matched to the level of risk associated with the services we are funded to deliver

Survey responses—Funded organisations. Question 3: To what extent do you agree with this statement: Service agreement administrative and compliance requirements are appropriately matched to the level of risk associated with the services we are funded to deliver

Source: VAGO.

Health services-focused organisations gave more favourable responses to this question than human services-focused organisations:

  • Seventy-eight per cent of surveyed organisations that primarily deliver health services either agreed or strongly agreed that their administrative and compliance obligations were appropriately matched to their service risk. Another 10 per cent of respondents either disagreed or strongly disagreed, while 12 per cent neither agreed nor disagreed.
  • Sixty-seven per cent of surveyed organisations that primarily deliver human services either agreed or strongly agreed that their administrative and compliance obligations were appropriately matched to service risks. Another 13 per cent either disagreed or strongly disagreed, while 20percent neither agreed nor disagreed.

The funded organisations' open-text responses commonly raised concerns about excessive administrative and compliance requirements set by DHHS that do not scale with service risks, organisation size or the level of funding provided. Figure 2F gives examples of these concerns.

Figure 2F
Survey responses—Funded organisations: Open-text examples highlighting excessive administrative and compliance requirements

'It seems that there is the same quality system requirements for small, relatively simple programs as for the large and complex ones.'

'We are a small service delivery organisation in a small rural town. Our compliance requirements are far above the level of risk associated with the delivery of services we provide.'

'There is significant administrative expectations from the Department for low level funding arrangements. Excessive meetings and duplication of processes.'

'We are a small one staff member organisation having to meet the requirements of hugely funded agencies so nearly all areas are a burden to us.'

'Compliance is an onerous process and more time is spent on this area than program delivery.'

'We receive funding from the commonwealth and NSW government as well DHHS. We find that reporting is much greater from DHHS than from other governments.'

'We do not provide direct care services, but receive emails about compliance requirements as if we did.'

'Given the meagre program funding received, the level of reporting on ASM [Active Service Model], Diversity and Care Plans for example do not fit into the very low level offered by [name of organisation] and the model of operation we use.'

'We have already completed accreditation through registered Quality and Regulatory providers, e.g. [name of independent review body]. Why do we have to go through it all again. Unnecessary red tape. Risk aversion is over the top!'

'Focus is on throughput numbers according to targets, but little concentration on the quality of service or differentiation of which programs hold the most risky situations.'

'The requirements for Accreditation against HSS [Human Services Standards] plus governance standards, as well as the FOPMF and the SACC [Service Agreement Compliance Certification] are overly duplicative and burdensome. We are required to report in an extraordinary amount of detail how we go about our business, and maintain multiple registers for small numbers. This level of reporting does not assist us to manage risk in fact it creates a risk to the organisation in terms of our capacity to deliver quality services.'

'The amount of compliance required for our very small organisation is significant.'

'Compliance requirements have increased significantly without adequate funding. Most services we deliver are not high risk however we do need to comply with a wide range of legal and other requirements because of the variety of services we provide.'

'Reporting, data collection and compliance arrangements vary greatly between different sections of DHHS but can include double submission of data, face to face meetings, and reporting both centrally and regionally. The compliance requirements appear to be increasing across the board with little or no relationship to the level of risk of services.'

Source: VAGO.

Despite respondents' concerns about excessive administrative and compliance obligations, many still believe that they do not receive sufficient performance information in return from DHHS. Only 55 per cent of surveyed organisations either agreed or strongly agreed that they receive all the information they need from DHHS to understand how well their organisation is performing against service agreement targets. Figure E10 in Appendix E further details survey responses to this question.

Duplication across administrative and compliance obligations

Responses to our survey of funded organisations indicated that there is duplication in DHHS's service agreement reporting and data requirements as shown in Figure 2G below.

Figure 2G
Survey responses—Funded organisations: Duplication in service agreement reporting and data requirements

 

Level of duplication reported

Type of duplication

Significant

Moderate

Minimal

None

Not sure

Duplication within DHHS(a)

8%

13%

31%

42%

6%

Duplication across DHHS and other parties(b)

12%

19%

36%

29%

4%

(a) Relates to survey question 4—Is there any duplication in the service agreement reporting and data your organisation is required to provide to DHHS?
(b) Relates to survey question 5—Is there any duplication in the service agreement reporting and data your organisation is required to provide to other parties?
Source: VAGO.

Fifty-two per cent of respondents reported some level of duplication of reporting and data that their organisation is required to provide to DHHS. This includes being required to provide the same data to different DHHS information systems or providing the same information to different DHHS divisions or areas. The funded organisations' open-text responses raised numerous concerns with duplicative reporting and data requirements set by DHHS. Figure 2H gives examples of these concerns.

Figure 2H
Survey responses—Funded organisations: Open-text examples highlighting duplicative data and reporting requirements within DHHS

'The same data is reported often, monthly, quarterly and annually, via a variety of systems. DHHS do a 2–3 hour desk review with us which is a "mini" accreditation, absolute waste of time.'

'We have to provide three identical sets of quality documents one to health and one to North West Human Services and one to Southern Human Services - we also meet with three different LEOs or PASAs [Program Advisers] during the year.'

'Historically there has been a lack of consistency in reporting requirements and no streamlining. There are multiple reporting requirements to different stakeholders and individual contract managers request KPI reporting or additional reports in the format they require. There appear to be some attempts to change this.'

'As we are a state funded organisation, we continuously have to provide the exact same information to different DHHS divisions/areas.'

'The SACC [Service Agreement Compliance Certification], FOPMF, Desk Review and Accreditation process all involve the same questions.'

'Reporting is programmatic based so clients that receive multiple services from agencies are required to provide their client data to each individual service provider multiple times which is then entered into separate program databases.'

'An example is in homelessness and family violence counselling services where we are required to submit MDS [minimum dataset] data quarterly to DHHS, but submit the same data monthly to the regional office.'

Source: VAGO.

Sixty-seven per cent of respondents reported some level of duplication between the reporting and data requirements of DHHS and other parties such as Commonwealth government departments and accreditation bodies. The funded organisations' open-text responses commonly raised concerns with these duplicative reporting and data requirements. Figure 2I gives examples of these concerns.

Figure 2I
Survey responses—Funded organisations: Open-text examples highlighting duplicative data and reporting requirements across DHHS and other state or Commonwealth departments.

'The data we are required to submit to DHHS, the Adult, Community and Further Education Board and Local Government are often the same but required in different formats adding to the administrative burden. Then there are all the other Government departments. If one software program could be developed which would export data to everyone it would make our lives easier.'

'We have had instances where we have had to provide a great deal of resources and time to dealing with the same type of data and issues for different accreditation bodies and Government agencies to demonstrate compliance.'

'Homelessness data is reported to both the AIHW [Australian Institute of Health and Welfare] data collection and to DHHS separately.'

'Reporting to both DHHS and CHSP [Commonwealth Home Support Programme] requires some duplicate reporting and both conduct similar quality audits which have similar questions and aims.'

'As an example we provide significant information to our accreditation body and then have to submit at various times the exact same information and detail to DHHS and other Government bodies upon request.'

'Auditing requirements for DHHS and Dept Health [Commonwealth] funded program are duplicated in many ways. Separate audits means double information provided and additional cost.'

'Compliance for DHHS and Commonwealth is the same yet in a different format.'

'When one program is funded by two Victorian State departments, then the same information has to be reported to both depts.'

Source: VAGO.

The survey results regarding duplicative and administrative and compliance obligations is consistent with our findings in Section 4.2 regarding the design of FOPMF.

Capacity to meet administrative and compliance obligations

Figure 2J summarises responses to our survey question about the extent to which funded organisations can meet all service agreement administrative and compliance obligations. Only 52 per cent of surveyed funded organisations believe that they can consistently meet all requirements.

Figure 2J
Survey responses—Funded organisations
Question 8: To what extent is your organisation able to meet all the service agreement administrative and compliance requirements?

Survey responses—Funded organisations. Question 8: To what extent is your organisation able to meet all the service agreement administrative and compliance requirements?

Source: VAGO.

The survey results showed that human services-focused organisations have less capacity to meet DHHS's service agreement administrative and compliance requirements than health services organisations:

  • Staff in human services-focused organisations find it harder to meet all service agreement administrative and compliance requirements. Forty‑eight per cent of human services-focused organisations reported that they are always able to meet these requirements, compared to 64 per cent of health services-focused organisations.
  • Fifty-nine per cent of human services-focused organisations reported that they have staff resources dedicated to meeting these requirements, compared to 65 per cent of health services-focused organisations.
  • Staff in human services-focused organisations spend more time than staff in health services-focused organisations on meeting service agreement administrative and compliance requirements. In 36 per cent of surveyed human services-focused organisations, service delivery staff spend more than one-fifth of their time on meeting these requirements. In comparison, 22 per cent of service delivery staff in health services-focused organisations spend more than one-fifth of their time on meeting these requirements.
Administrative and compliance requirements across multiple DHHS areas

Surveyed organisations that deliver services in multiple DHHS areas reported greater misalignment between their administrative and compliance requirements and their service risks, as well as greater duplication across their data and reporting obligations:

  • Nineteen per cent of surveyed organisations that deliver services in multiple DHHS areas either disagreed or strongly disagreed that their administrative and compliance requirements were appropriately matched with service risks, compared to 10 per cent of organisations that deliver services in one DHHS area.
  • Thirty-four per cent of surveyed organisations that deliver services in multiple DHHS areas reported either significant or moderate levels of duplication in the service agreement reporting and data required by DHHS, compared to 17 per cent of organisations that deliver services in one DHHS area.
Face-to-face interviews with selected funded organisations

Our discussions with four selected funded organisations provided mixed views on their administrative and compliance obligations.

One organisation reported that there is little duplication across its administrative and compliance obligations. However, it also reported that DHHS's introduction of CIMS in early 2018 had led to excessive investigative and reporting obligations compared to previous arrangements.

Another organisation reported that its administrative and compliance obligations were no longer excessive after the DHHS LEO started meeting every two months with the organisation approximately two years ago. This reportedly led to more proactive engagements and streamlined performance monitoring processes.

The third organisation advised that its administrative and compliance obligations were resource-intensive to meet, but not excessive. It did acknowledge that some degree of duplication exists across these obligations, however, it believed that this duplication helped to reinforce its understanding of performance.

The fourth organisation that provides services across Victoria viewed its administrative and compliance obligations as highly excessive. It attributed this to:

  • duplicative and fragmented monitoring performed by different LEOs across multiple regions
  • significant overlap between different performance monitoring functions. In particular, the organisation advised that various annual performance monitoring requirements under FOPMF were already covered in greater detail by a separate triennial review of each organisation by an independent accreditation body.

Consistent with the results from our survey of funded organisations, our face-to-face discussions with funded organisations indicated that larger funded organisations—which typically deliver more services across multiple DHHS areas—are more likely to have duplication in administrative and compliance obligations.

Back to top

3 Staff skills, capabilities and capacity

DHHS service agreement staff need to strike a balance between managing funded organisations' performance in accordance with contractual requirements and partnering with them so that they are best placed to provide quality services to clients. The varied and often competing priorities of service agreement staff reinforce the need for DHHS to clearly define their roles, including key skills and capabilities. DHHS also needs to provide all staff with sufficient opportunities to acquire and develop key skills and capabilities over time.

In this part, we assess:

  • whether DHHS has defined the skills and capabilities necessary to manage service agreements
  • whether DHHS has developed a learning and development pathway for service agreement staff that reflects defined skills and capabilities
  • how DHHS service agreement staff perceive their capacity to manage service agreements.

3.1 Conclusion

While retaining its relationship management approach, DHHS is increasing the focus of its service agreement staff on performance management of funded organisations. This follows a series of reviews that highlighted gaps in how DHHS monitors and manages service agreement performance. This increased focus is consistent with better practice principles for contract management, but varying sector capability and depth will pose challenges to its implementation.

This shift is made even more difficult by the absence of a structured and comprehensive learning and development framework for service agreement staff. Learning and development offerings focus too heavily on systems and processes and are insufficient for giving staff the key skills and capabilities needed to manage service agreements. Poor uptake of individual performance plans among service agreement staff has further limited their opportunities to acquire skills and develop.

Combined with the wide geographic spread of service agreement staff, these issues have contributed to a lack of staff awareness of their roles and of the extent of their responsibilities.

3.2 Defining required staff skills and capabilities

DHHS restructured its service agreement management function at the central office, divisional and area level across the first half of 2018.

As part of the restructure process, in 2017 DHHS commissioned a review of the current system of service agreement management (the restructure review). It found there was conflict in DHHS's roles of funder, provider, contract manager and regulator, and confusion about the multiple functions of LEO and program adviser roles. The report proposed splitting the regulatory and contract management functions. DHHS implemented this in multiple stages:

  • It created a new central Regulatory Enforcement Unit in November 2017 that focuses on system-wide regulation of health and human service practitioners, providers and facilities.
  • It created four divisional Regulatory Compliance and Enforcement teams in June 2018 to undertake inspections, investigations and enforcement activities across specified regulatory schemes applying to community services regulated by DHHS.
  • It also created a new central Service Agreement Performance Unit in February 2018 to oversee and ensure funded organisation performance at a statewide level.

Another key change implemented in June 2018 was to combine the roles of LEOs and program advisers into a single service agreement adviser role that extends across the health and human services portfolios. These roles sit within the new Agency Performance and System Support Units in each of DHHS's 17 areas. DHHS expects that this new role will have an increased focus on performance management of funded organisations while keeping a relationship management focus. We examined the required skills and capabilities of previous LEO and program adviser roles, along with the recently announced service agreement adviser roles.

Key contract management skills and capabilities

Figure 3A details the key skills and capabilities of contract managers, based on the ANAO's better practice guide and the VGPB's VPS Procurement Capability Framework.

Figure 3A
Key skills and capabilities of contract managers

Category

Key skills and capabilities

Commercial/financial

Understands and applies relevant laws and accountability requirements and financial arrangements.

Performance management

Monitors service levels, provides feedback and manages underperformance.

Interpersonal/relationship building

Builds strong working relationships, encourages cooperation and communication.

Influence and negotiation

Applies negotiation skills and expertise to ensure benefits are realised and continuous service delivery improvements are identified and implemented.

Problem-solving and conflict management

Maintains a positive approach to solving problems and encourages mutual cooperation to resolving disagreements.

Organisational context

Understands the operating environment of the organisation.

Leadership

Manages team resources to maximise performance.

Source: VAGO based on the ANAO's better practice guide and the VGPB's VPS Procurement Capability Framework.

The restructure review commissioned by DHHS to support the proposed restructure of its service agreement management function identified the need for contract managers to have similar skills and capabilities.

Position descriptions for service agreement staff

Previous LEO and program adviser roles

We examined the previous position descriptions for LEOs and program advisers to see whether they addressed the above skills and capabilities. We found that the position descriptions for these roles did not explicitly focus on the performance management and commercial/financial skills identified in the ANAO's better practice guide and VGPB's VPS Procurement Capability Framework. However, other central and divisional DHHS staff are responsible for monitoring the financial performance of funded organisations under both the previous and current organisational structure. The LEO and program adviser position descriptions more clearly reflected the remaining five skills identified in Figure 3A, although they were not identical.

LEO position descriptions focused on softer skills such as service excellence, customer/client focus and decisiveness, while program adviser position descriptions focused on analysis and written communication skills. This reflects the historical split between the service agreement roles in health and human services. Both position descriptions reflected DHHS's strong focus on supporting and collaborating with funded organisations, however, they did not commensurately recognise the necessary contract management skills required of the role.

New roles

We performed the same analysis for the new roles within the newly created Agency Performance and System Support Units in each DHHS area, including service agreement adviser roles and other new roles that relate to managing service agreements. We found that:

  • the senior and principal service agreement adviser roles have a clearer focus on managing and monitoring the performance of funded organisations
  • the VPS Grade 3 and 4 service agreement adviser position descriptions focus on similar skills to the previous roles, and do not have an increased focus on the performance management-related skills identified in the ANAO and VGPB guides
  • three new roles created within each DHHS area's Agency Performance and System Support Unit would focus on improving funded organisations' performance in relation to their contractual obligations.

It is evident from the new positions descriptions—as well as the proposed functions of the newly created regulatory and performance units—that DHHS is more explicitly focusing on managing the performance of funded organisations against contractual obligations.

Our discussions with LEOs, program advisers and funded organisations highlighted concerns with the increased focus on performance management:

  • Funded organisations consistently placed high importance on DHHS managing service agreement performance in a supportive and collaborative manner.
  • LEOs and program advisers in the Western Melbourne and Ovens-Murray Areas also placed high importance on being able to manage service agreements in a way that supports and collaborates with funded organisations.
  • Multiple DHHS staff expressed concern that the proposed change to a more performance management-focused service agreement adviser role that does not specialise in health or human services would damage stakeholder relationships and diminish awareness of the environment that funded organisations operate within.
  • LEOs and program advisers said they felt poorly equipped to manage service agreements under the proposed new arrangements, chiefly due to a historical lack of training focused on developing their contract management-related skills.

The next section examines the range of training programs available to service agreement advisers in further detail.

Successfully adopting a more performance management-focused approach will require DHHS to actively manage these concerns. In particular, it is important that DHHS:

  • tailor its engagement approach to the capability of each funded organisation, as well as to the risks associated with the services they provide
  • clarify its expectations of service agreement staff regarding sector support and partnerships.

3.3 Providing learning and development pathways

Overall, we found that:

  • learning and development programs focus heavily on applying processes and using service agreement systems like SAMS2 and do not sufficiently focus on good practice principles for contract or risk management
  • there is a lack of focus within DHHS on individual performance planning and review for service agreement staff.

In terms of orientation and induction, DHHS centrally provides an 'Introduction to managing service agreements' training program for new service agreement advisers and other relevant staff that:

  • outlines the key elements of the service agreement
  • confirms key requirements of staff in their day-to-day work across all stages of service agreement management, from negotiation through to performance assessment
  • details functions for managing and monitoring service agreements.

The introductory program includes high-level content on good practice contract management and governance principles that is confined to a series of brief PowerPoint slides. This content alone is insufficient to provide service agreement advisers with the contract management and governance skills needed to effectively manage service agreements and the performance of funded organisations.

Figure 3B details all of DHHS's centralised training programs that relate to managing service agreements. We found that these programs focus on how to use relevant DHHS systems and follow established processes.

Figure 3B
DHHS training programs for managing service agreements and frequency of delivery

Training program

Duration

Frequency (2018)

Introduction to managing service agreements (incorporating FOPMF)

Two days

March, May, July, September, October, November

SAMS2 overview

Half day

Monthly

SAMS2 general

Two days

Monthly

FAC

Full day

Monthly

Desktop review

Full day

February, April, June

Note: Excludes any local training programs that are designed and delivered at a divisional or area level.
Source: VAGO based on DHHS's 2018 training calendar.

A desktop review is an annual assessment of a funded organisation's performance based on information collected throughout the year. It forms part of FOPMF.

DHHS does not have a central attendance register for all training programs that would show how many staff have completed courses or induction.

SAMS2 training is compulsory for staff who manage service agreements. Records for SAMS2 General and SAMS2 Overview training in 2017 show that 143 DHHS staff attended these courses. Half of the participants were based in DHHS's central office, and half were based in the divisions. These courses are open to anyone who uses the SAMS2 system.

DHHS also provides online training modules for SAMS2, live monitoring and the SDT reporting system.

The internal report of the restructure review for assessing the organisational redesign of the service agreement management function identified an increased need for training.

Staff views on learning and development pathways

Through our online surveys and face-to-face interviews, we sought the views of DHHS service agreement staff on whether:

  • the orientation and induction offered for their role gave them the basic knowledge and skills needed to manage service agreements
  • training provided by DHHS helped to build and maintain the skills needed to manage service agreements
  • staff have an individual performance plan that addresses learning and development needs.

We summarise all DHHS staff survey responses in Appendix D.

Overall the survey results highlight that a significant proportion of staff are dissatisfied with DHHS's learning and development framework for managing service agreements.

Staff views on orientation and induction

Figure 3C summarises DHHS staff responses to our survey question about whether the orientation and induction for their role was effective. It shows that 29 per cent of respondents viewed their orientation and induction as either 'not so effective' or 'not at all effective'.

Figure 3C
Survey responses—DHHS staff
Question 8: How effective has the orientation and induction provided by DHHS been at giving you the basic skills and knowledge needed to manage service agreements?

Survey responses—DHHS staff. Question 8: How effective has the orientation and induction provided by DHHS been at giving you the basic skills and knowledge needed to manage service agreements?

Source: VAGO.

There was little difference in the survey responses to this question between human services-focused staff and health services-focused staff. However, there were differences in the responses across DHHS divisions:

  • East Division staff were most satisfied with their orientation and induction, with just 18 per cent of respondents reporting that it was either 'not so effective' or 'not at all effective'.
  • West Division staff were least satisfied with their staff orientation and induction, with 34 per cent of respondents reporting it was either 'not so effective' or 'not at all effective'.

We also asked the DHHS staff surveyed to suggest how DHHS could improve its orientation and induction for new staff. Respondents' open-text responses commonly highlighted the need for:

  • a more structured and standardised program for giving new staff the basic skills needed to manage service agreements
  • improved mentoring of new staff by more experienced colleagues
  • more frequent orientation and induction offerings for new staff
  • more accessible guidance material for new staff to access outside of formal orientation and induction activities
  • a risk-based approach to allocating new staff to manage service agreements and funded organisation performance.

Our discussions with central and regional DHHS staff highlighted concerns with staff orientation and induction for managing service agreements. Specifically, various LEOs and program advisers advised that they never received a formal orientation or induction for their role and had to rely on the experience and knowledge of nearby colleagues instead.

Staff views on training

Figure 3D summarises DHHS staff responses to our question about the effectiveness of training in building and maintaining their service agreement management skills. It shows that 32 per cent of respondents viewed training as either 'not so effective' or 'not at all effective'.

Figure 3D
Survey responses—DHHS staff
Question 9: How effective has training provided by DHHS been at building and maintaining the skills you need to manage service agreements?

Survey responses—DHHS staff. Question 9: How effective has training provided by DHHS been at building and maintaining the skills you need to manage service agreements?

Source: VAGO.

Human services-focused staff reported higher levels of dissatisfaction with their training than health services-focused staff. Specifically, 36 per cent of human services-focused staff viewed training as either 'not so effective' or 'not at all effective', compared with 26 per cent of health services-focused staff.

We also found differences in survey responses across DHHS divisions:

  • Survey respondents from DHHS's East Division were the most satisfied with training of all divisions. Thirty-six per cent of East Division respondents viewed training as either 'very effective' or 'extremely effective'.
  • Survey respondents from the North Division were the least satisfied with training. Only 12 per cent of North Division staff viewed their training as either 'very effective' or 'extremely effective'.

We asked the DHHS staff surveyed to suggest how DHHS could improve its training of service agreement staff. Respondents' open-text responses commonly highlighted the need for:

  • clearer expectations regarding service agreement management roles so that training programs can be better targeted
  • an overarching learning and development framework that brings together all the skills and knowledge required to manage service agreements, rather than disjointed offerings that focus heavily on systems and processes
  • a redesign of the existing SAMS2 system training
  • increased training frequency and accessibility for regional staff
  • more web-based training programs.

Our discussions with regional DHHS staff highlighted concerns with the training offered for managing service agreements. Specifically:

  • staff working in the Western Melbourne and Ovens-Murray Area offices expressed the need for increased training in contract management and governance
  • Wangaratta-based staff highlighted the logistical challenges of attending service agreement training programs that are only available in Melbourne.
Staff views on individual performance plans

Figure 3E summarises DHHS staff responses to our question about whether they have an individual performance plan. Only 76 per cent of respondents reported that they have one.

Figure 3E
Survey responses—DHHS staff
Question 11: As a service agreement monitoring staff member, do you (or did you) have an individual performance plan?

Survey responses—DHHS staff. Question 11: As a service agreement monitoring staff member, do you (or did you) have an individual performance plan?

Source: VAGO.

Based on the survey results, we found that the use of individual performance plans was particularly low for human services-focused staff and North Division staff:

  • Seventy-three per cent of human services-focused staff reported that they have a performance plan, compared with 82 per cent of health services-focused staff.
  • The North Division had the lowest proportion of respondents that reported having a performance plan (66 per cent).
  • DHHS's central office had the highest proportion of staff that were not sure if they had a performance plan (14 per cent).

We asked DHHS staff survey respondents who reported having a performance plan about how often their performance was reviewed against it with their manager. We summarise the responses to this question in Figure 3F. It shows that staff most commonly review their performance against their plan twice per year (58 per cent), in accordance with DHHS's Performance and Development Process Policy. This trend was consistent across DHHS divisions and between health and human services-focused staff.

Figure 3F
Survey responses—DHHS staff
Question 12: How often do you (or did you) review performance against your individual performance plan with your manager?

Survey responses—DHHS staff. Question 12: How often do you (or did you) review performance against your individual performance plan with your manager?

Source: VAGO.

We also asked DHHS staff about how effective the individual performance planning and review process had been at addressing their learning and development needs. Figure 3G shows that 48 per cent of staff viewed the process as 'somewhat effective', while 32 per cent viewed it as either 'not so effective' or 'not at all effective'.

Figure 3G
Survey responses—DHHS staff
Question 13: How effective has the individual performance planning and review process been at addressing your learning and development needs?

Survey responses—DHHS staff. Question 13: How effective has the individual performance planning and review process been at addressing your learning and development needs?

Source: VAGO.

We found some variation in staff views on the individual performance planning process between the health and human services portfolios, as well as between DHHS divisions:

  • Human services-focused staff were less satisfied than their counterparts in health services-focused roles—38 per cent viewed the process as either 'not so effective' or 'not at all effective', compared with 28 per cent of health services-focused staff.
  • Staff in DHHS's North Division were particularly dissatisfied with the process compared to other divisions, with 48 per cent of respondents viewing the process as either 'not so effective' or 'not at all effective'.

3.4 Staff capacity to manage service agreements

Our survey of DHHS staff also sought their views on their capacity to manage services agreements. This included questions on:

  • the proportion of staff tasks performed that are not in their position description
  • the amount of time staff spend on monitoring and managing the performance of funded organisations
  • the amount of time staff spend on other tasks.

In particular, the survey results show that a high proportion of staff believe much of their work is focused on tasks that are outside their position description. This is consistent with the restructure review commissioned by DHHS. It found that staff time and effort spent on service agreement activities had been compromised by 'a lack of clarity in the interface between contract management, relationship management or sector capability building, and the appropriate time for LEOs and PASAs to spend on each of these functions'. The restructure review also found that, in the absence of role clarity, service agreement staff had 'taken on responsibility for managing pressing client needs, rather than managing compliance of service providers, nor addressing systemic issues and gaps'.

Tasks performed outside position descriptions

It is important that service agreement staff focus their effort on the core functions of their role. Having staff regularly perform tasks that are outside their role—such as addressing client needs—limits the capacity to monitor and address funded organisation performance issues.

Figure 3H summarises survey responses to our question on the proportion of staff tasks performed that are outside their position description. It shows that:

  • 28 per cent of respondents believe that somewhere between 25 and 50 per cent of their tasks are outside their position description
  • 21 per cent of respondents believe that over 50 per cent of their tasks are outside their position description.

Figure 3H
Survey responses—DHHS staff
Question 14: As a service agreement monitoring staff member, what proportion of tasks that you perform are NOT reflected in your position description?

Survey responses—DHHS staff. Question 14: As a service agreement monitoring staff member, what proportion of tasks that you perform are NOT reflected in your position description?

Source: VAGO.

These results did not vary significantly across DHHS divisions. However, there was some variation between the health and human services portfolios. Fifty-two per cent of human services-focused staff believe that they spend at least 25 per cent of their time on tasks that are outside their position description, compared with 42 per cent of health services-focused staff.

Time spent on monitoring and managing funded organisation performance

Figure 3I summarises survey responses to our question on how much time DHHS staff spend per day on monitoring and managing funded organisation performance. It shows staff spending variable amounts of time on these tasks, with 21 per cent of respondents estimating that they spend three to four hours per day on monitoring and managing organisations' performance.

Figure 3I
Survey responses—DHHS staff
Question 15: On average, how much time per day do you spend monitoring or managing the performance of funded organisations?

Survey responses—DHHS staff. Question 15: On average, how much time per day do you spend monitoring or managing the performance of funded organisations?

Source: VAGO.

There was some variation in the responses to this question between DHHS divisional and central office staff:

  • In central office, 9 per cent of surveyed staff who manage service agreements estimate that they spend three hours or more per day on monitoring and managing funded organisation performance.
  • Between 50 and 54 per cent of staff in DHHS's four divisions estimated that they spend three hours or more on these same tasks.

We also found that the results varied between health services-focused staff and human services‑focused staff, with 44 per cent of health services-focused staff spending two hours or more per day on monitoring and managing funded organisation performance, compared to 63 per cent of human services-focused staff.

Time spent on other tasks

Figure 3J summarises survey responses to our question on how much time DHHS staff spend per day on other tasks beyond monitoring and managing funded organisation performance. Like Figure 3I, it shows staff report spending variable amounts of time on these other tasks.

Figure 3J
Survey responses—DHHS staff
Question 16: On average, how much time per day do you spend on other tasks (i.e. beyond monitoring or managing the performance of funded organisations)?

Survey responses—DHHS staff. Question 16: On average, how much time per day do you spend on other tasks (i.e. beyond monitoring or managing the performance of funded organisations)?

Source: VAGO.

These results did not vary significantly when broken down by DHHS portfolio or division. However, the results did vary between DHHS central office and divisional staff. Specifically, 56 per cent of central office staff estimated that they spend five hours or more per day on other tasks, while between 15 and 25 per cent of divisional staff spend five hours or more per day on the same tasks. This reflects the fact that central office staff will often manage service agreements as a secondary part of their role.

3.5 Corporate knowledge risks

Agencies need to retain the knowledge of key employees in the event that they are unavailable or leave their role.

We found that a significant amount of corporate knowledge relating to the SAMS2 information system is held exclusively by two key DHHS staff—one with knowledge of the system's infrastructure and another with knowledge of the system's operational functions.

One of these two staff members recently moved into another role within DHHS but is still regularly called upon to assist with SAMS2-related issues and queries. DHHS currently has no formal measures to capture the knowledge of these two staff.

Considering that SAMS2 is DHHS's main information system for storing service agreement information and managing performance, this poses a significant risk to DHHS and its ongoing capacity to manage service agreements.

Back to top

4 Monitoring and managing performance

The aim of contract management is to ensure that all parties meet their obligations. All contracts—including service agreements—require active management throughout their life to ensure that the goods and services are delivered to the agreed standards and timeframes.

In line with the ANAO's better practice guide, monitoring and managing service agreement performance involves:

  • collecting sufficient but not excessive data from funded organisations on their service delivery and adherence to other contractual requirements
  • using collected data to assess whether funded organisations are meeting contractual requirements
  • taking appropriate action to address underperformance.

In this part, we assess whether:

  • FOPMF aligns with better practice contract management and supports staff to effectively manage service agreements
  • DHHS is consistently and comprehensively implementing the performance monitoring framework to drive service outcomes for clients.

4.1 Conclusion

DHHS's FOPMF is inefficient and ineffective and does not support the provision of system-wide assurance that clients receive safe, high-quality services that meet their needs. The framework does not enable staff to gain a clear insight into performance issues and whether contracted services are being delivered as intended. Its inability to match monitoring requirements to risk, complexity and funding levels—combined with fragmented and duplicative approaches to collecting performance information—leads to wasted monitoring efforts for both DHHS and funded organisations.

It is therefore not surprising that the uptake of prescribed performance monitoring tools among DHHS staff has been inconsistent, with many turning to local systems and tools to offset the framework's shortcomings. Such fragmentation prevents DHHS from providing informed advice to its senior management and ministers on performance issues that could put client safety and service delivery at risk.

4.2 Performance monitoring framework

DHHS's performance monitoring framework, FOPMF, provides the process for DHHS staff to assess funded organisations' compliance with service agreement requirements and to respond to identified risks and underperformance. It became operational on 1 January 2016. Figure 4A shows the components of FOPMF.

Figure 4A
Components of FOPMF

Figure 4A shows the components of FOPMF

Source: VAGO.

FOPMF comprises a series of pre-existing and new monitoring tools as shown in Figure 4B. FOPMF tools focus on monitoring funded organisations' governance, financial management and quality and safety of service delivery to clients.

Figure 4B
FOPMF tools

Tool

Description

Existing tools (pre-FOPMF)

Desktop review

Annual assessment of organisations' performance undertaken by monitoring teams.

Service review

Conducted where DHHS identifies a high level of risk or issues of concern. Can be collaborative or investigative in nature.

Financial Accountability Requirements

Organisations submit their financial position to DHHS each year, which is then reviewed and assessed to confirm that the organisation is financially sustainable.

New tools

Service agreement monitoring checklists

Completed by service agreement advisers annually to assess organisations' compliance and performance. Includes Organisational Compliance Checklist, Service Plan Checklist, Quality and Safety Checklist and specialist checklists. Some checklists were embedded within SAMS2 as of July 2017.

Live monitoring

SAMS2 feature that allows service agreement advisers to record real time data on organisation performance and track resolution of issues.

RAT

A tool DHHS staff use to determine the severity of performance issues. Can trigger remedial actions or a service review.

Service agreement compliance certification (SACC)

Online attestation completed by the organisations annually regarding financial performance, risk management, staff safety screening and privacy. Embedded within SAMS2.

Source: VAGO.

Overall, we found that:

  • FOPMF is essentially a one-size-fits-all framework with some minor exceptions where FOPMF requirements are either optional or not applicable. It cannot be scaled up or down to account for the varying complexities and sizes of funded organisations, or their risk profiles.
  • FOPMF tools such as the service agreement checklists and desktop reviews are heavily compliance driven to ensure funded organisations meet legislative and policy requirements.However, they lack the ability to enable deeper insights into service quality and performance issues.
  • FOPMF drives a fragmented and duplicative approach to collecting performance information.

The framework applies to organisations funded through a service agreement with DHHS. There are some exceptions where FOPMF requirements are either optional or not applicable:

  • FOPMF does not apply to organisations funded only through a short-form agreement (used for lower risk grant funding) or a corporate commercial contract. These agreements have their own specific reporting requirements and monitoring arrangements.
  • The service agreement monitoring checklists do not apply to community participation service activities such as neighbourhood houses.
  • The service agreement monitoring checklists are optional for:
    • services that are not direct client facing such as research and training
    • health services that only receive funding through DHHS's Budget Performance System.
  • The desktop review does not apply to hospitals, local governments, organisations funded under the National Disability Insurance Scheme, universities, technical and further education institutes (TAFE), schools and some specific community participation activities.
  • The risk attestation component of the SACC form does not apply to TAFEs and health services that already include an attestation against risk management in their annual reports.

FOPMF design

Development of FOPMF

The development of FOPMF was informed by issues identified through external and internal past reviews. In developing FOPMF, DHHS referred to these external sources including the Royal Commission into Institutional Child Sexual Abuse, a range of Victorian Ombudsman and VAGO reports, parliamentary inquiries and internal reviews. FOPMF's key monitoring areas align with the key risk areas identified in these inquiries and reviews as needing more effective oversight—governance, financial management and quality and safety of service delivery to clients. FOPMF tools, such as the service agreement checklists, desktop reviews and the SACC form, cover these three key risk areas. Figure 4C lists the topics FOPMF tools cover collectively under each of these key risk areas.

Figure 4C
Topics covered by FOPMF tools

Topic

Description

Governance

  • Board and management capabilities and responsibilities
  • Strategic and work planning
  • Risk management

Financial management

  • Financial viability and risks
  • Financial management

Quality and safety of service delivery

  • Performance measures and reporting
  • Staff safety screening (e.g. police, working with children, and referee checks)
  • Staff training
  • Incident management and reporting
  • Complaints management
  • Registration, accreditation and quality standards
  • Fire risk and emergency management
  • Service user safety and wellbeing
  • Records management
  • Privacy, data protection and data quality

Source: VAGO.

Limitations of FOPMF design

FOPMF has some limitations which hinder its effectiveness as a performance management framework.

Scope

Exemptions for different FOPMF requirements are not clear. This makes it difficult to ensure that exemptions are applied as intended, especially when an organisation is funded for multiple services where exemptions may or may not be applicable.

One-size-fits-all

FOPMF does not sufficiently account for the varying complexities and sizes of funded organisations. It is essentially a one-size-fits-all framework with some minor exceptions—those discussed earlier or when the shorter quality and safety checklists are used for lower risk, lower funded organisations. FOPMF does not distinguish medium-risk organisations from high-risk organisations. The same level of performance monitoring applies to these organisations, which can lead to administrative burden for both the funded organisation and DHHS staff. The absence of a risk-tiered approach to performance monitoring is discussed in Section 2.3.

Compliance driven

FOPMF tools such as the service agreement checklists and desktop reviews are heavily compliance driven. The questions in these tools are mostly related to assessing whether an organisation is meeting legislative and policy requirements, such as those under the Children Youth and Families Act 2005 and the Public Records Act 1973. Performance is assessed as either compliant, compliant in part or noncompliant. While this approach is important, these monitoring tools still need to enable deeper insights into service quality and performance issues.

Our examination of 38 service reviews showed that FOPMF tools do not cover some of the recurring issues identified in these reviews. Examples include:

  • staff supervision, management and support (including staff performance management)
  • staff rostering and ratios (including use of casual staff)
  • review of board and CEO performance
  • engagement with external stakeholders (including service users' family members and other services)
  • facilities management and upkeep.
Better practice principles in contract management

FOPMF could be strengthened by incorporating better practice principles of monitoring contractor performance, such as the ANAO's better practice guide and the VGPB VPS Procurement Capability Framework , into the framework. For example, if service agreement staff were to adopt a structured approach to managing relationships with contractors, for example, including formal meetings at predetermined intervals, both parties would have a clear understanding of when contractor performance was to be formally reviewed. This would assist with managing the contractor relationship and would help DHHS staff to enforce the terms of the contract in a professional manner based on evidence of contractual performance.

Staff views on FOPMF design

Responses to the DHHS staff survey we conducted on service agreement management also provide an insight into some of the limitations with FOPMF design. The following is a summary of the key survey results related to FOPMF design, which are detailed in Appendix D.

Only 42 per cent of respondents said they agree or strongly agree that FOPMF helps them monitor and manage the performance of funded organisations effectively. Limited access to information appears to be a central issue with FOPMF design according to survey respondents:

  • Only 66 per cent of respondents agreed or strongly agreed that they can access information that allows them to see how a funded organisation is performing against its contracted KPIs.
  • Only 46 per cent of respondents agreed or strongly agreed that they can access information that allows them to see how a funded organisation is performing in a different DHHS division or area.
  • Only 43 per cent of respondents agreed or strongly agreed that they can access information that allows them to compare how a funded organisation is performing against other funded organisations that deliver similar services.
  • Only 12 per cent of respondents agreed or strongly agreed that they can access information that allows them to compare how a funded organisation is performing against a better practice benchmark.
  • Only 35 per cent of respondents agreed or strongly agreed that they have access to all the information they need to effectively monitor and manage the performance of funded organisations.

Respondents identified some of the key barriers to accessing the necessary information as needing to gather performance information from various data systems for different funded activities and that SAMS2 is difficult to navigate.

Collecting performance data

Collecting data on performance against targets

DHHS uses its SDT tool to capture funded organisations' performance data. DHHS introduced SDT in 2014 to enable more frequent performance monitoring of high-risk activities. SDT requires in-scope funded organisations to self-report performance results against performance targets monthly, by entering data directly into the FAC. However, SDT is limited to approximately a third of human services activities; other human services activities and health activities are excluded.

For activities not subject to SDT, DHHS uses data it collects from funded organisations as part of the data collection requirements to monitor service provision. This monitoring, however, is at the state and local level, and is intended only for strategic and operational reporting purposes to senior management and to ministers. It is not designed to monitor performance at the funded organisational level. This data collection also involves multiple data management systems and the frequency of data collection varies depending on the activity. This makes it challenging to collect and monitor performance data specific to a funded organisation that is not subject to SDT.

These limitations also mean that DHHS does not have a complete picture of funded organisations' performance against targets.

Collecting data on performance against DHHS objectives

DHHS collects output-focused performance data which does not show how well a funded organisation is performing against relevant DHHS objectives and outcomes—see 'Links to the Department of Health and Human Services strategic plan and the Victorian public health and wellbeing outcomes framework' in Section 2.2.

Issues with performance information collection

We found multiple areas of concern with how performance information is collected.

Multiple systems

FOPMF requires service agreement advisers to refer to other existing monitoring systems—such as those relating to incident reporting information, complaints handling and Financial Accountability Requirements—and to transfer the relevant information when completing FOPMF tools, including service agreement checklists and desktop reviews. The disparate nature of these systems makes completing FOPMF tools administratively difficult and time consuming. It also increases the risk of human error when gathering the relevant performance information.

Unclear information collection frequency

FOPMF requires service agreement advisers to complete service agreement monitoring checklists in SAMS2 annually. Some checklist questions require performance information to be collected once a year while other questions require more regular information collection to enable ongoing monitoring. The nature of a checklist question and the availability of performance information determines the frequency of collection.

DHHS provides some guidance on performance information collection frequency. However, the guidelines and checklist templates do not clearly identify which checklist questions require annual information and which require more frequent information collection. Service agreement advisers ultimately determine how frequently they collect, record and monitor performance information for each checklist question in SAMS2, which could be too infrequent to enable meaningful performance monitoring.

The infrequent collection of performance information and use of service agreement monitoring checklists was illustrated in the checklists for the 12 funded organisations we examined in this audit. As of 3 April 2018—just three months before the end of the 2017–18 reporting period—SAMS2 showed:

  • organisational compliance checklists were blank for eight of the 12 funded organisations, even though, according to DHHS guidelines, there is at least one question that would require regular information collection and monitoring
  • service plan checklists were blank for six of the 12 selected funded organisations, even though, according to DHHS guidelines, there are at least six questions that would require regular information collection and monitoring.

The lack of clear guidance on which checklist questions require annual information collections versus those requiring ongoing information collection is further highlighted in the Outer Eastern Melbourne Area. They have created a monitoring questions map to ensure there is consistency in how frequently staff apply checklist questions. The map colour codes checklist questions to distinguish those that require annual monitoring from those that require ongoing monitoring.

Overlapping information collection

DHHS requires both the service agreement monitoring checklists and the desktop review to be completed on an annual basis. Both comprise questions focused on identifying risks in relation to governance, financial management and service delivery, although the layout and language used is not identical. Figure 4D maps the topics across FOPMF tools.

Figure 4D
Mapping of topics covered by FOPMF tools

Theme

Topic

FOPMF monitoring tools(a)

Service agreement checklists

 

Desktop review

SACC

Organisational Compliance

Service Plan

Governance

Board and management capabilities and responsibilities

 

 

Strategic/work planning

     

Risk management

   

Financial management

Financial viability and risks

Financial management

   

Service delivery

Performance measures and reporting

 

 

Staff safety screening (police, working with children and referee checks)

   

Staff training (service delivery staff)

 

   

Incident management and reporting

 

Complaints management

 

 

Registration, accreditation and quality standards

 

Fire risk/emergency management

 

 

Service user safety and wellbeing

 

   

Records/information management

     

Privacy, data protection and data quality

   

(a) The quality and safety checklist has not been included as it is used in place of the Organisation Compliance Checklist and the Service Plan Checklist for lower risk, lower funded organisations. Specialist checklists have not been included as they only apply to specific services (e.g. disability; residential care for children and young people).
Source: VAGO.

DHHS advises that the uptake of the service agreement monitoring checklists has not been consistent across the state since FOPMF became operational in 2016. Consequently, DHHS will only decommission its desktop reviews—a legacy tool from the previous monitoring framework that serves a similar function—once there is more consistent uptake of monitoring checklists. While the logic underpinning this decision is understandable, the continued availability of a familiar, pre-existing monitoring tool with a similar intent will continue to cause duplication and discourage service agreement advisers from using the checklists more consistently. Refer to Section 2.3 for funded organisations' survey results regarding duplication across administrative and compliance obligations.

Frequency of engagement with organisations

According to the ANAO's better practice guide, it is better practice to adopt a structured approach to managing the relationship with a contractor that consists of both informal interactions and formal meetings at predetermined intervals. Having predetermined meetings scheduled ensures both parties understand when contractor performance will be reviewed formally.

Under FOPMF, there is an expectation to monitor and engage with funded organisations throughout the year. However, FOPMF guidelines do not specify minimum requirements for formal meetings at predetermined intervals with funded organisations. There are no mechanisms under FOPMF to ensure service agreement advisers engage with and monitor funded organisations at a frequency that would be appropriate to their complexity, level of funding and risk. It is up to service agreement advisers to determine how often they engage with and monitor funded organisations.

The need to have formal meetings at predetermined intervals with funded organisations has been highlighted by the engagement model that the Outer Eastern Melbourne Area has developed to address this gap in FOPMF. The engagement model recognises the diversity of funded organisations and the need to tailor engagement frequencies accordingly. It categorises funded organisations into one of four tiers that reflect the complexity and level of funding of an organisation—Tier 1 for high funding, high complexity organisations down to Tier 4 for individual support packages providers (low funding, low complexity). It then specifies the minimum frequency for engagement and performance monitoring activities required at each tier (such as quarterly, six monthly and annually). The engagement model is supported by a monitoring and engagement schedule that identifies the organisations to be engaged with each month for the financial year.

Assessing and managing performance

Assessing performance issues and risks

FOPMF requires service agreement advisers to use a RAT to drive the assessment of performance. The RAT aims to give service agreement advisers a consistent approach to assessing the severity of an issue and to determining appropriate action where required.

The RAT is intended to be used in conjunction with FOPMF tools and associated guidelines. The risk ratings in the RAT are:

  • 0—no issue
  • 1—minor severity
  • 2—moderate severity
  • 3—major severity
  • 4—critical severity.

The RAT provides a consequence description against each rating to help staff determine the rating for the issue. It also provides some examples of actions service agreement advisers can take in response to each rating.

While the consequence descriptions in the RAT and Live Monitoring Business Rules cover the same content with slight variations to wording, they are not as clear and succinct as the rating descriptions provided in SAMS2 for recording live monitoring issues. An example of the difference in rating descriptions is shown in Figure 4E.

Figure 4E
Comparison of descriptions for 'moderate severity' rating

Source

Description

RAT

  • Compromised safety, rights, wellbeing of service user/employee for example inadequate response to a complaint or lack of evidence about staff training
  • Fire risk certificate not provided
  • Some performance failures against service agreement targets. Failure in service quality for example lack of evidence of complaint handling process
  • Modest disruption regarding provisions of services
  • Organisation has not provided SACC form after repeated reminders
  • There is a lack of evidence about governance roles and responsibilities, expertise of board/committee and planning, lack of board/committee training over three years
  • Financial irregularities or significant decrease in Commonwealth funding
  • Adverse public reports about service quality

Live Monitoring Business Rules

  • Compromised safety, rights, wellbeing of service user/employee
  • Some performance failures against service agreement targets
  • Failure in service quality
  • Modest disruption regarding provisions of services
  • Governance/organisation performance risks for example risks to long-term viability
  • Financial irregularities
  • Complaints raised in media

SAMS2

  • Issue indicates a trend or concern that without action may escalate and put services at risk but will have no immediate impact at this point

Source: VAGO based on information from DHHS.

This inconsistency in rating descriptions can cause confusion for staff who are trying to use the RAT and live monitoring. It can also lead to an inconsistent approach to assessing the severity of an issue and developing remedial actions.

Defining actions in response to underperformance

Depending on the severity of the issues (RAT rating), FOPMF provides three types of responses to performance issues as shown in Figure 4F.

Figure 4F
Responses to underperformance

Response type

Description

Remedial actions

Specific actions developed in discussion with DHHS and the funded organisation to address identified performance issues.

Service review

  • Collaborative review—undertaken in collaboration with the funded organisation and may involve an independent consultant to assess a funded organisation's ongoing viability and operating model with the aim of producing an action plan to address issues.
  • Investigative review—conducted by an independent consultant and managed by DHHS. Investigative reviews are undertaken when there is evidence or allegations made of a significant breach of the service agreement or service failure which will impact service user safety and wellbeing, or the ongoing provision of quality and sustainable services.

Enact clauses under service agreement or legislation

Occurs when there is sufficient evidence that the funded organisation has failed to address the requirements of the service agreement, impacting on service user safety and wellbeing, the ongoing provision of quality and sustainable services, or the ongoing viability of the service and funded organisation. An assessment of whether to enact a clause involves DHHS operational, management and executive staff and review of evidence. Enacting clauses may lead to service suspension, suspension of funding, cessation and termination of the service agreement.

Source: VAGO based on information from DHHS.

The FOPMF Guidelines for Department Monitoring Staff provide some guidance on which response action to take and the associated timeframes for each severity level. However, this guidance still relies on the judgement of service agreement advisers to determine a suitable timeframe as shown in Appendix F. This can lead to service agreement advisers setting different timeframes to resolve similar performance issues, especially when SAMS2 does not automatically set due dates for live monitoring actions. Refer to Section 4.3 regarding live monitoring issues and associated remedial actions recorded in SAMS2.

Communicating performance

FOPMF makes performance information available to internal and external stakeholders in the following ways:

  • completed service agreement monitoring checklists and live monitoring issues are available to DHHS staff through SAMS2
  • completed desktop reviews are available to DHHS staff and the relevant funded organisation through FAC.

However, it is up to the relevant individuals within DHHS to access this information. FOPMF does not provide clear instructions for communicating or using performance information. Specifically, DHHS's internal FOPMF guidelines provide little information on:

  • the intended audience/s for each completed FOPMF tool
  • the business rules and processes for distributing completed FOPMF tools
  • how the results feed into other forms of performance monitoring and reporting, such as the Area Performance, Assurance and Compliance and the Divisional Performance, Assurance and Compliance meetings.

Performance information, such as client incident reporting and performance against funded targets, is reported at the area and divisional level to senior management through forums such as Divisional Performance, Assurance and Compliance. DHHS collects this performance information from various data management systems and some of the completed FOPMF tools.

4.3 Applying the performance monitoring framework

The restructure review commissioned by DHHS found that there is inconsistent application of FOPMF—along with other processes for monitoring funded organisations—which requires DHHS to better communicate and oversight implementation of this mandated policy.

Our overall findings were consistent with the restructure review:

  • While there was a high uptake of desktop reviews and monitoring checklists among staff, use of the RAT for performance monitoring was particularly low.
  • The majority of staff who use the live monitoring tool do so to raise performance issues and organisational updates. However, the value of live monitoring data is limited by a high proportion of incomplete entries and poor staff awareness of the RAT.
  • Staff offered a variety of reasons for not using the FOPMF tools made available to them, including a lack of awareness and training, as well as the tools not suiting their needs.The frequent use of local systems and tools among divisional staff presents a key challenge to improving the uptake of FOPMF tools.

Use of FOPMF tools

Eighty-one per cent of respondents to our survey of DHHS service agreement staff reported that they use FOPMF in some capacity. The remaining respondents—including 16 per cent that reported not using FOPMF and 3 per cent who were not sure—are likely to include some staff who manage service agreements that are exempt from FOPMF requirements.

We asked the surveyed staff that reported using FOPMF in some capacity about which FOPMF tools they use to monitor funded organisation performance. Figure 4G summarises responses to this question.

Figure 4G
Survey responses—DHHS staff
Question 18: Which FOPMF tools do you use?

Figure 4G displays survey responses—DHHS staff Question 18: Which FOPMF tools do you use?

Note: This survey question applies only to respondents that reported using FOPMF.
Source: VAGO.

The particularly low uptake of the RAT is problematic, as it represents DHHS's system-wide tool for ensuring that staff assess the severity of performance issues consistently and accurately. It also undermines the reliability of performance issues entered into live monitoring, which is designed to be used alongside the RAT.

The use of FOPMF tools varied across both the health and human services portfolios and DHHS's Divisions:

  • In the health services, 62 per cent of FOPMF users reported that they use the RAT, compared to only 48 per cent of human services FOPMF users.
  • West Division FOPMF users reported the lowest uptake of live monitoring out of all DHHS's four divisions. Seventy-two per cent of these staff reported using this tool, while between 83 and 87 per cent of FOPMF users in the other three divisions reported using it.
  • There was a noticeable difference in the use of the RAT between the East and South Divisions (65 per cent and 61 per cent respectively) and the North and West Divisions (47 per cent and 46 per cent respectively).
  • Central office FOPMF users reported particularly low usage of the RAT (29 per cent) and of live monitoring (50 per cent).

DHHS can run reports showing the completion status of its monitoring checklists, desktop reviews and live monitoring issues. However, these reports are limited to the divisional level and cannot report in further detail at the area level. We sought but did not find any evidence showing the extent to which DHHS had used these reports to increase the uptake of FOPMF tools among staff.

Barriers to using FOPMF tools

The restructure review commissioned by DHHS identified that there is a perception among staff that insufficient guidance, processes and tools has resulted in inconsistent approaches by staff to addressing noncompliance by funded organisations. The restructure review identified that areas have developed their own tools to manage the scheduling of monitoring activities.

As part of our online survey, we asked DHHS staff to provide reasons for not using the FOPMF tools available to them. Common explanations included:

  • staff not being aware of the RAT
  • insufficient training and support on how to use FOPMF tools, particularly live monitoring
  • the tools not being suitable to the types of service agreements being managed
  • other competing priorities.

Through our survey, we also asked DHHS staff whether they use any local systems or tools instead of, or in addition to FOPMF, to monitor the performance of funded organisations. As shown in Figure 4H, 60 per cent of surveyed staff reported that they do use local systems or tools. This included common use of various performance monitoring templates, spreadsheets and other monitoring tools that sit outside of FOPMF.

Figure 4H
Survey responses—DHHS staff
Question 21: Do you use any local systems or tools instead of, or in addition to, FOPMF to monitor the performance of funded organisations?

Figure 4H displays survey responses—DHHS staff Question 21: Do you use any local systems or tools instead of, or in addition to, FOPMF to monitor the performance of funded organisations?

Source: VAGO.

Surveyed staff from DHHS's West Division consistently raised its own performance escalation framework as a key local tool for escalating performance issues in a more sophisticated way than is possible using FOPMF tools alone. The division's reliance on this local tool is likely the reason why its reported use of both the live monitoring and RAT was the lowest of all four divisions.

Under the performance escalation framework, West Division staff score funded organisations according to:

  • the variation between service delivery targets and results
  • the frequency of failure to meet service delivery targets
  • the number of 'minor' or 'moderate' issues entered into live monitoring.

The West Division's monitoring effort is then scaled to each organisation's score. The score also informs the seniority of DHHS staff allocated to monitor each organisation. This addresses issues we found with FOPMF's design, including its inability to scale monitoring requirements based on risk.

Our June 2018 Community Health Program performance audit report contains further information on the design and application of the West Division's performance escalation framework. This audit identified similar issues with the poor uptake of FOPMF and the reliance on local tools.

Identifying and addressing performance issues through live monitoring

We analysed SAMS2 data to understand how DHHS staff had used the live monitoring FOPMF tool to raise performance issues since it was introduced in mid-2015.

Live monitoring issues recorded in SAMS2

As at 17 April 2018, 1 384 live monitoring issues had been recorded in the SAMS2 information system across 616 funded organisations. This covers 32 per cent of the 1 927 funded organisations recorded in SAMS2.

For funded organisations subject to risk-tiering assessment, we found eight organisations with a high-risk score (risk score above 20) had no issues recorded in live monitoring—see Figure 4I.

Figure 4I
Number of live monitoring issues versus risk score

Figure 4I shows the number of live monitoring issues versus risk score

Source: VAGO.

Figure 4J shows that the majority of organisations with a live monitoring issue recorded had fewer then 10 entries created.

Figure 4J
Number of live monitoring issues created per funded organisation

Figure 4J shows the number of live monitoring issues created per funded organisation

Source: VAGO.

Figure 4K breaks down the distribution of live monitoring issues created across the four severity ratings—minor, moderate, major and critical. In particular, 69 per cent of live monitoring issues created were assigned a moderate severity rating, while less than 1 per cent of issues were assigned a critical rating.

Figure 4K
Breakdown of live monitoring issues by severity rating

Figure 4K displays a breakdown of live monitoring issues by severity rating

Source: VAGO.

Live monitoring issues without planned remedial actions

Not all live monitoring issues have led to planned remedial actions being recorded in SAMS2. Figure 4L shows that only 28 per cent of live monitoring issues with a moderate severity rating had a planned remedial action recorded. In contrast, approximately three-quarters of live monitoring issues across the remaining three severity rating categories had a planned remedial action recorded. However, these issues only made up 31 per cent of all live monitoring issues.

Figure 4L
Percentage of live monitoring issues with actions created by severity rating

Figure 4L shows the percentage of live monitoring issues with actions created by severity rating

Source: VAGO.

We examined the two live monitoring issues recorded with a critical severity rating that did not have a planned remedial action recorded in SAMS2. These included:

  • an issue recorded in September 2017 raising 30 noncompliances with section 120 of the Children, Youth and Families Act 2005, which requires that an out-of-home care service ask DHHS whether a person is disqualified before approving, employing or engaging them as a foster carer
  • an issue recorded in January 2018 raising a series of concerns with the funded organisation's call system, staffing, roster management, training and supervision.

Although remedial actions may have been documented elsewhere, such as actions plans or meeting minutes, not having them recorded centrally in live monitoring limits DHHS's ability to monitor and track actions. It also limits assurance that performance issues are addressed satisfactorily.

Overdue remedial actions

We also analysed the prevalence of live monitoring issues recorded in SAMS2 with planned remedial actions that were overdue as at 17 April 2018. Figure 4M shows that a total of 127 planned actions were overdue with the average number of days that actions were overdue being 264 days.

Figure 4M
Analysis of overdue remedial actions in response to live monitoring issues

Measure

Severity

Across all severity ratings

Minor

Moderate

Major

Critical

No. of overdue actions

38

74

15

0

127

Average length overdue (days)

249

270

279

n/a

264

Median length overdue (days)

260

199

229

n/a

200

Maximum length overdue (days)

661

671

598

n/a

671

Source: VAGO.

There were no overdue actions in relation to live monitoring issues with a critical severity rating. Of the 15 overdue remedial actions relating to live monitoring issues with a major severity rating, five actions had no record in SAMS2 of any response taken since the issue was created. These actions all related to instances of noncompliance in some form, such as verifying a funded organisation's compliance with registration requirements following an external audit.

Recording performance issues and tracking remedial actions is vital to managing service agreements, especially when issues have the potential to impact client safety and wellbeing. Until all service agreement staff use live monitoring regularly as intended, DHHS will continue to have limited oversight and assurance that performance issues are being addressed effectively and efficiently.

Using performance information to inform funding decisions

DHHS's Service Agreement business rules and guidelines instruct staff to consider the following when selecting an organisation to provide services:

  • financial viability
  • governance arrangements
  • facilities, staffing, resources and expertise
  • history of known performance and compliance issues (if it receives existing funding).

Despite this guidance, we could not find records in SAMS2 that clearly show existing performance information—generated through FOPMF or otherwise—being used to inform future service agreement funding decisions. According to DHHS, the evidence underpinning service agreement funding decisions is stored in the 'attachments' tab for each agreement in SAMS2. However, the information stored in this location did not give any indication of funding decisions considering past performance where appropriate.

Back to top

Appendix A. Audit Act 1994 section 16—submissions and comments

We have consulted with DHHS, and we considered its views when reaching our audit conclusions. As required by section 16(3) of the Audit Act 1994, we gave a draft copy of this report to DHHS and asked for its submission and comments. We also provided a copy of the report to the Department of Premier and Cabinet.

Responsibility for the accuracy, fairness and balance of those comments rests solely with the agency head.

DHHS's response is included below.

RESPONSE provided by Secretary, DHHS

RESPONSE provided by Secretary, DHHS, page 1

 

RESPONSE provided by Secretary, DHHS, page 2

 

RESPONSE provided by Secretary, DHHS, page 3

Back to top

Appendix B. Performance standards established in DHHS service agreements

Figure B1
Performance standards established in DHHS service agreements

Clause/Schedule

Standards

Clause 3.1

States that organisations need to:

  • deliver the services in a proper, timely and efficient manner using the standard that would reasonably be expected from an expert and experienced provider of the services
  • act in accordance with the highest applicable professional ethics, principles and standards
  • obtain and maintain relevant accreditation or registration
  • comply with the standards and performance targets.

Clause 3.6

Requires that organisations remain accredited and undertake a performance review against the standards, by an independent review body, every three years.

Schedule 1

Lists the relevant DHHS policies that organisations must comply with. These include, but are not limited to the:

  • Policy and Funding Guidelines, which detail the funding conditions and performance measures for each funded health and human services activity
  • Service agreement information kit for funded organisations, which summarises service agreement terms and conditions, specific DHHS policies (such as incident reporting and fire risk management), and funding and payment information.

The schedule also lists various SSGs that organisations are required to comply with. These range from strategies and frameworks to program manuals, practice guidelines and client audit tools. DHHS Service Standards is part of this list.

Note: The agreement defines 'the standards' as those performance standards made under the Children Youth and Families Act 2005 and Disability Act 2006, and any standards developed or endorsed by DHHS.
Source: VAGO.

Back to top

Appendix C. DHHS service system outcomes and key results

DHHS established five outcomes in its 2017 strategic plan. This audit focused on one of these—'Victorian Health and Human Services are person-centred and sustainable'. Figure C1 outlines the DHHS service system outcomes and key results under this direction.

Figure C1
DHHS service system outcomes and key results

Outcome

Key result

Description

Services are appropriate and accessible in the right place

1

Increase participation in universal and earlier intervention services—especially by Aboriginal Victorians

2

Reduce the average wait time for people on the priority housing list

3

Improve the timeliness of access to elective surgery, emergency department treatment, outpatient services, ambulance services and palliative care

4

Reduce unexplained variation in the care people receive—especially for disadvantaged groups

Services are inclusive and respond to choice, culture, identity, circumstances and goals

5

Increase client and patient choice concerning the services and treatment they receive

6

Increase diversity of the department's workforce—especially Aboriginal people employed in senior roles

7

Increase citizen engagement in the design and delivery of services

8

Increase participation of service providers and staff in the design of services

Services are efficient and sustainable

9

Reduce demand for acute services to manage complex and chronic conditions

10

Increase the proportion of service assets that are appropriately maintained

11

Increase the proportion of capital projects delivered on time and on budget

12

Improve alignment of our health, human services and community recreation assets with the needs of clients, patients and Victoria's growing population

13

Reduce waste arising from the use of inappropriate care

Services are safe, high‑quality and provide a positive experience

14

Improve patient- and client-reported experiences of care and treatment

15

Reduce restrictive practices in formal care settings

16

Increase the transparency of service safety and quality

17

Reduce assault, exploitation and neglect of clients and patients cared for in formal settings

Source: VAGO based on the Victorian public health and wellbeing outcomes framework.

Back to top

Appendix D. Survey results: DHHS staff

The following charts summarise the responses to our survey of DHHS service agreement staff, including quantifiable responses to our survey questions. Open-text responses are excluded.

We sent our DHHS survey to 513 staff that either currently or have previously managed service agreements. This included staff that manage service agreements as a core part of their role, as well as staff whose role has less of a focus on managing service agreements. We received 200 responses, equating to a response rate of 39 per cent.

Figure D1
Question 1: Which best describes your involvement with DHHS service agreements?

Figure D1 displays Question 1: Which best describes your involvement with DHHS service agreements?

Source: VAGO.

Figure D2
Question 2: Has your involvement with DHHS service agreements focused on health services, human services, or both?

Figure D2 displays Question 2: Has your involvement with DHHS service agreements focused on health services, human services, or both?

Source: VAGO.

Figure D3
Question 3: In your involvement with managing service agreements, what is (or was) your job title?

Figure D3 displays Question 3: In your involvement with managing service agreements, what is (or was) your job title?

Note: Job titles listed on the horizontal axis were included in at least one response.
Source: VAGO.

Figure D4
Question 4: What DHHS division do you work in?

Figure D4 displays Question 4: What DHHS division do you work in?

Source: VAGO.

Figure D5
Question 5: Which DHHS area do you work in?

Division

Area

Responses

Per cent of total

North

Hume Moreland

5

3%

Loddon

13

8%

Mallee

4

2%

North Eastern Melbourne

10

6%

South

Bayside Peninsula

15

9%

Inner Gippsland

15

9%

Outer Gippsland

5

3%

Southern Melbourne

12

7%

East

Goulburn

6

4%

Inner Eastern Melbourne

10

6%

Outer Eastern Melbourne

10

6%

Ovens Murray

7

4%

West

Barwon

13

8%

Brimbank Melton

5

3%

Central Highlands

14

9%

Western District

8

5%

Western Melbourne

11

7%

Note: This question only applies to those who recorded that they work in one of DHHS's four geographical divisions—North, South, East and West.
Note: Figures may not total 100 per cent due to rounding.
Source: VAGO.

Figure D6
Question 6: Approximately how long have you worked as a service agreement staff member at DHHS?

Figure D6 displays Question 6: Approximately how long have you worked as a service agreement staff member at DHHS?

Source: VAGO.

Figure D7
Question 7: How many funded organisations are you (or were you) responsible for managing?

Figure D7 displays Question 7: How many funded organisations are you (or were you) responsible for managing?

Note: Respondents that reported managing higher numbers of funded organisations commonly included people in managerial or team leader roles.
Source: VAGO.

Figure D8
Question 8: How effective has the orientation and induction provided by DHHS been at giving you the basic skills and knowledge needed to manage service agreements?

Figure D8 displays Question 8: How effective has the orientation and induction provided by DHHS been at giving you the basic skills and knowledge needed to manage service agreements?

Source: VAGO.

Figure D9
Question 9: How effective has training provided by DHHS been at building and maintaining the skills you need to manage service agreements?

Figure D9 displays Question 9: How effective has training provided by DHHS been at building and maintaining the skills you need to manage service agreements?

Source: VAGO.

Figure D10
Question 10: As a new service agreement monitoring staff member, were you assigned service agreements to manage that reflected your level of experience and skills?

Figure D10 displays Question 10: As a new service agreement monitoring staff member, were you assigned service agreements to manage that reflected your level of experience and skills?

Source: VAGO.

Figure D11
Question 11: As a service agreement monitoring staff member, do you (or did you) have an individual performance plan?

Figure D11 displays Question 11: As a service agreement monitoring staff member, do you (or did you) have an individual performance plan?

Source: VAGO.

Figure D12
Question 12: How often do you (or did you) review performance against your individual performance plan with your manager?

Figure D12 displays Question 12: How often do you (or did you) review performance against your individual performance plan with your manager?

Source: VAGO.

Figure D13
Question 13: How effective has the individual performance planning and review process been at addressing your learning and development needs?

Figure D13 displays Question 13: How effective has the individual performance planning and review process been at addressing your learning and development needs?

Source: VAGO.

Figure D14
Question 14: As a service agreement monitoring staff member, what proportion of tasks that you perform are NOT reflected in your position description?

Figure D14 displays Question 14: As a service agreement monitoring staff member, what proportion of tasks that you perform are NOT reflected in your position description?

Source: VAGO.

Figure D15
Question 15: On average, how much time per day do you spend monitoring or managing the performance of funded organisations?

Figure D15 displays Question 15: On average, how much time per day do you spend monitoring or managing the performance of funded organisations?

Source: VAGO.

Figure D16
Question 16: On average, how much time per day do you spend on other tasks (i.e. beyond monitoring or managing the performance of funded organisations)?

Figure D16 displays Question 16: On average, how much time per day do you spend on other tasks (i.e. beyond monitoring or managing the performance of funded organisations)?

Source: VAGO.

Figure D17
Question 17: Do you use the Funded Organisation Performance Monitoring Framework (FOPMF)?

Figure D17 displays Question 17: Do you use the Funded Organisation Performance Monitoring Framework (FOPMF)?

Source: VAGO.

Figure D18
Question 18: Which FOPMF tools do you use?

Figure D18 displays Question 18: Which FOPMF tools do you use?

Source: VAGO.

Figure D19
Question 19: To what extent do you agree with the following statement: There are clear instructions and guidance on how to use FOPMF

Figure D19 displays Question 19: To what extent do you agree with the following statement: There are clear instructions and guidance on how to use FOPMF

Source: VAGO.

Figure D20
Question 20: To what extent do you agree with the following statement: FOPMF helps me to monitor and manage the performance of funded organisations effectively

FIgure D20 displays Question 20: To what extent do you agree with the following statement: FOPMF helps me to monitor and manage the performance of funded organisations effectively

Source: VAGO.

Figure D21
Question 21: Do you use any local systems or tools instead of, or in addition to, FOPMF to monitor the performance of funded organisations?

Figure D21 displays Question 21: Do you use any local systems or tools instead of, or in addition to, FOPMF to monitor the performance of funded organisations?

Source: VAGO.

Figure D22
Question 22: To what extent do you agree with the following statement: I can access information that allows me to see how a funded organisation is performing against its contracted KPIs

Figure D22 displays Question 22: To what extent do you agree with the following statement: I can access information that allows me to see how a funded organisation is performing against its contracted KPIs

Source: VAGO.

Figure D23
Question 23: To what extent do you agree with the following statement: I can access information that allows me to see how a funded organisation is performing in a different DHHS Division or Area

Figure D23 displays Question 23: To what extent do you agree with the following statement: I can access information that allows me to see how a funded organisation is performing in a different DHHS Division or Area

Source: VAGO.

Figure D24
Question 24: To what extent do you agree with the following statement: I can access information that allows me to compare how a funded organisation is performing against other funded organisations that deliver similar services

Figure D24 displays Question 24: To what extent do you agree with the following statement: I can access information that allows me to compare how a funded organisation is performing against other funded organisations that deliver similar services

Source: VAGO.

Figure D25
Question 25: To what extent do you agree with the following statement: I can access information that allows me to compare how a funded organisation is performing against a better practice benchmark

Figure D25 displays Question 25: To what extent do you agree with the following statement: I can access information that allows me to compare how a funded organisation is performing against a better practice benchmark

Source: VAGO.

Figure D26
Question 26: To what extent do you agree with the following statement: I have access to all the information I need to effectively monitor and manage the performance of funded organisations

Figure D26 displays Question 26: To what extent do you agree with the following statement: I have access to all the information I need to effectively monitor and manage the performance of funded organisations

Source: VAGO.

Back to top

Appendix E. Survey results: funded organisations

The following charts summarise the responses to our survey of organisations that DHHS funds through service agreements. It includes quantifiable responses to our survey questions and excludes open-text responses.

We distributed our funded organisation survey to 1 021 funded organisations. These recipients were compiled through a SAMS2 report that listed the main contact for any funded organisation with a head office or postal address. This SAMS2 report excludes organisations that only have a short-form agreement with DHHS, as well as organisations with contact details that were not finalised as at 4 April 2018.

We received 355 responses, equating to a response rate of 35 per cent.

Figure E1
Question 1: What services does DHHS mainly fund your organisations for?

Figure E1 displays Question 1: What services does DHHS mainly fund your organisations for?

Source: VAGO.

Figure E2
Question 2: Which DHHS area/s does your organisation deliver services in?

Figure E2 dislpays Question 2: Which DHHS area/s does your organisation deliver services in?

Source: VAGO.

Figure E3
Question 3: To what extent do you agree with this statement: Service agreement administrative and compliance requirements are appropriately matched to the level of risk associated with the services we are funded to deliver

Figure E3 displays Question 3: To what extent do you agree with this statement: Service agreement administrative and compliance requirements are appropriately matched to the level of risk associated with the services we are funded to deliver

Source: VAGO.

Figure E4
Question 4: Is there any duplication in the service agreement reporting and data your organisation is required to provide to DHHS?

Figure E4 displays Question 4: Is there any duplication in the service agreement reporting and data your organisation is required to provide to DHHS?

Note: Examples of duplication may include being required to provide the same data to different information systems, or providing the same information to different DHHS divisions or areas. This excludes any reporting and data provided to external bodies like the Commonwealth Government.
Source: VAGO.

Figure E5
Question 5: Is there any duplication in the service agreement reporting and data your organisation is required to provide to other parties?

Figure E5 displays Question 5: Is there any duplication in the service agreement reporting and data your organisation is required to provide to other parties?

Note: Examples of duplication may include being required to provide the same information or data you already provide to DHHS, to the Commonwealth Government and accreditation bodies.
Source: VAGO.

Figure E6
Question 6: Do you have staff resources dedicated to completing service agreement administrative and compliance requirements?

Figure E6 displays Question 6: Do you have staff resources dedicated to completing service agreement administrative and compliance requirements?

Source: VAGO.

Figure E7
Question 7: What proportion of time do service delivery staff in your organisation spend on service agreement administrative and compliance requirements?

Figure E7 displays Question 7: What proportion of time do service delivery staff in your organisation spend on service agreement administrative and compliance requirements?

Source: VAGO.

Figure E8
Question 8: To what extent is your organisation able to meet all the service agreement administrative and compliance requirements?

Figure E8 displays Question 8: To what extent is your organisation able to meet all the service agreement administrative and compliance requirements?

Source: VAGO.

Figure E9
Question 9: Does DHHS follow up with your organisation when administrative and compliance requirements are not met?

Figure E9 displays Question 9: Does DHHS follow up with your organisation when administrative and compliance requirements are not met?

Source: VAGO.

Figure E10
Question 10: To what extent do you agree with this statement: I receive all the information I need from DHHS to understand how well my organisation is performing against service agreement targets

Figure E10 displays Question 10: To what extent do you agree with this statement: I receive all the information I need from DHHS to understand how well my organisation is performing against service agreement targets

Source: VAGO.

Back to top

Appendix F. Risk assessment tool action types and timeframes

Figure F1
Risk assessment tool action types and timeframes

Severity

Response action

Action completion time frame

Minor

  • Remedial action
  • 0–12 months

Moderate

  • Remedial action
  • 0–4 months
  • Staff to ensure actions occur as soon as possible if risk needs to be addressed within a shorter time frame
  • Collaborative service review and action plan
  • Service reviews (collaborative and investigative) to commence as soon as possible
  • Service review completion time frame will depend on size and methodology

Major

  • Remedial action
  • At least one action to be addressed within one month
  • Immediate action if service user safety or wellbeing is compromised
  • Collaborative service review and action plan
  • Investigative service review
  • Service reviews (collaborative and investigative) to commence as soon as possible
  • Service review completion time frame will depend on size and methodology
  • Enacting of clauses under the service agreement or legislation if breach identified
  • Not specified

Critical

  • Remedial action (if possible)
  • At least one action to be addressed within one month
  • Immediate action if service user safety or wellbeing is compromised
  • Investigative service review
  • Service reviews to commence as soon as possible
  • Service review completion time frame will depend on size and methodology
  • Enacting of clauses under the service agreement or legislation if breach identified
  • Not specified

Source: VAGO based on information from DHHS.

Back to top