Contract Management Capability in DHHS: Service Agreements

Tabled: 20 September 2018

2 Setting service agreement requirements

With $2.8 billion spent annually on service agreements across 1 927 funded organisations, DHHS needs sufficient assurance that clients are receiving quality services in a proper, timely and efficient manner. This requires that service agreements:

  • contain clearly defined performance standards, deliverables and review mechanisms
  • impose requirements on funded organisations that are proportionate to their risk profiles.

In this part, we assessed whether DHHS service agreements are fit-for-purpose, focusing on these two areas.

2.1 Conclusion

DHHS's service agreements are not fit-for-purpose. A fragmented approach to their development and management means that performance measures are set and recorded inconsistently, without a clear focus on desired service quality and outcomes. This fragmentation has also resulted in an increasingly complicated, disjointed and duplicative approach to the risk-profiling of funded organisations that does not inform the service agreement requirements imposed on them.

These issues prevent DHHS from having a clear and accurate understanding of funded organisation performance and service delivery risks. This understanding is critical to ensuring that clients' safety and wellbeing is not compromised.

2.2 Setting performance standards, deliverables and review mechanisms

Performance standards and deliverables

Based on existing better practice material—including the ANAO's better practice guide—we applied the following definitions when assessing service agreement performance standards and deliverables:

  • Performance standards―the quality of the service or activity that funded organisations are contracted to deliver, such as family violence support services and housing assistance services. Relevant agreement clauses, DHHS policies and guidelines fall within this definition.
  • Deliverables―service activity outputs, including what needs to be delivered, to what standard and in what timeframe. Performance measures fall within this definition.

Each service agreement contains a service plan in Schedule 2 that details the service activities that the funded organisation must deliver. Each service activity has funding, performance measures and targets attached to it.

We found that:

  • DHHS could organise performance standards in service agreements in a more meaningful way so that funded organisations clearly understand how the standards apply to each funded service activity
  • performance measures are inconsistent across service agreements for similar services and are internally inconsistent across documents and systems that record performance measures for the same organisation and agreement
  • service agreements did not consistently include mandatory performance measures
  • DHHS had set service agreement performance measures without sufficient system-wide oversight and quality control arrangements.
Performance standards

Service agreements contain standard terms and conditions that detail the mandatory performance standards for funded organisations.

The Funded Agency Channel is a secure website that funded organisations use to access their service agreements, performance reports, DHHS policies and standards, as well as other supporting information.

While some of these requirements are explicitly listed in service agreements, others are in documents that sit alongside them. Funded organisations can access these documents through the FAC.

For agencies that deliver a broad range of activities, the applicable standards can be extensive. The more services an organisation is funded for, the more SSGs are listed in the agreement, but the SSG documents are not organised in any meaningful way. One of the 12 service agreements we reviewed listed over 70 SSG documents.

Organisations would benefit from standards that are clearly linked to relevant activities within the agreement, so that specific requirements for each activity are clear. Funded organisations can run a report in the FAC that provides hyperlinks to all applicable SSGs for each funded activity, but we found that the majority of hyperlinks were outdated and broken.

Appendix B details the key areas of the service agreement that establish performance standards.

Deliverables

Figure 2A summarises the service agreement clauses and schedules that detail deliverables.

Figure 2A
Key deliverables in DHHS service agreements

Clause/schedule

Deliverables

Clause 8

Requires that funded organisations submit service delivery and financial accountability reports to DHHS as stated in the schedules and on request.

Schedule 2

Lists each service activity that the organisation is funded to perform. Activities that are classified as 'non-investment activities' should have a performance measure and target. These performance measures reflect the deliverables associated with the funding received.

Also lists various data collection requirements, including but not limited to:

  • service activity reports
  • project reports
  • national minimum dataset
  • annual reports.

Source: VAGO.

Regarding timing of deliverables, service agreements include financial year targets and DHHS requires funded organisations to report some performance measures more regularly. These additional requirements are not directly documented in the service agreement. Instead they are listed:

  • in activity descriptions available on DHHS's website (for human services activities)
  • in an appendix to the Policy and Funding Guidelines (for health services).
Omission or misalignment of mandatory performance measures

Activity descriptions in volume 3 (human services) and Appendix 4.1 of volume 2 (health) of the Policy and Funding Guidelines set out performance measures for each service activity. All performance measures in activity descriptions are mandatory. Performance measures in Appendix 4.1 are either mandatory or non-mandatory, which creates inconsistency in performance monitoring. Neither the Policy and Funding Guidelines nor the Service Agreement business rules and guidelines explain the basis for having non-mandatory measures.

Mandatory performance measures were not always included in the 12 service agreements that we reviewed. For example, Appendix 4.1 of the Policy and Funding Guidelines lists 'number of hours of service (provided to clients)' as a mandatory performance measure for the Home and Community Care program (HACC) volunteer coordination activity. Our review of three service agreements providing HACC services showed, however, that the only performance measure set for this activity is 'number of hours of coordinator time', which is a non‑mandatory measure. The mandatory measure is omitted from each of the three agreements.

In another example, the activity description for the home-based care—adolescent community placement service includes three mandatory performance measures:

  • daily average occupancy
  • percentage of the total number of children and young people in placements greater than six months who are in any of the following circumstances:
    • on family reunification
    • being cared for by DHHS Secretary
    • on long-term care orders that are contracted to the provider
  • percentage of total exits from placement that are planned.

Two service agreements that we reviewed included this adolescent community placement service, yet one of them did not include daily average occupancy as a performance measure.

The Multiple and Complex Needs Initiative is a time‑limited specialist disability service for people 16 years and older, who have been identified as having multiple and complex needs.

Our review of the 12 selected service agreements also showed that performance measures for the Multiple and Complex Needs Initiative (MACNI) service did not fully reflect the mandatory performance measures and targets set out in the Service provision framework: Multiple and Complex Needs Initiative December 2017 (MACNI Service provision framework) or the activity description.

The MACNI Service provision framework states that organisations providing MACNI service plans are required to report against three key performance indicators (KPIs):

  • 90 per cent of care plans are endorsed by the area panel within 12 weeks from the date of eligibility
  • 90 per cent of care plans are reviewed and endorsed by the area panel within six months
  • 100 per cent of clients have an exit transition plan endorsed at least six months prior to care plan termination.

These mandatory KPIs do not align with the three mandatory performance measures listed in the MACNI service activity description:

  • number of clients
  • percentage of MACNI clients that have an assessment and endorsed care plan within 12 weeks of eligibility
  • number of episodes of capacity building to provide care plan coordination for MACNI clients.

One of our 12 selected service agreements included MACNI services. We found that the agreement only listed one performance measure—100 per cent of MACNI clients have an assessment and endorsed care plan within 12 weeks of eligibility. This performance measure is slightly different to the corresponding measure in the MACNI Service provision framework and the service activity description. The agreement makes no reference to the remaining mandatory measures across these two documents.

Inconsistent performance measures between organisations

We also found that performance measures were inconsistent across different organisations with the same service activity.

Figure 2B shows the range of performance measures in five different service agreements for the Integrated Family Services activity, alongside the performance measures as required in the activity description.

Figure 2B
Differences in performance measures for the Integrated Family Services activity across five service agreements

Performance measure

Activity description

Organisation

A

B

C

D

E

Number of cases(a)

Number of service hours provided(a)

Number of clients

Number of packages

Number of families—intensive support to families—200 hours per family

(a) Mandatory measure as required in the activity description.
Source: VAGO.

DHHS attributes this inconsistency to some organisations not receiving funding for all components of the Integrated Family Services activity and therefore not being subject to all performance measures. However, given each of these funded organisations are funded to provide services directly to clients, it is reasonable to expect that the 'number of clients' performance measure would apply to all. Additionally, DHHS's documented activity descriptions make no mention of the link between funding and performance measures. It acknowledges that it could better explain the application of performance measures in its activity descriptions.

We also found inconsistencies across health services activities. For instance, Appendix 4.1 of the Policy and Funding Guidelines states that the mandatory performance measure for the HACC flexible service response activity is an annual service activity report. Three service agreements in our selection included the flexible response service, but one did not have the service activity report as a performance measure.

DHHS advised the type of funding attached for a service activity can affect whether performance measures and targets are required. DHHS classifies funding for activities into one of six categories:

  • ongoing and indexable
  • ongoing and non-indexable
  • fixed-term and indexable
  • fixed-term and non-indexable
  • minor capital
  • prior year adjustment.

DHHS applies annual price indexation at the rate approved by government to ongoing or fixed-term funding that is linked to wages. Specifically, for service activities that receive ongoing funding, SAMS2 would automatically require DHHS staff to include performance measures and targets. However, for service activities that receive fixed-term and non-indexable funding, performance measures are optional. This creates inconsistencies in how contracts are managed and limits DHHS's ability to assess the performance of these service activities. The basis for this differentiation is unclear given that organisations are delivering the same service.

Regardless of the funding arrangement, clients receiving services deserve the same level of assurance about the quality and accessibility of that service.

Inconsistent performance measures for the same organisation

We found that, even within one organisation, performance measures could be inconsistent across the service agreement and other related records and systems for performance measurement. This creates confusion for funded organisations and for DHHS about the level of service required. Figure 2C shows the differences in performance measures across different documents and systems for the Family Violence Support Services activity in one service agreement.

Figure 2C
Example of variance in performance measures across different service agreement documents and systems—Family Violence Support Services activity

Performance measure

Activity description

Service agreement

Service Delivery Tracking (SDT)

SAMS2

Number of new cases(a)

Number of contacts/referrals (Court Network)(a)

Percentage of clients sampled who are satisfied with the service provided

(a) This performance measure is mandatory as stated in the Family Support Services activity description document.
Source: VAGO.

Service Delivery Tracking is an online tool within the FAC website where funded organisations submit performance data on a monthly basis. It applies to approximately one-third of human services activities, discussed further in Section 3.2.

DHHS advised that SDT can only record one key performance measure per activity. DHHS plans to address this limitation through system improvements currently planned for 2018–19.

Other performance measure issues

We found that the performance measures and targets detailed in Schedule 2 of the 12 service agreements we examined were not always practical or easily understood. For example, some performance measures had a target of '0.1 new cases'. We heard conflicting reasoning for this from DHHS:

  • Area-based DHHS staff advised that an arbitrary target of 0.1 is entered when the SAMS2 system requires a target to be entered before finalising the agreement.
  • Other DHHS staff advised that it could be an administrative error.

Regardless of the reasoning, having a target of 0.1 new cases provides no insight into the level of service provided.

We also saw examples of what appears to be duplicate performance measures being set for the same service activity and financial year. While DHHS attributes its duplicative performance measures to the functionality of its SAMS2 system, it is potentially a source of confusion for funded organisations.

A lack of quality control in setting performance measures

The omission of mandatory performance measures, along with the inconsistency in how measures have been set and recorded, highlight a lack of system-wide oversight and quality control within DHHS.

Program area staff enter proposed performance measures into SAMS2, which then must be approved by:

  • a peer or team leader within the same program area
  • a finance approver for the relevant group, division or region.

DHHS's Service Agreement business rules and guidelines provide no guidance for staff on what to consider when approving proposed performance measures.

Beyond the program and finance-level approvals, DHHS does not perform a system-wide review of service agreement performance measures for similar activities to ensure that they are set and recorded in a compliant and consistent manner. This prevents DHHS from obtaining a clear and accurate understanding of performance across the state.

Agreement review mechanisms

Service agreements should include mechanisms and triggers to review the terms and conditions of the service agreement.

Service agreements provide for two types of reviews:

  • Clause 9 on audits and performance reviews
  • Clause 21 on reviewing terms and conditions of the service agreement.

This section is focused on the Clause 21 review process. Clause 9 is discussed in part three of this report.

We found that the mechanisms to review the terms and conditions of DHHS service agreements are sound. However, DHHS lacks assurance that its service agreement variations are being processed in accordance with these mechanisms.

Clause 21 of each service agreement states that the agreement may only be varied if either:

  • DHHS and the organisation agree in writing to the variation, or
  • DHHS notifies the organisation in writing of a proposed variation to the agreement and the date the proposed variation will take effect, and the organisation continues to deliver all or part of the services or delivers new services as described in the proposed variation after the effective date.

Variations are commonly used for service growth or new services. They can also be used for other changes, such as to funding and performance targets. In the service agreements of the 12 selected funded organisations we examined in this audit variations included:

  • increase/decrease to targets following a performance review
  • one-off funding allocation
  • recoup of unspent funds
  • transfer of funding from one organisation to another
  • lapsing funding allocation from the previous financial year.

DHHS has a standard variation process to support this clause that sets out a monthly variation schedule and approval process. This ensures consistency in documenting and timing variations.

The Service agreement information kit sets out the triggers for an agreement variation. These triggers include changes to funding and deliverables, or changes to other requirements contained in the agreement. Either DHHS or the organisation can initiate negotiations for a variation.

DHHS documents the details of each variation in SAMS2 and a finance delegate approves it. Once approved, organisations can review the variation and an amended service agreement through FAC. Organisations have five working days to check that the new version of the agreement reflects their expectations and to advise if there are any errors. Variations are effective five working days after being published on FAC.

The DHHS Service Agreement business rules and guidelines also includes further guidance for DHHS staff on the variation process. It sets out the minimum information required for the financial delegate to approve a variation. It also introduced the requirements for annual compliance audits of variations to verify that staff record the minimum information required in SAMS2 when processing a variation. DHHS has conducted the audit only once, using a small sample of 25 variations. DHHS has not conducted the audit for the 2017–18 financial year due to staff resourcing constraints.

The Adult, Community and Further Education Board plans and promotes adult learning, allocates resources, develops policies and advises the Minister for Training and Skills on matters related to adult education in Victoria.

The 2016–17 compliance audit found that:

  • nine variations (36 per cent) were fully compliant
  • eight variations (32 per cent) were partially compliant, with the majority of these supported by signed approval records but missing other key information, such as the cost centre or allocation method
  • eight variations (32 per cent) were noncompliant, having no supporting documentation recorded in SAMS2. Six of these variations related to service agreements that are managed by the Adult, Community and Further Education Board but recorded in SAMS2.

The absence of any subsequent variation compliance audits since 2016–17 limits assurance that DHHS is approving and processing service agreement variations in a compliant, evidence-based manner.

Links to the Department of Health and Human Services strategic plan and the Victorian public health and wellbeing outcomes framework

Service agreements should contain explicit links to DHHS's desired service system outcomes. In particular, a service agreement's accountability structures—comprising performance reporting and compliance with standards—should link to these outcomes.

We found that only some service agreement accountability structures clearly link with the Department of Health and Human Services strategic plan (DHHS strategic plan) and the Victorian public health and wellbeing outcomes framework (DHHS's outcomes framework).

DHHS established five outcomes in its 2017 strategic plan. This audit focused on one of these—'Victorian Health and Human Services are person‑centred and sustainable'. It is consistent with DHHS's outcomes framework.

To achieve this strategic direction and outcome, DHHS has established four supporting service system outcomes and identified underlying key results for each outcome as shown in Appendix C of this report. The four supporting service system outcomes are:

  • services are appropriate and accessible in the right place, at the right time
  • services are inclusive and respond to choice, culture, identity, circumstances and goals
  • services are efficient and sustainable
  • services are safe, high-quality and provide a positive experience.

The Department of Health and Human Services Standards are a single set of service quality standards for DHHS‑funded organisations and DHHS‑managed services. Organisations that provide direct client services must meet the standards as an obligation of their service agreement

The accessibility and quality of services that funded organisations provide under the service agreements directly impact DHHS's ability to achieve key results under the service system outcomes.

Service agreements require funded organisations that deliver services within the scope of the Department of Health and Human Services Standards (DHHS Service Standards) to obtain accreditation, every three years, against four standards:

  • Empowerment—people's rights are promoted and upheld.
  • Access and engagement—people's right to access transparent, equitable and integrated services is promoted and upheld.
  • Wellbeing—people's right to wellbeing and safety is promoted and upheld.
  • Participation—people's right to choice, decision-making and to actively participate as a valued member of their chosen community is promoted and upheld.

These standards align with DHHS's service system outcomes. Therefore, the service agreement requirements to obtain accreditation against the DHHS Service Standards contribute towards ensuring that service delivery aligns with the DHHS strategic plan and DHHS's outcomes framework.

In contrast, the way in which performance measures and activity reporting requirements in service agreements are linked to the DHHS strategic plan and DHHS's outcomes framework is less clear. Performance measures in service agreements almost exclusively reflect outputs and do not demonstrate how well organisations are achieving the outcomes DHHS has identified. Nor do service agreements explicitly mention the DHHS strategic plan or DHHS's outcomes framework.

Output-based performance measures

Typical examples of the output-driven performance measures in service agreements are:

  • number of service hours
  • number of clients
  • number of sessions.

These measures do not provide any information on service quality. Furthermore, the outcomes framework does not have any benchmarks or targets to assess performance or achievement of outcomes.

Across the 12 selected service agreements we found only two examples of performance measures that clearly focus on service quality:

  • 'percentage of clients who are satisfied with the service provided'—included in six out of the 12 agreements
  • 'percentage of services provided and/or referred to against identified key needs'—included in four of the 12 agreements.

While both measures are directly relevant to the system outcome 'services are safe, high quality and provide a positive experience', they are not mandatory for all funded organisations delivering the corresponding service activity.

2.3 Aligning service agreement requirements to risk

The scale and complexity of outsourced health and human services varies greatly, so it is important that the requirements set under each service agreement are targeted and proportionate to service risks.

We found that:

  • DHHS uses numerous mechanisms to manage service agreement risks—which are also fragmented and largely disconnected from each other
  • DHHS's main tool for categorising funded organisations according to risk has limited coverage, applying to only around a third of all funded organisations
  • DHHS's fragmented risk oversight does not inform funded organisations' service agreement obligations—which funded organisations commonly viewed as excessive and duplicative.

Categorising funded organisations according to risk

In recent years DHHS has introduced new methods, tools and systems to identify and assess risks associated with funded organisations and service agreements:

  • DHHS established FOPMF in 2016 to monitor performance and risks associated with funded organisations' service delivery, financial management and governance. This includes using a RAT to assess the severity of performance issues. We discuss FOPMF in further detail in Part 4 of this report.
  • In January 2016 DHHS launched the live monitoring component of SAMS2 where DHHS staff can record in real time performance issues and risks relating to funded organisations' service agreement.
  • DHHS launched its new CIMS in early 2018 to record and investigate incidents that have a direct impact on the safety of clients. Incidents recorded in CIMS include, but are not limited to, death, physical, emotional and sexual abuse and poor quality of care.
  • Since 2015 DHHS has performed spot audits of residential care providers to ensure that they deliver high-quality, compliant services to children and young people who reside in out-of-home care. DHHS undertakes these audits exclusively for residential care due to the higher risk that the activity poses.

Additionally, in 2015 DHHS introduced a risk-tiering framework to categorise funded organisations according to risk. This occurred as a result of internal reviews that took place between 2011 and 2014 all of which highlighted the need for a more risk-based approach to monitoring funded organisations.

Risk-tiering framework

Under the risk-tiering framework, DHHS performs quarterly assessments of funded organisations using criteria based on various reporting, systems and reviews, such as:

  • failure to meet the DHHS Service Standards
  • risk to client safety
  • loss or unauthorised disclosure of client information
  • failure to meet targets as highlighted in the SDT data
  • failure to meet its obligations in a timely manner.

Based on the results of these assessments, DHHS places organisations on one of three tiers—high, medium or low risk.

The risk-tiering framework has some key limitations. DHHS applies risk-tiering exclusively to funded organisations that fall under the DHHS Service Standards, which apply to direct client contact human services and not health or mental health activities. This means that approximately two-thirds of the 1 331 funded organisations with a standard service agreement are excluded from risk-tiering assessments. While risk-tiering does apply to higher risk client-facing services, DHHS acknowledges the need to expand its risk-based oversight to all funded organisations.

Figure 2D shows the average risk-tiering assessment score given to individual funded organisations during 2017 against the total funding received. Risk-tiering assessments do not consider the level of funding that an organisation receives.

Figure 2D
Average risk-tiering assessment score given to funded organisations against funding received, 2017

Average risk-tiering assessment score given to funded organisations against funding received, 2017

Note: Green dots = low-risk organisations; Orange dots = medium-risk organisations; red dots = high-risk organisations.
Source: VAGO based on DHHS data.

DHHS also does not use the results of risk-tiering assessments to inform service agreement obligations imposed on funded organisations. Consequently, funded organisations, regardless of their risk assessment, are subject to similar service agreement requirements, with the exception of those on short-form agreements and of the variations made to active service agreements.

Instead, risk-tiering results are sent to:

  • divisional staff for consideration, alongside local monitoring
  • central office staff to assist with prioritising unannounced audits of residential care providers, and to inform decisions to register organisations in line with requirements under the Children, Youth and Families Act 2005and the Disability Act 2006.

We found that there is a lack of integrated strategic risk assessment and management of DHHS service agreements. DHHS's other forms of risk-based oversight such as risks identified through FOPMF, live monitoring and audits of residential care providers, are not considered in combination with the risk‑tiering results and are mostly dealt with in isolation. This fragmentation increases the chance of inconsistent results and significant risks being missed by relevant departmental staff.

Funded organisations' administrative and compliance requirements

Through our online surveys and face-to-face interviews, we sought the views of funded organisations on whether:

  • service agreement administrative and compliance obligations align with the level of risk associated with contracted services
  • there is any duplication in the service agreement and data reporting requirements
  • they are able to consistently meet their service agreement and data reporting requirements
  • DHHS follows up when administrative and compliance obligations are not met.

We found that:

  • while the majority of funded organisations view their administrative and compliance obligations as being proportionate to service risk, a significant proportion of organisations believe they are excessive
  • service agreement administrative and compliance requirements are often duplicative at the departmental and inter-jurisdictional level—especially for larger funded organisations that provide services across multiple DHHS areas
  • only about half of the surveyed organisations believe they are consistently able to meet their service agreement administrative and compliance obligations
  • human services-focused organisations more commonly viewed their administrative and compliance obligations as being disproportionate to risk and beyond their own capacity.
Survey results

We summarise all the survey responses from funded organisations in Appendix E. The following sections focus on survey responses regarding funded organisations' administrative and compliance requirements.

Matching administrative and compliance obligations to risk

Figure 2E summarises funded organisations' responses to our survey question about whether their administrative and compliance obligations were appropriately matched to service risks. It shows that 70 per cent of respondents either agreed or strongly agreed that administrative and compliance obligations in their service agreement aligned with the associated risk.

Figure 2E
Survey responses—Funded organisations
Question 3: To what extent do you agree with this statement: Service agreement administrative and compliance requirements are appropriately matched to the level of risk associated with the services we are funded to deliver

Survey responses—Funded organisations. Question 3: To what extent do you agree with this statement: Service agreement administrative and compliance requirements are appropriately matched to the level of risk associated with the services we are funded to deliver

Source: VAGO.

Health services-focused organisations gave more favourable responses to this question than human services-focused organisations:

  • Seventy-eight per cent of surveyed organisations that primarily deliver health services either agreed or strongly agreed that their administrative and compliance obligations were appropriately matched to their service risk. Another 10 per cent of respondents either disagreed or strongly disagreed, while 12 per cent neither agreed nor disagreed.
  • Sixty-seven per cent of surveyed organisations that primarily deliver human services either agreed or strongly agreed that their administrative and compliance obligations were appropriately matched to service risks. Another 13 per cent either disagreed or strongly disagreed, while 20percent neither agreed nor disagreed.

The funded organisations' open-text responses commonly raised concerns about excessive administrative and compliance requirements set by DHHS that do not scale with service risks, organisation size or the level of funding provided. Figure 2F gives examples of these concerns.

Figure 2F
Survey responses—Funded organisations: Open-text examples highlighting excessive administrative and compliance requirements

'It seems that there is the same quality system requirements for small, relatively simple programs as for the large and complex ones.'

'We are a small service delivery organisation in a small rural town. Our compliance requirements are far above the level of risk associated with the delivery of services we provide.'

'There is significant administrative expectations from the Department for low level funding arrangements. Excessive meetings and duplication of processes.'

'We are a small one staff member organisation having to meet the requirements of hugely funded agencies so nearly all areas are a burden to us.'

'Compliance is an onerous process and more time is spent on this area than program delivery.'

'We receive funding from the commonwealth and NSW government as well DHHS. We find that reporting is much greater from DHHS than from other governments.'

'We do not provide direct care services, but receive emails about compliance requirements as if we did.'

'Given the meagre program funding received, the level of reporting on ASM [Active Service Model], Diversity and Care Plans for example do not fit into the very low level offered by [name of organisation] and the model of operation we use.'

'We have already completed accreditation through registered Quality and Regulatory providers, e.g. [name of independent review body]. Why do we have to go through it all again. Unnecessary red tape. Risk aversion is over the top!'

'Focus is on throughput numbers according to targets, but little concentration on the quality of service or differentiation of which programs hold the most risky situations.'

'The requirements for Accreditation against HSS [Human Services Standards] plus governance standards, as well as the FOPMF and the SACC [Service Agreement Compliance Certification] are overly duplicative and burdensome. We are required to report in an extraordinary amount of detail how we go about our business, and maintain multiple registers for small numbers. This level of reporting does not assist us to manage risk in fact it creates a risk to the organisation in terms of our capacity to deliver quality services.'

'The amount of compliance required for our very small organisation is significant.'

'Compliance requirements have increased significantly without adequate funding. Most services we deliver are not high risk however we do need to comply with a wide range of legal and other requirements because of the variety of services we provide.'

'Reporting, data collection and compliance arrangements vary greatly between different sections of DHHS but can include double submission of data, face to face meetings, and reporting both centrally and regionally. The compliance requirements appear to be increasing across the board with little or no relationship to the level of risk of services.'

Source: VAGO.

Despite respondents' concerns about excessive administrative and compliance obligations, many still believe that they do not receive sufficient performance information in return from DHHS. Only 55 per cent of surveyed organisations either agreed or strongly agreed that they receive all the information they need from DHHS to understand how well their organisation is performing against service agreement targets. Figure E10 in Appendix E further details survey responses to this question.

Duplication across administrative and compliance obligations

Responses to our survey of funded organisations indicated that there is duplication in DHHS's service agreement reporting and data requirements as shown in Figure 2G below.

Figure 2G
Survey responses—Funded organisations: Duplication in service agreement reporting and data requirements

 

Level of duplication reported

Type of duplication

Significant

Moderate

Minimal

None

Not sure

Duplication within DHHS(a)

8%

13%

31%

42%

6%

Duplication across DHHS and other parties(b)

12%

19%

36%

29%

4%

(a) Relates to survey question 4—Is there any duplication in the service agreement reporting and data your organisation is required to provide to DHHS?
(b) Relates to survey question 5—Is there any duplication in the service agreement reporting and data your organisation is required to provide to other parties?
Source: VAGO.

Fifty-two per cent of respondents reported some level of duplication of reporting and data that their organisation is required to provide to DHHS. This includes being required to provide the same data to different DHHS information systems or providing the same information to different DHHS divisions or areas. The funded organisations' open-text responses raised numerous concerns with duplicative reporting and data requirements set by DHHS. Figure 2H gives examples of these concerns.

Figure 2H
Survey responses—Funded organisations: Open-text examples highlighting duplicative data and reporting requirements within DHHS

'The same data is reported often, monthly, quarterly and annually, via a variety of systems. DHHS do a 2–3 hour desk review with us which is a "mini" accreditation, absolute waste of time.'

'We have to provide three identical sets of quality documents one to health and one to North West Human Services and one to Southern Human Services - we also meet with three different LEOs or PASAs [Program Advisers] during the year.'

'Historically there has been a lack of consistency in reporting requirements and no streamlining. There are multiple reporting requirements to different stakeholders and individual contract managers request KPI reporting or additional reports in the format they require. There appear to be some attempts to change this.'

'As we are a state funded organisation, we continuously have to provide the exact same information to different DHHS divisions/areas.'

'The SACC [Service Agreement Compliance Certification], FOPMF, Desk Review and Accreditation process all involve the same questions.'

'Reporting is programmatic based so clients that receive multiple services from agencies are required to provide their client data to each individual service provider multiple times which is then entered into separate program databases.'

'An example is in homelessness and family violence counselling services where we are required to submit MDS [minimum dataset] data quarterly to DHHS, but submit the same data monthly to the regional office.'

Source: VAGO.

Sixty-seven per cent of respondents reported some level of duplication between the reporting and data requirements of DHHS and other parties such as Commonwealth government departments and accreditation bodies. The funded organisations' open-text responses commonly raised concerns with these duplicative reporting and data requirements. Figure 2I gives examples of these concerns.

Figure 2I
Survey responses—Funded organisations: Open-text examples highlighting duplicative data and reporting requirements across DHHS and other state or Commonwealth departments.

'The data we are required to submit to DHHS, the Adult, Community and Further Education Board and Local Government are often the same but required in different formats adding to the administrative burden. Then there are all the other Government departments. If one software program could be developed which would export data to everyone it would make our lives easier.'

'We have had instances where we have had to provide a great deal of resources and time to dealing with the same type of data and issues for different accreditation bodies and Government agencies to demonstrate compliance.'

'Homelessness data is reported to both the AIHW [Australian Institute of Health and Welfare] data collection and to DHHS separately.'

'Reporting to both DHHS and CHSP [Commonwealth Home Support Programme] requires some duplicate reporting and both conduct similar quality audits which have similar questions and aims.'

'As an example we provide significant information to our accreditation body and then have to submit at various times the exact same information and detail to DHHS and other Government bodies upon request.'

'Auditing requirements for DHHS and Dept Health [Commonwealth] funded program are duplicated in many ways. Separate audits means double information provided and additional cost.'

'Compliance for DHHS and Commonwealth is the same yet in a different format.'

'When one program is funded by two Victorian State departments, then the same information has to be reported to both depts.'

Source: VAGO.

The survey results regarding duplicative and administrative and compliance obligations is consistent with our findings in Section 4.2 regarding the design of FOPMF.

Capacity to meet administrative and compliance obligations

Figure 2J summarises responses to our survey question about the extent to which funded organisations can meet all service agreement administrative and compliance obligations. Only 52 per cent of surveyed funded organisations believe that they can consistently meet all requirements.

Figure 2J
Survey responses—Funded organisations
Question 8: To what extent is your organisation able to meet all the service agreement administrative and compliance requirements?

Survey responses—Funded organisations. Question 8: To what extent is your organisation able to meet all the service agreement administrative and compliance requirements?

Source: VAGO.

The survey results showed that human services-focused organisations have less capacity to meet DHHS's service agreement administrative and compliance requirements than health services organisations:

  • Staff in human services-focused organisations find it harder to meet all service agreement administrative and compliance requirements. Forty‑eight per cent of human services-focused organisations reported that they are always able to meet these requirements, compared to 64 per cent of health services-focused organisations.
  • Fifty-nine per cent of human services-focused organisations reported that they have staff resources dedicated to meeting these requirements, compared to 65 per cent of health services-focused organisations.
  • Staff in human services-focused organisations spend more time than staff in health services-focused organisations on meeting service agreement administrative and compliance requirements. In 36 per cent of surveyed human services-focused organisations, service delivery staff spend more than one-fifth of their time on meeting these requirements. In comparison, 22 per cent of service delivery staff in health services-focused organisations spend more than one-fifth of their time on meeting these requirements.
Administrative and compliance requirements across multiple DHHS areas

Surveyed organisations that deliver services in multiple DHHS areas reported greater misalignment between their administrative and compliance requirements and their service risks, as well as greater duplication across their data and reporting obligations:

  • Nineteen per cent of surveyed organisations that deliver services in multiple DHHS areas either disagreed or strongly disagreed that their administrative and compliance requirements were appropriately matched with service risks, compared to 10 per cent of organisations that deliver services in one DHHS area.
  • Thirty-four per cent of surveyed organisations that deliver services in multiple DHHS areas reported either significant or moderate levels of duplication in the service agreement reporting and data required by DHHS, compared to 17 per cent of organisations that deliver services in one DHHS area.
Face-to-face interviews with selected funded organisations

Our discussions with four selected funded organisations provided mixed views on their administrative and compliance obligations.

One organisation reported that there is little duplication across its administrative and compliance obligations. However, it also reported that DHHS's introduction of CIMS in early 2018 had led to excessive investigative and reporting obligations compared to previous arrangements.

Another organisation reported that its administrative and compliance obligations were no longer excessive after the DHHS LEO started meeting every two months with the organisation approximately two years ago. This reportedly led to more proactive engagements and streamlined performance monitoring processes.

The third organisation advised that its administrative and compliance obligations were resource-intensive to meet, but not excessive. It did acknowledge that some degree of duplication exists across these obligations, however, it believed that this duplication helped to reinforce its understanding of performance.

The fourth organisation that provides services across Victoria viewed its administrative and compliance obligations as highly excessive. It attributed this to:

  • duplicative and fragmented monitoring performed by different LEOs across multiple regions
  • significant overlap between different performance monitoring functions. In particular, the organisation advised that various annual performance monitoring requirements under FOPMF were already covered in greater detail by a separate triennial review of each organisation by an independent accreditation body.

Consistent with the results from our survey of funded organisations, our face-to-face discussions with funded organisations indicated that larger funded organisations—which typically deliver more services across multiple DHHS areas—are more likely to have duplication in administrative and compliance obligations.

Back to Top