Contract Management Capability in DHHS: Service Agreements

Tabled: 20 September 2018

4 Monitoring and managing performance

The aim of contract management is to ensure that all parties meet their obligations. All contracts—including service agreements—require active management throughout their life to ensure that the goods and services are delivered to the agreed standards and timeframes.

In line with the ANAO's better practice guide, monitoring and managing service agreement performance involves:

  • collecting sufficient but not excessive data from funded organisations on their service delivery and adherence to other contractual requirements
  • using collected data to assess whether funded organisations are meeting contractual requirements
  • taking appropriate action to address underperformance.

In this part, we assess whether:

  • FOPMF aligns with better practice contract management and supports staff to effectively manage service agreements
  • DHHS is consistently and comprehensively implementing the performance monitoring framework to drive service outcomes for clients.

4.1 Conclusion

DHHS's FOPMF is inefficient and ineffective and does not support the provision of system-wide assurance that clients receive safe, high-quality services that meet their needs. The framework does not enable staff to gain a clear insight into performance issues and whether contracted services are being delivered as intended. Its inability to match monitoring requirements to risk, complexity and funding levels—combined with fragmented and duplicative approaches to collecting performance information—leads to wasted monitoring efforts for both DHHS and funded organisations.

It is therefore not surprising that the uptake of prescribed performance monitoring tools among DHHS staff has been inconsistent, with many turning to local systems and tools to offset the framework's shortcomings. Such fragmentation prevents DHHS from providing informed advice to its senior management and ministers on performance issues that could put client safety and service delivery at risk.

4.2 Performance monitoring framework

DHHS's performance monitoring framework, FOPMF, provides the process for DHHS staff to assess funded organisations' compliance with service agreement requirements and to respond to identified risks and underperformance. It became operational on 1 January 2016. Figure 4A shows the components of FOPMF.

Figure 4A
Components of FOPMF

Figure 4A shows the components of FOPMF

Source: VAGO.

FOPMF comprises a series of pre-existing and new monitoring tools as shown in Figure 4B. FOPMF tools focus on monitoring funded organisations' governance, financial management and quality and safety of service delivery to clients.

Figure 4B
FOPMF tools

Tool

Description

Existing tools (pre-FOPMF)

Desktop review

Annual assessment of organisations' performance undertaken by monitoring teams.

Service review

Conducted where DHHS identifies a high level of risk or issues of concern. Can be collaborative or investigative in nature.

Financial Accountability Requirements

Organisations submit their financial position to DHHS each year, which is then reviewed and assessed to confirm that the organisation is financially sustainable.

New tools

Service agreement monitoring checklists

Completed by service agreement advisers annually to assess organisations' compliance and performance. Includes Organisational Compliance Checklist, Service Plan Checklist, Quality and Safety Checklist and specialist checklists. Some checklists were embedded within SAMS2 as of July 2017.

Live monitoring

SAMS2 feature that allows service agreement advisers to record real time data on organisation performance and track resolution of issues.

RAT

A tool DHHS staff use to determine the severity of performance issues. Can trigger remedial actions or a service review.

Service agreement compliance certification (SACC)

Online attestation completed by the organisations annually regarding financial performance, risk management, staff safety screening and privacy. Embedded within SAMS2.

Source: VAGO.

Overall, we found that:

  • FOPMF is essentially a one-size-fits-all framework with some minor exceptions where FOPMF requirements are either optional or not applicable. It cannot be scaled up or down to account for the varying complexities and sizes of funded organisations, or their risk profiles.
  • FOPMF tools such as the service agreement checklists and desktop reviews are heavily compliance driven to ensure funded organisations meet legislative and policy requirements.However, they lack the ability to enable deeper insights into service quality and performance issues.
  • FOPMF drives a fragmented and duplicative approach to collecting performance information.

The framework applies to organisations funded through a service agreement with DHHS. There are some exceptions where FOPMF requirements are either optional or not applicable:

  • FOPMF does not apply to organisations funded only through a short-form agreement (used for lower risk grant funding) or a corporate commercial contract. These agreements have their own specific reporting requirements and monitoring arrangements.
  • The service agreement monitoring checklists do not apply to community participation service activities such as neighbourhood houses.
  • The service agreement monitoring checklists are optional for:
    • services that are not direct client facing such as research and training
    • health services that only receive funding through DHHS's Budget Performance System.
  • The desktop review does not apply to hospitals, local governments, organisations funded under the National Disability Insurance Scheme, universities, technical and further education institutes (TAFE), schools and some specific community participation activities.
  • The risk attestation component of the SACC form does not apply to TAFEs and health services that already include an attestation against risk management in their annual reports.

FOPMF design

Development of FOPMF

The development of FOPMF was informed by issues identified through external and internal past reviews. In developing FOPMF, DHHS referred to these external sources including the Royal Commission into Institutional Child Sexual Abuse, a range of Victorian Ombudsman and VAGO reports, parliamentary inquiries and internal reviews. FOPMF's key monitoring areas align with the key risk areas identified in these inquiries and reviews as needing more effective oversight—governance, financial management and quality and safety of service delivery to clients. FOPMF tools, such as the service agreement checklists, desktop reviews and the SACC form, cover these three key risk areas. Figure 4C lists the topics FOPMF tools cover collectively under each of these key risk areas.

Figure 4C
Topics covered by FOPMF tools

Topic

Description

Governance

  • Board and management capabilities and responsibilities
  • Strategic and work planning
  • Risk management

Financial management

  • Financial viability and risks
  • Financial management

Quality and safety of service delivery

  • Performance measures and reporting
  • Staff safety screening (e.g. police, working with children, and referee checks)
  • Staff training
  • Incident management and reporting
  • Complaints management
  • Registration, accreditation and quality standards
  • Fire risk and emergency management
  • Service user safety and wellbeing
  • Records management
  • Privacy, data protection and data quality

Source: VAGO.

Limitations of FOPMF design

FOPMF has some limitations which hinder its effectiveness as a performance management framework.

Scope

Exemptions for different FOPMF requirements are not clear. This makes it difficult to ensure that exemptions are applied as intended, especially when an organisation is funded for multiple services where exemptions may or may not be applicable.

One-size-fits-all

FOPMF does not sufficiently account for the varying complexities and sizes of funded organisations. It is essentially a one-size-fits-all framework with some minor exceptions—those discussed earlier or when the shorter quality and safety checklists are used for lower risk, lower funded organisations. FOPMF does not distinguish medium-risk organisations from high-risk organisations. The same level of performance monitoring applies to these organisations, which can lead to administrative burden for both the funded organisation and DHHS staff. The absence of a risk-tiered approach to performance monitoring is discussed in Section 2.3.

Compliance driven

FOPMF tools such as the service agreement checklists and desktop reviews are heavily compliance driven. The questions in these tools are mostly related to assessing whether an organisation is meeting legislative and policy requirements, such as those under the Children Youth and Families Act 2005 and the Public Records Act 1973. Performance is assessed as either compliant, compliant in part or noncompliant. While this approach is important, these monitoring tools still need to enable deeper insights into service quality and performance issues.

Our examination of 38 service reviews showed that FOPMF tools do not cover some of the recurring issues identified in these reviews. Examples include:

  • staff supervision, management and support (including staff performance management)
  • staff rostering and ratios (including use of casual staff)
  • review of board and CEO performance
  • engagement with external stakeholders (including service users' family members and other services)
  • facilities management and upkeep.
Better practice principles in contract management

FOPMF could be strengthened by incorporating better practice principles of monitoring contractor performance, such as the ANAO's better practice guide and the VGPB VPS Procurement Capability Framework , into the framework. For example, if service agreement staff were to adopt a structured approach to managing relationships with contractors, for example, including formal meetings at predetermined intervals, both parties would have a clear understanding of when contractor performance was to be formally reviewed. This would assist with managing the contractor relationship and would help DHHS staff to enforce the terms of the contract in a professional manner based on evidence of contractual performance.

Staff views on FOPMF design

Responses to the DHHS staff survey we conducted on service agreement management also provide an insight into some of the limitations with FOPMF design. The following is a summary of the key survey results related to FOPMF design, which are detailed in Appendix D.

Only 42 per cent of respondents said they agree or strongly agree that FOPMF helps them monitor and manage the performance of funded organisations effectively. Limited access to information appears to be a central issue with FOPMF design according to survey respondents:

  • Only 66 per cent of respondents agreed or strongly agreed that they can access information that allows them to see how a funded organisation is performing against its contracted KPIs.
  • Only 46 per cent of respondents agreed or strongly agreed that they can access information that allows them to see how a funded organisation is performing in a different DHHS division or area.
  • Only 43 per cent of respondents agreed or strongly agreed that they can access information that allows them to compare how a funded organisation is performing against other funded organisations that deliver similar services.
  • Only 12 per cent of respondents agreed or strongly agreed that they can access information that allows them to compare how a funded organisation is performing against a better practice benchmark.
  • Only 35 per cent of respondents agreed or strongly agreed that they have access to all the information they need to effectively monitor and manage the performance of funded organisations.

Respondents identified some of the key barriers to accessing the necessary information as needing to gather performance information from various data systems for different funded activities and that SAMS2 is difficult to navigate.

Collecting performance data

Collecting data on performance against targets

DHHS uses its SDT tool to capture funded organisations' performance data. DHHS introduced SDT in 2014 to enable more frequent performance monitoring of high-risk activities. SDT requires in-scope funded organisations to self-report performance results against performance targets monthly, by entering data directly into the FAC. However, SDT is limited to approximately a third of human services activities; other human services activities and health activities are excluded.

For activities not subject to SDT, DHHS uses data it collects from funded organisations as part of the data collection requirements to monitor service provision. This monitoring, however, is at the state and local level, and is intended only for strategic and operational reporting purposes to senior management and to ministers. It is not designed to monitor performance at the funded organisational level. This data collection also involves multiple data management systems and the frequency of data collection varies depending on the activity. This makes it challenging to collect and monitor performance data specific to a funded organisation that is not subject to SDT.

These limitations also mean that DHHS does not have a complete picture of funded organisations' performance against targets.

Collecting data on performance against DHHS objectives

DHHS collects output-focused performance data which does not show how well a funded organisation is performing against relevant DHHS objectives and outcomes—see 'Links to the Department of Health and Human Services strategic plan and the Victorian public health and wellbeing outcomes framework' in Section 2.2.

Issues with performance information collection

We found multiple areas of concern with how performance information is collected.

Multiple systems

FOPMF requires service agreement advisers to refer to other existing monitoring systems—such as those relating to incident reporting information, complaints handling and Financial Accountability Requirements—and to transfer the relevant information when completing FOPMF tools, including service agreement checklists and desktop reviews. The disparate nature of these systems makes completing FOPMF tools administratively difficult and time consuming. It also increases the risk of human error when gathering the relevant performance information.

Unclear information collection frequency

FOPMF requires service agreement advisers to complete service agreement monitoring checklists in SAMS2 annually. Some checklist questions require performance information to be collected once a year while other questions require more regular information collection to enable ongoing monitoring. The nature of a checklist question and the availability of performance information determines the frequency of collection.

DHHS provides some guidance on performance information collection frequency. However, the guidelines and checklist templates do not clearly identify which checklist questions require annual information and which require more frequent information collection. Service agreement advisers ultimately determine how frequently they collect, record and monitor performance information for each checklist question in SAMS2, which could be too infrequent to enable meaningful performance monitoring.

The infrequent collection of performance information and use of service agreement monitoring checklists was illustrated in the checklists for the 12 funded organisations we examined in this audit. As of 3 April 2018—just three months before the end of the 2017–18 reporting period—SAMS2 showed:

  • organisational compliance checklists were blank for eight of the 12 funded organisations, even though, according to DHHS guidelines, there is at least one question that would require regular information collection and monitoring
  • service plan checklists were blank for six of the 12 selected funded organisations, even though, according to DHHS guidelines, there are at least six questions that would require regular information collection and monitoring.

The lack of clear guidance on which checklist questions require annual information collections versus those requiring ongoing information collection is further highlighted in the Outer Eastern Melbourne Area. They have created a monitoring questions map to ensure there is consistency in how frequently staff apply checklist questions. The map colour codes checklist questions to distinguish those that require annual monitoring from those that require ongoing monitoring.

Overlapping information collection

DHHS requires both the service agreement monitoring checklists and the desktop review to be completed on an annual basis. Both comprise questions focused on identifying risks in relation to governance, financial management and service delivery, although the layout and language used is not identical. Figure 4D maps the topics across FOPMF tools.

Figure 4D
Mapping of topics covered by FOPMF tools

Theme

Topic

FOPMF monitoring tools(a)

Service agreement checklists

 

Desktop review

SACC

Organisational Compliance

Service Plan

Governance

Board and management capabilities and responsibilities

 

 

Strategic/work planning

     

Risk management

   

Financial management

Financial viability and risks

Financial management

   

Service delivery

Performance measures and reporting

 

 

Staff safety screening (police, working with children and referee checks)

   

Staff training (service delivery staff)

 

   

Incident management and reporting

 

Complaints management

 

 

Registration, accreditation and quality standards

 

Fire risk/emergency management

 

 

Service user safety and wellbeing

 

   

Records/information management

     

Privacy, data protection and data quality

   

(a) The quality and safety checklist has not been included as it is used in place of the Organisation Compliance Checklist and the Service Plan Checklist for lower risk, lower funded organisations. Specialist checklists have not been included as they only apply to specific services (e.g. disability; residential care for children and young people).
Source: VAGO.

DHHS advises that the uptake of the service agreement monitoring checklists has not been consistent across the state since FOPMF became operational in 2016. Consequently, DHHS will only decommission its desktop reviews—a legacy tool from the previous monitoring framework that serves a similar function—once there is more consistent uptake of monitoring checklists. While the logic underpinning this decision is understandable, the continued availability of a familiar, pre-existing monitoring tool with a similar intent will continue to cause duplication and discourage service agreement advisers from using the checklists more consistently. Refer to Section 2.3 for funded organisations' survey results regarding duplication across administrative and compliance obligations.

Frequency of engagement with organisations

According to the ANAO's better practice guide, it is better practice to adopt a structured approach to managing the relationship with a contractor that consists of both informal interactions and formal meetings at predetermined intervals. Having predetermined meetings scheduled ensures both parties understand when contractor performance will be reviewed formally.

Under FOPMF, there is an expectation to monitor and engage with funded organisations throughout the year. However, FOPMF guidelines do not specify minimum requirements for formal meetings at predetermined intervals with funded organisations. There are no mechanisms under FOPMF to ensure service agreement advisers engage with and monitor funded organisations at a frequency that would be appropriate to their complexity, level of funding and risk. It is up to service agreement advisers to determine how often they engage with and monitor funded organisations.

The need to have formal meetings at predetermined intervals with funded organisations has been highlighted by the engagement model that the Outer Eastern Melbourne Area has developed to address this gap in FOPMF. The engagement model recognises the diversity of funded organisations and the need to tailor engagement frequencies accordingly. It categorises funded organisations into one of four tiers that reflect the complexity and level of funding of an organisation—Tier 1 for high funding, high complexity organisations down to Tier 4 for individual support packages providers (low funding, low complexity). It then specifies the minimum frequency for engagement and performance monitoring activities required at each tier (such as quarterly, six monthly and annually). The engagement model is supported by a monitoring and engagement schedule that identifies the organisations to be engaged with each month for the financial year.

Assessing and managing performance

Assessing performance issues and risks

FOPMF requires service agreement advisers to use a RAT to drive the assessment of performance. The RAT aims to give service agreement advisers a consistent approach to assessing the severity of an issue and to determining appropriate action where required.

The RAT is intended to be used in conjunction with FOPMF tools and associated guidelines. The risk ratings in the RAT are:

  • 0—no issue
  • 1—minor severity
  • 2—moderate severity
  • 3—major severity
  • 4—critical severity.

The RAT provides a consequence description against each rating to help staff determine the rating for the issue. It also provides some examples of actions service agreement advisers can take in response to each rating.

While the consequence descriptions in the RAT and Live Monitoring Business Rules cover the same content with slight variations to wording, they are not as clear and succinct as the rating descriptions provided in SAMS2 for recording live monitoring issues. An example of the difference in rating descriptions is shown in Figure 4E.

Figure 4E
Comparison of descriptions for 'moderate severity' rating

Source

Description

RAT

  • Compromised safety, rights, wellbeing of service user/employee for example inadequate response to a complaint or lack of evidence about staff training
  • Fire risk certificate not provided
  • Some performance failures against service agreement targets. Failure in service quality for example lack of evidence of complaint handling process
  • Modest disruption regarding provisions of services
  • Organisation has not provided SACC form after repeated reminders
  • There is a lack of evidence about governance roles and responsibilities, expertise of board/committee and planning, lack of board/committee training over three years
  • Financial irregularities or significant decrease in Commonwealth funding
  • Adverse public reports about service quality

Live Monitoring Business Rules

  • Compromised safety, rights, wellbeing of service user/employee
  • Some performance failures against service agreement targets
  • Failure in service quality
  • Modest disruption regarding provisions of services
  • Governance/organisation performance risks for example risks to long-term viability
  • Financial irregularities
  • Complaints raised in media

SAMS2

  • Issue indicates a trend or concern that without action may escalate and put services at risk but will have no immediate impact at this point

Source: VAGO based on information from DHHS.

This inconsistency in rating descriptions can cause confusion for staff who are trying to use the RAT and live monitoring. It can also lead to an inconsistent approach to assessing the severity of an issue and developing remedial actions.

Defining actions in response to underperformance

Depending on the severity of the issues (RAT rating), FOPMF provides three types of responses to performance issues as shown in Figure 4F.

Figure 4F
Responses to underperformance

Response type

Description

Remedial actions

Specific actions developed in discussion with DHHS and the funded organisation to address identified performance issues.

Service review

  • Collaborative review—undertaken in collaboration with the funded organisation and may involve an independent consultant to assess a funded organisation's ongoing viability and operating model with the aim of producing an action plan to address issues.
  • Investigative review—conducted by an independent consultant and managed by DHHS. Investigative reviews are undertaken when there is evidence or allegations made of a significant breach of the service agreement or service failure which will impact service user safety and wellbeing, or the ongoing provision of quality and sustainable services.

Enact clauses under service agreement or legislation

Occurs when there is sufficient evidence that the funded organisation has failed to address the requirements of the service agreement, impacting on service user safety and wellbeing, the ongoing provision of quality and sustainable services, or the ongoing viability of the service and funded organisation. An assessment of whether to enact a clause involves DHHS operational, management and executive staff and review of evidence. Enacting clauses may lead to service suspension, suspension of funding, cessation and termination of the service agreement.

Source: VAGO based on information from DHHS.

The FOPMF Guidelines for Department Monitoring Staff provide some guidance on which response action to take and the associated timeframes for each severity level. However, this guidance still relies on the judgement of service agreement advisers to determine a suitable timeframe as shown in Appendix F. This can lead to service agreement advisers setting different timeframes to resolve similar performance issues, especially when SAMS2 does not automatically set due dates for live monitoring actions. Refer to Section 4.3 regarding live monitoring issues and associated remedial actions recorded in SAMS2.

Communicating performance

FOPMF makes performance information available to internal and external stakeholders in the following ways:

  • completed service agreement monitoring checklists and live monitoring issues are available to DHHS staff through SAMS2
  • completed desktop reviews are available to DHHS staff and the relevant funded organisation through FAC.

However, it is up to the relevant individuals within DHHS to access this information. FOPMF does not provide clear instructions for communicating or using performance information. Specifically, DHHS's internal FOPMF guidelines provide little information on:

  • the intended audience/s for each completed FOPMF tool
  • the business rules and processes for distributing completed FOPMF tools
  • how the results feed into other forms of performance monitoring and reporting, such as the Area Performance, Assurance and Compliance and the Divisional Performance, Assurance and Compliance meetings.

Performance information, such as client incident reporting and performance against funded targets, is reported at the area and divisional level to senior management through forums such as Divisional Performance, Assurance and Compliance. DHHS collects this performance information from various data management systems and some of the completed FOPMF tools.

4.3 Applying the performance monitoring framework

The restructure review commissioned by DHHS found that there is inconsistent application of FOPMF—along with other processes for monitoring funded organisations—which requires DHHS to better communicate and oversight implementation of this mandated policy.

Our overall findings were consistent with the restructure review:

  • While there was a high uptake of desktop reviews and monitoring checklists among staff, use of the RAT for performance monitoring was particularly low.
  • The majority of staff who use the live monitoring tool do so to raise performance issues and organisational updates. However, the value of live monitoring data is limited by a high proportion of incomplete entries and poor staff awareness of the RAT.
  • Staff offered a variety of reasons for not using the FOPMF tools made available to them, including a lack of awareness and training, as well as the tools not suiting their needs.The frequent use of local systems and tools among divisional staff presents a key challenge to improving the uptake of FOPMF tools.

Use of FOPMF tools

Eighty-one per cent of respondents to our survey of DHHS service agreement staff reported that they use FOPMF in some capacity. The remaining respondents—including 16 per cent that reported not using FOPMF and 3 per cent who were not sure—are likely to include some staff who manage service agreements that are exempt from FOPMF requirements.

We asked the surveyed staff that reported using FOPMF in some capacity about which FOPMF tools they use to monitor funded organisation performance. Figure 4G summarises responses to this question.

Figure 4G
Survey responses—DHHS staff
Question 18: Which FOPMF tools do you use?

Figure 4G displays survey responses—DHHS staff Question 18: Which FOPMF tools do you use?

Note: This survey question applies only to respondents that reported using FOPMF.
Source: VAGO.

The particularly low uptake of the RAT is problematic, as it represents DHHS's system-wide tool for ensuring that staff assess the severity of performance issues consistently and accurately. It also undermines the reliability of performance issues entered into live monitoring, which is designed to be used alongside the RAT.

The use of FOPMF tools varied across both the health and human services portfolios and DHHS's Divisions:

  • In the health services, 62 per cent of FOPMF users reported that they use the RAT, compared to only 48 per cent of human services FOPMF users.
  • West Division FOPMF users reported the lowest uptake of live monitoring out of all DHHS's four divisions. Seventy-two per cent of these staff reported using this tool, while between 83 and 87 per cent of FOPMF users in the other three divisions reported using it.
  • There was a noticeable difference in the use of the RAT between the East and South Divisions (65 per cent and 61 per cent respectively) and the North and West Divisions (47 per cent and 46 per cent respectively).
  • Central office FOPMF users reported particularly low usage of the RAT (29 per cent) and of live monitoring (50 per cent).

DHHS can run reports showing the completion status of its monitoring checklists, desktop reviews and live monitoring issues. However, these reports are limited to the divisional level and cannot report in further detail at the area level. We sought but did not find any evidence showing the extent to which DHHS had used these reports to increase the uptake of FOPMF tools among staff.

Barriers to using FOPMF tools

The restructure review commissioned by DHHS identified that there is a perception among staff that insufficient guidance, processes and tools has resulted in inconsistent approaches by staff to addressing noncompliance by funded organisations. The restructure review identified that areas have developed their own tools to manage the scheduling of monitoring activities.

As part of our online survey, we asked DHHS staff to provide reasons for not using the FOPMF tools available to them. Common explanations included:

  • staff not being aware of the RAT
  • insufficient training and support on how to use FOPMF tools, particularly live monitoring
  • the tools not being suitable to the types of service agreements being managed
  • other competing priorities.

Through our survey, we also asked DHHS staff whether they use any local systems or tools instead of, or in addition to FOPMF, to monitor the performance of funded organisations. As shown in Figure 4H, 60 per cent of surveyed staff reported that they do use local systems or tools. This included common use of various performance monitoring templates, spreadsheets and other monitoring tools that sit outside of FOPMF.

Figure 4H
Survey responses—DHHS staff
Question 21: Do you use any local systems or tools instead of, or in addition to, FOPMF to monitor the performance of funded organisations?

Figure 4H displays survey responses—DHHS staff Question 21: Do you use any local systems or tools instead of, or in addition to, FOPMF to monitor the performance of funded organisations?

Source: VAGO.

Surveyed staff from DHHS's West Division consistently raised its own performance escalation framework as a key local tool for escalating performance issues in a more sophisticated way than is possible using FOPMF tools alone. The division's reliance on this local tool is likely the reason why its reported use of both the live monitoring and RAT was the lowest of all four divisions.

Under the performance escalation framework, West Division staff score funded organisations according to:

  • the variation between service delivery targets and results
  • the frequency of failure to meet service delivery targets
  • the number of 'minor' or 'moderate' issues entered into live monitoring.

The West Division's monitoring effort is then scaled to each organisation's score. The score also informs the seniority of DHHS staff allocated to monitor each organisation. This addresses issues we found with FOPMF's design, including its inability to scale monitoring requirements based on risk.

Our June 2018 Community Health Program performance audit report contains further information on the design and application of the West Division's performance escalation framework. This audit identified similar issues with the poor uptake of FOPMF and the reliance on local tools.

Identifying and addressing performance issues through live monitoring

We analysed SAMS2 data to understand how DHHS staff had used the live monitoring FOPMF tool to raise performance issues since it was introduced in mid-2015.

Live monitoring issues recorded in SAMS2

As at 17 April 2018, 1 384 live monitoring issues had been recorded in the SAMS2 information system across 616 funded organisations. This covers 32 per cent of the 1 927 funded organisations recorded in SAMS2.

For funded organisations subject to risk-tiering assessment, we found eight organisations with a high-risk score (risk score above 20) had no issues recorded in live monitoring—see Figure 4I.

Figure 4I
Number of live monitoring issues versus risk score

Figure 4I shows the number of live monitoring issues versus risk score

Source: VAGO.

Figure 4J shows that the majority of organisations with a live monitoring issue recorded had fewer then 10 entries created.

Figure 4J
Number of live monitoring issues created per funded organisation

Figure 4J shows the number of live monitoring issues created per funded organisation

Source: VAGO.

Figure 4K breaks down the distribution of live monitoring issues created across the four severity ratings—minor, moderate, major and critical. In particular, 69 per cent of live monitoring issues created were assigned a moderate severity rating, while less than 1 per cent of issues were assigned a critical rating.

Figure 4K
Breakdown of live monitoring issues by severity rating

Figure 4K displays a breakdown of live monitoring issues by severity rating

Source: VAGO.

Live monitoring issues without planned remedial actions

Not all live monitoring issues have led to planned remedial actions being recorded in SAMS2. Figure 4L shows that only 28 per cent of live monitoring issues with a moderate severity rating had a planned remedial action recorded. In contrast, approximately three-quarters of live monitoring issues across the remaining three severity rating categories had a planned remedial action recorded. However, these issues only made up 31 per cent of all live monitoring issues.

Figure 4L
Percentage of live monitoring issues with actions created by severity rating

Figure 4L shows the percentage of live monitoring issues with actions created by severity rating

Source: VAGO.

We examined the two live monitoring issues recorded with a critical severity rating that did not have a planned remedial action recorded in SAMS2. These included:

  • an issue recorded in September 2017 raising 30 noncompliances with section 120 of the Children, Youth and Families Act 2005, which requires that an out-of-home care service ask DHHS whether a person is disqualified before approving, employing or engaging them as a foster carer
  • an issue recorded in January 2018 raising a series of concerns with the funded organisation's call system, staffing, roster management, training and supervision.

Although remedial actions may have been documented elsewhere, such as actions plans or meeting minutes, not having them recorded centrally in live monitoring limits DHHS's ability to monitor and track actions. It also limits assurance that performance issues are addressed satisfactorily.

Overdue remedial actions

We also analysed the prevalence of live monitoring issues recorded in SAMS2 with planned remedial actions that were overdue as at 17 April 2018. Figure 4M shows that a total of 127 planned actions were overdue with the average number of days that actions were overdue being 264 days.

Figure 4M
Analysis of overdue remedial actions in response to live monitoring issues

Measure

Severity

Across all severity ratings

Minor

Moderate

Major

Critical

No. of overdue actions

38

74

15

0

127

Average length overdue (days)

249

270

279

n/a

264

Median length overdue (days)

260

199

229

n/a

200

Maximum length overdue (days)

661

671

598

n/a

671

Source: VAGO.

There were no overdue actions in relation to live monitoring issues with a critical severity rating. Of the 15 overdue remedial actions relating to live monitoring issues with a major severity rating, five actions had no record in SAMS2 of any response taken since the issue was created. These actions all related to instances of noncompliance in some form, such as verifying a funded organisation's compliance with registration requirements following an external audit.

Recording performance issues and tracking remedial actions is vital to managing service agreements, especially when issues have the potential to impact client safety and wellbeing. Until all service agreement staff use live monitoring regularly as intended, DHHS will continue to have limited oversight and assurance that performance issues are being addressed effectively and efficiently.

Using performance information to inform funding decisions

DHHS's Service Agreement business rules and guidelines instruct staff to consider the following when selecting an organisation to provide services:

  • financial viability
  • governance arrangements
  • facilities, staffing, resources and expertise
  • history of known performance and compliance issues (if it receives existing funding).

Despite this guidance, we could not find records in SAMS2 that clearly show existing performance information—generated through FOPMF or otherwise—being used to inform future service agreement funding decisions. According to DHHS, the evidence underpinning service agreement funding decisions is stored in the 'attachments' tab for each agreement in SAMS2. However, the information stored in this location did not give any indication of funding decisions considering past performance where appropriate.

Back to Top