Child and Youth Mental Health

Tabled: 5 June 2019

3 Monitoring performance, quality and outcomes

Government departments have a core responsibility to monitor the performance of their funded agencies and to understand what services or other public value is being delivered with the funding that it provides, and ideally to also understand what outcomes are achieved.

The processes and information that departments use to make decisions about funding and the performance of funded agencies—often described as performance and accountability frameworks—should be clear and transparent to all relevant stakeholders. In most cases they should also be transparent to the public and the recipients of services.

3.1 Conclusion

DHHS does not have a clear method for monitoring the performance of the CYMHS system within broader health service and mental health system performance monitoring and oversight. Without this, DHHS cannot fulfil its role to advise government on the system's performance, its resourcing needs, or the challenges patients and health services face in engaging other necessary social services. An example of this is that the current performance monitoring system has not highlighted the significant number of young people regularly 'stuck' in inpatient mental health services. Legislated mechanisms to protect the most vulnerable Victorians are also impeded by bureaucratic hierarchies and silos within DHHS.

3.2 Monitoring performance

DHHS has no effective governance arrangements to provide oversight of CYMHS, as CYMHS monitoring is embedded within broader performance monitoring and system oversight, as seen in Figure 3A. Legislation mandates some of the Chief Psychiatrist's and Secretary's monitoring but other components of the monitoring would benefit from a more systemic approach.

Figure 3A
DHHS performance monitoring agreements for CYMHS, their oversight bodies and mechanisms, the KPIs and other items

Figure 3A shows DHHS performance monitoring agreements for CYMHS, their oversight bodies and mechanisms, the KPIs and other items

Key to ‘Monitored items’ section: Dark grey = population-level health outcomes. Blue = consider only inpatient services. Dark purple = monitor either community programs or CYMHS as a whole.
Note: Regarding reporting of deaths, Safer Care Victoria and the Chief Psychiatrist each require reporting of certain types of deaths only. Different criteria are legislated/mandated for each reporting requirement.
Source: VAGO, with information provided by DHHS.

DHHS does not have mechanisms to identify and address consistent issues in CYMHS in order to proactively prevent safety breaches or improve the quality of CYMHS.

When the performance monitoring that occurs around CYMHS is presented together, as in Figure 3A, it is evident that a lack of overarching governance and coordinated monitoring of CYMHS creates an unnecessary reporting burden on health services.

There are 29 components of CYMHS service delivery or outcomes that are monitored, by mandate or request, through seven different systems that are managed by DHHS and overseen by four different authorities with different roles and responsibilities in monitoring the performance of CYMHS. There are also six public reports produced by six different agencies or groups about different aspects of CYMHS performance.

This section examines the effectiveness and appropriateness of each of the seven different systems for performance monitoring of CYMHS, which are:

  • the Victorian Health Services Performance Monitoring Framework, published by DHHS's Health Services Performance and Regulation Branch, which also convenes the primary performance monitoring discussions between DHHS and health service CEOs each quarter
  • 15 KPIs on CYMHS service delivery and outcomes, collected and publicly reported by the Mental Health Branch
  • legislated responsibilities of the Chief Psychiatrist to monitor five components of service delivery and the quality of services more broadly
  • the Quality and Safety Bill 2017 by which Safer Care Victoria (SCV) was established as an administrative office of DHHS with a range of functions, including monitoring sentinel events
  • public reporting on five components of CYMHS service delivery by the Victorian Agency for Health Information (VAHI), another administrative office of DHHS
  • the Mental Health Services Annual Report, which the Secretary of DHHS has a legislated responsibility to produce
  • the Victorian Government's Health and Wellbeing Outcomes Framework, which monitors three components of Victorian children and young people's mental health.

Sentinel events are defined by SCV as 'unexpected events that result in death or serious harm to a patient while in the care of a health service'.

Three items are reported and monitored by multiple mechanisms and agencies, as follows:

  • Some deaths must be reported as a sentinel event to SCV and the Chief Psychiatrist also requires notification of 'reportable deaths'. The two mechanisms use different definitions of what needs to be reported and have different processes for how they are reported and analysed.
  • The rate of seclusion events is monitored by the Chief Psychiatrist;the Monitor report, which makes it a topic of the quarterly CEO health service performance meetings;and public reporting on both the VAHI website and the Mental Health Branch's section of the DHHS website.
  • The rate of follow-up after discharge is similarly reported in Monitor and therefore CEO health service performance meetings as well as public reporting on VAHI and DHHS websites.

The latter two items relate only to inpatients and DHHS could not provide any rationale for why the most actively monitored items neglect the largest component of CYMHS clients, those receiving community programs.

In addition to the performance monitoring arrangements described above, both the Public Advocate and the Mental Health Complaints Commissioner, which was established under the new Act in 2014 as an independent authority, undertake investigations around mental health services. These entities require health services to provide information about incidents, which can duplicate what they provide to DHHS.

DHHS could not provide evidence that it had taken any action to understand the complexity and duplication that exists within the performance monitoring arrangements. It has taken no action to streamline or resolve conflicts and confusion caused by the overlapping reporting requirements for health services or its own duplicated monitoring arrangements, except for discussions around sentinel event reporting, which resulted in a joint OCP and SCV sentinel event review process.

During the audit, DHHS commenced a project to develop a new performance and accountability framework for mental health, with an initial focus on adult services. DHHS should ensure the inclusion of CYMHS in this work and also consider health services' reporting burden under current and future accountability arrangements.

DHHS has advised that 'quality and safety reporting and monitoring requirements will continue to be managed by the Office of the Chief Psychiatrist'. DHHS needs to ensure that there is clarity and transparency about the role of the Chief Psychiatrist in performance monitoring, and the accountability framework should articulate the Chief Psychiatrist's role.

There is no single source of information for CYMHS about either reporting requirements or DHHS's performance monitoring activities. The DHHS Policy and Funding Guidelines 2018–19 contain only an incomplete description of reporting and monitoring for CYMHS.

During the audit, the Director of Mental Health proposed some additions to the 2019–20 version of the DHHS Policy and Funding Guidelines to detail all of health services' mandatory reporting requirements to the Chief Psychiatrist.

Even if the proposed changes to the DHHS Policy and Funding Guidelines 2019–20 proceed, there remains no single source of information on the reporting requirements and performance monitoring arrangements for CYMHS. This reflects a siloed approach within DHHS and within the Mental Health Branch itself (between its performance monitoring area and the OCP), which inhibits DHHS from effective and efficient performance monitoring. It also inhibits DHHS's capability to provide accurate advice to government on CYMHS and the needs of Victorian children and young people with serious mental health problems. Although during the audit, DHHS finalised an Operational Model, which outlines a more integrated approach to performance monitoring that details the roles of the performance monitoring area and the OCP.

The Victorian Health Services Performance Monitoring Framework

The Victorian Health Services Performance Monitoring Framework describes DHHS's roles, responsibilities and processes for monitoring the performance of health services across all areas of quality of care, governance, access to care, and financial management. It specifies two KPIs for CYMHS that health services' CEOs are held accountable for by DHHS, through its quarterly performance meetings.

Figure 3B
Victorian Health Services Performance Monitoring Framework KPIs for child and youth mental health

KPI

Target

Rate of seclusion events relating to a child and adolescent acute mental health admission

15 seclusions per 1 000 bed days

Percentage of child and adolescent mental health inpatients with post-discharge follow-up within seven days

80 per cent

75 per cent prior to 1 July 2018

Source: VAGO, from the Victorian Health Services Performance Monitoring Framework 2018–19.

DHHS could not provide a rationale for why it chose these two KPIs as the measures of CYMHS performance at a high level. Each KPI addresses areas that are important to monitor, but we found critical weaknesses in how the follow-up rate is calculated and how the seclusion rate is reported. It is a significant concern that these KPIs apply to inpatients only. The Victorian Health Services Performance Monitoring Framework therefore does not monitor the largest cohort of CYMHS clients, those in community or outpatient programs, which represent 74.4 per cent of CYMHS funding and a greater, though undetermined, proportion of people receiving CYMHS services.

Monitoring seclusion

If a health service exceeds DHHS's target for seclusions in CYMHS, DHHS brings this to the attention of the health service CEO and Board through the Monitor report and the quarterly health service performance meetings. CYMHS leaders are subsequently asked to explain the breach of the target to their hospital executive and Board. The explanation requires significant time and resources because of the required detail, which includes the background of the young people involved, and the complex and rapid clinical decisions that led to seclusions.

Quality improvement activities must accompany monitoring activities, otherwise a punitive culture can develop, which can worsen the practice the KPI is seeking to improve.

International literature shows that effectively reducing seclusion requires a comprehensive set of actions including training, debriefing and leadership on organisational change, alongside monitoring data. DHHS should consider reporting a health service's engagement in evidence-based activities to improve seclusion practices alongside the KPI in the Monitor report.

Figure 3C
Definition of seclusion in mental health care

Seclusion is the confinement of a patient at any time of the day or night alone in a room or area from which free exit is prevented.

The purpose, duration, structure of the area and awareness of the patient are not relevant in determining what constitutes seclusion. Seclusion also applies if the patient agrees to or requests confinement and cannot leave of their own accord.

While seclusion can be used to provide safety and containment at times when this is considered necessary to protect patients, staff and others, it can also be a source of distress; not only for the patient but also for support persons, representatives, other patients, staff and visitors. Wherever possible, alternative, less-restrictive ways of managing a patient's behaviour should be used, and hence the use of seclusion minimised.

Source: Australian Institute of Health and Welfare, Mental Health Services in Australia, 2019.

Since 2013, the Office of the Chief Mental Health Nurse within DHHS has led important work to introduce systems-oriented quality-improvement practices focused on seclusion and other restrictive practices across Victorian mental health services, including CYMHS.

Safewards is a UK-developed model with a set of 10 interventions designed to reduce conflict and containment in inpatient services. The trial of Safewards in Victoria in 2013 included two youth and one adolescent ward, as well as an adult ward that receives funding for two adolescent beds. A comprehensive evaluation by The University of Melbourne, published in 2015, showed that the rate of seclusion in the three adolescent/youth wards reduced significantly from 19 seclusions per 1 000 occupied bed days before the 12-week trial to 9.5 seclusions at the 12-month follow-up. The trial also showed promising, though less significant, improvements in adult services.

A 2016 expansion of Safewards engaged all but one of the adolescent and youth CYMHS wards. One of the child wards did not yet exist and the other was deemed out of scope for the project.

In 2018–19, another initiative to expand the Safewards program saw eight of the 11 CYMHS inpatient wards funded to employ a Clinical Nurse Consultant whose responsibilities include implementing Safewards and other activities to reduce restrictive interventions. A ninth CYMHS, Orygen Youth Health, had recently established a similar position at the time that the Victorian Budget provided this new funding, so it did not receive the funding but is doing the same work. Figure 3D shows the CYMHS wards that have been engaged in each of these three components of the Safewards program. DHHS needs to ensure that all CYMHS inpatient wards are engaged with Safewards.

Figure 3D
Participation in the three components of the Safewards program, by CYMHS wards

Ward

Trial, 2013

Expansion, 2016

Clinical Nurse Consultant funded, 2018–19

Any of the three components

Metropolitan services

Austin—adolescent

Austin—child

Eastern—adolescent

Monash—youth

Monash—adolescent

Monash—child

n/a

n/a

Orygen Youth Health

self-funded

RCH

Regional services

Ballarat Health

Latrobe Regional Hospital

Mildura Base Hospital

Total

4

8

8–9

10

Note: Monash Child ward opened in 2018.
Note: The sites in Figure 3D were nominated for the expansion of the project but their actual participation varied due to staged implementation based on service capability.
Note: Monash Health and Austin Health both have one Clinical Nurse Consultant position split across their child and adolescent wards, making the total number of Clinical Nurse Consultants across CYMHS six.
Source: VAGO, based on information from DHHS.

Despite the significant work with Safewards, the rates of seclusion in CYMHS continue to exceed DHHS's target of 15 seclusions per 1 000 bed days and the national rate in 2017–18 of 8.1 seclusions per 1 000 bed days. DHHS advises that the national rate is lower than the Victorian rate due to a stricter definition of seclusion in Victoria, and therefore higher rates of reporting in Victoria. However, DHHS has not conducted an audit or review to confirm this theory.

DHHS's rationale for setting the target at 15 seclusions per 1 000 bed days includes data from a report it published in 2018 that shows trends in seclusion rates in adult mental health services only. DHHS advises that its Restrictive Interventions Governance Group has considered the target on several occasions. This group decided against reducing the target, but DHHS could not provide evidence of these deliberations.

DHHS monitors and publicly reports the seclusion rate at individual health services and calculates the total rate for all metropolitan CYMHS, which has only exceeded the target in one quarter since 2015–16, as shown in Figure 3E.

Figure 3E
Seclusion rate for audited CYMHS per 1 000 bed days, 2015–16 to 2017–18

Figure 3E shows seclusion rate for audited CYMHS per 1 000 bed days, 2015–16 to 2017–18

Note: In the graph Q refers to 'Quarter'.
Note: The target in the Victorian Health Performance Framework is 15 seclusions per 1 000 bed days.
Source: VAGO analysis of information available on DHHS website. National rate 2017–2018 as reported by Australian Institute of Health and Welfare in Mental Health Services in Australia, published 22 March 2019.

The Act defines seclusion as 'the sole confinement of a person to a room or any other enclosed space, from which it is not within the control of the person confined to leave'.

A variety of scenarios can occur when a young person in a mental health facility becomes agitated or aggressive. These can include a clinician leaving the room so that the young person is left alone. The door to the room where the young person is can be left open, closed or locked. There are different understandings among CYMHS clinicians about whether an open door, or a closed but unlocked door, must be reported and recorded as a seclusion.

In 2018, the Chief Psychiatrist wrote to authorised psychiatrists about a concern that seclusion was under-reported. There was debate about reporting practices at a forum of mental health nurses in November 2018, which was chaired by DHHS's Chief Mental Health Nurse, that included different interpretations of the 2018 correspondence from the Chief Psychiatrist.

An authorised psychiatrist is appointed by the board of a health service and has specific legislated responsibilities under the Act around compulsory assessment and treatment and other situations where a consumer's rights may be at risk.

Inconsistent reporting of seclusion reduces the validity of the KPI. DHHS needs to better understand whether there are variable reporting practices around seclusion in CYMHS and take strategic action to improve them.

Chief Psychiatrist monitoring of individual seclusions

Health services must report every case of seclusion to the Chief Psychiatrist through monthly reporting processes. Reports include the patient's date of birth, which allows the Chief Psychiatrist to identify seclusions of children and young people. Reports include the duration of seclusion, the reason for it, who secluded the patient, and who approved and authorised the seclusion. The Chief Psychiatrist's Data Review Working Group reviews all reports of seclusion monthly. Services are required to clarify the circumstances of episodes of seclusion and restraints that exceed thresholds. In circumstances where the clarification raises concerns regarding quality and safety, the matter is brought to the Chief Psychiatrist's Portfolio Governance meeting for assessment and review. This process is highly resource-intensive, with approximately 10 senior staff involved at least half a day every month.

DHHS could not provide evidence of how it decides that there is a 'concern' that requires action or analysis to identify trends or persistent issues to inform strategic action.

The current process is reactive and focuses on singular cases. DHHS advises that actions arising from the meetings include phone calls made by senior staff to the authorised psychiatrists in the health services concerned. They may also instigate visits to health services by the Chief Psychiatrist and/or the Chief Mental Health Nurse. DHHS should develop a more transparent, timely, and efficient mechanism for reviewing individual cases of seclusion.

To enable this, DHHS should consider analysis of individual seclusion reports that includes:

  • identifying trends at health services over time
  • identifying trends or clusters within vulnerable population groups
  • collecting and analysing additional information about the secluded client such as indicators of vulnerability, which for children and young people would include legal status with regards to child protection, disability diagnoses and family and housing characteristics.

This form of analysis would be consistent with the 2016 external review of the OCP, to be discussed in Section 3.7, which recommended increasing data analysis capacity.

There is no evidence that this monitoring of individual seclusion cases by the OCP is coordinated with the monitoring of the seclusion rate by the performance monitoring areas of DHHS. When DHHS communicates a breach of the seclusion rate target to hospital CEOs through its performance monitoring, it should draw on the information it holds through the OCP's extensive monitoring processes to form consolidated and consistent advice to health services.

Monitoring seclusion in regional health services

DHHS does not report the seclusion rate—or any other KPI—for the six CYMHS inpatient beds it funds for regional health services in Ballarat, Mildura, and Gippsland as it does for metropolitan services. DHHS advises that it 'monitors all seclusions for all health services Ö against age-determined benchmarks', but it could not provide evidence of seclusion rates for these regional CYMHS beds nor for children and adolescents in these regional services. One example of a seclusion report received by the OCP for a young person in a regional service was provided, but there is no evidence that seclusion of children and young people in regional health services is monitored strategically.

DHHS has not taken action to investigate the rate of seclusion for children and adolescents in these or any other regional mental health services. In Section 2.4, we detailed the significant problems with the system design around these beds and the risk that health services are not able to use them as a child and adolescent inpatient service.

In Section 2.4, we also showed how there are significant numbers of both adolescents and young adults using adult inpatient services in the regional health service that we audited. DHHS has taken no action to monitor the seclusion rate for children and young people who are being admitted as inpatients to adult services.

Post-discharge follow-up

Continuity of care, especially after an inpatient admission, is a critical component of good-quality mental health care. It requires the discharging inpatient service to make appropriate referrals, and communicate with the 'receiving' community program as well as the client and family. It also requires the 'receiving' community program to participate in the discharge process and communicate effectively with the inpatient service, client and family.

The post-discharge follow-up KPI holds only the 'receiving' CYMHS accountable for follow-up, when the process is also highly dependent on the inpatient service communicating well during the discharge process.

This issue is significantly exacerbated in the Western metropolitan region, where CYMHS funding and service delivery is shared between two organisations. RCH provides an inpatient service for adolescents aged 13–18 years, but their community programs that offer follow-up care for younger clients cease at 15 years. For RCH inpatients aged 15 years or over who live in the Western metropolitan catchment—approximately 300 young people each year—follow-up care is provided by Orygen Youth Health.

DHHS's performance monitoring system does not account for this complexity through its KPI reporting. RCH is reporting against post-discharge follow-up within seven days for inpatients over 15 years, although this service is provided by Orygen Youth Health. Orygen Youth Health, which is funded to provide the follow up care, is only held accountable for following up its own inpatients, excluding those transferred from RCH. During the audit, DHHS committed to 'improve the transparency' around this matter, but did not advise how they plan to do this and whether it will rectify this error in its performance monitoring system.

DHHS advises that the follow-up target is set at only 80 per cent to allow for clients who may use services outside of the Victorian public mental health system for their follow-up care. It had not done any research or analysis to confirm what proportion of CYMHS clients receive their follow-up care in the private or other parts of the health system. The target is also consistently lower than the current data on follow-up, which shows an average of 90 per cent for metropolitan services and 87 per cent for rural services. DHHS increased the target from 75 per cent to the current 80 per cent on 1 July 2018, but could not provide a rationale for why the target was increased or why 80 per cent was chosen.

3.3 Mandatory reporting to the Chief Psychiatrist

The Act stipulates that health services must report to the Chief Psychiatrist on:

  • use of ECT
  • results of neurosurgery performed
  • use of restrictive interventions
  • reportable deaths under the Coroners Act 2008.

The Chief Psychiatrist has powers under the Act to request additional reporting to what is legislated. This is communicated to health services as a part of guidelines that address individual topics, such as the guideline on ECT, for which the Chief Psychiatrist requires reporting 'in advance' for children and adolescents. As another example, in 2018, the Chief Psychiatrist issued a new reporting instruction to health services to mandate reporting of sexual safety violations.

During the audit, the OCP provided us with some proposed changes to the DHHS Policy and Funding Guidelines 2017–18 to include all mandatory reporting requirements to the Chief Psychiatrist. This is an important development if it proceeds, because while some of the mandatory reporting to the Chief Psychiatrist is contained in legislation, other requirements are issued through topic-specific guidelines or directives. There are currently 32 documents on the Chief Psychiatrist's website that health services need to review to identify whether they contain a reporting mandate.

A person with dual disability has a developmental disability, such as intellectual disability or autism spectrum disorders and also severe mental health problems.

During the audit, the OCP proposed a new requirement for health services to report long periods of seclusion and long stays in high-dependency units, which our audit has shown are important matters to monitor in CYMHS given the challenges with complex clients with dual disability. However, this is only proposed as a revision to the DHHS Policy and Funding Guidelines 2019–20. The OCP advises that they will consider whether a guideline or reporting directive is required after reporting has commenced. This creates another level of inconsistency in reporting guidance for health services where some reporting requirements have directives or guidelines from the Chief Psychiatrist to explain the context, rationale and reporting processes, but others do not.

Health services are confused about their reporting obligations and associated actions with regards to SCV and the Chief Psychiatrist having some overlapping responsibilities. DHHS has not taken any action to clarify or explain this.

Some of the information health services report to the Chief Psychiatrist is made public through their annual reports. The reports use age groupings or service type of child and youth, adult or aged which allows monitoring of these measures for CYMHS, but only at a statewide level. There are privacy issues that reasonably explain why the data should not be publicly reported at a health service level.

The last published annual report for 2016–17 reported:

  • seclusion episodes per 1 000 occupied bed days, by clinical program
  • bodily restraint episodes per 1 000 occupied bed days
  • number of ECTs for people under 18 years, and for 18–29 year-olds, by gender.

Deaths reported to the Chief Psychiatrist are not stated in the annual report by age for privacy reasons, although they are carefully monitored by the OCP.

The Chief Psychiatrist annual report for the period ending June 2018 was not published until March 2019, nine months after the end of the reporting period. This data is less useful when it is reported with extensive delays.

The OCP analyses the data it receives and identifies trends that it acts on either by issuing statewide guidance or instigating investigations of specific services. However, we found that this guidance is not communicated to CYMHS leaders nor is it understood consistently by senior staff in CYMHS responsible for implementing it.

The Chief Psychiatrist also chairs a Sentinel Event Review Committee, which makes recommendations about themes and system issues.

3.4 Mental Health Branch's monitoring activities

DHHS's Mental Health Branch monitors the performance of CYMHS through a suite of 15 KPIs, allocation of 'health service leads' and convening program meetings with each health service.

Key performance indicators

DHHS has a broader suite of 15 KPIs for CYMHS, shown below in Figure 3F, which it publicly reports on four times per year on its website. These KPIs have significant problems with their appropriateness as a representation of performance.

There is no governance structure or performance monitoring framework that oversees the development or use of these KPIs. DHHS has never evaluated, reviewed or consulted health services nor other experts on these or other KPIs' appropriateness for monitoring the performance of CYMHS. DHHS advises that its plans to develop a performance and accountability framework for mental health include reviewing its KPIs. DHHS should ensure that this review extends to the CYMHS KPIs, and addresses the problems with validity and appropriateness that are described in this report.

DHHS's rationale for the selection of these KPIs is that they are 'based on the national KPIs', but it could not explain the rationale for many of the significant differences between the national KPIs and those that it uses.

The national KPIs are formally known as the Key Performance Indicators for Australian Public Mental Health Services, Third Edition ('the national KPIs'), which are developed and overseen by a subcommittee established to advise the Australian Health Minister's Advisory Council. The national KPIs contain 15 KPIs that DHHS reports against to the Australian Government annually. DHHS could not provide any evidence of their decision-making process to develop their own suite of KPIs and the changes made from the national KPIs, which had been developed by high-level committees of experts in the field.

DHHS does not have a corresponding KPI for six of the national KPIs, which are marked as red in Figure 3F, and its rationale for excluding each is either absent or incomplete for three of these KPIs as follows:

  • 'Costs of services'—data is available because it is reported to the Australian Institute of Health and Welfare, but DHHS could not provide any rationale for excluding this KPI
  • 'Accessibility—New client index'—no rationale
  • 'Accessibility—Proportion of population receiving care'—excluded because 'the CAMHS population is very small', without further explanation.

A further eight of the DHHS KPIs can be aligned to their counterpart in the national KPIs, but DHHS has changed them substantially, which are those marked orange in Figure 3F. For example, 'change in consumers' clinical outcomes' is measured by DHHS only for community clients, whereas the national KPI does not prescribe this limitation. 'Comparative area resources' is measured only in metropolitan areas, when geographic proximity is a known issue for regional areas and the rates of mental health problems for young people in regional areas are greater. DHHS excludes regional areas from this KPI without a rationale.

DHHS has an additional KPI outside the scope of the national KPIs, which is for monitoring services provided to children under the age of 12 years.

Figure 3F
DHHS's KPIs compared to national KPIs

Domain

National KPIs

DHHS Mental Health Branch's CAMHS KPIs

VAHI Victorian Health Services Performance public report

Victorian Health Services Performance Framework

Effective

Change in consumers' clinical outcomes

Percentage of clients with significant improvement case end (community only)

Change in mean number of clinically significant HoNOS items (community only)

Mean HoNOS at episode start

28-day readmission rate

 

Appropriate

National service standards compliance

 

Efficient—inpatient

Average length of acute inpatient stay

Trimmed average length of stay, excluding same day stays and stays over 35 days

Average cost per acute admitted patient day

 

Efficient—community

Average treatment days per three-month community care period

Average treatment days

Average cost per community treatment day

 

Average length of case (days)

Case re-referral rate

Accessible

Proportion of population receiving clinical mental health care

 

New client index

 

Comparative area resources

Beds per 10 000 of population (metro only)

Continuous

Rate of preadmission community care

Preadmission contact rate—CAMHS

Rate of post-discharge community care

Post-discharge contact rate—CAMHS

Responsive

Consumer outcomes participation

Percentage self-rating measures offered

Percentage self-rating measures completed

Capable

Outcomes readiness

Percentage HoNOS compliant

Safe

Rate of seclusion

Seclusion per 1 000 occupied bed days

No domain in national KPIs

n/a

Percentage clients aged under 12

Service hours

Key: Red = very significant variation or omission; orange = substantial variation; green = consistent.
Note: HoNOS = Health of the National Outcomes Scale.
Note: 28 day readmissions: CAMHS services advised that the 28-day readmission rate is not a reliable indicator of service as clinicians need flexibility to readmit patients if needed, without being concerned about KPIs.
Note: National service standard compliance: this KPI is measured at a health service/organisational level.
Source: VAGO analysis of Key Performance Indicators for Australian Public Mental Health Services (Third Edition), DHHS Mental Health Branch quarterly reports to Ministers and public, VAHI's Victorian Health Services Performance public report and the Victorian Health Services Performance Framework.

Health services could use these KPIs to benchmark their performance against other services, but they advise that this is rarely useful given the measures do not appropriately represent their performance. They report that the KPIs do not sufficiently monitor client outcomes and monitor seclusion in a manner that is not useful to understand the problems and respond to any performance issues at the health service level.

DHHS has not reviewed the effectiveness or appropriateness of its KPIs in monitoring CYMHS performance, nor has it strategically consulted with CYMHS or acted on their advice that the KPIs are not a useful measure of performance.

Section 4.3 of this report details how long patient bed occupancy is a significant issue in CYMHS, but DHHS could not provide any rationale or evidence to explain why it does not monitor it for CYMHS.

Six-monthly program meetings

Six-monthly program meetings are attended by senior staff from the health service (at least the Director of Mental Health, and sometimes the CEO as well) and the DHHS Mental Health Branch, often including the Chief Psychiatrist.

DHHS report that these meetings are not for performance monitoring. However, audited health services consider them to be an important forum to seek and receive guidance from DHHS and to communicate challenges and opportunities they face. Some of the audited health services noted that these meetings are their sole opportunity to formally communicate with DHHS, and in some cases their only opportunity to communicate at all, as they did not know who to contact at DHHS if they need guidance.

The meetings are held inconsistently for all but one health service, and do not regularly cover any matters related to CYMHS, focusing instead on adult mental health services. The meetings occur significantly less frequently with regional health services.

Our analysis of DHHS's minutes of these meetings over the past four years for 13 health services that have CYMHS and/or a Y-PARC found that, on average across all health services, CYMHS had only been discussed on two occasions in four years and only half of the meetings held mentioned child and youth mental health—for regional services it was less often. For four health services, only one meeting in four years had addressed child and youth matters, as seen in Figure 3G. DHHS could not provide evidence of any program meetings at AWH or Ballarat Health.

It is unclear whether agreed actions are followed through by either party, as the structure of the meetings does not frequently include reference to progress against agreed actions.

During the audit, the Mental Health Branch finalised an Operational Model, which outlines a range of processes and protocols including for:

  • engaging with health services
  • monitoring programs
  • escalating performance issues
  • working across DHHS to address systems issues
  • program meetings
  • ensuring the follow up of agreed actions.

Figure 3G
Six-monthly program meetings held in 2015–18

 

Meetings held

Meetings that included matters related to children and youth

Metropolitan services

Alfred Health

6

6

Austin Health

8

7

Eastern Health

7

4

Melbourne Health (auspice of Orygen Youth Health)

5

1

Monash Health

7

3

RCH

5

5

Regional services

AWH

0

0

Ballarat Health

0

0

Barwon Health

4

2

Bendigo Health

3

2

Goulburn Valley Health

3

1

Latrobe Regional Hospital

3

1

Mildura Base Hospital

3

1

Peninsula Health

7

2

South West Healthcare

4

2

Source: VAGO analysis of DHHS minutes of six-monthly program meetings held between 1 January 2015 and 31 December 2018.

The role of health service leads

DHHS's Mental Health Branch allocates a health service lead to each mental health service. They are primarily responsible for liaison between DHHS and the health service around mental health matters. One staff member may be the health service lead for one or two health services.

One of the tasks of the health service lead is to ask services to explain their KPI results to DHHS when they are released quarterly. There is no protocol for what does or does not trigger the health service lead to ask for these explanations, and there is no evidence that DHHS has taken any follow-up action on the responses.

DHHS has acknowledged weaknesses in its engagement model with health services and had commenced a review at the time of the audit, but this was in a very early stage so we cannot report on its appropriateness for addressing the issues identified in this audit.

3.5 Failing to monitor accessibility

The only KPI for accessibility of services is limited to inpatient beds in metropolitan services, as shown in Figure 3F. Our March 2019 performance audit Access to Mental Health Services found that accessibility is a significant problem that DHHS needs to monitor more closely.

The Health of the Nation Outcome Scales for Children and Adolescents (HoNOSCA) is a clinician-rated instrument comprising 15 questions measuring behaviour, impairment, symptoms, social problems and information problems for those under 18 years.

DHHS's KPI of a mean (average) HoNOSCA score does not allow it to monitor whether the most severe clients are accessing CYMHS. To do this, DHHS should monitor the distribution of outcome measures at admission and look at the proportion that are of a high severity. DHHS's KPI shows that CYMHS are collecting outcomes measures at a high rate, so it could commence more meaningful analysis and monitoring of accessibility immediately.

In 2012, the National Mental Health Performance Subcommittee considered adopting a new accessibility measure that would monitor whether the most seriously unwell clients were getting access to CYMHS. The measure was to look at what proportion of clients accessing CYMHS community programs had very high (95th percentile) scores on a self-rating measure that is routinely collected—the Strengths and Difficulties Questionnaire (SDQ)—Parent versions. While this measure did not become a national KPI, it is an example of a measure that DHHS could use to monitor access for CYMHS. DHHS has been collecting this data for many years, so changes over time could also be determined and monitored going forward.

The Strengths and Difficulties Questionnaire (SDQ) is a brief emotional and behavioural screening tool. The tool can capture the perspective of children and young people, their parents and teachers.

The SDQ can be used for various purposes, including clinical assessment, evaluation of outcomes, research and screening.

The most recent data from the Australian Mental Health Outcomes and Classification Network shown in Figure 3H demonstrates that clients admitted to CYMHS in Victoria in 2016–17 had an average SDQ impact score of between 6 and 6.1 out of 10 over the past four years. The most unwell 5 per cent of the population will have a score over 3. In 2016–17, 89 per cent of CYMHS clients in Victoria had an SDQ impact score over 3, putting them in the highest range of the most unwell 5 per cent of the population. This demonstrates that CYMHS are seeing the most unwell young people in the population.

Figure 3H shows that in 2015–16 the severity of CYMHS 'youth' clients was the highest of the four years of available data, while the severity of 'child' clients was lower than the other years. It also shows that in 2014≠–15, the minimum level for children accessing CYMHS was more severe than other years. These differences cannot be seen by the corresponding average figure, which is also shown on the chart in Figure 3H, and does not vary enough to identify changes.

Further analysis and monitoring of this data, which DHHS receives daily through its Client Management Interface (CMI) database, could show DHHS any changes over time and any variations between health services. There is no evidence that DHHS has ever reviewed the severity of clients accessing CYMHS nor taken any action where variation has occurred over time or between services.

Figure 3H
Severity of mental health problems as shown by the SDQ's impact scores for Victorian CYMHS clients at access to community programs, by year

Figure 3H shows severity of mental health problems as shown by the SDQ's impact scores for Victorian CYMHS clients at access to community programs, by year

Note: The box in each chart shows the upper and lower limits of the middle 50 per cent of all scores recorded, and the lines or 'whiskers' show the upper limits, the highest and lowest scores recorded.
Source: VAGO analysis of Australian Mental Health Outcomes and Classification Network information.

The measure DHHS uses as a KPI to monitor the severity of CYMHS clients is less precise and less useful as it does not identify the wide range of levels of severity of mental health problems seen in CYMHS. It therefore does not allow any determination around whether the intended client group—those with more severe problems—are accessing CYMHS.

3.6 Other DHHS areas and agencies' performance monitoring of CYMHS

There are four performance monitoring activities for CYMHS that are managed by areas of DHHS outside of the Mental Health Branch:

  • quarterly health service performance meetings
  • VAHI reporting of CMI data
  • sentinel event reporting to SCV
  • reporting to Department of Treasury and Finance (DTF).

Quarterly health service performance meetings

The primary mechanism for managing the performance of health services is a quarterly meeting between health service CEOs and DHHS. The inputs to this meeting are the KPIs in the Victorian Health Services Performance Monitoring Framework and the agreement made through the Statement of Priorities (SoP), which are described below.

The meetings cover performance of the entire health service, and DHHS was not able to provide advice about the extent to which they address mental health performance matters. A representative of the Mental Health Branch is invited to attend if a mental health program matter is on the agenda. This can be contributed by a health service or the Mental Health Branch can recommend agenda items relating to the health service's performance against KPIs.

Statements of Priorities

Each year, a SoP is developed as an agreement between each health service and the Minister for Health on what that year's priorities will be. We analysed the SoPs for all 17 Victorian health services that receive funding directed at children and youth with mental health problems (see Appendix B for funded agencies).

No SoPs in 2018–19 mention child and youth mental health, though two did in the previous year. In 2016–17, when there was a substantial investment of new funds and new programs, only five health services mentioned child and youth mental health in their SoP.

Victorian Agency for Health Information

VAHI was established as an administrative office within DHHS in 2017 after Targeting zero: Report of the Review of Hospital Safety and Quality Assurance in Victoria made 66 recommendations about improving quality and safety in Victorian hospitals.

VAHI produces a quarterly report on the performance of Victorian health services, which includes four items that allow performance monitoring of CYMHS. All are reported at the individual health service level. These are:

  • service hours provided by community mental health services
  • child and adolescent mental health average length of stay
  • child and adolescent mental health post discharge follow-up rate
  • child and adolescent mental health seclusion events per 1 000 bed days.

DHHS advises that 27 items that VAHI reports on relate to CYMHS. However, the four above are the only items where CYMHS data is separated from other parts of the mental health system and therefore used to monitor CYMHS performance. It is not possible to monitor the performance of CYMHS using data that does not distinguish between the different sectors of the mental health system.

VAHI's quarterly report is the primary source of evidence for advising the ministers for Health, Ambulance Services and Mental Health on the performance of CYMHS.

DHHS could not provide a rationale or evidence of the decision‑making for how or why these four indicators were selected. These indicators do not adequately monitor the performance of community programs and critical issues for inpatients such as long stays, continuity of care for vulnerable groups and aspects of accessibility, as discussed in Section 3.5. The quality of DHHS's advice to its ministers is therefore impacted.

VAHI produces a range of reports with this information, which are distributed to different groups, as follows:

  • VAHI Board Quality and Safety Report is sent quarterly to health service boards and CEOs.
  • Inspire is sent directly to clinicians quarterly with special issues also produced for in-depth reporting on specific clinical areas. There have been two Inspire reports into mental health, which included data reported for children and youth.
  • Monitor is sent to public health service boards, CEOs and DHHS quarterly, and reports performance information across measures contained in each health service's SoP.
  • Program Report for Integrated Service Monitoring (PRISM) is sent to public health service boards, CEOs and DHHS quarterly, with a broader range of performance information to complement Monitor.

Sentinel event reporting to Safer Care Victoria

The Health Legislation Amendment (Quality and Safety) Bill 2017 established SCV as an administrative office of DHHS to monitor and improve the quality and safety of care delivered across Victoria's health system. It also gave the Secretary of DHHS powers to request information from health services, which has been one of the enablers of SCV's mandatory reporting of sentinel events.

Health services must report sentinel events within three days and must complete and provide to SCV a 'root cause analysis' within 30 working days.

DHHS advises that it had received a report of one sentinel event in CYMHS since SCV was established three years ago, but it could not provide evidence to the audit about this event or its treatment. There is no evidence that this sentinel event report was shared with other areas of DHHS with responsibility for monitoring quality, safety or performance such as the Chief Psychiatrist or the Mental Health Branch, nor how these different areas coordinate their various monitoring activities.

Reporting to the Department of Treasury and Finance

DHHS must report to the DTF every year on the performance of the mental health system through Budget Paper 3.

The 14 performance measures and targets relate to the whole mental health system, and are not broken down by age, so none of measures that are used for this purpose allow DTF to monitor CYMHS or children and young people's mental health.

Budget Paper 3 performance measures can be changed through negotiations with DTF during the annual state budget process, but there is no evidence that DHHS considered giving DTF visibility of the performance of CYMHS.

Given the significant economic impact of addressing mental health problems early in life, DTF should be able to monitor the performance of CYMHS as a priority area.

3.7 Monitoring the quality of service delivery

The Act prescribes three functions to the Secretary of DHHS that relate to service quality and a further six service quality functions to the Chief Psychiatrist, as shown in Figure 3I. The activities we identified around monitoring or improving service quality for CYMHS were those undertaken by the OCP.

Figure 3I
Functions in the Act for the Secretary and the Chief Psychiatrist that relate to program quality

The Secretary's functions include:

(a) to develop and implement mental health strategies, policies, guidelines and Codes of Practice

(b) to plan, develop and promote a range of mental health services that are comprehensive, integrated, accessible, safe, inclusive, equitable, and free from stigma

(c) to promote continuous improvement in the quality and safety of mental health services.

The Chief Psychiatrist's functions include:

(a) to develop standards, guidelines and practice directions for the provision of mental health services and publish or otherwise make available those standards, guidelines and practice directions

(b) to assist mental health service providers to comply with the standards, guidelines and practice directions developed by the chief psychiatrist

(c) to develop and provide information, training and education to promote improved quality and safety in the provision of mental health services

(e) to assist mental health service providers to comply with this Act, regulations made under this Act and any Codes of Practice

(i) to conduct investigations in relation to the provision of mental health services by mental health service providers

(j) to give directions to mental health service providers in respect of the provision of mental health services.

Source: Excerpts from the Mental Health Act 2014.

Seven of the 15 KPIs that DHHS's Mental Health Branch monitors every quarter measure different aspects of service quality, which are clinical outcomes, seclusions, continuity of care and consumer participation in monitoring outcomes. However, all the KPIs have significant limitations, which we have detailed in Section 3.4.

The role of the Chief Psychiatrist

DHHS has assigned all responsibility for matters of service quality in mental health services to the OCP. The Chief Psychiatrist undertakes a wide range of activities that endeavour to improve service quality across mental health services for all children and young people, adults and aged people, such as issuing guidelines, forums, service reviews and correspondence.

Figure 3J shows the functions of the Chief Psychiatrist legislated in the Act that relate to service quality and evidence of action against each of these areas with relevance to CYMHS.

Figure 3J
OCP's actions against the Chief Psychiatrist's legislated functions that relate to service quality

Legislated function

Evidence of action in 2017–19

(a) to develop standards, guidelines and practice directions for the provision of mental health services and publish or otherwise make available those standards, guidelines and practice directions

  • Sixteen current guidelines published.
  • Four new guidelines completed and released.
  • New reporting instruction on sexual safety violations.
  • Clinical practice framework for intensive mental health nursing.
  • Guideline and practice resource: Family violence.

(b) to assist mental health service providers to comply with the standards, guidelines and practice directions developed by the chief psychiatrist

  • Minimal. New guidelines are sent to authorised psychiatrists by email. No guidance or support is provided on their implementation. Evidence that some are not implemented and no follow-up.

(c) to develop and provide information, training and education to promote improved quality and safety in the provision of mental health services

  • Two Quality and Safety bulletins published. Note: Committed to two per year but only published one in each of 2017≠–18 and 2018–≠19.
  • CYMHS Senior Nurses Forum—three meetings held, plus monthly forums for all senior nurses.
  • CYMHS Clinical Leaders meeting—two meetings held during the audit in October 2018 and March 2019, after they were put on hold for two years commencing August 2016.
  • Quarterly forums for authorised psychiatrists.
  • Three quality and safety forums.
  • Safewards project—27 events, forums, workshops, meetings.

(d) to monitor the provision of mental health services in order to improve the quality and safety of mental health services

  • Analysis of individual reports of seclusions and restrictive interventions.
  • Monitored implementation of the Hospital Outreach Post-suicidal Engagement (HOPE) program at six sites.
  • Review of inpatient deaths.
  • Review of community deaths.

(e) to assist mental health service providers to comply with this Act, regulations made under this Act and any Codes of Practice

  • Correspondence to authorised psychiatrists regarding restrictive interventions in CYMHS in July 2018, and subsequent discussions of same at CYMHS Clinical Leaders Network and CYMHS Senior Nurses forum.

(i) to conduct investigations in relation to the provision of mental health services by mental health service providers

  • Eleven major investigations since 2016, including two relating to young people.
  • Recommended a review of whole mental health service at AWH, which the service accepted and commissioned. Chief Psychiatrist's role in the process unclear. Reasons for review also unclear.
  • Last review of CYMHS specifically was Goulburn Valley Health in 2016.

(j) to give directions to mental health service providers in respect of the provision of mental health services

  • While it is clear that the Director of Mental Health cannot provide directions to health services on clinical care, the Chief Psychiatrist is consulted by the Director of Mental Health for ad hoc advice that informs directions to health services such as prioritising access to beds. The Chief Psychiatrist also provides their own directions on occasions where more collaborative improvement strategies have failed. The respective roles of the Chief Psychiatrist and the Director of Mental Health in providing directions to health services is unclear and lacks transparency.

Key: Green = actions have fully met the legislated functions; orange = actions have not completely met legislative functions.
Source: Mental Health Act 2014; VAGO analysis of Chief Psychiatrist Annual Report 2017–18 (draft) and other documentation provided to the audit.

The audited health services confirmed that the two areas noted as 'minimal' in Figure 3J are both lacking actions by the OCP and that the lack of guidance in these areas is a cause of significant challenges for CYMHS.

External review of the OCP

In 2016, DHHS commissioned an external review of the Chief Psychiatrist's role by two senior health service managers from New South Wales, one of whom was a psychiatrist. The review made 22 recommendations about organisational structure and resourcing, engagement with health services and stakeholders, internal business processes and several matters of scope and role definition.

The OCP's internal acquittal of the review in February 2019 shows that five of the recommendations were not supported. The OCP replaced one of them with an alternate response, but the remaining four, which relate to resourcing the OCP and its scope extending to oversight of the private sector, have no explanation as to the rationale or decision-making process for their dismissal, nor any alternative response proposed to deal with the underlying issues identified by the review.

Eight recommendations have been partially implemented three years after the review and one has not been implemented at all, which was about the Chief Psychiatrist providing regular independent briefings to the DHHS Secretary, the implications of which are discussed in Section 4.4.

For three of the eight recommendations that the OCP's internal acquittal notes as fully implemented, our audit has found evidence which contradicts these assertions, which relate to:

  • clarifying respective roles around sentinel event reporting with SCV
  • clarifying and communicating to mental health services the respective roles of the OCP and other parts of the Mental Health Branch
  • contributing to health service performance discussions.

It is unclear why the OCP would acquit these recommendations as complete when they have not been. DHHS should more thoroughly respond to this review and increase its transparency by reporting against its progress to the Minister.

Chief Psychiatrist's Guidelines

The OCP develops and issues guidelines on a range of topics. The guidelines do not differentiate between different parts of the mental health system with different models of care. As such, their implementation in CYMHS can be complex and conflict with their own health service policies. There is insufficient interpretation or implementation support provided by the OCP or DHHS.

Audited health services report that it can be difficult to determine whether communication from the Chief Psychiatrist is a mandate or directive that must be implemented under legislation, or whether it is merely advice of recommended practice that can be adapted and implemented to suit local needs. This is an important distinction that should be made clear in all communications if the impact of the mandated directives is to be upheld. The OCP does not monitor the implementation of guidelines.

The guidelines are issued to only one person in each health service, the authorised psychiatrist, which can cause delays or, on occasion, failures in delivery to those who are responsible for implementing them in CYMHS. It is a simple administrative matter to ensure distribution of guidelines to senior staff. There is no evidence that these staff change frequently in CYMHS, so maintaining a database of the the relevant names and their email addresses would not be a significant resource burden for DHHS.

Governance and 'umpire' functions

Health services described the occasional need for an 'umpire' where there are disputes or differing policies and procedures between services. For example, where there are disputes about responsibility for patients and processes for transferring patients between catchments.

The Chief Psychiatrist convenes four external committees and six internal groups that provide governance around the functions in the Act, as shown in Figure 3K. There is no evidence that these committees have considered CYMHS specifically in the past year, as their meetings are fully occupied by matters in the adult mental health system.

Figure 3K
Governance of quality and safety issues in mental health services managed by the OCP

Figure 3K shows governance of quality and safety issues in mental health services managed by the OCP

Source: OCP.

Unreleased evaluations and reviews

The Victorian Government invested $34 million over four years in child and youth reform initiatives in 2008. $200 000 was allocated towards evaluation, which internal correspondence noted as insufficient. $100 000 was spent to undertake an internal evaluation; however, competing priorities prevented the evaluation from being finalised. The remaining $100 000 was subsequently combined with some program delivery underspend, so that $453 963 was allocated towards evaluation of the 2008 reforms.

DHHS commissioned two external evaluations, but did not release either 'due to a change in government'. The reports were titled:

  • Evaluation of selected Victorian child and youth mental health reform initiatives. Stage 1 Preliminary Investigation Final Report, 31 October 2012
  • Evaluation of selected child and youth mental health reform initiatives, 29 May 2013.

Audited health services advise that the findings of these two evaluations are highly sought-after and would remain relevant and useful today.

There are another 10 reviews and reports that also provide data and lessons about program quality and improvement opportunities for CYMHS that have not been released for reasons that DHHS could not explain. These documents represent a considerable expenditure of government funds and resources. Where CYMHS have directly contributed to the evaluations and reviews, withholding these reports has eroded their trust in DHHS. It is also inconsistent with DHHS's organisational value to 'generously share our knowledge'.

The University of Melbourne evaluated the Frankston Y-PARC in 2017, but only an executive summary was published. DHHS did not commission this evaluation, and ownership sits with the university and the agency that manages the Y-PARC, Mind Australia. However, as the funding body for the service, DHHS could direct and enable this evaluation to be shared with the wider sector. Further, DHHS has not sought or reviewed either the unreleased full evaluation report or the publicly available executive summary.

DHHS has also not released a comprehensive report by Alfred Health on the establishment of a dual disability service there, which Alfred Health provided to DHHS in 2018. DHHS has taken no action to communicate the project's outcomes with other CYMHS. The report contains valuable lessons for improving service quality in CYMHS. The importance of this project is discussed further in Section 4.5. Alfred Health has presented at one conference about the project, but has not discussed it more widely.

The following eight reviews and analyses of clinical mental health services, conducted or commissioned by DHHS, include specific consideration of CYMHS, but have never been released or communicated to the sector:

  • Mental Health Services Strategy Data Analysis Report—Draft Report, April2018
  • Reform of Victoria's specialist clinical mental health services: Advice to the Secretary, December 2017
  • DHHS Linkage, Modelling and Forecasting Section, Mental Health 2018–23 Services Strategy analysis—Draft, 2018
  • Design, Service and Infrastructure Planning Framework for Victoria's Clinical Mental Health System, April 2017
  • Consultation paper Clinical mental health service catchments, August 2013
  • Review of acute mental health assessment and treatment for Victorian children aged 0–12 Summary Report, April 2010
  • Next steps 0–25 Next Steps in Mental Healthcare Reform for Children, Young People and their families: Guidance for state-funded specialist mental health services, August 2012
  • Victorian Department of Health and Human Services' Expert Taskforce on Mental Health, 10-year mental health plan wave 2 priorities—discussion paper, 28 June 2016.

All relevant lessons learned from these 12 evaluations and reviews, and where possible complete reports, should be released to CYMHS leaders, and more widely to consumers and the general public, so they can be used to inform service development and quality improvement.

3.8 Monitoring outcomes

Monitoring outcomes is a significant priority for all mental health services and for the Victorian Government, but there are some significant failings in DHHS's approach to monitoring outcomes in CYMHS.

In a March 2019 publication about the Victorian Government's commitment to 'outcomes-thinking', the Secretary of the Department of Premier and Cabinet stated that:

The best way to deliver public value to the people of Victoria is to clearly define the outcomes we are trying to achieve, and measure progress along the way.

There is no evidence that DHHS has asked a CYMHS to explain its performance with regards to outcome measures—neither their collection nor the results.

DHHS's Health and Wellbeing Outcomes Framework

DHHS's Health and Wellbeing Outcomes Framework, published in 2016, includes as an outcome that 'Victorians have good mental health', but none of the targets monitor the wellbeing of children and young people who have mental health problems nor the effectiveness of government programs to assist them. The selected outcomes measures take a prevention and population health approach to mental health, which is important, but does not allow for any outcome monitoring for people with more severe mental health problems that are using CYMHS.

The outcomes framework is not an effective mechanism for monitoring outcomes of CYMHS or the children and young people with mental health problems who CYMHS supports.

The one target against the outcomes is a 20 per cent increase in resilience of adolescents by 2025, which comes from the government's education policy, 'Education State'. Three of the measures defined against this outcome relate to children and young people, which are:

  • the proportion of adolescents who experience psychological distress
  • the proportion of adolescents with a high level of resilience
  • the proportion of children living in families with unhealthy family functioning.

The suicide rate is also a measure, but it is not broken down by age to measure the rate for children and young people specifically.

10-year Mental Health Plan outcomes framework

DHHS's 10-year plan defines 16 outcomes and DHHS has developed indicators and measures for 10 of these. It reports progress against the indicators in its annual report to the Minister for Mental Health and publishes the report on its website.

Four of the 34 indicators relate to children and young people with severe mental health problems, as shown in Figure 3L. A further 18 of the indicators would provide useful information about CYMHS and its clients; however, they are only reported for adults or for the whole system, so children and young people cannot be separately monitored.

Figure 3L
Indicators in the 10-year mental health plan that relate to children and young people with severe mental health problems

Indicator

Reference year

Two years prior

One year prior

Most current data

Proportion of Victorian young people with positive psychological development

2016

70.1%

n/a

68.8%

Proportion of children at school entry at high risk of clinically significant problems related to behaviour and emotional wellbeing

2017

4.6%

4.8%

4.9%

Proportion of Victorian Aboriginal children at school entry at high risk of clinically significant problems related to behaviour and emotional wellbeing

2017

14.2%

15.6%

14.4%

Proportion of registered clients experiencing stable or improved clinical outcomes (children and adolescents)

2017–18

90.6%

91.3%

90.6%

Source: DHHS Mental Health Services Annual Report 2017–18.

  • DHHS has not developed indicators or measures to monitor four other outcomes in its framework that relate to the wellbeing of children and young people with severe mental health problems. These outcomes are that Victorians with mental illness:
  • participate in learning and education
  • participate in and contribute to the economy
  • have financial security
  • are socially engaged and live in inclusive communities.

DHHS advises that it has developed an indicator for CYMHS clients' participation in learning and education, but this was not being used at the time of the audit and DHHS could not provide any evidence of this work. Our analysis of data from the five audited CYMHS shows that education and economic indicators are a significant issue for CYMHS clients, as seen in Figure 3M. We found 65 CYMHS clients of school age who had never attended school, while 19 per cent of clients over the age of 15 years were not employed or in any education program.

Figure 3M
Number of clients with 'education status' recorded as 'not at school' or 'unemployed/pensioner'

Figure 3M shows the number of clients with 'education status' recorded as 'not at school' or 'unemployed/pensioner'

Note: Analysis is limited to the 12 848 CYMHS clients over three years to 31 December 2018 who had their education status recorded in the clinical database.
Source: VAGO analysis of information from five audited health services.

Our 2019 audit Access to Mental Health Services also found significant failings in the outcomes selected for the 10-year plan, with the following finding:

There are few measures in the outcomes framework for the 10-year plan that directly capture performance against providing access to services or increasing service reach—this despite the acknowledged performance problems in this area—which shows a lack of focus on the most pressing issue the system faces.

Nationally agreed outcomes collection

A self-rating measure is a survey that asks clients, or their parents, or both, to rate various aspects of their health and wellbeing. The surveys are used as a part of therapeutic care and the results can also be used to inform research and service development.

The most commonly used self-rating measure in Australian clinical mental health services is the SDQ, which has three different versions: one for parents of children, one for parents of youth and a youth self-report version.

Under the Second National Mental Health Plan, endorsed in 1998, all Australian states and territories committed to routine collection of outcomes data in public mental health services. The National Outcomes and Casemix Collection (NOCC) was first specified in August 2002. It outlines the agreed national minimum requirements and includes a set of protocols about the times and points in service delivery when each outcome should be collected. The outcomes include a mix of self-rated and clinician-rated assessment tools.

The NOCC protocols require CYMHS to collect seven different outcomes for all clients and they specify whether they need to be collected at admission, review, discharge, or all three points, which varies between inpatient, community residential or ambulatory settings.

DHHS monitors compliance with three elements of the NOCC protocols, by having KPIs for health services:

  • completing the HoNOSCA outcome tool
  • offering a self-rating outcome measure to inpatients only
  • completing a self-rating measure.

CYMHS's performance against these KPIs shows that their completion of outcome measures varies between different settings and between services, as shown in Figure 3N.

Figure 3N
Completion of the HoNOSCA outcome measures in April to June 2018 by health service

Figure 3N shows the completion of the HoNOSCA outcome measures in April to June 2018 by health service

Source: VAGO from DHHS website quarterly CAMHS KPIs.

DHHS's guidance to health services, Outcome measurement in clinical mental health services, published on its website, has not been reviewed since 2009 and contains errors. It states that the NOCC outcomes for 'community residential' are not applicable in Victoria for children and adolescents, despite such facilities now existing. Elsewhere on the DHHS website there is guidance on implementing the NOCC protocols that differs from this publication in stating whether measures should be collected at intake or admission.

International Declaration on Youth Mental Health

The United Kingdom's Association for Child and Adolescent Mental Health together with the International Association for Youth Mental Health published a declaration on youth mental health in 2011 and updated it in 2013. The declaration sets eleven 10-year targets for service provision for young people aged 12–25 years.

Benchmarking performance against other organisations or consensus targets like the international declaration is an important and effective strategy to identify opportunities for improvements in systems and processes. DHHS has never benchmarked Victorian CYMHS against these international targets and it does not collect the relevant data or other information to allow it to monitor them. For some of the targets, DHHS does not collect data in the right format, such as breaking down suicide rate by age. For other targets, DHHS has data that would allow some monitoring, but it has never done so, such as using CMI data to monitor accessibility, as discussed in Section 3.5, or the user-experience survey data, which it collects but has never analysed or used.

If DHHS proceeds with its commitment to develop strategic directions under its Clinical Mental Health Services Improvement Implementation Plan, which this audit recommends, it should consider benchmarking against the international targets. It should share the results of the benchmarking with CYMHS leaders and involve them in developing strategies to address any discrepancies identified between Victorian CYMHS's performance and the international targets. DHHS should also rectify the issues described in Figure 3O in regards to its capacity to monitor the important issues covered by the international targets.

Figure 3O
Relevant(a) targets in the International Declaration on Youth Mental Health and DHHS's ability to monitor each

Target

Available data or system to monitor?

1. Suicide rates for young people aged 12–25 years will have reduced by a minimum of 50 per cent over the next 10 years.

The Victorian population's suicide rate is not measured or reported by age.

5. All young people and their families or carers will be able to access specialist mental health assessment and intervention in youth-friendly locations.

Location of services is not monitored.

6. Specialist assessment and intervention will be immediately accessible to every young person who urgently needs them.

Accessibility or time lines of access is not monitored, as discussed in Section 3.5.

7. All young people aged 12–25 years who require specialist intervention will experience continuity of care as they move through the phases of adolescence and emerging adulthood. Transitions from one service to another will always involve a formal face-to-face transfer of care meeting involving the young person, his or her family/carers and each service involved in his or her care.

The KPI for follow-up applies only to inpatients and does not monitor the type of transition service provided.

8. Two years after accessing specialist mental health support, 90 per cent of young people will report being engaged in meaningful educational, vocational or social activity.

Not monitored, though the CMI database does collect information which shows poor outcomes, with 19 per cent not engaged.

9. Every newly developed specialist youth mental health service will demonstrate evidence of youth participation in the process of planning and developing those services.

Youth participation in CYMHS is not monitored.

10. A minimum of 80 per cent of young people will report satisfaction with their experience of mental health service provision.

DHHS's Your Experience of Service (YES) survey is completed by people aged over 16 years using mental health services. In the three months March to May 2018, 1 051 people aged 16–25 years completed the survey. The survey is not mandatory and health services administer it at widely differing rates. There is no evidence that DHHS analyses the results for under 25-year-olds CYMHS clients nor uses them to monitor performance.

11. A minimum of 80% of families will report satisfaction that they felt respected and included as partners in care.

The YES survey asks the young person's perception of their family's experience. Families themselves are not surveyed.

(a) Three of the targets relate to prevention and workforce issues that were out of scope for this audit.
Source: VAGO analysis of information provided by DHHS and the '10-year targets' in the International Declaration on Youth Mental Health published by the International Association for Youth Mental Health, October 2013.

Audited health services' outcomes monitoring

In addition to the outcome measures mandated by DHHS and NOCC, RCH has begun to use 10 tools to measure and monitor clinical outcomes. The selected tools are freely available and are commonly used in research, allowing RCH to benchmark outcomes for their clients nationally and internationally. The tools are rating scales and questionnaires developed for specific disorders, such as anxiety, obsessive compulsive disorder and suicidality, and are all tailored to children and young people.

DHHS was not aware of this work, possibly because it had not convened its six-monthly program meeting with RCH for nine months at the time of the audit and the work had occurred during that period. This represents a missed opportunity for DHHS to share RCH's work with other health services.

3.9 Data collection and reporting to DHHS

The Mental Health Branch requires CYMHS to report all client contacts and information through a computer application called CMI, which delivers data into a central database managed by DHHS called the operational data score (ODS).

DHHS has a manual that provides guidance on how to report activity into CMI. The manual is not publicly available, and DHHS advises that the manual is out of date and being updated, but could not provide evidence of the process or the expected completion date.

DHHS also communicates some reporting requirements to health services in a series of bulletins published on its website. Each relates to a specific matter, such as 'recording admissions' or 'deceased clients'—there is no single source of information on CMI and its reporting requirements except for seeking advice from the DHHS staff who are responsible for maintaining the database. CYMHS independently convene a network of their health information managers to provide support and upskilling to these specialised staff.

Our analysis of some CMI/ODS data (see the scope and methodology in Appendix D) identified some significant gaps in the information that can be entered into the database and the usefulness of other information that is entered, due to there not being any current guidance on terminology and definitions of fields.

We found the following specific issues with the CMI database:

  • The 'legal status' information that can be recorded on DHHS's printed client registration forms does not have a corresponding field and cannot be entered into the CMI/ODS database.
  • The 'sex' field on the client registration form and in the CMI database only include the options 'male' and 'female', which is not consistent with the Victorian Government's guidance on inclusive language. DHHS advises that it is working to improve mental health services' collection of sex/gender information in line with the 'Rainbow Tick' national accreditation program for organisations that are committed to safe and inclusive service delivery for lesbian, gay, bisexual, transgender and intersex people. There is no evidence yet of implementation in CYMHS.
  • 'Living arrangements' has 20 response options that are not mutually exclusive or defined with business rules.
  • 'Living status' response options include 'acute hospital' and 'psychiatric hospital', but there are no business rules to explain the distinction or why these would be an individual's place of residence. ​
  • The 'carer relationship' field has 24 response options that are not defined or clearly described.
  • The 'carer' field is used by health services to record the client's medical professional's details, which should be a separate and different field.
  • Health services recorded 15 CYMHS clients born in Adelie Land, a French‑claimed territory in Antarctica. This is likely to be a data entry mistake, as Adelie Land is the first option for country of birth in the alphabetic list on CMI and no respondent in the 2016 census was born there. During the audit, DHHS advised that it had introduced validations to identify this and ensure corrections are made.
  • Health services can create their own response options for many fields in the database, which creates inconsistent data that is difficult to analyse at the sector or statewide level.

DHHS owns the CMI database and is responsible for managing and maintaining it. The Mental Health Branch uses this data to generate the KPI reports it uses to monitor CYMHS performance. It advised us that making changes to the database is difficult because it is managed by a different area within DHHS whose resources are stretched. DHHS's Digitising health strategy, published in 2016, notes 'Mental health modernisation' as a priority. DHHS advises that significant work is underway, including the appointment of a provider to transition the CMI/ODS database to a new platform. However, DHHS could not provide evidence of progress, methodology or timelines for this activity.

DHHS does not review whether CMI is collecting the appropriate and necessary information and was not aware of the failures of the CMI system that we identified through our analysis.

As a result of these database issues, DHHS cannot understand many important components of CYMHS, such as whether they are providing services to vulnerable populations, or the complexity and vulnerability of the clients who do access CYMHS. Without an accurate way to collect this information, DHHS cannot appropriately monitor the performance of CYMHS. The problems with collecting this information will also impede DHHS's ability to describe performance issues to government and advocate for additional resources where they might be needed.

Back to Top