Enrolment Processes at Technical and Further Education Institutes

Tabled: 11 September 2019

3 Monitoring enrolments at TAFEs

In this Part, we examine whether TAFEs routinely monitor the efficiency and effectiveness of their enrolment processes. This oversight is important, as understanding the strengths and weaknesses of their practices will help TAFEs improve the enrolment experience for prospective students.

3.1 Conclusion

Some TAFEs cannot comprehensively assess the timeliness and effectiveness of their enrolment processes, which affects their ability to improve efficiency. Melbourne Polytechnic, SuniTAFE and William Angliss rely on manual processes, meaning they cannot scrutinise the time taken for prospective students to complete critical enrolment-related tasks.

Despite this limitation, William Angliss—like Box Hill and Swinburne—does track conversion between key steps in their enrolment processes. This analysis is important, as high conversion rates suggest that TAFEs adequately support prospective students through the enrolment process.

By better understanding their enrolment processes, TAFEs can optimise their limited resources, which will enable them to operate more efficiently.

3.2 TAFEs' monitoring methods

TAFEs collect data to assess their:

  • Conversion rates—the proportion of prospective students who progress between key points in the enrolment process. Box Hill, Swinburne and William Angliss track this data.
  • Customer service levels—the extent to which prospective students receive timely and efficient support from TAFEs. Box Hill, Swinburne, and William Angliss monitor and report against these.
  • Prospective students' feedback—the systematic collection and analysis of individuals' experiences and observations. All TAFEs—except SuniTAFE—systematically collect feedback from prospective students.

Monitoring progression through the enrolment process

As shown in Figure 3A, a TAFE's pool of prospective students will diminish throughout the enrolment process. Both TAFEs and prospective students drive this attrition. For example, staff may conclude that an individual's chosen course does not align with their career aspirations, or successful candidates may reject, defer, or lapse their offer. Examining the underlying reasons for this attrition would help TAFEs to identify and address issues, such as process bottlenecks, that might deter prospective students. TAFEs can then maximise the number of prospective students who progress, therefore increasing their conversion rates.

Figure 3A
The enrolment funnel

Figure 3A shows the enrolment funnel

Note: While this funnel does not capture all key steps in each TAFE's unique process, it conceptualises enrolment trends in educational institutions.
Source: VAGO.

Measuring conversion rates

By measuring conversion, TAFEs can provide a good foundation for internal benchmarking and implement targeted improvement initiatives. To measure conversion, TAFEs must collect data on the basic outputs of their enrolment processes. All TAFEs collect and report on these outputs with varying degrees of frequency and formality. Box Hill and William Angliss also have dashboards that provide real-time access to enquiry and application data.

As shown in Figure 3B, all five TAFEs measure their end-to-end conversion—the proportion of individuals who initially expressed interest that later finalised their enrolment.

Figure 3B
End-to-end conversion measures used

Measure

Box Hill

Melbourne Polytechnic

SuniTAFE

Swinburne

William Angliss

Enquiry to enrolment

Application to enrolment

Note: TAFEs' end-to-end points differ due to their unique enrolment processes—see Appendix C.
Source: VAGO, based on TAFEs' documents.

Figure 3C shows the key touchpoints in each TAFE's enrolment process. TAFEs currently measure the conversion rates between points marked by the blue arrows, but do not measure those marked by the orange arrows. Without conducting this more granular analysis, TAFEs may struggle to identify specific points of attrition, impairing their ability to identify and address issues.

Figure 3C
Current and potential measures of conversion

Figure 3C shows the current and potential measures of conversion

(a) We have analysed Box Hill's reporting against its old process (as shown in Figure C2 of Appendix C).
Note: Blue arrows represent points of transition that a TAFE currently measures in terms of conversion. Orange arrows represent transition points that are not measured. William Angliss also measures transition rates between two non-sequential touchpoints.
Note: These points are critical actions in each TAFE's enrolment process—refer to Appendix C for more comprehensive process maps. This chart also assumes that prospective students successfully finalise their enrolment with minimal issues.
Source: VAGO.

In contrast to Melbourne Polytechnic and SuniTAFE, Box Hill, William Angliss, and Swinburne have the capability to assess the rates of transition between several of their key touchpoints. This analysis should provide TAFEs with further insight into the behaviour of their prospective students, which may help staff to improve their processes. Box Hill also attempts to contact disengaged individuals to understand why they withdrew from the enrolment process, and records their reasons using a standardised list. Capturing this information helps Box Hill to identify and address enrolment issues from the user's perspective.

The example in Figure 3D highlights the benefits of routinely collecting and analysing this information.

Figure 3D
Swinburne's VET Onboarding Taskforce

In March 2017, Swinburne created the VET Onboarding Taskforce—a multidisciplinary team focused on removing organisational barriers to an efficient and effective enrolment process. To achieve this goal, Swinburne enhanced its oversight of prospective students' key transition points, including the proportion of applicants who receive an offer, accept their offer, and finalise their enrolment. In early 2018, Swinburne used this information to deliver specialised support to individuals with incomplete applications and unaccepted offers through targeted email and phone campaigns, as well as drop-in sessions. This resulted in:

  • an 8 per cent increase in the proportion of applicants receiving an offer compared to the same period in 2017
  • a 4 per cent increase in the proportion of admitted(a)applicants finalising their enrolment compared to the same period in 2017.

The taskforce also assessed the efficacy of Swinburne's enrolment processes, systems, and documentation to identify further opportunities for improvement. As part of this analysis, Swinburne conducted usability testing with students to ensure that any changes to its enrolment process would yield the intended results.

(a) 'Admitted' individuals are those who have received an offer to study at Swinburne.
Source: VAGO.

While all TAFEs measure their basic enrolment outputs, most do not assess this information in a strategic manner. Swinburne, however, produces biannual reports that break down the critical six-week period prior to the start of each semester. These reports analyse the barriers and enablers to Swinburne's success, and highlight their commitment to continuous improvement.

In these reports, Swinburne analyses the volume and subject of its enquiries, as well as their originating channels. Evaluating this information allows Swinburne to more efficiently allocate its resources during peak enrolment periods. For example, it identified that Monday is typically the busiest day of the week for enquiries. It has also observed that prospective students tend to apply 'at the last minute', causing a spike in activity towards the end of the enrolment period. This behavioural analysis helps Swinburne ensure that staff are in the right place at the right time.

Monitoring customer service

To thoroughly assess the efficacy of their enrolment practices, TAFEs need performance measures that assess how staff interact with prospective students. Customer-focused performance measures, such as timeliness, provide valuable insight into the prospective students' journey from enquiry to enrolment, as they assess the implementation of a service as opposed to its outcome. These measures help staff isolate the cause of any downward trends in their conversion rates, as inefficient internal decision-making may exacerbate attrition. Box Hill, Swinburne and William Angliss report against enrolment‑related customer service standards, to varying extents.

Swinburne assesses its performance biannually against various service standards, including:

Individuals can call or attend studentHQ to receive advice from Swinburne staff.

  • the average wait time and handle time per in-person enquiry at each studentHQ location
  • the average wait time and handle time per phone enquiry
  • the proportion of phone calls that prospective students abandon.

Swinburne encourages staff to focus on prospective students' enrolment experience. By prioritising the delivery of prompt information to prospective students, Swinburne may gain a competitive edge in the demand-driven environment.

Reporting against these service standards also helps Swinburne understand whether changes in their conversion rates are due to internal or external factors. For example, in early 2016, Swinburne identified that wait and handle times for both phone and in-person enquiries had significantly increased over the previous year. Swinburne attributed these longer times to improved staff training, which allowed admissions officers to resolve complex issues on‑the‑spot without the need for onwards referral. To mitigate this issue and its impact on prospective students, Swinburne opened a third studentHQ location at its Hawthorn campus in January 2017. This resulted in reduced wait and handle times at the TAFE's other studentHQ locations.

Box Hill and William Angliss also assess their performance from a customer service perspective. Box Hill monitors the number of phone calls received, the number of phone calls that each admissions officer handles, as well as prospective students' hold times. This information informs Box Hill's resourcing decisions. William Angliss' customer service standards form part of its admissions officers' performance and development cycles, and include:

  • the proportion of phone calls that prospective students abandon
  • the average wait time for each phone enquiry
  • the number of online enquiries and applications cleared daily
  • the number of voicemails cleared daily
  • the number of emails cleared daily.

While Box Hill and William Angliss encourage high-quality customer service by monitoring this aspect of performance, they do not use their findings to inform their improvement initiatives.

Melbourne Polytechnic and SuniTAFE do not perform similar customer‑focused reporting. This hinders their ability to understand changes in their conversion rates and their prospective students' enrolment experience.

Prospective students' feedback

Traditionally, educational institutions have focused on students' satisfaction with training and assessment as opposed to the effectiveness of support services, such as enrolment. Institutions that neglect this analysis lack critical information, as they do not holistically consider the full suite of services that may contribute to an individual's sense of satisfaction. Collecting and analysing prospective students' feedback on the enrolment process to understand its impact may help TAFEs improve their conversion rates.

All audited TAFEs, except for SuniTAFE, use various mechanisms to collect feedback from prospective students about their enrolment processes.

Box Hill

In 2017, Box Hill commissioned consultants to design and implement a survey tool to better understand why students study at the TAFE. The optional, one‑off survey contained seven free-text fields and 41 questions, seven of which related to enrolment.

Box Hill's enrolment-related questions reflect those that ASQA uses during its five‑yearly audits of training providers. To inform the scope of its audits, ASQA surveys students to identify potential areas of concern. ASQA's enrolment‑related questions focus on compliance with the Standards for Registered Training Organisations 2015.

As a result, Box Hill's questions primarily assess whether staff conveyed critical enrolment-related information in a clear and accurate manner. While this provides insight regarding its staff's communication skills, its questions do not assess other elements of enrolment, such as timeliness or convenience. By incorporating these elements, Box Hill could better understand any procedural barriers to prospective students' enrolment.

Box Hill adapted this tool in 2019 to create a new course evaluation survey. This survey contains enrolment-related questions that broadly reflect the themes of the 2017 version. Staff administer the survey to individuals at the end of their course, which highlights Box Hill's commitment to understanding and improving the student experience.

Melbourne Polytechnic

In late 2016, Melbourne Polytechnic developed a survey tool that sought to evaluate the prospective student's journey from enquiry to enrolment. In contrast to Box Hill, Melbourne Polytechnic's survey focuses exclusively on the enrolment process. This provides Melbourne Polytechnic with step‑by‑step insight into the strengths and weaknesses of its enrolment procedures, such as marketing and communication, the enquiry and application process, the conduct of the pre-training review, and training plan development. Overall, Melbourne Polytechnic's survey provides detailed information regarding its process, which should enable staff to action various student‑centred improvements.

Melbourne Polytechnic administered the survey twice in 2017. However, it has been under review since 2018.

SuniTAFE

SuniTAFE does not administer enrolment-related surveys or conduct other student‑centred research. This means that it lacks critical information about prospective students' enrolment experience, which affects its ability to identify inefficiencies and undertake corrective action.

Swinburne

Since mid-2015, Swinburne has administered two surveys to individuals who utilise its specialist support services, including:

  • Ask George—an email application that responds to prospective students' frequently asked questions by analysing keywords
  • the enrolment hub—an on-campus location that provides on‑the‑spot assistance to prospective students on the enrolment process.

Swinburne manages the first survey through a follow-up email. This survey aims to enhance the quality of Swinburne's generic information material. At the enrolment hub, Swinburne has tablets available for prospective students to complete the second survey immediately after their visit. The survey asks prospective students whether staff successfully resolved their issue without onwards referral.

In contrast to Box Hill and Melbourne Polytechnic, Swinburne extends its surveys to individuals who did not finalise their enrolment. By including this additional perspective, Swinburne enhances its ability to identify procedural barriers that may impact its end-to-end conversion rate.

William Angliss

William Angliss emails a brief survey to all prospective students who interact with its customer service team. The survey asks individuals to rate the support they received from staff as either satisfactory or unsatisfactory. This allows William Angliss to assess the efficacy of its ongoing interactions with students, which may lead to service improvements. Like Swinburne, William Angliss's survey includes respondents who may not necessarily finalise their enrolment, thus offering a wider perspective.

In addition, William Angliss ran a focus group in July 2018 with five domestic students to better understand their enrolment experiences.

Free TAFE Student Experience Minimum Service Standards

As steward of the education system, the department plays a significant role in supporting TAFEs to deliver high-quality services. However, in the absence of mandatory reporting requirements, TAFEs have no incentive to embed the department's best practice standards into their everyday operations.

In late 2018, the department developed minimum service standards in consultation with TAFEs to improve the enrolment process for people interested in Free TAFE courses. As shown in Figure 3E, these standards encourage TAFEs to help prospective students through the enquiry and application process in a timely manner by providing a more consistent experience.

Figure 3E
Free TAFE Student Experience Minimum Service Standards

Student interest

  • Contact channels must include phone, in-person, online forms, and email
  • Contact student within 24 hours of referral or enquiry
  • Enquiry centres and phone lines open 8 am to 6 pm
  • Voicemail enabled for out-of-hours enquiries
  • All enquiries logged in CRM, including those received through faculties
  • If course not offered, direct referral to appropriate TAFE within 24 hours of enquiry

Application and offer

  • Online application form available
  • Application process to be completed in two to four weeks
  • Complete the pre-training review and the literacy and numeracy test to assess suitability during application
  • If suitable, provide written conditional offer to course
  • Staff available during the day and out-of-hours to complete reviews
  • TAFEs to monitor application process and follow up with stalled students to encourage completion
  • Offers may lapse after four weeks of no reply and be made to new student

Maintaining interest

  • Commence nurture campaigns (for example, taster courses or newsletters)
  • Send enrolment reminders at least four, two and one week prior to cut-off

Completes enrolment

  • Complete contract compliance requirements (for example, identification checks)
  • Conduct operational decisions (for example, timetabling)
  • Input student into the Skills Victoria Training System (SVTS) within 30 days of enrolment

Source: VAGO, based on documentation from the department.

TAFEs have not embedded the service standards into their performance monitoring and reporting frameworks for Free TAFE courses. However, Box Hill advises that it is in the process of doing this for all courses. The department is also yet to apply any system‑wide monitoring requirements using these service standards. In embedding them, the department and/or TAFEs would need to develop corresponding targets for meaningful performance measurement. In addition, some TAFEs may need support from the department to develop their capability to report against these standards.

The department and TAFEs could apply these standards beyond Free TAFE courses to the broader Skills First program. Doing so could lead to system‑wide improvements to the consistency, responsiveness, and timeliness of the enrolment processes.

3.3 Limitations of TAFEs' enrolment data

To effectively scrutinise their enrolment performance, TAFEs require information that is complete, accurate, and consistent. Most TAFEs' information management systems and associated enrolment processes do not enable this type of data collection, impacting the quality of their performance reporting. This hinders TAFEs' ability to identify and resolve performance issues.

To assess the efficiency of TAFEs' enrolment processes, we undertook walkthroughs with each institution. At these walkthroughs, we followed the pathway of a typical domestic applicant seeking to enrol in a government‑subsidised course. Our primary aims were to:

  • experience each TAFE's enrolment process from prospective students' viewpoint
  • understand how each TAFE's data collection procedures work in practice, including how and when they capture critical information.

Process mining is an analytical method that helps the user understand different workflows with the aim of improving their efficiency. It involves extracting and evaluating a workflow's key data points over a designated period to show common and unique pathways.

We examined prospective students' pathways through the enrolment process at four of the five TAFEs using process mining software. Process mining is an exploratory approach that shows the order in which prospective students complete critical tasks, as well as the time taken to perform each step. Process mining has the potential to illuminate key points of attrition, irregular pathways, bottlenecks, or other inefficiencies.

To perform this analysis, we identified the data points that represent TAFEs' unique, end-to-end enrolment processes. We then requested an extract of each prospective student's pathway through these data points, regardless of whether they finalised their enrolment.

As Melbourne Polytechnic, SuniTAFE and William Angliss use manual processes to collect enrolment‑related information, we could not extract real-time data points for certain tasks. We encountered the same issue when extracting data points from Box Hill's old enrolment system, but their new system provided us with higher-quality, real‑time information. Given this reliance on manual practices, as well as the differences in each TAFE's process, we could not meaningfully compare individual enrolment step timeliness across the five institutions.

Despite these issues, we were able to compare the overall time taken to process an individual's enrolment at three TAFEs—Swinburne, Melbourne Polytechnic and William Angliss—using process mining software. Our results showed, at the 50th percentile, that Swinburne took the least time to process an individual's enrolment (six days), followed by Melbourne Polytechnic (10 days) and William Angliss (61 days). Swinburne enables prospective students to complete the enrolment process online, while Melbourne Polytechnic requires one in-person visit and William Angliss requires two in-person visits.

We tested the median enrolment times and found statistically significant differences between TAFEs at a 95 per cent confidence level. Appendix E shows more details of the statistical methodology used to establish this.

Our process mining offered valuable insights into TAFEs' information management systems and their ability to conduct performance reporting. For example, the prevalence of divergent pathways suggests that TAFEs may lack strong data integrity rules and as a result do not accurately or consistently record interactions with prospective students in their SMS or CRM. This limitation means that some TAFEs do not have complete oversight of each prospective student's journey through the enrolment pipeline, which may affect their ability to identify attrition issues.

Likewise, the prevalence of manual processing and associated lack of data points affects some TAFEs' ability to systematically assess key aspects of their performance, such as timeliness. This means that some TAFEs will struggle to report against the department's Free TAFE minimum service standards, particularly those that focus on turnaround times. The differences in TAFEs' information management systems and enrolment processes also compromise the department's ability to conduct sector-wide analyses.

Back to Top