State Purchase Contracts

Tabled: 20 September 2018

3 Overseeing and managing SPCs

VGPB is responsible for monitoring the compliance of departments and specified entities with VGPB supply policies.

Under VGPB's oversight, lead agencies are responsible for the day-to-day management of SPCs. Each SPC is different, however the fundamental principles for managing contract performance are the same.

Effective SPC management ensures:

  • a high standard of service and quality delivered by the suppliers
  • value for money
  • reduced risk
  • the ability to address market changes or developments
  • the identification of poor contractor performance.

This part examines VGPB's oversight role and how well lead agencies oversee and manage their SPCs.

3.1 Conclusion

The split in responsibilities for managing SPCs has led to inconsistent management practices across lead agencies. Further, VGPB does not have the resources to directly oversee the management of all SPCs or ensure compliance with its supply policies.

Lead agencies use contract management frameworks to manage SPCs, but the lack of quality data limits the effectiveness of these frameworks. In some instances, lead agencies have failed to actively manage SPCs despite holding the data to do so. This has resulted in missed savings.

3.2 VGPB oversight of SPCs

Before June 2016, DTF was responsible for overseeing all SPCs. A 2016–17 policy review conducted by VGPB identified a need for VGPB to take a stronger role in establishing, reporting on and overseeing SPCs.

VGPB has only a small secretariat, provided by DTF, to undertake all monitoring activities to ensure compliance with VGPB policies. With its limited resources, it sensibly monitors only the compliance of the seven departments, Public Transport Victoria, VicRoads, Victoria Police and Cenitex, as opposed to all 34 VPS agencies in its scope.

VGPB oversight of SPC users

The Market analysis and review policy outlines the requirement for agencies to use mandatory SPCs. VGPB attempts to monitor compliance through:

  • ASRs, which entities submit to VGPB at the end of each financial year— ASRs summarise procurement activity for the year and report instances of non‑compliance with VGPB policies, including in the use of mandatory SPCs
  • an audit program—entities audit their own compliance with VGPB policies and submit a report to VGPB once every three years.

While these mechanisms play a role in assessing compliance, VGPB acknowledges their limitations. For example, in their 2016–17 ASR, the seven departments raised no compliance issues with their SPC obligations. As we discuss in Part 5, these departments were unable to tell us whether leakage was occurring from SPCs, and three departments—DET, DPC and DELWP—were unable to identify SPC spend within their own financial and procurement systems.

The Financial Compliance Management Attestation process was introduced in 2017–18. It requires agencies to conduct an annual assessment of compliance with all applicable requirements in the FMA, including VGPB procurement policies.

How these departments were able to make an attestation of compliance given these limitations is unclear. VGPB accepted these assertions on face value. VGPB stated that this is because of the lack of data and tight time frames specified in the FMA between when entities submit their ASRs and when VGPB tables these ASRs in its annual report.

While VGPB's audit program requires entities to verify compliance with mandatory policy requirements and submit a report to VGPB every three years, this verification takes place well after attestations are made.

VGPB is assessing whether the new Financial Compliance Management Attestation will provide further assurance as to the accuracy of ASRs. However, this does not address the underlying data limitations that hinder departments' ability to attest to compliance with SPC requirements.

VGPB oversight of lead agencies

VGPB oversees lead agency management of SPCs by:

  • approving key documents—VGPB must endorse all SPC business cases before the lead agency submits the business case for ministerial approval
  • overseeing strategic procurements—entities can nominate certain strategic procurements for VGPB to oversee, and VGPB can identify specific procurements to oversee, including the establishment or renewal of an SPC
  • reviewing procurement activity plans—entities submit procurement activity plans every year for review by VGPB, which may identify potential aggregation opportunities. As discussed in Section 2.2, this is not an effective process for identifying new SPC opportunities.

Five SPCs that were renewed in 2016–17 progressed through the VGPB strategic oversight program — Telecommunications Purchasing and Management Strategy, eServices Register, Cash and Banking Services, Security Services and Master Agency Media Services . Letters between VGPB and the lead agencies for these SPCs show evidence of VGPB's review of key documentation, including business cases and category strategies.

While this oversight process requires lead agencies to provide evidence that they have complied with VGPB policies throughout the SPC tendering process, VGPB's oversight of lead agencies' contract management activities is minimal once the contract is executed.

VGPB requests an update from lead agencies at certain milestones. However, these milestones are at one- or two-year points of contracts that run for three years. Consequently, there is little oversight and reporting by VGPB of contract management activities once a lead agency executes an SPC.

3.3 Lead agencies' development and management of SPCs

Active management of SPCs by lead agencies presents the opportunity to improve SPC performance—not only in realising cost reduction by demand aggregation, but also by monitoring suppliers' performance and sharing information with users to increase savings opportunities. The management cycle for an SPC includes:

  • developing a business case
  • implementing a CMP
  • measuring client satisfaction
  • managing key suppliers
  • sharing savings opportunities with departments and entities
  • tracking SPC prices.

Business cases

We examined the business case documentation for 31 SPCs to assess whether they provide an audit trail of the decision‑making process and include the following key elements:

  • objectives of the SPC establishment/renewal
  • analysis of past spend data and forward spend estimation
  • market analysis
  • expected financial and non‑financial benefits, including assumptions and methodology for calculating financial benefits
  • sourcing options.

Figure 3A summarises the results of our assessment of business case documentation.

Figure 3A
Assessment of key elements of SPC business cases

Key elements of business cases

Number of business cases

Substantially met

Partly met

Not met

Objectives—clear set of objectives

31

0

0

Spend—analysis of user consumption and spend data based on projected volume and product mix

18

11

2

Market analysis—detailed research of the current market place and existing opportunities and/or constraints for government

29

1

1

Benefits—outline of expected benefits and how they will be measured

17

8

6

Sourcing options—exploration of sourcing options to respond to the problem and deliver benefits

31

0

0

Note: Substantially met—adequately addressing requirements. Partly met—partially addressing requirements with identifiable gaps. Not met—weaknesses mean the documentation falls short of the minimum required.
Source: VAGO analysis of business cases provided by lead agencies.

Most of the business cases substantially addressed the key requirements, with improvements evident in those more recently developed . Lead agencies' market analysis and consultation with key stakeholders from SPC users is occurring early to support business case development or the extension of an SPC.

However, we identified major shortcomings, including:

  • a lack of documented evidence outlining assumptions for calculating financial and non-financial benefits that have been agreed with stakeholders
  • the absence of comprehensive and reliable spend and volume data—lead agencies rely on historical self-reported supplier spend and volume data, which creates risks such as:
    • not considering category spend that has occurred outside an SPC
    • underestimating volumes or historical spend due to incomplete information from suppliers
    • the possibility that lead agencies may not identify suppliers increasing prices of non-contract items, which counters the benefits from the items included in the contract.

Category Management Plans

Once an SPC is executed, the lead agency must develop a CMP. CMPs outline the activities associated with managing an SPC, including:

  • benefits tracking—specifying the methodology used to calculate financial benefits
  • risk management—outlining risks specific to the SPC and mitigation strategies
  • performance monitoring—outlining performance measures to assess supplier performance
  • continuous improvement—outlining improvements that result in decreased costs or improved service levels and quality.

Of the 34 SPCs, 29 have CMPs. The remaining five SPCs, with a combined spend of more than $17.9 million in 2016–17, do not have CMPs:

  • DX Services—$2.3 million
  • Rosetta—$650 000
  • Data Centre Facilities—$14.9 million
  • eServices Register—spend data not collected by lead agency
  • Microsoft Licensing Solution Provider—no spend data attributable.

Lead agencies allocate resources to the management of SPCs by assessing the importance of the category to government in terms of business impact, value, risk and complexity. The DX Services SPC and Rosetta SPC have minimal management and documentation, given the small spend amount and that there is one supplier. This is a reasonable approach.

The lack of CMPs for the Data Centre Facilities SPC, eServices Register and Microsoft Licensing Solution Provider, limits DPC's ability to ensure that these SPCs are managed consistently and that they meet their objectives.

Figure 3B summarises the results of our assessment of the 29 CMPs across their key elements.

Figure 3B
Assessment of CMPs against better practice

Figure 3B summarises the results of our assessment of the 29 CMPs across their key elements.

Note: Substantially met—adequately addressing requirements. Partly met—partially addressing requirements with identifiable gaps. Not met—weaknesses mean the documentation falls short of the minimum required.
Note: Figures may not total 100 per cent due to rounding.
Source: VAGO analysis of category management plans provided by lead agencies.

The majority of CMPs substantially or partially met the requirement to identify the methodology used to calculate financial benefits, and risks and mitigation strategies associated with the SPC.

However, more than half of the CMPs included no continuous improvement initiatives, and 14 (48 per cent) did not outline performance measures to assess supplier performance. DPC manages 10 of these 14. The absence of performance measures affects DPC's ability to identify poor contract performance and opportunities for improvements to service provision.

Four CMPs (14 per cent) did not adequately outline how lead agencies would track and measure benefits, which heightens the risk of an inconsistent approach to the calculation of benefits for these SPCs. We discuss this in Section 4.4.

Client satisfaction surveys

DTF surveys SPC users annually to assess their satisfaction with SPCs. The 2016−17 results indicate that almost three-quarters of users were satisfied or very satisfied with their overall experiences using the SPC. The survey also revealed that:

  • 81 per cent of users were satisfied or very satisfied that the SPC met their departments' needs
  • 73 per cent of users were satisfied or very satisfied with the suppliers' performance
  • 77 per cent of users were satisfied or very satisfied with DTF engagement.

While the annual survey is useful in general, it does not show users' assessment of suppliers on individual engagements. This is particularly important for panel supplier arrangements such as PAS, where DTF could use the information to address performance issues and notify users of issues with specific suppliers.

Lead agencies engage with stakeholders to seek feedback on supplier performance. This is predominately done through a user reference group—which comprises key representatives from each of the user departments—and through surveys.

While the PAS SPC requires users to complete a satisfaction survey and forward it to DTF at the completion of each engagement, only a limited number of users do so. Consequently, DTF has little visibility of the SPC performance and buyer satisfaction.

In 2016–17 DJR undertook an extensive consultation process with all user agencies and providers on the Legal Services Panel to develop a new client satisfaction survey. The survey results feed into annual performance review meetings with suppliers.Our review of survey results for the Legal Services Panel indicates that users have generally been satisfied with services provided under the Legal Services Panel.

DPC and Cenitex have limited visibility of users' satisfaction with supplier performance because they do not survey SPC users.

Managing key suppliers

The establishment of an SPC concentrates government expenditure with a select number of suppliers—eight of the top 10 suppliers receiving government expenditure are on SPC arrangements. To manage an SPC well, lead agencies need to understand the level of spend on suppliers and to use this information to potentially leverage further savings.

Panel arrangements

Figure 3C shows our review of three SPCs with a panel of suppliers—PAS (230 suppliers), Staffing Services (eight suppliers) and the Legal Services Panel (23 suppliers) using data provided from the lead agencies—DTF and DJR. This analysis highlights that a significant percentage of expenditure is concentrated with a small number of suppliers on each of these SPCs. This may indicate user preference for dealing with the known suppliers and some reluctance to engage new suppliers, or that some suppliers provide a greater number of services under the SPC. For example, on the Legal Services Panel there are 13 areas of law that a panel firm can compete in. Some firms only provide services in one area of law while others provide services in more than eight areas, meaning they are able to generate more revenue.

Figure 3C
SPC spend for PAS, Legal Services Panel and Staffing Services by supplier, 2016–17

Figure 3C shows SPC spend for PAS, Legal Services Panel and Staffing Services by supplier, 2016–17

Source: VAGO, based on information from DTF and DJR.

DPC's June 2018 review into labour hire and professional services found that despite the significant expenditure on PAS to a limited number of suppliers, 'there is no active account management of these suppliers at the whole of government level and the aggregation of demand is not actively used to drive better pricing outcomes'.

DTF advised it is in the process of developing a strategy for the future PAS SPC, focusing on more active central category management.

Suppliers on a register

Registers are typically used where a variety of skills and capabilities are sought across a large supplier base. They allow pre-qualification of suppliers who satisfy certain key selection criteria relating to their capability to supply goods or services to government business.

Pre-qualified registers have the advantage of not 'locking up' a market, as new entrants can be added at any time. One challenge is that although many suppliers may be registered, many find that they do not receive government work. Limited performance information about suppliers is available to user agencies, so it is common for users to select firms they have previously used.

For example, Figure 3D shows that the top 10 suppliers, in terms of spend, made up around 91 per cent ($40.6 million) of the total spend ($44.6 million). Half the suppliers on the IT Infrastructure Register received no work from the Victorian Government in 2016–17. This highlights that entities continually use the same suppliers, which may compromise their ability to get the best value for money.

Figure 3D
IT Infrastructure spend by suppliers, 2016–17

Figure 3D shows IT Infrastructure spend by suppliers, 2016–17

Source: VAGO based on data from DPC.

Marketing Services Register

Our February 2012 Government Advertising and Communications audit identified that DPC's management of the Marketing Services SPC was poor, with ineffective recording, reporting and monitoring of government expenditure for marketing services. DTF has managed the original Marketing Services SPC as a register since December 2009.

While the August 2013 business case for the Marketing Services SPC identified the need for an online system that captures spend, this was not fully implemented. Consequently, DTF has had to estimate spend and cannot measure benefits.

An online system that captures procurement activity is crucial to the operations of the Marketing Services Register to support compliance and visibility of procurement activity, and to minimise the risk of contract leakage. Furthermore, it would allow DTF to obtain accurate data to quantify the benefits of the Marketing Services Register.

Sharing saving opportunities

Lead agencies share high-level information on departmental spend and usage with VGPB and stakeholders on an ongoing basis. However, an opportunity exists to better communicate and highlight savings opportunities and trends across users because, presently, users have no transparent way to assess if they are receiving competitive rates from suppliers compared to other users.

This information can be useful for SPC users where suppliers may charge different users varying rates for equivalent goods or services, such as on the PAS, Legal Services Panel and Staffing Services SPCs.

The Staffing Services SPC is for the engagement of temporary staff to work in government agencies.

As an example, we analysed 58 engagements of temporary senior policy officers through the Staffing Services SPC in 2016–17 across four user departments:

  • 17 engagements by DEDJTR
  • 18 engagements by DELWP
  • 9 engagements by DJR
  • 14 engagements by DET.

All these engagements were for staff hired at a VPS 5 level for between three and six months. The hourly rate varied within and across departments. Figure 3E shows the minimum, maximum and average hourly rates achieved by the departments.

Figure 3E
Hourly rates for temporary VPS 5 senior policy officers, 2016–17

Figure 3E shows hourly rates for temporary VPS 5 senior policy officers, 2016–17

Source: VAGO, based on supplier reports.

While no staffing engagement is the same, this comparison of hourly rates is valuable. As the lead agency, DTF should be reviewing and distributing such information to SPC users to help them identify where they may not be achieving the same level of savings as other users.

This analysis also highlights the need for user departments to do more work to understand where different parts of their businesses are paying varying rates for the same service. Understanding internal spending patterns will help SPC users negotiate lower prices during future engagements.

In addition to the Who Buys What and How interactive dashboard—discussed in Part 2 of this report—the WA Department of Finance is developing interactive dashboards for its whole-of-government goods and service contracts that will show the comparative rates agencies have paid the same supplier. The intention of presenting a collective view of agency data is to encourage collaboration and the proactive sharing of advice across agencies, resulting in better outcomes. To protect the commercial sensitivity of this information it will only be available to agencies.

Price tracking to ensure compliance

Some SPCs provide a negotiated discount—such as Oracle Software and Support—or a ceiling rate—such as PAS—for the goods or services provided. Price tracking against the contract is an important contract compliance measure.

Each department has an Internal Procurement Unit responsible for ensuring that procurement activity complies with VGPB policy.

The SPC user is primarily responsible for ensuring that the prices it pays accord with the SPC contract. However, purchasing decisions in user departments are made by different business units and are not all centrally tracked through the Internal Procurement Unit. This hinders the ability of SPC users to monitor compliance with SPC pricing.

DPC's June 2018 review into labour hire and professional services raised concerns with how departments check compliance of invoices with agreed rates on the Staffing Services SPC and ceiling rates on the PAS SPC.

Lead agencies also have a role in highlighting the variation in prices paid compared to the agreed rates and discounts outlined in an SPC agreement. However, they have limited visibility of purchasing decisions, partly due to the lack of centralised procurement information.

Lead agencies, as contract managers, should also conduct spot-check analyses of supplier-reported invoices for high-risk SPCs to ensure pricing validity and accuracy, including ensuring ceiling rates are not exceeded. However, they have not done so.

As Figure 3F discusses, in June 2017 DTF engaged a third party to develop a dashboard to identify spend above the ceiling rates agreed for the PAS SPC.

Figure 3F
Price tracking for the PAS SPC

Suppliers on the PAS panel have submitted 'not to exceed' prices for hourly rates charged for their consultancy services. The PAS model, agreed with stakeholders, requires purchasers to negotiate improved fees, thereby maximising value for money at the point of procurement. This process depends on the SPC user. DTF, as the lead agency, only sees the hourly rates charged later, when it receives quarterly reported data from suppliers.

Our analysis of DTF's dashboard revealed that although DTF has attempted to monitor compliance with PAS ceiling rates, significant work is required before the data is accurate, including on the number of hours worked. DTF advised that these numbers are not always accurate—consequently it is currently unable to monitor if SPC users have paid more than the ceiling rates.

With these data caveats in mind, we assessed the data reported by one PAS supplier in 2016–17. Using the number of hours and total spend reported by this supplier, we analysed how many transactions were above the ceiling rate. Of the 61 purchase orders, 10—or 16 per cent—appeared to be above the ceiling rate. Our analysis depends on the supplier correctly recording the number of hours charged, which may not be accurate. Regardless of whether this data is completely correct, it highlights the risk that payments above the ceiling rate may occur in the PAS SPC.

User departments are responsible for the amounts they agree to pay PAS suppliers. As a second layer of protection, DTF should continue to build capacity to perform this type of compliance monitoring to help user entities achieve value for money.

Source: VAGO based on information provided by DTF.

Back to Top