Skip to main content

https://dataingovernment.blog.gov.uk/employer-routed-funding-service-assessment/

Employer Routed Funding - Service Assessment

The Employer Routed Funding service provides a single digital journey to take employers and training providers from initial interest in recruiting an apprentice, through to the completion of the apprenticeship. This includes:

  • control of funding provision and payments
  • comparison and selection of training providers
  • choosing the standard their apprentice will complete
  • choosing the assessment organisations

The digital service will offer assisted digital provision to help Employers who have low digital skills.

Department / Agency:
BIS / SFA

Date of Assessment:
28/09/2015

Assessment Stage:
alpha

Result of Assessment:
Pass

Lead Assessor:
S. Wood

Service Manager:
G. Tucker

Digital Leader:
E. Stace


Assessment Report

Outcome of service assessment

After consideration, the assessment panel have concluded that the Employer Routed Funding service is on track to meet the Digital Service Standard at this early stage of development.

Reasons and Recommendations

The Employer Routed Funding service is part of the overall Apprenticeship Reform Programme. This will deliver a new funding system that will give employers control of apprenticeship funding. The service will provide a single digital journey that will allow employers to search for an appropriate set of rules for the apprenticeship, and give employers greater control over apprenticeship funding. The service will take employers and training providers from initial interest in recruiting an apprentice, through to completion of the apprenticeship. This will include: control of funding provision and payments, comparison and selection of training providers, choosing the standard that the apprentice will complete, and choosing the assessment organisations. The digital service will offer assisted digital provision to help employers who have low digital skills. The overriding aim is to shift the emphasis from providers. During the alpha service standard assessment, the assessment panel looked at the following areas.

User needs

The service team has clearly identified the user needs that will underpin the service. The policy intent, (for employers to have more control over the training providers they use) is a user need for some employers but possibly not all, as some will be satisfied with the current situation. Care must be taken here not to create pain points where none exist at the moment. However, this cannot be fully tested yet as the policy around funding is yet to be finalised. The panel was pleased to see that over 70 users had been interviewed during discovery. Commendably, the team had carried out additional surveys with both employers (approximately 350 responses) and providers (approximately 250 responses).

Employers will be the main users of this service and it was good to see that testing and research was primarily taking place at employers’ premises. It was also encouraging to hear that, despite the obvious difficulties, all the team were taking part in user research.

The team has made a very positive start to assessing user needs for assisted digital support, particularly given that it can be difficult to identify businesses that don’t have digital skills or access. During their research, the team identified several companies that would require support, and have estimated a 5% ongoing need for support once the service is established. They will continue to work with these companies in beta, as well as engaging with Remploy. The team is considering all options for support and will start to solidify this in beta

The team

The panel recognises that the service manager is experienced, knowledgeable, and highly competent, and the team is multidisciplinary and is only supplemented in one or two areas by a supplier. The team is clearly working in an agile environment; sprints that were originally three weeks long have been shortened and are now weekly. In part this was possible because the prototypes are no longer being built using HTML, with Axure being used instead. We would expect fortnightly sprints to become the norm once “real” code is used again, not least because short sprints can become relentless, and may not allow for sufficient thinking time. Show & tells are well attended, and the expected techniques are taking place. The service manager attends many, but not all, of the stand-ups, but this is not an issue as the team is highly competent and take turns in leading them.

It was confirmed that assisted digital support will be paid for by the service, and that people will not have to pay providers for support.

Technology

The panel was pleased to learn that fraud vectors have already been investigated, and that validation is already being carried out by the agency. Plans are being put into place for the service to plug into these. The agency’s data controller has been involved throughout the alpha phase. It was noted that the back-end service has already been involved in two successful service standard assessments, and there appears to be a depth of technical skill within the team.

Design

While perfectly acceptable for the alpha to be built using Axure, the panel noted non-standard layout and decorative elements, such as pictograms, which will need to be addressed during beta. Also, while it appeared that the alpha prototype was iterated frequently on the basis of user feedback, the panel felt the team had consider only a limited range of options. This risks the prototype not being the best possible option for the beta.

Content design

We were pleased to learn that the team had identified a number of content design issues during the alpha, especially with regard to the variety of different user needs for information and for start points to their interaction with the service. The team had already iterated their terminology within the service, but the change from ‘Standard’ to ‘Apprenticeship’ is a concern. The service currently uses the word ‘Apprenticeship’ to mean two different things; both as a specific instance of an apprenticeship (a job with training, also used by the candidate journey), and as the choice of a type of apprenticeship. We strongly recommend that the team stop using the same word in two ways and find another word or phrase to describe the things formerly called ‘framework’ or ‘standard’, then test that word with employers and providers. The panel noted that the team have yet been able to conduct significant research on the funding model; the panel suspect that the word ‘funding’ may also prove to be problematic, as it suggests a payment to the employer rather than the involvement of the employer in choosing the training provider who will receive the payment. Given there is one certainly problematic term, and one that is potentially problematic, the panel recommends the creation of a ‘controlled vocabulary’, a list of all the service related nouns that are currently in use, to check that each noun is being used uniquely, and to help in testing the nouns with employers.

Analytics

The team is already making use of enquiries received by the helpdesk to inform the design of the user journey. A web metrics service is in place, but this might change. The team is still teasing out which Key Performance Indicators (KPIs) are needed, and is working with the GDS Performance Platform team to identify these. The user researcher will be responsible for assisted digital analysis. However, as there is no equivalent existing service, it will be difficult to establish benchmarks.

Service development

There is always a risk that the end of the alpha phase merely evolves into the beta service. Serious consideration should be given to taking the start of the private beta phase as an opportunity to draw a line under the work to date, and design the new service from scratch. Services should be based upon what has been learned, not what what has been built.

The team should regularly engage with the GDS Design Team throughout beta, and it is advisable that this engagement starts as soon as the beta is underway. The service is complicated insofar as there are a number of user groups (employers, training providers, government staff, apprentices), some of whom will be perfectly happy with the existing model and could therefore be confused.

We also recommend the service take advantage of the wider GOV.UK design community. There are other government services in development that have similar features and user needs. For example, Digital Marketplace, Choose and Book (NHS), and Performance Tables (DfE). The team should collaborate with these and other services to establish common patterns that all can benefit from. GDS can support you in this.

The terms “apprentice” and “apprenticeship” mean different things to different people and did cause the panel some confusion. More work will be required to ensure users aren’t similarly confused when the service returns for the beta assessment.

Summary

The panel would like to congratulate the service manager and the team on passing the assessment. The team appears highly competent and is clearly working well. It was good to see that there is a member of the policy team involved, and that lessons and observations from user research sessions are helping to inform policy decisions. Equally it was good to hear that all members of the team took part in user research sessions. The panel looks forward to seeing the team again at the beta assessment.


Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes