Skip to main content

https://dataingovernment.blog.gov.uk/nhs-e-referral-service-assessment/

NHS e-Referral - Service Assessment

The NHS e-Referral Service will succeed the current Choose and Book (CAB) service. CAB has been live in the NHS in England for 10 years, with over 60 million patient referrals being processed since then.

Department / Agency:
DH / HSCIC

Date of Assessment:
12/05/2015

Assessment stage:
Beta

Result of Assessment:
Pass

Lead Assessor:
J. Hughes

Service Manager:
B. Gildersleve

Digital Leader:
A. Bye


Assessment Report

Outcome of service assessment
After consideration, the assessment panel have concluded that the NHS e-Referral service should be given approval to launch as a Beta service. This means that the service is ready to be made available to the public on a beta basis, whilst continuing to be developed and improved for live.

The panel have, as requested, assessed the service against the beta criteria, but are concerned that, in reality, there is no planned beta phase for the service (see further detail on this below). According to the definition of a ‘live’ service, the plan to launch the service will mean it will effectively be live from 15 June 2015, since the existing system will be turned off at that point and all users will be using the new system instead.

Moving straight from alpha to live in this way without a period of beta testing is not recommended by the panel. The panel note that in this particular circumstance it would be very difficult (though not impossible) to do a phased beta launch given the complexity and scale of the system, and they recognise that the team has done a lot of work in advance to minimise the risk of things going wrong - this work is recognised as being valuable. However, the risk and impact of failure given the complexity and scale of the system are the precise reasons why a beta phase is recommended for new digital services, rather than a big bang launch.

This successful beta assessment should not be interpreted as a de-facto approval of this approach. To approve the move into ‘live’, the service needs to undergo a live service standard assessment. At this stage, the panel think it is unlikely the service would pass such an assessment, because there is work still to be done during the beta phase to fully meet the criteria for a live service. The panel have made recommendations in this report to help the team take action to meet the required standards.

The assessment panel recommend that the service be submitted for a live assessment at the earliest opportunity, and definitely within 3 months of the service going into production.

Reasons
At the last beta assessment, the assessment panel proposed two possible ways forward for the service. The team has chosen the option that involved prioritising replacing the legacy technology whilst maintaining the user experience broadly as it is under the old system (although the team has made improvements to the user interface based on feedback and research), providing a basis for further iteration and improvement.

The assessment panel think it would have been preferable, possible, and more consistent with the approach laid out in the Digital by Default Service Standard and the Service Manual, to transform the service more fully end to end from the outset. But the panel accept that the option the team has chosen represents a significant step forward, replacing legacy technology with a service that can be iterated and improved over time. The team is clearly committed to improving and iterating the service once it is in beta.

Obviously it is important that the team now fulfils its clear commitment to improve the service over time - the risk is that once the first phase (technology replacement) is achieved, investment for and commitment to ongoing improvement starts to wane. The panel were convinced in the beta assessment that the team is committed and is putting resources in place to do this and, on this basis, were satisfied that the relevant beta criteria are met through this option.

Since the previous beta assessment of the service, the team has made a lot of progress and the service is now much more clearly on track to meet the required criteria. In particular:

  • The team has carried out some user research using a prototype and has developed a backlog of prioritised needs which it plans to work on after the service goes into production, and it has plans in place to carry out ongoing user research.
  • The team has developed a prototype for a new patient-facing service, using agile, user-centred methods - it is currently at prototype / alpha stage but if the team continues to test and develop that element of the service using these methods, the panel are confident it will represent a big improvement in the quality of the service for patients.
  • The service is already on gov.uk/performance, and the team has plans to expand the range of measures included on the dashboard during the beta.
  • The team has made significant progress in its understanding of assisted digital user needs and how support might be delivered, and has plans to test and develop support during the beta.

Recommendations
The assessment panel have seen sufficient evidence that the team is on track to meet the service standard requirements to pass a beta assessment, but to meet the requirements for live, the panel believe that the team will need to complete some of the work still underway in relation to some aspects of the service standard. These elements are as follows.

  • iteration and improvement - the team is planning to monitor analytics and carry out ongoing user research to inform the future development of the service. It’s not possible for the team to do this until the service goes into production, because the legacy system doesn’t allow for analytics or frequent iteration. For a live assessment, the team would need to demonstrate evidence that it has iterated and improved the beta service on the basis of analytics and research.
  • research and analytics - The team recognises the importance of analytics and has a broad understanding of how they can use them to improve the service. During the beta they will rapidly need to further develop their plans and capability to match their aspirations to use analytics to improve the service. This includes having a clear point of responsibility and capacity to carry out the required analysis.
  • The team is placing a lot of emphasis on the value of analytics in the new service, but has not yet put in place the capability that will be required monitor and interpret the analytics, and is still recruiting for 3 research posts within the team. This capability will need to be fully in place and working, with evidence of frequent user research and analysis taking place and feeding into ongoing improvements, in order for the service to meet the requirements for a live assessment.
  • design capability - the team does not include any designers - to fulfil the team’s commitment to ongoing iteration and improvement, it will be essential to fill this gap. The plan is to do this through new contractual arrangements with the development partner. For a live assessment the panel would need to see evidence that the design capability is fully embedded in the team and its ongoing development processes.
  • assisted digital - the team has assisted digital support in place for beta that will effectively complete the transaction on behalf of people, either in person or on the phone. The team plans to extend and develop the assisted digital offering to provide support for people to complete the transaction themselves. Research should be undertaken with assisted digital users at each point of the user journey to demonstrate how assisted digital support meets user needs.
  • publishing code - the team has not yet established its policy for opening its code (this is partly contingent on corporate decisions within the organisation), or published any components of it, although it is sharing code within the organisation and is using and contributing to open source tools. The team has identified some components that can be published during the beta, and for its new patient-facing service will start publishing code as that part of the service moves into beta. The panel were satisfied that there was sufficient potential to meet the requirement for a live service, providing the team makes significant progress during the beta phase in developing its formal policy and releasing its code.

Use of design patterns and style guide -
The service going into beta does not use the GOV.UK design patterns and style guide, because of the phased approach the team has taken (tech first, UX iteration on that basis), a decision that the panel have accepted based partly on the lack of ability to measure and improve the legacy system’s user interface. The panel have accepted that this approach is sufficient for starting the beta phase, assuming that the team is going to iterate and improve the service, with significant design input, from the start of the beta onwards. To meet the requirements for Live the service will need to have gone through some significant iteration and design improvement to meet this criterion, based on user research and design work in the context of GOV.UK and NHS.UK.

Summary
The panel would like to recognise and congratulate the team for the progress it has made in a complex technical, operational and stakeholder environment.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes