Skip to main content

https://dataingovernment.blog.gov.uk/secure-communications-service-assessment/

Secure Communications - Service Assessment

Secure Communications allows third parties to send information securely to DWP. The current scope of the service allows GP surgeries and Macmillan nurses to send medical information to the department.

Department / Agency:
DWP

Date of Assessment:
14/5/2015

Assessment Stage:
alpha

Result of Assessment:
Pass

Lead Assessor:
D. Williams

Service Manager:
R. Woods

Digital Leader:
K. Cunnington


Assessment Report

Outcome of service assessment

After consideration the assessment panel have concluded the Secure Communications service is on track to meet the Digital by Default Service Standard at this early stage of development. However, the panel identified some areas where the service needs to demonstrate considerable improvement before coming in for a beta assessment. These are outlined in the recommendations below.

Reasons

User needs and user research

The service is currently a proof of concept demonstrating how GPs and Macmillan nurses can submit DS1500 forms online to DWP. Currently this can be done by post or by emailing a Word document version of the form if they are registered to do so. The service team had insight into user needs here and had established that the current process leads to a large number of delays due to forms being lost in the post or through referral back to the issuer due to errors and/or illegibilty.

The needs that the service is currently addressing were identified from a mixture of desk research, focus groups, in depth interviews and ‘day in the life of’ visits to 20 GP surgeries. The team showed evidence of how they used feedback to determine a web-based service was desired by users. The team have gathered evidence of the need to make the process quicker for both the patient (where time sadly really is of the essence) and for the GP - one observation was that for a GP even 10 seconds saved is worthwhile. The team showed some knowledge of as yet unmet user needs, some of which they expected to address shortly, others were awaiting prioritisation in the backlog.

The team have plans to use lab testing and to further engage with GPs through the British Medical Council (BMC) and the forum on DWP, and possibly with patients though these mediums (with perhaps Macmillan nurses acting as a proxy due to the sensitivity of the subject). Analytics will be employed when the online form comes into use.

The team showed how they have made some changes following evidence gathered from research.

The team

The team have a dedicated team with one empowered lead service manager. It is likely that the team will expand in the beta stage and be divided into two, with each focussing on separate but strongly linked tasks. The team demonstrated an understanding of how these tasks will be coordinated.

The team is using agile and provided evidence of adhering to the key principles including adapting processes as required. They are using Scrum and working in two-week sprints.

Security, privacy, tools and standards

The team are visible within the ‘security community’ at DWP by having representatives attend their show & tells; the team also update their CESG forum. In addition, the DWP identity team regularly visit and often are embedded within the team. The team have regular communications with security transactions team for risk discussions. The overall strategy is to be "noisy" so as to encourage engagement with other departments.

The DWP data protection team have been engaged and assessments are ongoing.

There is an intention to make the code available (excluding NHS code) and the team would be happy for it to be used. Senior management are aware of this and the team expect to be able to proceed. The service will need to evidence making code available under an appropriate open source licence at the beta assessment.

Design

The design of the service is still very much a work in progress which is understandable at this alpha stage. There has been end-to-end testing with GPs and this will continue as the panel would expect.

On the current hardcopy form there is a requirement to capture the patient’s National Insurance number, and as user feedback suggests this is frequently not known by the user, the team will be challenging this.

There is a reliance on an external designer but as this is not a dedicated resource this is a vulnerability.

Assisted digital and channel shift

The service team has demonstrated that assisted digital (AD) support does not need to be provided at this time. If the scope of the service changes the user base, the team may need to undertake research with AD users and design, test and provide appropriate AD support which meets user needs.

Analysis and benchmarking

Prototyping has involved end-to-end testing with GPs and there have been four versions to date. The team separated smart card and form journeys, and have tested the data gathering element (i.e. the form) more extensively. Lessons were learned by observation and incorporated into the process.

The team intend to use Nagios and possibly Google Analytics. There is the aspiration to use analytics to verify user research, however this is dependant on users accessing and using the service during beta.

The team have thought about how to measure success in addition to the 4 mandated key performance indicators (KPIs).

Benchmarking will be problematic as data on the current service is poorly defined; however efforts are being made to engage with operations managers in order to make measurements so as to establish a baseline.

Recommendations

User needs and user research

Point 1 - Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for digital and assisted digital service design.

Point 2 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

Point 20 - Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users.

Concentrate on planned user research with actual users and ensure the service is regularly tested end-to-end.

Integrate new designs that have tested well into the service and test these with users.

Continue to involve the whole team in user research and help the team understand the user needs this service will be meeting.

The team

Point 2 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

The team mentioned a plan for beta to reorganise into two teams. This should reduce the prioritisation tensions between the DS1500 piece of work and the smartcard piece of work, and allow for more focus on user experience. The panel supports this approach.

Recruit a full-time designer and a full-time content designer to work with the service team. This a complex and sensitive service and without proper analysis and design, the user experience will not move past an online replica of the existing paper form. The designer and content designer (when recruited) should work alongside the team, get involved in user research and feed into the design and flow of the service.

Assisted digital

Point 10 - Put appropriate assisted digital support in place that’s aimed towards those who genuinely need it.

Point 11 - Plan (with GDS) for the phasing out of any existing alternative channels, where appropriate.

Carry out further research to identify users with assisted digital needs and develop proposed support to meet user needs and the assisted digital standard.

Design and content design

Point 9 - Create a service that is simple and intuitive enough that users succeed first time, unaided.

Point 13 - Build a service consistent with the user experience of the rest of GOV.UK by using the design patterns and the style guide.

The service team should work with DWP teams and NHS teams to ensure user journeys around the service provide the best experience for users.

The designer and content designer (when recruited) should collaborate with the content community at GDS and across government to ensure that the service adopts the style patterns and best practice endorsed by its application in comparable, successful services.

During the next assessment, the service team should be prepared to show more examples of how evidence gathered from user research and testing has informed the service design.

Analytics, benchmarking and reporting

Point 7 - Establish performance benchmarks, in consultation with GDS, using the 4 key performance indicators (KPIs) defined in the manual, against which the service will be measured.

Point 18 - Use analytics tools that collect performance data.

Point 21 - Establish a benchmark for user satisfaction across the digital and assisted digital service. Report performance data on the Performance Platform.

Point 22 - Establish a benchmark for completion rates across the digital and assisted digital service. Report performance data on the Performance Platform.

Point 23 - Make a plan (with supporting evidence) to achieve a low cost per transaction across the digital and assisted digital service. Report performance data on the Performance Platform.

Point 24 - Make a plan (with supporting evidence) to achieve a high digital take-up and assisted digital support for users who really need it. Report performance data on the Performance Platform.

Work with the GDS performance platform team to have a dashboard measuring performance against KPIs publicly available when you are ready for public beta. The panel recommends that the team consider measuring abandoned versus successful submissions in addition to the mandatory KPIs.

Open standards and common government platforms

Point 16 - Use open standards and common government platforms (e.g. GOV.UK Verify) where available.

The PDFs generated in the service should be PDF/A to comply with open standards. The team should familiarise themselves with the government Standards Hub.

The team mentioned that many GPs use off-the-shelf software to manage patient records, but are yet to approach the software providers to discuss any possible integrations. Even a very simple API to pre-populate the patient’s name and address could save valuable GP time and improve the user experience.

Make source code open and reusable

Point 15 - Make all new source code open and reusable, and publish it under appropriate licences (or give a convincing explanation as to why this can’t be done for specific subsets of the source code).

Continue the work to open source code.

Testing the end-to-end service

Point 17 - Be able to test the end-to-end service in an environment identical to that of the live version on all common browsers and devices. Use dummy accounts and a representative sample of users.

Ensure the service has been penetration tested.

User data and security

Point 3 - Evaluate what user data and information the service will be providing or storing, and address the security level, legal responsibilities, and risks associated with the service (consulting with experts where appropriate).

The team should continue to review the target security level to ensure that it is not too low, nor, importantly, too high. Given the specific fraud risks around this service, the team should consider if, for example, multiple smartcard authentications is excessive.

Testing end-to-end

Point 17 - Be able to test the end-to-end service in an environment identical to that of the live version on all common browsers and devices. Use dummy accounts and a representative sample of users.

The team should follow-up with the Health & Social Care Information Centre (HSCIC) to better understand the future roadmap for NHS staff authentication, particularly with regard to use of a broad range of devices and browsers.

Testing with the minister

Point 26 - Test the service from beginning to end with the minister responsible for it.

The team are aware of the need to test the service with the minister responsible for it and plan to do so before the service moves into live.

Summary

The panel were impressed with the cohesion and skill set within the team. The team demonstrated a passion and dedication to providing the best possible solution for users, and a deep understanding of the benefits for the patients who ultimately will be the main beneficiary during a very difficult time.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 No 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 No 14 Yes
15 No 16 Yes
17 No 18 Yes
19 Yes 20 Yes
21 No 22 No
23 No 24 No
25 Yes 26 Yes