Skip to main content

https://dataingovernment.blog.gov.uk/charity-commissions-digital-services-service-assessment/

Charity Commission’s Digital Services - Service Assessment

The service will allow a person authorised to act on behalf of a charity to make a change to the charity’s governing document so that it can operate more effectively.

Department / Agency:
The Charity Commission

Date of Assessment:
22/5/2015

Assessment Stage:
alpha

Result of Assessment:
Pass

Lead Assessor:
M. Harrington

Service Manager:
C. Cooke

Digital Leader:
C. Cooke


Assessment Report

Outcome of service assessment

After consideration, the assessment panel have concluded that the Charity Commission's Digital Service is on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons

User needs

The service team have a good understanding of their users and their needs. This has been gathered through a range of user research methods with feedback already being used to iterate and improve the service.

The team

The alpha phase has been used well to understand options for delivery and it was good to hear that the team had considered the options available to them. There is a multidisciplinary team in place, with technical skills from the software provider. The team are working using agile processes and have developed the alpha in four sprints. It was positive to hear that there was a cycle of sprint planning, daily stand-ups, showcases and retrospectives.

Security

Unfortunately the authentication mechanism in use is not fit for purpose (see recommendations below).

Privacy

The team have a good understanding of the data they are capturing and the bulk of it is intended to be part of a public register; there are no obvious privacy concerns surrounding the data currently captured. The data sensitivity assessment currently in progress should inform future direction. Some concerns are outlined below.

Tools

The team have made pragmatic choices that let them iterate fast within the alpha - the tool chosen to design and capture form submissions has a low learning curve and reasonably flexible deployment and support options (on premise, Infrastructure as as Service (IaaS) or hosted by vendor). While the tool is proprietary, the system uses XML to interface with downstream systems and this would allow for a migration at a later date if needed. The tool supports acting as a Security Assertion Markup Language (SAML) service provider, so should be interoperable with an improved authentication mechanism.

Standards

The software provider product allows sharing of resources between customers, and the team have taken advantage of the GOV.UK toolkit which skins the product for government services. Use of XML for data interchange is also a sensible choice. There is a need to work on standards for identity assurance and authentication (see recommendations).

Improving the service

The team have been able to rapidly iterate the service during the alpha phase meaning multiple variants have been tested with users. There is a backlog of prioritised work which has been created as the service has been tested and iterated. The decision was made to make changes weekly so that versions could be tested with multiple users. Changes were made by the team using the graphical user interface (GUI) provided by the software provider. A change to a form does not stop a user from completing a transaction, and does not require service downtime.

Design

The team has made great progress in understanding user needs and iterating the service based on user needs and research. The team has significantly improved the flow and content design of the service and has good ideas for how the service can be made better during the beta phase.

Assisted digital and digital take-up

The team have not considered assisted digital at this stage. No testing has been done with assisted digital users and there has been no consideration of assisted digital support. Just because the existing service is already online only does not mean that no one is getting or needs help to use it.

Analysis and benchmarking

The existing service has analytics in place and the team have used this to gain insight and influence the design of the new service. Analytics has been installed on the alpha and is ready to be used in the beta phase. The team are engaged with the Performance Platform. There is historic data available which could be used on the Performance Platform.

Recommendations

User needs

The team should investigate what the actual service or services should be (just the two transactions presented or incorporate all changes to charities). This will effect the name of the service and how it is represented on GOV.UK.

During the alpha, many of the issues found were around language use. During the next phase it is important to make sure that recommendations about language can be acted upon. The team should also consider widening the research scope to include login and internal case-workers, so that the whole user journey can evolve to a point where it is fast and straightforward.

The team has identified a significant proportion of professional agents that would use the service. Research needs to be carried out with the user group to understand if their needs are different and if the service being developed is appropriate for them.

The team

Currently, many members of the team take on two or more roles. This has been fine for the alpha phase but as the product develops to integrate with the back-end system and take on more services there must be a clearer separation between key roles.

Security and privacy

The authentication mechanism is not fit for purpose as there is a single shared credential for an entire organisation. This poses severe risks to privacy, transaction non-repudiation and the reputation of the service. A solution that does not rely on shared credentials needs to be put into place as a matter of urgency. Until a solution is in place the team has no assurance of the identity of the individual transacting and therefore cannot determine their authority to transact without an out of band i.e. non-digital step. We were pleased to see the team is aware of the shortcomings of the authentication mechanism and has planned a review of this.

As the service is highly dependent on third party software and hosting, the team should take care to procure this in a sustainable way. Timely security updates, security monitoring, service and data resilience will need to be considered in any procurement.

We understand that currently the changes requested by service users generate casework which is dealt with by Charity Commission staff and that there is a desire to eliminate casework from “low risk” transactions. A thorough analysis of the threats to the system will be needed to achieve this, and it seems likely any work on this would need to proceed in tandem with, or after dealing with, the authentication mechanism.

Tools

There is a danger of becoming locked-in to the software provider’s tool. To avoid this we would recommend:

  • Ensuring that the capability to use the tool to design and modify forms exists within the organisation rather than via supplier staff - these members of staff would play a key role in evaluating alternatives and conducting any future migration.
  • Ensure that the infrastructure is well understood by a group of technical staff within the Charity Commission to facilitate running the contract with the supplier and operational support.
  • Continue to evaluate the software provider and consider other suppliers if they do not meet your needs - we are keen to see the evaluation report which is a planned outcome of the alpha phase at the beta assessment.

Standards

There is an ongoing stream of work within GDS on registers. As the service, overall, manages updates to a register, there’s potential to contribute to, and benefit from, the experience of other bodies that maintain similar registers.

The "Cross Government Organisation and Authority Management Working Group" includes a number of interested parties across government with identity assurance requirements to process transactions on behalf of organisations. It is possible that GOV.UK Verify could meet a part of these needs.

Improving the service

Currently the service is hosted on a single machine by the software provider. While this has been fine for alpha, the team should ensure they understand what is needed for running a service at beta.

Design

There will need to be significant work on the visual design style and HTML coding in the beta to fully conform to the service design manual and design patterns. The team will need to include a front-end developer that can quickly iterate the code and make changes to the template.

The service must work well on mobile, preferably through the adoption of a 'mobile first' approach and there should be automated browser testing as well as manual testing to make sure the service works correctly and is rendered properly. We would expect there to be accessibility testing on the service and for recommendations to be incorporated.

The team should continue to iterate the content design and language of the service. A content review will be supplied separately.

Assisted digital and digital take-up

The team need to understand their assisted digital users and how support is currently being received. The team should put together a model of support that is designed to meet the identified support needs of users who need it. As discussed at the assessment, voluntary associations would be a good place to start this research with, to find users with the lowest levels of digital skills and confidence. This report might also provide a useful starting point for analysis of assisted digital needs.

The team should carefully review what is expected at beta before the next assessment.

Analysis and benchmarking

The team should consider the key performance indicators (KPIs) which will be a measure of success for the service, in addition to the four KPIs stated in the manual. It is also recommended that the discussions with the technology supplier about the ability to undertake multivariate (A/B) testing are continued.

Summary

It was particularly good to see the alpha phase being used to rapidly build a product to show the art of the possible in just 4 weeks. The team have also done some great work to bring the business along on the journey. The panel are confident of the team’s ability to pick up on the recommendations in this report and progress in beta.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 No 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 No
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes