Individual Electoral Registration (IER) is part of the wider electoral registration transformation programme being undertaken by the Cabinet Office, aimed at increasing the accuracy of the electoral register.
IER will reform how individuals register to vote. Instead of the ‘head of the household’ supplying the details of all people living at the same address (which can result in fraud and errors), IER will require people to register individually. In line with legislative changes, IER goes live in England and Wales on 10 June 2014, and in Scotland after the referendum on 19 September 2014.
The IER digital service (IER-DS) allows us to meet the needs of people that arise out of these changes in legislation. IER-DS allows people to submit online applications for the first time ever, as well as mechanisms for verifying details of people on existing electoral registers (46 million) and those that apply online, over the phone and on paper.
Department / Agency:
Date of Assessment:
After completing the assessment the panel can confirm the Individual Electoral Registration (IER) service has shown sufficient evidence of meeting the Digital by Default Service Standard and should go Live as a Digital by Default service on GOV.UK.
The team have built a service so good, people should prefer to use it.
The service was assessed against and has met all 26 points of the Digital by Default Service Standard.
The service clearly demonstrated how they have built and developed the service by starting with user needs. Tackling the “no chance for a Beta” blocker with continuous user research with well-identified user groups. Feedback has been sought and input from both end-users of the digital service, and backend users (Electoral Registration Officers, department colleagues).
The service team gave good examples of how they designed and iterated the service with the needs of users in mind. Collaborating with and contributed to the GDS-wide user centred design community.
The team has a user-led iteration plan in place, and have reliable sources of user feedback. The Service Manager and Product Manager are responsible for and empowered to take user feedback to a dedicated development team to enable continuous improvement.
The service team is an excellent example of how a multi-disciplinary team, sat together, using agile delivery principles, can design and build a user-focused service quickly and flexibly. There were excellent examples of team collaboration, demonstrating how the whole team are involved in research analysis and story writing, and hence have a real understanding of the context and value of their work.
GDS were encouraged and impressed by the dedication, particularly when challenging legislation to overcome stumbling blocks in the service for users. This is a great example of user-focused service design influencing policy and legislation.
GDS note that there is a phased transition plan in place to hand the service over to a Cabinet Office team later in 2014. GDS would urge that the expertise and understanding the current service team has accrued is thoroughly handed over to the new team.
Security, Privacy, Tools and Standards
The service team demonstrated a thorough understanding of and commitment to Data Protection laws, and have thought carefully and conducted research around handling the question about inclusion on the Open Register.
The service is using Open Source throughout, restricting SaaS to peripheral features, eg GitHub for code control, and can switch suppliers easily. As the service is seasonal, agreements are in place so the service can scale up and down easily, saving money in periods of low usage.
There are plans to open up the code once the service is Live. The assessment panel were very pleased to hear you're moving non-sensitive source code into a public repository. The team is building a cross-government platform for address lookups which will be very useful. There are multiple environments, and have tested across multiple browsers and devices (determined by stats), either physically or using emulation. The service is designed ‘mobile-first’.
The service team has tested resilience and worked out plans for ‘worst case scenarios’, particularly during the 3-week lead time before a general election. Bandwidth issues have been addressed and the service has flexible operational support contracts in place.
The service team has detailed plans in place for various gradations of failure of the digital service. During the assessment the service manager explained how the service could be deployed elsewhere, or a fallback to the current manual system could be employed in the case of extreme emergency.
The service has a flexible deployment process allowing for very frequent deploys if needed. Current estimates for a fix to live are in under an hour without the need for downtime releases.
To compensate for no Beta, the service team have carried out intensive user research over the last 6 months, focusing on reluctant or less confident users. The research (allowing for the slight bias of moderated research) gave the panel confidence that this service is so intuitive that users will succeed first time. The inline help text was thoroughly researched and tested, helping people move through the service. The panel particularly liked the approach to this - only adding Help where research had indicated it was useful. The intention is to take it away if stats show few people are using it.
The assessment panel was really impressed with the efforts around the non-digital aspects of the service, constrained currently by legislation dictating provision of the paper form. The service team have gone to extra lengths to work with partners on calls to action to encourage takeup of the digital service, collaborating with other service teams facing similar issues.
The benefits of having a content designer and designer embedded in the service team are demonstrable by the quality of the copy and design, and adherence to the GDS Style Guide and Design Patterns Toolkit.
Analysis and Benchmarking
The service showed expert knowledge of setting KPIs and interpreting data, especially around completion rates and successful journeys. Currently the service is collecting its own analytics, the panel speculated if it was an option to remove Google Analytics if any privacy concerns made this something to seriously consider. The team is collaborating with GDS on the Performance Platform dashboard, start and done page, and the user satisfaction survey.
The service team has plans for increasing digital takeup, currently predicted to start at ~60%. GDS acknowledge the complexity in determining a cost-per-transaction, and the lack of a comparable current service, making any benchmarks somewhat arbitrary. However the panel encourage the team to work with the GDS Analytics team to determine some figures, detailed in the recommendations section below.
Assisted Digital and Channel Shift
The service team has worked with the Electoral Commission to turn the aboutmyvote website into a resource for local authorities, with relevant public user needs met on GOV.UK.
Despite legislative restrictions, the service team and the Cabinet Office have put in place comprehensive assisted digital support for this service through a broad range of 3rd party providers, in line with GDS guidance. The team have briefed and trained providers, who are expected to have more time for assisted digital due to the new digital service.
1. Closely monitor plans for Assisted Digital (AD) delivery of this service.
The Assisted Digital service has not been tested, as the digital service could not be fully tested during Beta (by law). The panel recommend that assisted digital provision is closely monitored in the early stages of Live and iterated if it is found not to meet user needs.
As an exemplar, the service must work with the GDS Assisted Digital team as the cross-government model for assisted digital support is developed, to share their best practice and ensure that the service’s provision remains consistent with the government standard.
2. Gauge the current satisfaction rate with the current, offline, register to vote service.
GDS understands the concerns that any comparison of completion rate would be comparing different policies. However, a more detailed look at current completions may give a benchmark for similar issues that may arise with the new service. GDS note the service teams concerns about surfacing value without context. Work could be done to ascertain current baseline costs and projected ones dictated by paths through the new version. The GDS Analytics team would be happy to advise on this.
Digital by Default Service Standard criteria