Tax Credits Change of Circumstances allows users to view and update the information used to calculate their tax credits. The aim is to offer an easy to use service that encourages and supports users to advise HMRC of the correct information at the correct time, thereby reducing the levels of error, overpayment and debt.
Department / Agency:
HM Revenue & Customs (HMRC)
Date of Assessment:
11 November 2014
Result of Assessment:
The Tax Credits Digital Change of Circumstances service has been reviewed against the 26 points of the Service Standard at the end of the Alpha development.
After consideration the assessment panel have concluded the Tax Credit Digital Change of Circumstances service is on track to meet the Digital by Default Service Standard at this early stage of development.
The service manager and the service team spoke knowledgeably about the service and had done a considerable amount of research to understand user needs during discovery and alpha.
The service team had developed and iterated the prototype quickly during the alpha, based on feedback from users. The assessment team appreciated being able to see the alternate designs that were initially considered, as well as a video walk through.
The current multidisciplinary team will remain in place throughout the next stage of development.
The service manager showed a good understanding of assisted digital principles but the team did not demonstrate sufficient research with assisted digital users and it was too early in development to meet the service standard for alpha on this point. The team have good plans to undertake research with existing third party providers of support and to join up with related government services. To pass the service standard at beta, the team must undertake research with assisted digital users to be able to demonstrate that their proposed assisted digital support meets user needs and the assisted digital standard (for example, low wait times for calls). The team must also demonstrate that support is sustainable and not rely on third parties such as charities to provide ongoing support to their users.
The team weren’t able to talk the assessment panel through their plan for making all new source code open and reusable or, where necessary, the reasons why subsets of the code could not be open. The assessment panel understand that HMRC has an open source policy, and that there are discussions about the security of opening up the code, but it is our opinion that prototypes have very little security impact and therefore should already be open sourced. It feels like the team believe they need to justify the open sourcing of the prototype the same way they would need to justify open sourcing real production code. The assessment panel feel that the source code for the prototype would be a valuable resource for other projects to learn from.
Safety and Security
The assessment panel had some serious concerns about the security implications of the service, in terms of sharing information with users, and whether the team is receiving the appropriate advice to ensure that they carefully balance the security needs and the user needs. The panel feel that at this early stage, this does not preclude advancement from the alpha stage as it can be addressed during the beta phase. Whilst a lot of security considerations have already been made, the panel strongly recommend that the service manager continues to talk to the appropriate specialists.
A content designer and product analyst (or an appropriate amount of time from these each of these disciplines) should be embedded in service team during the next stage of development.
The largest volume channel for users to tell HMRC about a change of circumstances affecting their tax credits is currently over the over the phone. Using call centre data and speaking to the people who handle those calls could help inform user needs, engaging with the call centre(s) early should be considered.
The panel would also suggest at the next stage of development focussing more on what users do and what works for them, rather than asking for them for feedback. For example, the panel noted that the A/B user research the team explained at the assessment involved asking users for a preference, rather than monitoring the effectiveness of the different options.
The team should consider how follow-on actions are presented to users - eg notifications, and make sure that the concepts of ‘pending’ changes are well understood.
The assessment panel had some concerns that the design that has been worked on in the prototype would not be possible to achieve during the public beta because of technical constraints. The panel did not see evidence that the intermediate state of the service was being carefully designed as well. The panel recommend that the service team work both on the intended end state of the system, and the system that is achievable within the constraints of the service development timeline.
A large percentage of the services users are likely to be on mobile devices, but the prototype is not currently optimised for these users. The service should be designed with this in mind, potentially using a mobile-first strategy.
The team had an early discussion with the Performance Platform. They should renew these discussions to establish how they are going to track the 4 KPIs and report them on the performance platform.
Digital by Default Service Standard criteria