The Legal Aid Agency administers the Legal Aid budget for England and Wales. Advocate Defence Payments looks to replace the existing paper process for barristers and solicitor advocates submitting claims for Crown Court cases.
Department / Agency:
Date of Assessment:
Result of Assessment:
Outcome of service assessment
After consideration, the assessment panel have concluded that the Advocate Defence Payments service is on track to meet the Digital by Default Service Standard at this early stage of development.
The panel found that the Advocate Defence Payments team demonstrated a sound understanding of the task they have embarked upon. Together the team confidently and competently demonstrated that they understood many of the issues that they faced and showed that they were flexible enough in their approach to be able to respond to user needs that are emerging from the research. For example, having spoken to around 60 people the team identified a new user type – the claim chaser.
The team is working with two broad groups of users who work at different ends of the service: advocates’ clerks and Legal Aid caseworkers. The assessment panel liked the plans to bring the two groups together to avoid the “them and us” culture. The panel would like to see the outcome of this when the Advocate Defence Payments team returns for their beta assessment.
The panel also liked that there was a desire in the team to test users in their offices - where they do their day-to-day work.
The panel were pleased to hear that the accreditor had been engaged at an early stage of the delivery.
The panel also liked the way that the Advocate Defence Payments team recognised that their service was part of a wider transformation programme. On that same note, the panel were pleased to know that the team will be talking to other MOJ colleagues about approaches to remove the need for a "wet signature". Clearly more work needs to be done here, but the Advocate Defence Payments team recognise this.
Although engagement with the GDS Performance Platform has been tentative, the panel believe that the team have time to rectify this. The panel were interested to hear that the Legal Aid Agency has its own internal dashboard, as well as that the team thought about the need for KPIs over and above the standard four that GDS recommends.
The Advocate Defence Payments team had started by simply replicating the existing paper form. While this is not unusual, it does mean that issues with that form are carried over to the online service. However, the panel were assured that the team recognised the flaws in this approach and explained how they intended to tackle them.
Additionally, the role of the business analyst (BA) seemed to be underplayed in the delivery team. The panel has found that government too often confuses the role of a BA with that of a subject matter expert. Translating user needs into actionable user stories and acceptance criteria that a dev team can work on is a vital element in the role of a BA. At the beta assessment, the panel will be looking for stronger evidence as to the effectiveness of this position.
The service did not pass point 10 of the Digital by Default Service Standard due to a lack of research into assisted digital (AD) users and their needs. There was a lot of reference to the fact that 7.5% of claims in one particular week were handwritten submissions and this was taken as a start point for AD research during the beta build phase. But during the alpha build, no AD users had been identified, much less engaged with.
The team must identify users who require AD support (including from third parties) in order to complete the service, and the number of users involved. Then the panel would need to see that a support plan has been put in place based on that research and that the support is being tested and iterated. The team must develop a fuller understanding of likely costs of providing support, for all providers (including third parties) and across all support routes.
User needs and research
There is a need to address cultural issues at some advocates’ offices, specifically the reticence of clerks to ask their bosses for missing information. At the moment, this means that incomplete claims are being submitted, resulting in payment delays to the advocates. It could be that the online service helps to overcome this issue and is something the assessment panel look forward to hearing about at the beta assessment.
Regarding the "Case Chasers", the panel recommend further work with this group to ensure that their needs are both understood and met.
The panel recommend that the team keep the scope minimal for their minimum viable product. In particular, identify which features it would be impossible to launch without and focus on those. Avoid implementing features where there’s not a clear user need. The team should also hold off on creating a global experience language until the product is more mature.
The panel also recommend that the team identify and remove unnecessary questions, working with a BA to fully understand why questions are being asked. For example, if you need to confirm a user is over 18, you can just ask them to confirm instead of requesting a date of birth.
The panel also suggest moving away from replicating the paper form. Instead, testing whether this works best as one long page (as it is currently) or split into individual questions (see www.gov.uk/register-to-vote). The team should consider their choice of form fields carefully. For example, change drop-downs with 8 items or fewer to radio buttons. For your giant drop-downs, look into other ways your users can make this choice.
For the beta assessment, the assessment panel will be looking for evidence that the topic of "wet signatures" has been fully explored and solutions tested.
The panel recommends that the nature of the data and the interaction with untrusted end users requires some thought around security and fraud prevention. End users will more than likely (borne out by research) be using operating systems that are near end-of-life and vulnerable to viruses and so on. The panel believe it will be crucial to see the team de-risk these client interactions.
Additionally, the panel think that the security approach taken for the alpha is well thought out - however a decision on the correct solution for this dataset will need to be taken soon and also, more broadly, the team should see if there are ways of reducing the dataset stored.
It was noted by the panel that the role of the BA was not shown on the slide about the team structure. Although it was accepted that this was an oversight and that the BA is indeed well placed to challenge the business requirements, this is an important function in a delivery team. The panel believe that it is important to ensure that the BA is not just a subject matter expert (all too common in government), but is also someone who is skilled in writing user stories and acceptance criteria that capture the product owner’s requirements and that developers can understand.
To conclude, the team showed the sort of passion and commitment that the panel like to see. The team clearly believe in what they are doing and want to do the best for the users of their service - both in advocates’ offices and in the Legal Aid Agency. The panel encourage the service to continue their good work, mindful of the recommendations made.
A successful alpha review is not a guarantee of success. But it is a clear indicator that a service is on the right track. The panel look forward to seeing the service team again at the beta assessment sometime in the future.
Digital by Default Service Standard criteria