https://gdsdata.blog.gov.uk/civil-legal-advice-service-assessment-3/

Civil Legal Advice - Service Assessment

Civil Legal Advice provides state-funded legal help with problems such as repossession, abusing partners, house disrepair etc. This legal help is expensive, and is only available to citizens who pass a means test and whose problems fall within key areas of law.

Department / Agency:
MoJ

Date of Assessment:
22/7/2015

Assessment stage:
Live

Result of Assessment:
Not Pass

Lead Assessor:
S. Wood

Service Manager:
L. Citron

Digital Leader:
M. Coats


Assessment Report

The Civil Legal Advice service is seeking permission to be branded a live service on the service.gov.uk domain.

Outcome of Service Assessment

The assessment panel has concluded the Civil Legal Advice service should not be given approval to remove beta branding and launch on a service.gov.uk domain as a live service.

Reasons

Although the Civil Legal Advice Service has not passed the live assessment, the service team demonstrated many positives across the 18 points of the service standard and the panel was impressed by the quality of the product.

Particularly strong areas of performance include:

User Research
The team clearly understands the needs of their users. Research has covered an impressive 239 people including call centre staff and those who need assistance to use the digital service. The team has made use of a variety of resources including Citizens Advice Bureau and disability groups. Pop-up research has also been effectively used. Where there has been difficulty finding the right people the team has put out adverts and used agencies. And when it was discovered that some deaf people had difficulties in using the service, British Sign Language was introduced. All this demonstrates that the researcher in the team is in full command of their brief and the team acts on research findings. It was good to see that all members of the team have attended user research sessions and that user research will continue throughout the life of the service.

Technical
The panel were satisfied that the service is robust and stable. The panel noted that developers swap in and out with other services through a central MoJ “hub”, thereby increasing the skill base and adding to knowledge share. The technical architect in the team explained that migration to Amazon services is about to take place and this appears to be being managed very well. Two CLAS consultants have been involved, and there is close liaison with the Office of the Government Senior Information Risk Owner (OGSIRO) and accreditors on issues like backups. The team has a sound understanding of the datasets being used, and is aware where any issues lie. The technical architect also described what measures are in place (sanitisation, workflows, role access) to ensure that data is handled appropriately. Call centre code has been looked at and design patterns reused. The service owns the call centre code so can “lift and shift” the call centre if required.

The panel were slightly concerned about the live support model that is in place and the demands it may place on what is a relatively small team. A recommendation would be to ensure that the service has robust documentation and alerts and that the bulk of support issues can be triaged by the central support function.

Design
The team demonstrated that they had worked with the content teams of both GOV.UK and MoJ digital services, and are in regular contact with other GDS teams. The product owner and user researcher demonstrated how changes to the pages were evidenced based on user research. The service team showed video footage from testing sessions, and were able to explain what they had learnt and how this had resulted in changes to the service.

The designer and content designer are involved in each sprint, resulting in changes being made iteratively. The text is shown to lawyers, which does run the risk of content not being written for the user, and has the potential to slow things down. However, the team explained how they had developed good working relationships with the legal teams and how this mitigates the risk of the content losing user focus.

Support for users is primarily through a contact centre, although face to face support is available if needed. Close working with the Legal Aid Agency Business Owner, and a good partnership with the outsourced call centre provider, means that the team is able to gather a wealth of information about user needs and can quickly respond to feedback. The team have changed processes to make the on-screen service simpler for users and keep them online (for example, removing the need to scan documents) and to improve the end-to-end user journey for assisted digital users (by making changes to how delegation to another person is handled). They identified peak times in demand for support, and resolved this by amending bookable appointment times to ensure that waiting time is low. The team is measuring the four mandatory KPIs among other metrics.

Digital take-up has increased since Beta and the team is reviewing messaging and drop out points to try to keep users in the online service. Digital take-up will be a key focus for the team post-live.

Despite these positives, the service did not pass the live assessment. This was on two points of the service standard and the reasons for not passing are closely related. They are:

  • Point 3 (Put in place a sustainable multidisciplinary team…) and
  • Point 15 (Use tools for analysis that collect performance data. Use this data to analyse the success of the service and to translate this into features and tasks for the next phase of development.)

Point 3
There is no performance analyst dedicated to the team that is responsible for identifying actionable data insights from the service. The assessment panel for the beta assessment recommended that
‘the service team continue in their efforts to recruit a product/data analyst.’ and this has not being addressed by the service team yet. The team is an agile multi-disciplinary one who are working well together and the panel had no other concerns in this area.

Point 15
The team has collectively adopted responsibility for managing analytics (for instance, by setting up funnels on Google Analytics where required). This seemed merely to add to the amount of management information that is generated, rather than provide the actionable data insights needed to help the team focus their efforts on improving the service. The lack of an embedded performance analyst meant the service team were not able to adequately demonstrate how they are analysing the success of the service through data, and using this to improve the service. For example, there was not enough evidence to show how user research linked to actual on-site activity. Also, it was unclear how data feeds into the product development process. And the team has yet to implement virtual pageviews as goals, which would be an effective way to identify leaks within user journeys.

The product manager suggested that there are a number of data related items on the backlog. However, it was not clear how or when these would be prioritised. Consequently, there is a gap that in-depth, iterative analytics would provide in supporting user research. By this the panel understands that comments from users are not being validated using accompanying analytical data.

Furthermore, ownership (and therefore prioritisation) of data and analytics is split between a number of individuals within the team. This reduces the opportunity for ad hoc reporting and analysis, which can often lead to the greatest insights.

Recommendations

In order to pass the reassessment:

  • the service team should recruit a performance analyst
  • the performance analyst should be able to demonstrate analysis that has led to actionable data insights the team has used to improve the service. The panel can put the team in touch with a performance analyst at GDS for additional information about the role and how it can help the service team.

Beyond the live assessment, the service team should consider:

  • continuing to work with Citizens Advice Bureau and other relevant third parties to understand user needs for support, particularly around needs for face to face support and signposting users to develop their digital skills
    involving content designers from other MoJ digital services to carry out the content and design “second eyes” review

Summary

To conclude, a lot of excellent work has been done by the team and the panel hopes they are not too downhearted by the findings. The panel is confident that the team is generally on track to build a service that meets user needs and is looking forward to seeing the service again for a reassessment against the not passed points.


Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 No 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 No 16 Yes
17 Yes 18 Yes