Skip to main content

https://dataingovernment.blog.gov.uk/employment-tribunal-fees-service-assessment/

Employment Tribunal Fees - Service Assessment

Employment tribunals determine disputes between employers and employees over employment rights. An application to an Employment Tribunal Fees is known as a claim, and it is submitted by a claimant.

Before submitting a claim, all claimants must contact ACAS to attempt early conciliation with the respondent. If no resolution is found, the claimant is issued with a certificate which enables them to make a claim to an employment tribunal.

Tribunals review each claim and assess whether it falls under their jurisdiction and consequently whether it can be heard by a judge. Respondents are notified of the claim and have an opportunity to respond. Once all information is gathered and accepted, the claim (or claims if a group) becomes a case which is reviewed by a judge who makes the final decision on the outcome during a hearing.

https://www.gov.uk/transformation/pay-tribunal-fees.html

Department / Agency:

MOJ

Date of Assessment:

24/2/2015

Assessment stage:

Live

Result of Assessment:

Pass

Lead Assessor:

T. Scott

Service Manager:

T. Wynne-Morgan

Digital Leader:

M. Coats

Assessment Report

Outcome of service assessment

After completing the assessment, the panel can confirm the Employment Tribunal Fees service has shown sufficient evidence of meeting the Digital by Default Service Standard and should go Live as a Digital by Default service on GOV.UK. The service can now remove any Beta branding. 

Reasons

The service was assessed against and has met all 26 points of the Digital by Default Service Standard.

User needs

The service team displayed a deep understanding of their users and their needs to the assessment panel. The panel thought the team were an excellent example of how starting with user needs, embedding research into sprint cycles, and focussing the whole service team on meeting the most important user needs first, helps create a digital service that users can complete successfully, unaided. 

The panel saw plenty of evidence that the service team had gathered to inform the service design. The team had recruited real users for research and carried out research with all types of users using appropriate methodologies. Almost half of users (individual claimants) will encounter the service during a stressful and unpleasant time. The service team displayed empathy with this user group. The other main user group are professionals (solicitors, advisers and other representatives) claiming on behalf of someone else. The team have put equal effort into both sets of users, and showed evidence for how some of the professional group will also be infrequent users of the service. 

User researchers will continue to be embedded in the service team. Feedback from all sources is captured by the team and analysed, informing stories for the backlog.

The service team has made an exemplary effort to ensure users with assisted digital needs are catered for. The team sought out users with assisted digital or digital inclusion needs and has set up a sustainable service for them.

The team

The assessment panel thought the service team were an excellent example of a co-located, multi-disciplinary team working together, focussed on meeting user needs. There were no gaps in the team and the recommended roles are filled. The service manager is empowered and responsible for the entire end-to-end service (the digital service referred to here is solely about making a claim, and is managed by an empowered product manager).

The team is using agile principles, including a variety of tools and techniques that allow the team to collaborate and share responsibilities for meeting needs and supporting the service. The team participate in user research.

Security, privacy, tools and standards

The team have adopted a collaborative approach with the relevant experts to make sure that security doesn’t get in the way of meeting user needs. The assessment panel saw compelling evidence of the effectiveness of this approach. Security colleagues are involved from the start, meaning that they can work with the service team and gain from context and understanding of users when talking about potentially contentious security issues. Hence, the team was able to justify downgrading the security classification. The team has secured all necessary data accreditation and has a cookie policy in place.

The service team have rebuilt the front-end of the service during beta. They have opened up the source code (they have reused code from other services and anticipate some features of this one will be useful too) and have adopted an approach that moves away from vendor lock-in. MOJ owns the data. Capacity planning is in place and has been tested with x5 the expected load. The service has processes in place for dealing with an unexpected or malicious incident - a failover page will be available that will direct users to the paper application form. Backups are in place, and pager duty will be activated. An impact assessment of service downtime shows that users will still be able to make and follow up claims using readily available alternative channels. This process has been tested with a real incident by the service team, and the results were positive.

The team use a staging environment for testing, including user research. They have a scaled testing approach, depending on the size and impact of each feature. Pen testing is used for larger features. More comprehensive testing and user research on different designs and functionality is carried out in a demo environment. The service team have access to a professional hacker who attacks the service regularly. Any issues found are resolved within 2 days.

The service is responsive, although data and research suggests that desktops are most likely to be used.

There is a disaster recovery plan in place - once the underlying issue is resolved the service will be back within 10-12 minutes, or 2-3 hours if a different system is needed.

Improving the service

The team demonstrated the ability to be very flexible and to iterate rapidly.This ability will continue through to live operation. The team iterated features continuously during beta in response to user research. They are committed to further improving the user experience and have a prioritised backlog of features to research and iterate next. The service has set up automated testing and they deploy as soon as something is ready. There is enough structure in place to ensure the right levels of product review are adhered to, without making the release process overly cumbersome. The service team plan to encourage content designers to make pull requests for small copy changes, rather than incorporating these into sprint work.

The team have ensured they are gathering all sources of data and evidence to be confident they are getting a full picture of how the service is performing and what users are saying. They are linked up with other parts of the end-to-end service, such as the Public Enquiry Line team, and any insights are incorporated back into the service design.

Design

The team have evidence to show that users are able to complete the service unaided. Their knowledge of their users and the context in which they’re operating reveals that many will return to the service to complete it later. They are seeking to add tracking to ensure that the journey for re-accessing the service is successful. Currently users will access their saved claim via their case reference number (via email) and their memorable word. The team investigated GOV.UK Verify and explained that a full identity check wasn’t appropriate nor needed. Users tend to collaborate on a claim with their representative. The team have an assumption, which they’re testing, that the memorable word step is not useful nor required for assurance purposes. They are collaborating with MOJ security experts on this and their SIRO is aware of it. The team are still in discussions with these experts about this feature.

The service team displayed a thorough and impressive understanding and engagement with the end-to-end service. This is a complex legal process involving several parts of government and third parties, such as Acas. The team have led the way in collaborating with all parties to improve the user experience and encourage a digital by default approach. At present, the service becomes mostly paper-based once a claim has been submitted. The team have made several improvements, eg paper applications are now entered into the digital service by tribunal staff. The team are working with BIS and HMCTS to redesign the paper application form and to ensure the language matches the digital service.

The team has collaborated closely with Acas (who provides the mandatory mediation service) and the CAB, who are providing a trial of facebyface assisted digital and who also have the remit to advise claimants.

The service uses the GOV.UK design patterns and style guide. The team showed how they have contributed to the above toolkits when there is no standard pattern, and when user research has recommended a different approach. The designer and content designer on the service team work closely with the wider government community. The service includes inline guidance, which the team justified with evidence from user research. The service panel would encourage the team to work with the GOV.UK content team to ensure that any overlapping content on GOV.UK is consistent. The panel also spotted some style guide inconsistencies.

Assisted digital and digital take-up   

The service used research with assisted digital users to forecast the demand for assisted digital support, although in testing there has been lower demand than expected. They have tested their telephone support for 4 months and iterated the support to meet user needs, incorporating appropriate needs assessment processes and making improvements to the digital service. They have tested the face-by-face support in a pilot with a 3rd party in a few locations. There are challenges to delivering this (splitting out supporting the transaction from providing legal advice) but this has been resolved by the team. They are running a pilot on payment for this support to ensure that it is sustainable. Awareness that support is available could be clearer on the digital service’s ‘contact us’ page but otherwise is good, including signposting through 3rd parties where appropriate.

This service is one part of a wider process which is largely paper-based, most applications are by intermediaries who find paper a more convenient way to engage with their clients and users can pay by cheque in the post. This makes digital take-up for this service more challenging but the team demonstrated that they are working with related services, agencies and intermediaries to move users online. They are also in the early stages of working with BIS to improve the paper form, in line with their findings for the digital service.

Analysis and benchmarking

The service is using Google Analytics and has several team members with the skills to interpret data. As well as a fully optimised Google Analytics dashboard, the team uses a geckoboard to display real time data. They have a dashboard on the performance platform currently tracking 3 KPIs, with a plan to add the 4th KPI very soon and to switch the dashboard over to the new digital service. The team have identified 2 further KPIs that they will use to track success: percentage of users who use the ‘save and return’ feature and percentage of professional users who use the service interface to record a claim, versus uploading an attachment.

The team are proficient in using performance metrics to track the service against the benchmarks they have set. They will be able to see how the redesigned service performs against the current beta to establish whether they have improved the experience for users. Data captured during the private beta suggests that performance against the KPIs will show improvement.

The team have carefully considered and consulted around identifying the cost per transaction figure - they have ensured this represents the entire end-to-end service. Similarly with the user satisfaction metric - the team explained how the positioning of the data capture survey can skew results away from measuring the actual digital part of the service.

Testing with the Minister

The service team showed a video of the minister making a claim via the service, as if they were a user. They were able to complete unaided, and gave good feedback.

Recommendations

  1. Adopt the same approach as the excellent collaboration shown within government by working closer with GOV.UK content teams. The service team has developed an in depth understanding of the wider needs surrounding employment tribunals and could impart some of that expertise to the GOV.UK team, both in GDS and in other government departments and agencies.
  1. Specifically, work directly with the content team at GOV.UK to ensure that the content in the inline guidance reflects content design standards and GDS house style. Encourage collaboration and learning from each other.
  1. The team should continue to develop their plans for digital take-up as part of the wider employment tribunals process. Processes should be improved so that users that can use the digital service choose it over paper.
  1. The work on deploying the application into Amazon Web Services (AWS) and achieving authority to operate is great. The team are working towards accreditation. The panel believes it would be fantastic if the service team could share their experiences more widely within government if they are successful.

Summary

Making a claim to an employment tribunal is a difficult challenge to address digitally. The team has shown an exemplary service design approach and has demonstrated impressive results. They have turned around the service during beta and have produced a user focussed service with evidence to show that users prefer it. Their “putting users first” approach, including users with assisted digital and digital inclusion needs, is demonstrable from their research methods and ways of team working, through to their attitude towards security considerations and offline elements of the wider service.

The panel was extremely impressed and had no doubt in passing all 26 criteria. The service team was incredibly well prepared and were able to answer questions clearly and concisely. The level of knowledge of users, service design and technical stack was high in all members of the service team. All disciplines represented in the assessment showed a commitment and passion to building and improving a user focussed service, and their empathy and understanding towards these users was evident.

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes