Skip to main content

https://dataingovernment.blog.gov.uk/mot-testing-service-assessment/

MOT Testing - Service Assessment

The MOT Testing service will enable users to efficiently and effectively record and report the results of an MOT test in accordance with the MOT scheme rules. For example, it will enable:

  • qualified and pre authorised vehicle testers, operating at pre authorised private garages, to electronically record and amend an MOT test result and print a certificate
  • the private garages to pay a transaction (slot) fee to DVSA for the submission of the MOT test
  • DVSA staff to record the outcome of a vehicle re-inspection
  • the results of the MOT test to be shared with DVLA, and in turn support DVLA's on-line electronic vehicle licensing service.


Department / Agency:
DfT / DVSA

Date of Assessment:
1/12/2014

Assessment stage:
Alpha

Result of Assessment:
Pass

Lead Assessor:
M. Sheldon

Service Manager:
N. Barlow

Digital Leader:
B. Etheridge


Assessment Report

The MOT Testing service has been reviewed against the 26 points of the Service Standard at the end of the Alpha development.

Outcome of service assessment

After consideration the assessment panel have concluded the MOT Testing service is on track to meet the Digital by Default Service Standard at this stage of development.

Reasons

The service currently meets the requirements of the standard for an Alpha. Areas of good performance against the standard included:

User needs

The assessment panel were pleased to see that the DVSA team have a good understanding of the users and their needs of the service This has been gathered through a range of user research methods, with feedback already being used to iterate and improve the service.

The team

The DVSA service team is led by a skilled and empowered service manager, who owns and is responsible for all online and offline elements of the MOT scheme. All product managers in the scrum teams have experience in the motor industry and have autonomy to lead their multidisciplinary teams and work in an agile way.

Security, privacy, tools and standards

Early and regular engagement with experts, to evaluate the security and privacy risks of the service, have allowed the team to put in place appropriate security with an emphasis on user needs. The assessment panel were pleased to see that the service team have followed a similar approach to idenitity assurance when developing the authentication interfaces and patterns, and are documenting that approach to share with other services not requiring GOV.UK Verify.

Improving the service

Improvements to and iteration of the service features are already being made based on feedback from real users and testing several prototypes prior to feature development.

Recommendations

User needs

The service team are aware that they lack sufficient user researcher capability and are currently looking to fill that gap in the teams. For beta assessment we expect to see a more regular and methodical approach to formal user research, which is sustainable and incorporated into the continuous improvement process. The current approach, although with real users, appears to be based on general feedback rather than a more structured approach. Whilst we agree that using online surveys is a good way to reach a wider set of users quickly and remote user research software allows testing of specific features, these techniques must only be used to support a robust user research process that the entire team can observe.

The team

The service team are currently supported with a high percentage of interim staff, procured through contracts with external companies. At beta assessment we would expect to see a plan in place to address this balance and build a sustainable multidisciplinary team that can continue to own, operate and improve the service when live. The service team must consider the need for a dedicated analyst on the team who can identify actionable insights from data and analytics. There must also be a named person on the service team who takes ownership of assisted digital.

Security, privacy, tools and standards

It was agreed that the service team will provide further detail on what tools and systems have been procured to support the service. Identifying how those procurements have been made and with what companies or services.

Open source

For beta assessment we expect to see steps taken for this project to be coded in the open, with all code published on github.com in open repositories. Where this is not possible there should be a convincing explanation as to why.

Improving the service

To be able to update and improve the service on a very frequent basis, the service team must continue to shorten the build pipeline with a goal of quicker feedback on builds to the development teams and a reduction in the 5 day lead time of getting production ready code to the live environment.

Design

The GOV.UK style guide and front-end toolkit are currently being applied across the service. There are however a few areas where the patterns vary somewhat from GDS recommendations (e.g GOV.UK header, select-a-failure buttons). If there is a deemed an appropriate user need to build new patterns these must be solidly backed up by evidence from usability testing as opposed to preference from a small number of users.

Use of the Crown icon as well as the GDS Transport font shall be determined based on whether or not the service sits on a service domain of gov.uk and this should be defined before going live.

Analysis and benchmarking

The assessment panel recognise that the new service will be different to the existing service, but the service team should be benchmarking around user satisfaction for comparison. Some of the direct questionnaire feedback highlighted that users are not entirely unhappy with the current service. Measuring and comparing user satisfaction between both services will allow the team to focus on any dissatisfied users, or difficult user journeys and take steps to improve the service and its features.

Assisted Digital

Initial steps have been taken to understand the level of assisted digital required to support users who need it. The service team must use non-digital channels to get a full picture of the needs and numbers of assisted digital users, understanding that online surveys only reach users with a level of digital skills and confidence. The service team must carry out user research with users who have low or no digital skills, and use this research to influence design of both the on-screen digital service and any assisted digital support. The service team should plot their service’s users on the digital inclusion scale, to help demonstrate what level of assisted digital support will be required.

The service team must evidence that user needs will be met by the planned DVLA call centre offering or face-by-face support from users’ colleagues. They must also evidence that support from colleagues is sustainable and meets user needs.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes