Skip to main content

https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/hmrc/vat-mini-one-stop-shop-live/

VAT Mini One Stop Shop - Live Assessment

VAT Mini One Stop Shop (MOSS) allows users to register, report and pay VAT due on sales of digital services to consumers in the EU.
The VAT Mini One Stop Shop guidance provides further information.

Department / Agency:
HM Revenue & Customs (HMRC)

Date of Assessment:
3 October 2014

Assessment Stage:
Live

Result of Assessment:
Not Passed

Lead Assessor:
M. Harrington

Service Manager:
A. Collins

Digital Leader:
M. Dearnley


Assessment Report

The VAT Mini One Stop Shop service is seeking permission to be branded as a Live Digital by Default service within the HMRC portal.

Outcome of service assessment

After completing our assessment we have concluded the VAT Mini One Stop Shop service should not be given approval to be a live service on the HMRC portal.

Reasons

User Needs

The team have not tested the service with a broad enough sample of users to truly develop a good understanding of their needs, particularly SMEs. While the team have demonstrated elements of the service to groups in-depth testing has only been done with 4 people. This is not enough to try out different design directions. Although the assessment panel understand that the full service has only recently been available elements of the service could and should have been tested through development. The design is based on the assumption that because a user is familiar with VAT sign up they should be fine with this service, there is no strong evidence to back this up. Similarly, there was little evidence of a plan to continue user research once the service went live, making iterative improvements very difficult. There had been no testing of the assisted digital support.

The Team

It was good to hear the team worked in an agile way during the build, with a product owner in charge of the backlog. The Senior Responsible Owner also has a close relationship with the team which is positive. However, the lack of a dedicated user researcher or designer (for example) in the team full time has impacted the teams ability to meet all the points of the standard.

Security, Privacy, Tools and Standards

The Mini One Stop Shop application is predicated on an existing infrastructure and is essentially an extension rather than new software. Security and Privacy have been taken into account by the team. The application has proprietary applications, for the database tier and the application servers. This is the common platform upon which all the portal applications are built. The assessment panel would prefer an open source solution to these common service tiers of which there are several options. Though it is appreciated that this common approach had program benefits with regards to the operational aspects of the service.

Frequent deployments of the Mini One Stop Shop service are hindered by existing HMRC structures. With regards to disaster recovery, the team had put little work into this area and were utilising existing HMRC infrastructure to handle concerns. Links are established to existing support teams to handle queries and issues.

Improving the service

The service is exempt from this part of the assessment as agreed at the GDS Ops Board on 21 July 2014. Yet, it was positive to hear that the team could quickly respond to copy issues if they became aware of them and that deployments are happening more regularly.

Design

The service is exempt from having the look and feel of GOV.UK and there are no offline sections which require integration. It is very difficult to judge whether the service is simple and intuitive for users to succeed first time as user research has been limited to only 4 people and a great deal of reliance is placed on the user having experience of the VAT transaction in the same portal.

Assisted Digital and Channel Shift

The service team have not done specific research to discover assisted digital users of this service and have made a broad assumption that their users have the skills to use the service independently. Working with a user researcher it would be good to validate this assumption.

Assisted digital users are supported by a central HMRC team who have been given scripts/guidance to help them offer support. This has not been tested at all. The lack of testing and research here meant that the team was unable to talk about many aspects of the assisted digital channel, such as awareness, wait times, completion rates and trust.

Digital take-up is also covered in this section of the assessment. This service is only available online so anyone using this service will do so via the digital channel. The comms plan preparing users for when the service goes live, this hasn't involved GDS.

Analysis and Benchmarking

There are tools in place to collect data on the service. These includes Webtrends for analytics and access log data to better understand journeys and user behaviour. There was no evidence at the assessment that this data was being used to learn about the service and improve it.

The service team have recently approached the Performance Platform team to discuss displaying the four KPIs as outlined in the service manual but are not currently in a position to do so. It was clear from the assessment that the team do not have access to all the data of the service without needing to involve a third party which might impact their ability to report on the four KPIs.

Recommendations

User Needs

Put into action a plan for regular user testing with a user researcher. Many of the not passed points of this assessment are due to the lack of user research. A better understanding of user needs and validation of the team's assumptions would put them in a much stronger position for re-assessment as well as giving the team real insight into the service. A dedicated user researcher on the team would help with this.

The Team

As noted above, the lack of dedicated skills in certain areas in the team has made it difficult for them to reach the service standard. At the assessment it was made clear that there is available resource to create the right team for the VAT Mini One Stop Shop service going forwards. This is positive and the team should think carefully about the skills required to create the best possible service. Our recommendations would include user research, design, assisted digital and data/performance analysis.

Security, Privacy, Tools and Standards

The service team spoke at the assessment about expanding the Mini One Stop Shop application. It is the assessment team’s recommendation that future development should be based on the HRMC Tax Platform as this offers more flexibility for building and iterating. It would also enable the service to better meet the standard with regards to open source and technology choices. While a limitation of the existing architecture, it would be good to see the live service demonstrated at assessments.

Assisted Digital and Channel Shift

Assisted digital support for this service should be clearly and wholly owned within the service team who should proactively seek out users of the service with the lowest digital skills and carry out research to understand their specific needs and numbers. With this knowledge, the service team should design a model of assisted digital support for those users and then test and iterate it to show the support is meeting user needs and volumes. The service team should put in place mechanisms to measure performance of the assisted digital support when live, and plan to iterate the support in line with that feedback.

With regards to digital take-up, the service team should contact the digital take-up team at GDS to review planning and ensure all is being done to reduce avoidable non-digital contact from potential users using non-digital channels to contact HMRC about the service.

Analysis and Benchmarking

The service team needs to better use available data and make sure there are systems in place to record insights from the service. This will help to instill a culture of decision making based on data. Alongside user research, it will also help validate or better understand assumptions the team is making about the service. The team must display the four KPIs as set out in the service manual on the Performance Platform.

Next Steps

You should follow the recommendations made in this report and see the Government Service Design Manual for further guidance. In order for the service to proceed we require a full reassessment.

Summary

Given the challenges faced, like EU legislation and legacy systems, there are positives for the team to take from this assessment.

The lack of user research on this product really hampers its ability to meet the service standard. However, the assessment team believe with the right additions to the team and user research in place, the service could meet the standard, with the exemptions as previously agreed. It is good to know that recruitment is within the service team's control and that it could help them meet the standard.

The example given about the automatically calculated VAT total not being popular is an example of the invaluable insight that can be found with user research. The more the service is properly tested, the more insight that will be gained.

The fact that the team can act quickly to iterate copy is good news and will enable the team to respond to feedback from users. Releases are much more regular than expected meaning that iterative improvements are possible. Similarly, it was very good to hear that the team had worked in an agile way through the build and it was good to have the backlog owner at the assessment to give an account of that experience.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 No 2 No
3 Yes 4 Yes
5 Yes 6 N/A
7 No 8 No
9 No 10 No
11 No 12 Yes
13 No 14 No
15 No 16 Yes
17 No 18 Yes
19 No 20 No
21 No 22 No
23 No 24 No
25 Yes 26 Yes