We’re coming up to a year of live running of the digital service standard. Throughout the year the ‘pass rate’ for service standard assessments has been at about 70%.
I was asked recently if a 70% pass rate is good? What’s the benchmark?
How many services should pass assessments?
The truth is that there isn’t a benchmark for how successful the development of new government digital services should be. In running assessments and publishing data about the pass rate and the criteria most often not passed we’re setting the benchmark. As far as we know there isn’t anyone else around the world assessing the development of digital services on this scale (and if there is then we’d love to hear from them).
To me, having about 70% of services pass their assessment seems about right. Not passing an assessment isn’t the end of a service – it’s a chance for a team to take on board the recommendations and use them to improve the service, making it better for users. In fact, maybe the most satisfying thing about working on the service standard is seeing that in action.
Delivering services people prefer to use
Because of the high numbers of users, a lot of digital services being assessed at GDS come from HMRC. In the last year 19 of our assessments have been for HMRC services, and overall they have passed 74% these.
Recently an HMRC team brought in the Inheritance Tax Online service for an alpha assessment. This assessment happens at the end of the alpha stage, and is an opportunity for an early review, before a team starts their beta development.
While there were lots of good things about the service, the GDS assessment panel were worried that the digital service was too closely modelled on the existing paper form, and the opportunity to test more radical and user focussed designs was being missed. On that basis the service didn’t pass its first assessment.
But the team considered and addressed the panel’s recommendations, simplifying the process and removing unnecessary fields. It led to a much improved service, and a pass when it was re-assessed.
Sharing what we’ve learned
Of course we’d love to get to a stage where everything meets the service standard first time, but it is rightly a high bar. We’ve run almost 100 assessments of new and redesigned digital services, ranging from Contracts Finder to Carer’s Allowance. We’ve also got a new dashboard on the performance platform, which shows how many services pass, and what the common points are that are challenging.
We’ve learned a lot, and we’ve published all of the reports from assessments so that people developing other services can learn too.