Skip to main content

https://dataingovernment.blog.gov.uk/office-for-national-statistics-ons-website-voluntary-service-assessment-2/

Office for National Statistics (ONS) Website - Voluntary Service Assessment

The Office for National Statistics (ONS) is the UK’s largest independent producer of official statistics and is the recognised national statistical institute for the UK. It is responsible for collecting and publishing statistics related to the economy, population and society at national, regional and local levels. It also conducts the census in England and Wales every ten years. The website is the primary channel for dissemination of these statistics.

Department / Agency:
ONS

Date of Assessment:
16/12/2015

Assessment Stage:
live

Result of Assessment:
Pass

Lead Assessor:
H. Garrett

Service Manager:
M. Jukes

Digital Leader:
T. Makewell


Assessment Report

Outcome of service assessment

After consideration the assessment panel can confirm the ONS Website has shown sufficient evidence of meeting the Digital Service Standard at this stage of development. The service should now remove any beta branding.

Reasons

The website was assessed against all 18 points of the Digital Service Standard.

The assessment panel were really impressed with the service team’s detailed and knowledgeable answers during the assessment. Their enthusiasm and commitment to creating a website based on user needs that could be iterated at pace was clear.

Research and design

The ONS website is exempt from using the frontend toolkit and GOV.UK look and feel (Transport font, the crown icon etc), however the team demonstrated that the website is built in the spirit of the design principles, and the principles of the design patterns.

The team gave several examples of how they iterated the design of the website during the beta phase of development. These iterations were informed by regular user research using appropriate methods, and the research was based on what the team needed to learn. Examples of research techniques used included one-to-one, face-to-face research, lab sessions, online task-based sessions, A/B tests, and click testing. The team completed 27 rounds of research, talking to users of the website and users of the publishing tools.

The team also benchmarked the beta website against the existing site by testing tasks that users had struggled to complete on the existing site. The team have plans for ongoing research once the website is live, including diary studies and further A/B tests.

The team completed an accessibility audit which identified three main areas of work for improving accessibility. The team have plans to complete this work before launching the website, including improving the search and listing pages, reorganising the layout of the pages to make them more readable by screen readers, making the CSV downloads more prominent and ensuring that the CSVs can be accessed without the use of JavaScript. In the longer term the team also plan do more work to improve the accessibility of the interactive charts.

The team

There is an empowered multi-disciplinary team in place and there are clear and sensible plans for transferring knowledge from members of the team who’ve been building the beta to the permanent team members recruited during the beta phase.

The team are working in an agile way using themed sprints, and quickly bringing findings from research into the next phase of work.

Recruitment had been a challenge during the beta phase. The panel were really impressed with how the service manager had approached the problem, including how they put job descriptions on a hackpad to be peer reviewed, and tailoring them to explain what it meant to work for the ONS, rather than using generic job descriptions.

Technology decisions

The technology decisions the team have made were clearly articulated during the assessment and based on user needs. There are some huge improvements from the legacy site, including a simple deployment process which means that the website can be iterated quickly; code changes to the current site are made only once once or twice a year. They have a well-thought-out set of environments that allow continuous feedback through functional, performance and load testing. All the source code has been made available.

The service manager explained that they had implemented monitoring for downtime and the health of all their applications. Full application support is provided from 8am - 6pm with some further support provided by their supplier, however the ONS have accepted the risk, signed off by their SIRO (Senior Information Risk Owner), that some issues could remain unresolved outside of the full support hours of 8am - 6pm.

Summary

The GDS assessment panel would like to thank the ONS service team for a really positive assessment, their enthusiasm and well-articulated answers to our questions. We look forward to seeing the ONS website launch and continue to develop and iterate based on user needs.


Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes