Skip to main content

https://dataingovernment.blog.gov.uk/national-archives-discovery-voluntary-service-assessment/

National Archives Discovery - Voluntary Service Assessment

Discovery is The National Archives' catalogue, providing a way to explore our collections and - where available - download digital copies of our records. Discovery has been designed to host, search and display the many different databases and datasets held at The National Archives.
http://discovery.nationalarchives.gov.uk/

Currently at Beta stage with the next version of Discovery, working on integrating a further 5 digital services. http://beta.discovery.nationalarchives.gov.uk/

Department / Agency:
MoJ / National Archives

Date of Assessment:
22/04/2014

Assessment Stage:
Live assessment

Result of Assessment:
Passed

Lead Assessor:
P. Ferris

Service Manager:
E. Bayne

Digital Leader:
M. Coats


The National Archives Discovery service is not considered within the remit of the Digital by Default Service Standard, however the service team wanted to ensure they were aligned to it. The voluntary assessment was completed in the same way any assessment held by GDS would be, with a panel of assessors representing the different disciplines at GDS.


Assessment report

The National Archives (TNA) Discovery beta was reviewed against the Digital by Default Service Standard and the service is proposed to go Live during Summer 2014. This was a voluntary assessment against the standard as TNA is outside the remit of GOV.UK.

The presentation was strong and demonstrated a clear understanding of what the business aim was. There was a clear understanding of the 26 points of the standard, other services should aspire to comprehend and apply this into their development. GDS was very impressed with the passion across the whole team to deliver the new Discovery service.

Does the service meet user needs?

TNA has a number of mechanisms in place for gathering user data (surveys, focus groups, analytics, paid user feedback etc.) and this was used to inform the programme of changes currently underway.  We noted a slight bias to focus on your specialist user groups, the service dealt with over 350,000 visits per month (over 4 million annually). Feedback mechanisms are used to support your decision making around changes this includes a plan to remove the 'browse’ option from your homepage as less than 5% of users accessed the service this way.

The panel noted that TNA tended to operate this as a bi-monthly user feedback opportunity, rather than approaching user groups each time a new product/process was proposed. The panel considered the largest challenge facing your service was search and usability and how your users would navigate to the content they needed on the site. From our use of the service in preparation for this review, we found common journeys can be complex, with significant use of filtering needed.

Access to an onsite user research team was excellent. The panel felt that you had a good understanding of your product, the needs it was intended to meet and were planning iteratively to continue your development towards this.

Recommendation: consider increasing your user feedback opportunities to align with product releases to gain faster insight on success and operate to fail fast principles.

Can the service be iteratively improved?

Criteria 6 of the Service Standard insists services use of Agile methodologies for the quick, cost effective delivery of user-centred digital projects. TNA does use a mixture of agile methodology and Kanban and has separated out its business as usual processes from this. There are regular retrospective processes and the learning from these inform future sprints. TNA also works towards the principle of a high level three-weekly release cycle, rather than an ongoing release process and was working towards a stable system where you could deploy every day which is in line with agile methods of delivery. The panel suggested you could streamline your processes to use the same systems for tracking work and suggested that Kanban was a strong tool for managing business as usual tracking.

TNA have a separated User Insight team that it recognises moves to a rapid prototypes position faster than the digital team but, this was known and accounted for within working arrangements. The team set up by TNA to deliver the project was focused and in place with the Service Manager supporting delivery being a strong element. A significant amount of knowledge was demonstrated by the Service Manager on the overall product and deeper technical knowledge.

Recommendation: consider the opportunity to align processes to be the same across ' business as usual ' and project delivery.

Is the service safe?

TNA spoke briefly about security risks and the need to protect user data.  In particular, TNA goes through an annual penetration test and there is a process in place to acquire SIRO sign-off for the use of non-standard data. TNA were very aware of wider reputational risk and also operated a high level of informal internal testing. TNA had an agile privacy impact assessment and worked closely with their department security officer. In terms of a denial of service attack TNA were confident about taking their processes and have a mirror site in place.

TNA spoke about its 3-tier technical stack which utilised Mongodb (because its open), Solr, Net WCF and Mongo Management capsule.  It also advised that TNA hosted onsite as it had the facilities to achieve this, although, it was looking at some cloud-related use opportunities. In terms of open code TNA had released some to Github as part of an open approach but, noted that its use was specialist and almost bespoke in terms of statutory and legislative response and unlikely to be of significant benefit to many other organisations. TNA also uses an open API for data. TNA operate a test environment with full scripting for end-to-end testing and use a browser stack testing system to check accessibility for devices.

TNA have a clear disaster recovery plan in place and have the ability to rebuild its entire product as part of this. The service was also aware of GovCert reporting requirements and have a clear rota of staff as part of disaster response and management. Data backups are completed on a regular basis and there is an offsite data system arrangement in place.

Design

TNA used a conversion funnel to test design and this has resulted in achieving improvements on customer dropout rates demonstrating customers are now successfully buying records. Design and filtering systems are in place to assist users to filter to their specific need (e.g. academic s/journalist) and TNA demonstrated a good understanding of its niche audience needs. The products offered are a mix of online and offline and this will remain the case going forward , although significant growth of the online aspect is anticipated. Future design improvements proposed include order tracking. TNA recognise that its mobile device audience was growing at a great pace. TNA are exempt from GOV.UK and therefore th e website was not built to the same design, TNA have published its own patter n s on GitHub. Some of the wording demonstrated was known to be confusing and was currently under consideration.

Recommendation: consider speaking to the GDS Content Team on language issues and recommendations to help with challenges faced by TNA.

Analysis

TNA have a mature well developed digital analytics implementation. They use webtrends and have shown that they have a good data culture of using a mix of actionable analytics data, along with user research to drive improvements. The analytics implementation has been approved by their SIRO and they do not collect any personal data.

Recommendation: TNA demonstrate their innovative webtrends dashboards at a future GDS Show & Tell as a learning opportunity for the GDS and government digital analytics community .

Assisted Digital

TNA understood the need for assisted digital and provided a variety of channels to access assisted digital support to access its services, The service has a longer term plan to increase digital channel use particularly in relation to viewing its most popular collections.

Recommendation: consider speaking to the GDS Assisted Digital Team on this area when at a stage to look at more opportunities.

Summary

The assessment panel very much enjoyed hearing about this important and fascinating service and it adherence to the service standard, this is a challenge to other government services to reach the same level of quality and user focus.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 N/A 8 Yes
9 Yes 10 N/A
11 Yes 12 Yes
13 N/A 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 N/A 22 N/A
23 N/A 24 N/A
25 Yes 26 Yes
Details of criteria that are not applicable to this service
10 - broadly the plan to provide Assisted Digital support was appropriate for the service
13 - this service does not sit on GOV.UK, however much of the design and content are aligned
21 to 24 - the service is not obliged to publish data to the performance platform, the service is considering the possibility in publishing relevant data