Skip to main content

https://dataingovernment.blog.gov.uk/schools-performance-tables-service-assessment/

Schools Performance Tables - Service Assessment

Department / Agency:
DfE

Date of Assessment:
28/5/2015

Assessment stage:
Alpha Review

Result of Assessment:
Pass

Lead Assessor:
M. Harrington

Service Manager:
I. Thomson

Digital Leader:
L. Diep


Assessment Report

The Schools Performance Tables service has been reviewed against the 26 points of the Service Standard at the end of the Alpha development.

Outcome of service assessment

After consideration the assessment panel have concluded the Schools Performance Tables service is on track to meet the Digital by Default Service Standard at this early stage of development. There are however a number of recommendations in this report which the team must take into account for the beta phase.

Reasons

User needs:

The panel were impressed with the approach and results gained by the UX Lead and the team in a short time. The service manager was able to explain clearly who the service was for, and enumerate the target personas. A sensible plan and amount of funding was declared for continual iterative research through beta. It’s helpful to keep in mind the changing needs of user research as the development matures. Initially more ethnographic and wide-ranging research - ideally in situ - should be carried out. This should then make way for more focussed work on the interface and interaction as the key assumptions re. user needs get filled in. The panel were pleased to hear that DfE plan on dedicating a space for a user research lab, but this will need sustained attention if it’s to be completed in time to be useful for this service.

The team:

The team is not co-located however they appear to be working well to deal with this, spending at least 2 days per sprint face to face. User research has also been conducted in different locations to ensure all the team are able to take part. The team is working using Agile methodologies and there is good separation of roles in most cases. The panel were concerned that too much is expected of the UX Lead in the team who is responsible for both user research and the design of the service. (See recommendations)

Security, Privacy, Tools and Standards:

The team appear very familiar with the actions and conversations that need to happen when operating a digital service. They are taking conscious steps to avoid lock-in, and carefully evaluating their choices with technology options. There are a lot of unknowns still, e.g. how the service will be operated; but the team are aware of what they don’t know and have plans about how to address this.

The team was very knowledgeable about what standards they can support, and are building the service with a view to it being an API that can be consumed, rather than something which consumers might need to scrape to do something with the data.

Improving the service:

At alpha the team have been able to rapidly change and deploy the product and multiple variants have been tested with users.

Design:

The service team are aware of design problems with the service and are in the process of iterating the service based on solid user research. The panel are confident the UX Lead is well placed to push the design of the project forward and is engaging with the design community around government. Although the panel are confident the service team has the ability to deliver a good service which meets the design criteria – the current alpha service does not meet Service Standard Point 13 ‘Build a service consistent with the user experience of the rest of GOV.UK by using the design patterns and style guide.’

Assisted digital and digital take up:

The existing service is 100% digital and the new service will also be fully digital. Since the service is not transactional, assisted digital is not a component that the panel assessed.

Analysis and benchmarking:

The team had tested different designs with users in the alpha phase and fed this research in to the build of the product. The existing service does have analytics on it which provides an insight into its use but only the out of the box metrics are tracked.

Recommendations

User needs:

There is concern that there is too great an overlap between the design and research roles. The design challenges in this service will be significant, with a great deal of information to convey clearly. The primary user research should inform which features are critical to ‘minimally-viable’ first release of the service. The panel recommend testing with fewer on-screen elements, essentially to start testing with a much more pared down interface and user flow (carrying forward the learnings from the alpha).

Every element on screen has to be justified with primary research. The current prototype gives the impression that every idea has been included, and that the available data is driving what elements are included, rather than a user-centred approach. The panel do not recommend testing with non-working elements. Efficient testing, happens when a thin, horizontal slice through the interaction is prioritised for one user group (e.g. searching for local schools for parents), and then other key user persona groups included. Alternative interactions / layouts / interface paradigms should be explored to discover which works best for the needs, experience, expectations, and mental models of your user persona groups. For example, the informational and emotional needs of parents using this service will likely be meaningfully different than for school governors.

The team:

The team needs a full time designer and full time user researcher. Currently design and user research are a single role. The service has different user groups and user needs and for these to be properly met there needs to be more design input.

Security, Privacy, Tools and Standards:

The panel would urge the team to look at how they intend to make the source code for the service available. Doing this earlier rather than later makes it considerably easier. In particular, be aware of separating out configuration from implementation, as described in https://gds.blog.gov.uk/2014/10/08/when-is-it-ok-not-to-open-all-source-code/. The panel would also like to encourage communication between this team and other parts of government that are developing services using Microsoft technologies, such as how to consume and extend the frontend toolkit. Publishing the code and highlighting it via blogging can help those conversations happen.

Improving the service:

As the team move in to beta they should ensure they can iterate and improve the service at the same speed as they have in alpha. At the alpha assessment there were some incomplete features which may or may not help meet user need. At beta the team should ensure they focus on the key points of the service to deliver the most benefit to users.

Design:

A separate email will be sent by the design assessor outlining areas that need improvement in relation to Service Standard Point 13. Also see previous comments relating to the composition of the design team.

Analysis and benchmarking:

The team should install analytics on the beta service to capture data about how the service is being used and to give insight for improvements and changes. The service should be measuring on-screen events and goals to better understand and see how users are making these journeys. (e.g a user comparing two schools might follow a path of:  home→ search results→ individual view→ compare.)

This is a non transactional service, however the service is still able to display cost per session, user satisfaction, and a suitable completion rates (for frequent tasks as identified from the analytics) on their Performance Platform dashboard in addition to any other metrics identified by the service team.

As part of the Beta development to remove the duplication of effort across government, the team should investigate a closer integration with the Performance Platform - for example using their code for the visualisations on the Schools Performance Tables, or using the Performance Platform to actually serve the graphs and charts as components within the service.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 No 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes