Skip to main content

https://dataingovernment.blog.gov.uk/contracts-finder-service-assessment/

Contracts Finder - Service Assessment

Contracts Finder will be the central repository for government contract information referring to future opportunities, current opportunities, awarded contracts and pre-procurement engagement with the market. The terminology for these states is currently being tested. The website will provide enough relevant business opportunity information for buyers and suppliers to able to manage and act on this and interact directly with each other using a number of channels. Contracts Finder will also present contract information to other interested parties and will provide features that will facilitate the publication of government contracts and their associated expenditure.

Department / Agency:
CO / CCS

Date of Original Assessment:
5/9/2014

Date of Reassessment:
9/1/2015

Assessment stage:
Alpha Review

Result of Original Assessment:
Not passed

Result of Reassessment:
Pass

Lead Assessor:
D. Vaughan

Service Manager:
P. Sinclair

Digital Leader:
P. Maltby


Reassessment Report

The Contracts Finder service has been reviewed against the points of the Service Standard not passed at the original Alpha Review assessment.

Outcome of service assessment

After consideration we have concluded the Contracts Finder service is on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons 

Following the reassessment of the service, the assessment panel felt that good progress has been made to address various concerns since our last review.

The prototype has continued to be developed in an agile way with a multidisciplinary team with an empowered service manager. The service team have begun to conduct lab-based user research sessions with both government buyers and suppliers to government. These lab-based sessions are intended to continue through the next phase of development.

The service team have chosen a digital analytics package and is intending to appoint a performance analyst to help measure how the service is performing to identify any improvements that are needed.

The service team intend to publish the majority of the code for the service under an appropriate open licence.

Recommendations

The following recommendations should be acted upon before the service returns for a beta assessment.

Look and feel - Service Standard point 13:

In order to meet point 13 of the service standard for the beta assessment, Contracts Finder must have a consistent user experience with the rest of GOV.UK. You should use the GOV.UK design patterns and style guide published in the Service Manual and engage with the government service design community who have tackled many similar design patterns.

Use of digital analytics tools and collection of performance data - Service Standard points 8 and 18:

The person appointed as performance analyst should work with the service manager to identify priority data points and ensure analytics is implemented to capture these. The service should engage with the GDS Performance Platform and with the community of performance analysts to learn and share best practice.

Continue to test the service with users - Service Standard point 20:

The service team should continue with its intention to test the service with users using lab-based sessions throughout future phases of development. This should include usability testing with people with disabilities. 

Engage with GDS on opening up the service’s code - Service Standard point 15:

The team should engage with GDS to help choose an appropriate open licence for the publishing of their service’s code.

Summary

The panel was pleased to see the progress since the last assessment particularly in relation to the lab-based research sessions.


Original Assessment Report

The Contracts Finder service has been reviewed against the 26 points of the Service Standard at the end of the Alpha development.

Outcome of service assessment

After consideration the assessment panel have concluded the Contracts Finder service is not yet on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons

The assessment panel felt that although good work had been done on the service so far, it had not yet been proved that it would meet user needs. While progress had been made on production of the prototype, it wasn’t possible for the service team or assessment panel to judge whether the prototype was a success or simple enough for users to use unaided - points 8 and 9 in the Service Standard.

Overall, the assessment panel were pleased with the way the service is being built. They are agile, multidisciplinary team with an empowered Service Manager. They’ve conducted user research on a range of users including government buyers, suppliers to government and other people and organisations who might find the data useful.

The service is being built in a way which allows improvements and changes to be made on a very frequent basis with an appropriate amount of thought to the tools and processes that they are using. They’ve already engaged with the GDS Performance Platform team and are planning on testing the service with the minister later in development.

The team demonstrated that there are no eligible assisted digital users for this service so the service standard does not require assisted digital support to be provided.

Recommendations

The following recommendations should be acted upon and must be satisfied before the re-assessment.

Test the prototype service with users - Service Standard points 1, 8, 9 and 20

The prototype’s end-to-end user journeys should be tested with the service’s users including buyers, suppliers and others as appropriate. Knowledge gained during this should be used to validate the success of the prototype and help inform the ongoing development.
A plan for ongoing usability testing should be put in place to continually seek feedback from users. The outcomes of this should feed into the beta development.

Roles in the team - Service Standard point 2:

While the service is being developed with a multidisciplinary team the presence of a number roles are unclear. The service should consider including access to a Data Analyst within the team to assist with understanding how the service is being used and recommending improvements. It was also unclear whether the team contained a User Researcher with a focus on usability testing of the service.

Look and feel - Service Standard points 13 and 17:

The service should be designed to be consistent with the user experience of the rest of GOV.UK using the design patterns and style guide published in the Service Manual. This currently isn’t the case.
The service should also ensure it works regardless of the browser’s capability using progressive enhancement - some elements of the prototype currently only work with JavaScript enabled.

Opening up code - Service Standard point 15:

The service team should consider how they can make all the new source code open and reusable, including which licences are appropriate.

Use of Digital Analytics Tools point 8:

The service team should install a digital analytics tool and identify a person who has clear responsibility for analysing the data with a view to identifying insights that can be used to inform service design going forward.

Next Steps

You should follow any recommendations made in this report and see the Government Service Design Manual for further guidance. In order for the service to proceed we require a reassessment against the not passed criteria.

Summary

Overall the assessment panel was pleased with the work undertaken on the service so far. With some additional development on the prototype and appropriate usability testing, we believe the service is well on its way to meeting the requirements of the standard.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 No 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes

Point 13: Although the service does not have a consistent user experience with the rest of GOV.UK, this should not impede the service from continuing through to the next phase of development as long as this issue is resolved prior to the beta assessment.