Skip to main content

https://dataingovernment.blog.gov.uk/digital-marketplace-service-assessment-2/

Digital Marketplace - Service Assessment

Digital marketplace is a browse and search tool for civil servants to find products and services available on the G-Cloud framework.

It allows buyers of services to do both keyword searching and filtered searches based on attributes of the services. Additionally it provides the ability for buyers create accounts, which enables saving of searches.

Suppliers who have products on the G-Cloud framework have additional functionality around managing their supplier and service details. Functionality is this area is limited for this release to enabling new suppliers to create an instance and also for existing suppliers to manage the users that are associated with their company. Future functionality will allow further management of supplier and product data.

Department / Agency:
CO / GDS

Date of Original Assessment:
29/9/2014

Date of Reassessment:
5/11/2014

Assessment stage:
Beta

Result of Original Assessment:
Not passed

Result of Reassessment:
Pass

Lead Assessor:
L. Scott

Service Manager:
I. Majic

Digital Leader:
P. Maltby


Reassessment Report

The assessment panel have agreed that the Digital Marketplace is now in great shape to move into Beta development.

Below is the assessment panel’s response to the steps Digital Marketplace have taken to address the 4 criteria not passed at the original beta assessment. The panel were really encouraged to see how the service had acted on all the recommendations the panel had made in a very short time.

2. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

  • The backlog of web operations tasks is now almost complete, and a user support and web operations model has been discussed and agreed, set up and tested.
  • There is an agreement with the GOV.UK 2nd Line Tech Support team to provide 3rd line support, and this support model has been tested operationally.
  • The service team reiterated that there are 1200 users/day, with no out-of-hours expected use, hence pager duty is unlikely.
  • At the time of assessment, the Ops Manual was ¾ complete, and is expected to be finished imminently.

5. Evaluate what tools and systems will be used to build, host, operate and measure a service, and how to procure them

  • The panel discussed monitoring, log collection and aggregation with the service team. Graphite is in place, and other measures, eg key application metrics, should be finished imminently (planned for the current sprint). Standard system metrics will be used.
  • The service team explained why they will not be doing log aggregation. The panel agree that it is not worth doing these infrastructure changes since the plan is to move to more of a PAAS (platform as as service) in 2 to 3 months time.

18. Use analytics tools that collect performance data

  • The panel were struck by how the service team have really turned around the service with respect to this area of the Standard.
  • The team confirmed they have upgraded their analytics provider.
  • They have set customised KPIs which better reflect the service, along with more granular service improvement measures.
  • There is a product analyst full-time on the team, and the service manager was rich in her praise of the individual in this role. It was clear how an embedded data analyst is benefitting the service and its users by inspiring the team, sharing knowledge and making great use of data insights to inform service development and design. The team has already made changes based on analytics.
  • The dashboards are now far more optimal, the team now has multiple views of their data including a raw profile. They have started segmenting visits and have set up real-time search terms which help keep the team in touch with their users.
  • The service manager explained how they are concentrating on analytics that will inform the team to make user-focussed improvements.
  • The panel and the service team had a good discussion about benchmarking the service and measuring performance. The service manager was convincing in the description of the service’s success thus far, including a successful transition from the CloudStore, unsolicited positive feedback, no vehement negative feedback, and peer organisations recommending the Digital Marketplace.

Recommendation: There was mention made of exploring the use of Google Tag Manager, this will need to go through the Assurance Process if pursued.

25. Make a plan for the event of the service being taken temporarily offline.

  • The team explained how there is a CDN (Content Delivery Network) in place, and downtime measures in place (via the operational support model discussed earlier). The GOV.UK Infrastructure team have done an evaluation and are happy with this approach.
  • The service team reiterated that there is high tolerance for small outages from users, although no outages are anticipated. The team explained how they can contact all active users quickly with any communications in the unlikely event of prolonged downtime.

Recommendation - a bespoke failover page should be developed during the beta, to replace the standard ‘This application is unavailable’ page.


Original Assessment Report

The Digital Marketplace is seeking permission to launch as a Beta service.

Outcome of service assessment

After consideration we have concluded the Digital Marketplace should not be given approval to launch as a Beta service.

Reasons

The panel concluded that the assessment was too early and that the Digital Marketplace was not yet at the standard GDS expects for services going into Beta. The specific criteria which were not passed are explained below.

2. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

  • The panel’s main concern is the lack of permanent web operations roles in place. The service currently has a single person on loan from GOV.UK. There is a large backlog of web operation tasks outstanding, and we expect these to be more complete before a Beta assessment can be passed. The provision of web operations once the loan period is over was unclear.

5. Evaluate what tools and systems will be used to build, host, operate and measure a service, and how to procure them

  • The panel had concerns about the lack of web operations roles in place to operate the service, and the lack of monitoring, log collection and aggregation. The panel understand that this work is planned, but to pass the Beta assessment, this work must be complete.

18. Use analytics tools that collect performance data

  • Previous recommendations from the alpha assessment indicated that the team should look at analytics from the existing cloudstore service. These analytics could be used to baseline existing success measures, and track how well the replacement digital marketplace service performed. That work does not appear to have happened. There is a plan in place to do this and an analyst is being seconded to the team for a reasonable term, but we would expect this work to happen before the product can pass into Beta.
  • It is acknowledged that the 4 KPIs in the service standard don’t map as well to a service that fundamentally isn’t a transactional service. We would like to see the team work with the Performance Platform to put in the work to define what KPIs are relevant and define success measures accordingly.

25. Make a plan for the event of the service being taken temporarily offline.

  • There was no evidence of a clear plan to follow in the case of an unexpected outage or malicious attack. The business requirements have indicated the sufficiency of a 9am to 5pm, Monday to Friday support model. However alerts, a planned response, and a responsible team to deal with an unexpected outage must be in place for a product to pass into Beta.

Recommendations

1. The service team needs to contain people with the capabilities to be able to operate the service, either as full-time members of the service team, or as a demonstrable support model from another team with the capabilities and capacity.

2. Monitoring, log collection and aggregation needs to be in place.

3. An operations manual should be written and available.

4. A plan must be in place for actions for the operations team to take in event of an emergency and this plan must have been tested.

5. A plan must be in place for scheduled or unplanned downtime, including communications to users.

6. Benchmark the performance of the old service (CloudStore) to provide a baseline to measure the performance of the Digital Marketplace.

7. Define how you will measure success and set your own KPIs.

8. Upgrade to the Premium version of your current analytics package.

Observations against other criteria

User needs

  • The excellent work to determine the user needs for this service was evident. The Digital Marketplace is a revolutionary product and the service manager, product manager and user researcher displayed a deep knowledge and understanding of users and their needs, which was shared by the whole team.
  • The service design has been truly informed by user research and analysis of user needs. The team gave excellent examples of this during the assessment by describing the research-prompted features of saved searches, and/or searches, the need for repeatability, and service descriptions.
  • The team has used a variety of research methods and gathered valuable data and insights. The panel was impressed to hear of the team-wide attendance at weekly research sessions and the resultant knowledge and understanding of their users that the team demonstrates. There are 2 user researchers on the team.

The team

  • There is a multidisciplinary team in place, with the exception of a web operations role, as discussed above. A product analyst was about to join the team.
  • Some great examples of agile product delivery were discussed, including how the team turned around problems raised in retrospectives, putting more effort into planning, using a feature wall to give developers more autonomy. The team has seen an increase in productivity as a result.
  • The content designer on the team is working in isolation - we discussed that they should collaborate with the GDS content design community to ensure the Digital Marketplace product follows latest content design patterns.

Security, privacy, tools and standards

  • We had a good discussion around these criteria and the service team displayed a thorough understanding of the needs and how they would solve them. They have undergone assurance processes and are working out potential areas of concern identified.
  • The Panel had some concerns about aspects of these criteria, outlined above.
  • The information on cookies is out-of-date, although we heard this work was planned.

Improving the service

  • The panel was impressed to hear how the service team can respond quickly to iterate the product, and do so frequently, following findings from research.
  • The team gave a good example of a feature they introduced within a couple of weeks following insights from research, through to design and testing, through to deploy.
  • The panel also noted how the team have removed unnecessary or incomplete features that were not contributing positively to the user experience.

Design

  • The team showed thorough evidence that users can successfully use the product unaided and have a positive experience. User research informed all design decisions. The team have been tracking technical ability of their users and making decisions accordingly.
  • The team have solved problems for users, such as having 2 start pages for the service for different audiences.
  • The service team have not specifically tackled the offline steps in this service, although they have a vision for how this could work and plenty of research to aid development in that area.
  • The panel noted how the service team is using an adapt and adopt approach to GDS style guides and design patterns. Some of the content in the service isn’t in GOV.UK style. The Content Designer should work with GOV.UK Content Designers to get the content into GOV.UK style before the next assessment.
  • The service has not been designed to be responsive. Data has informed this decision. The team explained they are monitoring data in case they need to review this decision.

Assisted Digital and Channel Shift

  • The team demonstrated that their users were highly IT literate due to the nature of their roles and the need to have completed other more complex services before accessing the digital marketplace. They provided evidence of this by plotting the users they tested onto the digital inclusion scale. The panel was particularly struck by this excellent approach.
  • They have not yet identified a need to provide assisted digital support for this service, but will continue to conduct research with suppliers and to seek out potential assisted digital users.

Analysis and Benchmarking

  • The service team have got tracking but not interpretation of analytics in place. A new role due to start will take responsibility for this area. The panel had concerns about the stage the service is at with this criteria, outlined above.
  • The analytics profiles will benefit from a more optimal set-up. The panel made several suggestions for how the performance of the service could be measured, including aligning with business goals, measuring the performance of search rankings, segmentation possibilities such as measuring results by procurement framework.
  • The standard KPIs for services are not relevant for this product.
  • The panel agreed the team needs more defined performance measures in place to evaluate success, as outlined in the recommendations above.

Testing with the Minister

  • The team has made plans for this.

Next Steps

You should follow the recommendations made in this report and see the Government Service Design Manual for further guidance. In order for the service to proceed we require a review of the not passed criteria.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 N/A 8 Yes
9 Yes 10 Yes
11 N/A 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 N/A 22 N/A
23 N/A 24 N/A
25 Yes 26 Yes