Innovate UK provides funding via a range of funding mechanisms; the largest of which focuses on attracting applications for collaborative research and development (CR&D) projects from business led consortiums.
The Innovation Funding Service will initially replace the current CR&D service enabling business led consortiums to apply for public sector funding. All Innovate UK services will subsequently transition to the new service.
The Innovation Funding Service covers the end-to-end service:
- Businesses applying for funding under a themed competition.
- Independent assessment of application by sector experts.
- Due diligence by Innovate UK staff and if appropriate, grant award.
- The monitoring of live projects and payment.
Department / Agency:
BIS / Innovate UK
Date of Assessment:
Result of Assessment:
Outcome of service assessment
The panel previously assessed the service in April 2015 and at this time concluded that the service was not yet ready to move to the beta stage.
The panel were impressed by the progress made since this assessment and conclude that the service now meets the alpha service standard.
The team had clearly prepared for the assessment and the panel were impressed by both the pre-assessment briefing and the presentation during the assessment itself.
The Innovation Funding Service agile team have built an excellent alpha service using a wide range of open source tools which can be taken through to the beta stage.
The panel were particularly impressed by:
- A strong co-located team working collaboratively, self-organising and fully supported by the service manager.
- The service manager clearly owns the service and exhibits all the behaviours required in such a role.
- The delivery manager, technical architect and user researcher representing the team were clearly competent in their specialist areas, able to provide considerable evidence across the service standard.
- Evidence was also presented of the wider business areas of Innovate UK being engaged in the planned transformation.
- The panel were impressed by the wide ranging iteration and improvement made based on user research and user testing with the different user types.
- Since the first assessment, the design has been greatly improved - more clear and simple - and usability testing and iteration of some features will continue during the beta.
- The team’s research approach recognised the complexity of gathering the needs of all relevant sectors for example business users with different roles within a company.
- It was good see that the architecture had been considered from the enterprise viewpoint ensuring loose coupling of services which should make future integrations easier, for example plugging in the GDS notify platform. The team should continue to keep in view the development of common government platforms and how these could be deployed for this service.
The panel have the following recommendations to be progressed during the beta phase:
- Introduction of a web analytic tool as soon as possible for user journey start points, as this will provide another source of user analytics.
- The assisted digital procedures need to be reviewed and tested with identified assisted digital users. It would be good to see analytics performed around expected numbers in each of the defined user groups.
- To penetration test the service and support model early in the beta phase to ensure any identified risks can be understood and considered in the design and coding rather than becoming an add-on at the end.
- As the service has been designed responsively, the panel suggest the use of automated test tools to test a range of browsers and devices rather than the manual approach adopted in alpha.
- Consideration should also be given to retention of knowledge, given the changes in the beta team and to smooth the transition for any future churn in resources.
- To continue to research the user needs and user experience, particularly for the finance user within larger companies.
- To continue to develop the service in line with the GOV.UK style guide and test the design with users. The panel observed that the difference between “my applications” and “my projects” in the dashboard may not be clear to the user; the meaning of the flag and timer icons may not be obvious; and the format of the deadline and time left infographics may need to be further iterated.
- Whilst the team had initiated engagement with the GDS Performance Platform team, further discussions should be progressed in order to ensure a dashboard is ready for the launch of the service, with a clear understanding of all available data and how this will be used to monitor and improve the end-to-end service.
Digital Service Standard criteria