Business Properties Rental Information - Beta Assessment

One of the Valuation Office Agency’s fundamental responsibilities is to produce fair and accurate valuations of non domestic properties. Rental information for all non domestic properties is required in order to achieve this and the method of receiving the required information is currently via an 8 page ‘Form of Return’ (FOR) which is issued to customers in paper format to complete and return to us with the information required.

An online method (eFOR) for completing the form of return (referred to on page 1 of the paper FOR) which feeds the data directly into the database (after a validation process) already exists, but currently only 10% of customers use it – the other 90% of form submissions are by the paper form which is manually inputted in the Agency’s database. The Agency is looking to ultimately replace this paper FOR with a letter advising customers to visit their website and submit online, which will then feed directly into the database.

Department / Agency:

Date of Assessment:

Assessment Stage:

Result of Assessment:
Not pass

Lead Assessor:
J. McEvoy

Product Manager:
H. Christian

Digital Leader:
M. Dearnley

Assessment Report

Outcome of service assessment

After consideration the assessment panel has concluded the Business Properties Rental Information service should not be given approval to launch on the domain as a Beta service.


The panel wants to make it clear that the development is very much going in the right direction. The team clearly works in agile and collaborative manner, and the handover to the new service manager is being handled very well. It is good to see that the service manager is responsible for the entirety of the service and has the ability to change the paper as well as the digital parts of the service.

User needs and user research

There has been insufficient research with users with low digital skills. Although there is a desire to carry out both prototype-based research on the service and research into support needs with users with low digital skills, this has not yet fully taken place due to issues with recruiting users. This is important because some assumptions have been made about the number of users requiring support, and about how support should be designed. Research will enable the team to test their assumptions and understand those users’ needs for support and start to make decisions during the private beta phase about how, specifically, to meet those needs. It may also help the team to improve the design of the on-screen service.

More broadly, the team were also unable to make clear statements about the granular user needs across the various audience segments for this service. This seemed to flow from the lack of any clear or systematic segmentation or profiling of the audience. While much useful research has been conducted during the beta so far, it has mainly been with small business users. Other audience sectors e.g. large businesses, and agents, have not featured significantly in recent research. The team mentioned various other audience dimensions - e.g. business size, agent/tenant, tenant experience, single/many returns - but there has been no systematic or broad based research in any of these areas.

Recruitment for the research that is being conducted has generally been through internal channels e.g. people who have contacted the call centre. As a consequence, the sample consists primarily of people who already have some familiarity with VOA and with the process. There is a risk in this approach that important areas of user need will be missed.

The team

The team are working in an agile way, and most disciplines are represented. The GDS design and interaction patterns are being followed, however the panel had concerns around the front-end developer acting as a designer instead of this being a separate role. Given that the team is not observing user research sessions and are shortly to begin sharing their user researcher with another service, this is not sustainable and does not meet the service standard.


Although the team has enhanced its existing contact centre support and plans to test and measure this in beta, it was not demonstrated why this support was chosen over other providers or channels (e.g. face to face) to meet user needs. This is a requirement for services to move to public beta and was a clear recommendation in the report following the alpha assessment.

The panel raised two issues with the letter that is sent to users asking them to use the digital service. The logo used on this letter is poor quality and makes the letter look like it could be fake; and the content of the letter is still worded in such a way that it sounds like a user needs to send in a paper form.

The agency logo is applied to every page of the digital service despite lack of justification in terms of user need. The team should also test removing the logo and tracking the effect on completion rates and takeup.


Point 1 - Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.

Make identification/recruitment of assisted digital users a priority. There has been some work to understand who will have support needs, their barriers to using the digital service independently and what support provision will be effective but more needs to be carried out to ensure the needs are well understood before making decisions around appropriate support. The GDS assisted digital team can help with further guidance on what service teams should have covered by the beta assessment.

Develop robust and systematic profiles or segmentations of the audience for the service, based on qualitative and quantitative data, which includes attitudinal and demographic dimensions relevant to each.

Develop a set of user needs statements, based on this research, which reflect the breadth of need across the audiences. A robust user needs statement typically has the following qualities:

  • It is something a real user would say
  • It helps you to design and prioritise
  • It does not unnecessarily constrain possible solutions
  • It is based on good evidence

Develop a research plan which addresses each audience sector identified in the segmentation/profile and which includes a range of research methods - including contextual research and lab research - to clarify user needs that can be further used to iterate the prototype.

Point 2 - Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.

Consider using a user research lab and an external recruiter. This will allow recruitment of people from the target audience who have no knowledge of the form or process (supporting a more robust understanding of user need) and it will allow the whole team to view research (which will support team buy-in and understanding).

Expand the range of user-types who are recruited for forthcoming research. Include respondents from other parts of the audience, including large businesses and large agents.

Use the findings to identify the key dimensions of user need and user behaviour, and from this develop a profile, model or segmentation of the audience against relevant dimensions, which can be used as a basis for describing user need at a more granular level, and for future research recruitment.

Point 3 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

Consider recruiting a designer (ideally the panel would recommend this to be on full-time basis). This will complement the existing skills within the team (front-end developer, user researcher) and ensure that the important interplay between user research and design is fully in place.

As mentioned above, effort should be made to involve the entire team in user research.

Point 12 - Create a service that is simple and intuitive enough that users succeed first time.

Design support to meet user needs, based on user research. All options for support should be explored, including whether a face-to-face route is appropriate. If they are planning to use it, the team should fully understand the support HMRC is proposing and ensure that it meets user needs of the this specific service. All support should be set up and ready to test alongside the onscreen part of the service during public beta.


The assessment team would like to thank the service team for their well-informed answers to our questions, we look forward to hearing about what the service team learn from their research and seeing how the service develops.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 No 2 No
3 No 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 No
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes