They are an opportunity to get feedback from a panel of experts, solve problems while the service is being built reducing the risk of bigger problems happening later, are on opportunity to share the knowledge and learn more about how to build service and promote the work service teams are doing.
When we started assessing digital services, both GDS and the exemplar programme were very new and departments were just starting to develop their digital teams. We have all learned a lot since then and a lot has changed. That’s why we’ve decided to run a discovery and Sarah Herman, a user researcher, joined our team to help us.
Meeting user needs
Capability across government has grown at a staggering rate over the last two years. We want to make sure the assessments reflect this, while continuing to support excellent digital teams who are using the Digital Service Standards as their guide. We want to ensure that how we assess services meets the needs of those who are developing them, as well as our needs. And started by working out what those needs are.
Last year we shortened the standard from 26 to 18 points. What we haven’t looked at is the assurance process itself.
Sarah will shortly blog about the process she followed and the challenges she encountered during the discovery on the user research blog.
We tried to keep an open mind going into the discovery and not make any assumptions - not even the assumption that the assessment process should continue to exist!
Fortunately, almost everyone we interviewed felt that assessments are vital to ensure high quality and consistency between services produced by different departments.
However, those who have been involved in more than one service assessment mentioned the difference in the mood. Some assessments felt like they were were conducted in a collaborative and engaging way while some seemed more like an interrogation.
The format assessments take: a 4-hour meeting in one of the GDS meeting rooms, is also not ideal. It doesn’t give the assessors the full picture of these complex services, and is very tiring for the service team who often leave the room feeling like they’ve sat through a particularly harrowing exam.
The interviews also highlighted that a consistent benchmark doesn’t exist for some points in the standard and while some questions may be a simple pass/not pass type, the result of others can be very subjective.
Lastly, while the current process works well for self-contained transactional services, it can be difficult to apply to products like components, platforms, or APIs - as government builds more complex services this will need to change.
Importantly, the research also found that there are some issues which are extraneous to assessments that have huge impact on them - such as money and governance processes in the departments and government overall, issues with up-skilling staff, and constraints on the scope of services that the teams cannot influence.
Those are not things we can directly influence but we will continue working with other GDS teams and the Digital Leaders to ensure they are addressed.
It is clear that the assessment process will need to go through quite a radical redesign. In order to ensure we are doing things right, we will need to bring together a team, think of a way we are going to test, evaluate and iterate any changes, select services for a pilot, and come up with a strategy to scale up the new process once we are sure we are moving in the right direction.
We are currently recruiting a permanent Delivery Manager to help us achieve this and help our team work even better in the future.
While these longer-term changes are undergoing, we have started running the assessments slightly differently - for example we now schedule a compulsory meeting before assessment for assessors to go through the service, ensure they have a shared understanding of its users, and agree discussion points for the assessment.
Assessors will no longer use laptops during assessments, so everyone is focused on the task at hand 100% of the time. We will continuously evaluate these changes and make small, easy to implement changes that alleviate pain points we identify.
Of course we will need to ensure whatever we come up with ties in with the refresh of the Service Manual, the Performance Platform, the work to develop pipelines of future spend activity and the new government digital strategy.
If you have any questions or comments please get in touch via comments or email the Service Assessments team.