Government Digital Service (GDS) was set up in 2011 by Cabinet Office with a task to transform the provision of government digital services by putting users’ needs before the needs of government.
We have been lucky enough to work with GDS since early 2013 and since then have been following Digital Service Standard and Service Design Phases for all our work with other government bodies as well as private sector clients – we strongly believe that it is one of the most progressive ways to build digital products and services. So this is the approach we took when rebuilding The Queen’s Awards for Enterprise (part of Department of Business and Innovation).
Starting in late 2014, we took the Queen’s Awards for Enterprise team from Discovery, all the way through to the public Beta version. At the moment, we are preparing to move on to the Live Phase, and we provide them with support and maintenance.
In this and the next blog post, we’re going to have a look at the project from start to the launch of the public Beta version, including how we went about meeting user needs, fulfilling business and technology requirements, and launching in time for the next application cycle.
Discovery: Close collaboration with the QAE team to maximise learning
Because we knew that we’d be under time pressure from the very beginning of the project, we set out to collaborate as closely as possible with the QAE team during the Discovery phase. We used their knowledge of the service and existing data – telephone and web surveys with past applicants – to help speed up the learning process.
After analysing the available data, the QAE team helped to source participants – including previous Queen’s Awards applicants, QAE assessors and administrators – for the series of interviews and workshops that our UX team led to collect both qualitative and quantitative data from these different users.
A user story workshop with BIS (Department of Business and Innovation) was carried out to prioritise stories on the basis of ‘must have’, ‘should have’, ‘would (be nice to) have’ and ‘could have’. This helped to identify BIS’s objectives for the project, such as ensuring more quality applications through the new service.
Throughout this phase, we kept detailed documentation of the findings from every interview, and workshop – and the decisions made based on those results – in the form of research insights, user stories, wireframes, and flowcharts. These results would then be gathered for our weekly sprint meetings, where we reported our progress to the QAE team.
Using all the collected data, we prioritised different features that could be included in the application. For example, we found out that that many users would get a long way through the application process before realising they weren’t eligible, so we prioritised inclusion of an eligibility questionnaire early in the process.
Once our Delivery Manager and the QAE Service Manager agreed on the prioritised features – with the application forms themselves being top priority – they were then used to determine the next stage of the project – the Alpha build.
Alpha: Good communication, short feedback loops, and rapid iteration
During the Alpha phase we began working on a fully-functioning prototype of the Queen’s Awards application. This started with the development team building a version of the application form, as well as most of the public-facing parts of the service.
The aim of this phase was to have working software ready early in the project lifecycle, giving us more time to carry out user testing to ensure the QAE service would meet user needs. In the process of developing this initial functioning prototype, we also built a template of the first application form, which we could then reuse to create the other three (there are four application forms – one for each of the award categories – in the finished service).
We worked with the QAE team to prioritise functionality which would be used earliest in the application cycle – like the eligibility questionnaire and the marketing site. Many of the administrative features were left to be built at a later stage, as they wouldn’t be needed for an extra few months.
In total, the Alpha phase consisted of four week-long sprints, with the development team continuously deploying working software to staging, and going through peer reviews and automatic quality tests. Each iteration was then put through user testing by the UX team, and also tested by QAE team to make sure that it met user needs.
In order to maintain the speed of the delivery, remove project blockers, and keep feedback loops as short as possible, our Delivery Manager held daily standups with all team members – UX, Design and Development – involved. We also held weekly sprint meetings with the QAE Service Manager and members of our UX research team, where we got together to test the software, look at the results of the last week of work, and discuss plans for the next sprint.
At the end of the Alpha we held a phase retrospective, where together with the QAE team we discussed what went well and what could be improved going forward. We also worked with QAE’s Service Manager in preparation for their post-Alpha assessment with Government Digital Services, helping to gather evidence of how the project followed the Government Service Design Manual, and attending the assessment itself to help answer questions from assessment panel members. The QAE project passed the Alpha assessment first time.
Please read A Walk Through the GDS Service Design Phases: Beta Phase for The Queen’s Awards for Enterprise to learn about the rest of the project.
Do you need help creating or improving a digital service? Contact Matthew, our Technical Director, today on +44 207 125 0160 or drop him a line on [email protected] for a free consultation.