Introducing UCD methods at Inland Revenue/EDS
Contents of this page:
- Usability testing
- Desired improvements
- Finding a project to try the methods
- Initial maturity assessment
- Introducing the methods
- Achieving usability targets
On other pages
- Usability maturity assessment
- Justifying UCD: calculating cost benefits
- Which methods were used and why
- Experience with use of the methods
IR/EDS have known for a number of years that delivering multi million pound business systems to time, cost and requirement was not the complete answer to satisfying their customers in the Revenue’s nation-wide chain of local offices who administer the Tax, Tax Credit and National Insurance systems. An increasingly sophisticated target audience saw IT playing a key role in improving their efficiency and effectiveness and wanted business systems that were not only robust and reliable but fitted their business tasks and were consistent with other applications they used in their day to day work.
Work with the National Physical Laboratory opened our eyes to the possibilities that usability testing offered help in plugging that gap in the development lifecycle and following on from that experience the IR business development process was amended to include end of lifecycle usability methods.
Over a period of three years that end of lifecycle approach worked very well in providing projects with an objective view of how a system was likely to be perceived by the customer. It however became increasingly clear that it was only a partial solution.
The main weaknesses were:
- work was being done at the end of the lifecycle as part of business assurance rather than throughout lifecycle as part of design so we could only find faults not build quality in;
- evaluations were carried out by the specialist business testing group so the wider design and development groups didn’t see usability as their responsibility;
- the business testing group didn’t always set a usability requirement or measure against it, too often they relied on diagnostic evaluations which reinforced the divide with the developers who were working to implement a business requirement (not the same thing at all).
Strengths did exist, particular:
- a group of experienced usability analysts in the business testing group who saw the weaknesses, wanted to improve and provided a firm foundation to build on;
- willingness of senior management on both sides to see this as a good thing and support (and pay for) changes to be made;
- an IT development lifecycle that involved empowered users throughout and took an iterative, incremental approach to design and development.
The go ahead was therefore given for IR/EDS to join forces with Serco Usability Services to put together a process improvement project with the aims of:
- establishing a benchmark to improve from;
- highlighting those areas of improvement that would give us the greatest business benefit;
- achieving a formal fit with the EDS IT lifecycle that ensured we used the right techniques at the right time in the development lifecycle;
- specifying usability requirements;
- gathering data on the pounds, shillings and pence benefits of usability.
Finding a project to try the methods
The main problem then was finding a project that would volunteer to be the pilot. Though senior management supported the initiative there was no formal process improvement culture that could be plugged into or called upon so it was left to the designated IR/EDS leaders to secure the trial project. A number of formal presentations and informal enquiries to various projects quickly brought home the message that though project managers and requirement managers were sympathetic with aims they saw process improvement as a risk rather than a benefit to their projects and didn’t want to play ball. No magic solution to that situation, it was resolved by: calling in favours; selling harder; targeting key players.
That path eventually led to a forward looking sponsor interested in ensuring that her end users were involved in the project in the right way using the right method. Problems in securing a pilot project were noted as an issue and played a part in a wider debate about putting some formality into process improvement so all work of this type could be done and managed in a consistent way that was established within the partnership.
TIP – An established process improvement group is a more effective way of introducing change than relying on individual initiatives whether started from the shop floor or the boardroom.
Initial maturity assessment
It was however a wary project team that was brought together for the first maturity assessment, uncertain what they had let themselves in for. The maturity assessment however opened everyone’s eyes to:
- The different ways user could and should be involved throughout the lifecycle;
- The benefits that could accrue to both the project and IR/EDS;
- Professional support available from Lloyds Register and Serco Usability Services.
Output from the assessment was not only a clear eyed assessment of the level of maturity in this area but it provided a straightforward model for raising that level aimed at the heart of the development lifecycle, the facilitated workshops which are the engine of design and development stages. The model was based around:
- using context analysis to scope who will use the system and what tasks they'll undertake;
- producing task scenarios for all the main tasks;
- setting usability requirements in support of efficiency, effectiveness and satisfaction;
- distilling the above into a "preparation pack" so there is a common view of what to address for each function;
- using paper mock-ups and task scenarios to design and verify windows;
- adhering to a corporate style guide to provide a consistent look and feel to windows;
- validating the resultant design using functional prototypes;
- invariably, refining the approach through a number of iterations and then measuring the final system against the usability requirement.
TIP – A benchmark is a great thing to have especially an industry standard one that is assessed by rigorous, methodical professionals from outside of the organisation.
Introducing the methods
Next step however nearly put the skids under the initiative. The pilot project was facing very short timescales, an expanding requirement and lack of trained, experienced staff on both sides. Add to that a group of end users working on their first commercial IT project and you can see in hindsight that it could have been a recipe for failure. It also meant that when trying to introduce the new model of smarter working into the facilitated workshops there was so much else happening within the workshops that little progress was being made. After a number of false starts things were brought to a head late one afternoon after a marathon eight hour workshop involving 15 people that had gone two steps backwards for every one step forward. As the workshop dissolved with people walking away wondering where it would all lead to the light seemed to go on above the heads of the key players on EDS and IR simultaneously and a hastily convened council of war ended in agreement that from the next workshop the new methods would be given their chance and the decks swept clear to allow them the chance to succeed.
It would be easy to say that from then onwards a period of unalloyed success followed. It didn't of course but over a period of two difficult months the methods began to prove themselves and a degree of engineering was brought to the workshops that hadn't previously existed or even been hinted at. Even more important perhaps was the framework that those methods provided to the end users who started to appreciate their roles and responsibilities and ensure their business knowledge and experience was fed into the design. Business benefits accrued not only in a computer interface that was closely aligned to the main business tasks but in a quicker development process that was clearer on what to do and how to do it. There are case studies available of all the individual methods used
TIP – Process improvement doesn’t happen in a vacuum and the other factors need to be recognised and managed. That is helped by having someone with responsibility working inside the pilot project rather than dipping in and out.
Achieving usability targets
The development experience has been verified in the business testing phase of the application which have demonstrated staff achieving the efficiency, effectiveness and satisfaction targets set in the usability requirement. The table below details the key results and show staff completing tasks quickly and to acceptable quality standards on their first day of using the on-line system in the test laboratories. The user centred methods employed during development that ensured we were designing to meet real work scenarios have played an important part in delivering those test results.
|Task||Time min:sec||Time Requiremen min:sec||Task Quality Score||Quality Requirement||Satisfaction Score (SUMI)||Satisfaction Requirement (SUMI)|
Copyright © 2002 Serco Ltd. Reproduction permitted provided the source is acknowledged.