An Overview of International Cyber-Security Awareness Raising and Educational Initiatives (2011)
3.9. ‘Work in progress’ evaluation, as part of the design process
Established disciplines in software and web development methodology use ‘work in progress’ evaluation throughout a project to address the high failure rate of large software projects (which some online services resemble). Core techniques include ‘iterative development’ and ‘software prototyping’ conducted to generate interim design artefacts for users and others to evaluate. The aim is to integrate evaluation into the design process, and thereby improve fitness for the needs of users. Rather than evaluation as traditional after-the-fact assessment of program success or failure, this puts ongoing evaluation of progress towards understanding and resolution of risks, and in particular towards satisfaction of known user needs, at the heart of the development process. It promises not only greater visibility of progress, but also delivery of materials actually useful to known users.
This approach relies on frequent ‘work in progress’ evaluation of the developers’ understanding of, and progress towards, goals. It is a central technique to steer current projects towards outputs better aligned with demonstrated needs of identified groups of users. It is based on a gradually elaborated set of sketches, prototypes and disposable artefacts representing current thinking about the nature of ‘the problem’ and the shape of ‘the solution’. The goal is to make the designers’ thinking and assumptions visible to a wide range of users and user surrogates from the very earliest, feasibility assessment stages of the project, and to use them to prompt user and other feedback to reveal errors in assumptions, language or plans at the earliest possible time. Early discovery means remediation is cheap and effective, while discovery at the end means it is too late, expensive and difficult to use evaluation feedback in the project.
This model, which we recommend for routine adoption, also depends on sophisticated project managers and clients, willing to allow these user-centred evaluations to guide the scope, quality and content of the deliverables as the project evolves, and to reward rather than punish designers for focusing on ‘making the worst mistakes early and visibly’. These can present challenges, albeit ones worth the effort.
 For instance, the extensive literature starting with McConnell S (1996) Rapid Development: taming wild development schedules, Microsoft Press, especially Ch. 7 and spiral life cycle model; Mayhew J (1999) The Usability Engineering Lifecycle: practitioner’s handbook for user interface design; Pearrow M, (2000) Web Site Usability Handbook, Charles River Media, esp. Ch 2.