In evaluating digital projects, the focus for many cultural institutions has been on quantitative methods, using defined metrics to identify and measure impact, value, and success. Continued efforts to develop standardised frameworks and tools, such as aggregate influence scores and dashboard templates, has meant that this is increasingly viewed as quick, broadly applicable, and easily understood. However, there remains dispute over whether this form of measurement, which places numbers and statistics above more subjective, qualitative response, is too reductive. This, in turn, prompts questions for us when evaluating our digital projects:
• What are we actually measuring, and what are the numbers really telling us?
• Why might these results be significant and privileged over other forms of insight?
• Are we at risk of oversimplifying and narrowing evaluation processes for digital projects because of the abundance and relative ease of digital evaluation tools?
While recognising the value of Web analytics, we have sought to engage with the above questions when designing evaluation of digital projects at Wellcome Library. Drawing from Simon Tanner's Balanced Value Impact model, we explore iterative approaches to evaluation through agile frameworks, which can be adapted to allow for changing priorities and iterated for the needs of different digital projects. Through a multimodal approach, we aim to combine rigorous data analysis with qualitative methods to reveal emergent patterns and behaviour, in order to better understand the often complex relationships our users have with our digital offer.
This approach has generated rich, nuanced data, giving greater understanding of our audiences and clear pathways to improve our digital services. Though it remains in tension with many institutions' reporting requirements, it is an approach that is lively with possibility and accommodates the continual evolution and range of response in users’ experiences.