There are several key approaches for measuring and reporting on a delivered digital experience. Measuring and reporting on the brand new – or newly optimised – experience is important, as this allows for data-led insights to be applied, that further enhance the digital experience. Some insights may only come to light after the initial launch of the experience. Furthermore, measurement and reporting are fundamentally valuable.
Measuring & testing during ‘Define’ & ‘Design’:
User testing is a method of testing (typically implemented in the design stage) to assess a prototype or a current feature within the digital experience, with real users. User testing involves: surveying the reported experiences of users who are willing to try out sections and attributes of the digital experience, and drawing upon the collected data to inform optimisations to the platform. It is a valuable method employed to measure the success of selected KPIs, in order to gain real-world insight on the effectiveness of chosen aspects of the digital experience. User testing especially adds a humanised layer of measurement, where qualitative data-capture is possible, and this is essential for balanced measurement of users’ experiences.
Measuring post-delivery (after the ‘Develop’ stage):
Creating Analytics dashboards for the digital experience, that highlight the performance of chosen engagement metrics enables flexible visibility for the client, as well as for the consultant. A common manner of implementing tracking for digital experiences that are site or app-based is through tagging. Experience Platforms such as Google Analytics and Adobe Analytics work in multifaceted ways, extracting data for their ascribed digital experiences, which can then be tailored into dashboards or downloaded as custom reports.
Testing to ‘Disrupt’ & Optimise:
CRO testing stands for Conversion Rate Optimisation testing. This method of testing helps identify the best result of a platform optimisation/change before it is developed and released as part of the digital experience. CRO testing is useful because by testing the proposed optimisation before developing it, accurate outcomes can be identified, which can either prove or disprove the proposition. Furthermore, there is little to no wasted development work, and almost absolutely no instances where optimisations are pushed through to development that are not worthwhile.
Testing and measurement implementations for digital experiences provide an additional layer of credibility to the activated digital experience, because tangible insights can be derived from the data to back up initial positions, or justify further optimisations and changes.