Customer Effort Score (CES)
CES evaluates the ease of a customer’s interaction with a digital product.
CES is commonly collected through surveys or feedback mechanisms that prompt customers to rate their experience based on the level of effort needed to resolve issues, complete tasks, or achieve goals. The rating scale typically ranges from “Very Easy” to “Very Difficult” or similar options.
For example, if a customer had a poor experience, they would likely select a small number such as 1, whereas the other end of the scale represents a highly satisfactory interaction.
CES = Total ratings / No. of Responses
Usability Metric for User Experience (UMUX)
It evaluates the user experience of a digital product, assessing the perceived usability and user satisfaction based on participants’ feedback, with higher scores indicating a better user experience.
The UMUX is a four statements questionnaire：
This website/ product/ tool/ software/ prototype capabilities meet my requirements..
Using this website/ product/ tool/ software/ prototype is a frustrating experience.
This website/ product/ tool/ software/ prototype is easy to use.
I have to spend too much time correcting things with this website/ product/ tool/ software/ prototype.
Users are required to rate each statement from 1 to 7 with 1 being strongly disagree and 7 being strongly agree. Here is how to calculate a single UMUX score:
For each odd-numbered question (1,3), subtract 1 from the participant’s response.
For each even-numbered question (2,4), subtract the participant’s response from 7.
Add up the scores for each user and divide the total by 24.
System Usability Scale (SUS)
It assesses the perceived usability of a digital product and is consisted of 10 standardised Likert-scale questions:
I think that I would like to use this system frequently.
I found the system unnecessarily complex.
I thought the system was easy to use.
I think that I would need the support of a technical person to be able to use this system.
I found the various functions in this system were well integrated.
I thought there was too much inconsistency in this system.
I would imagine that most people would learn to use this system very quickly.
I found the system very cumbersome to use.
I felt very confident using the system.
I needed to learn a lot of things before I could get going with this system.
Participants are required to rate their agreement or disagreement from 1 to 5, with 1 representing strongly disagree and 5 representing strongly agree. Responses are scored and combined to calculate a single SUS score, ranging from 0 to 100:
For each odd-numbered question (1,3,5,7,9), subtract 1 from the participant’s response.
For each even-numbered question (2,4,6,8,10), subtract the participant’s response from 5.
Add up the scores for each participant and multiply the total by 2.5.
Design a UX Measurement Framework
Measuring user experience is not a one-size-fits-all approach. To improve UX, you must choose metrics that align with your business goals.
The good news is that you don’t have to create a UX measurement framework from scratch. There are many frameworks to choose from that cater to different business goals. For example, the AARRR framework by Dave McClure focuses on business metrics to drive growth, while Google’s HEART framework focuses on experience metrics.
Tips for UX Metric Measurement
As you begin building your UX measurement framework, here are a few tips to keep you on track:
Use both quantitative and qualitative metrics to understand the full UX picture.
Ensure that the metrics you track are suitable for the design changes you make. Remember that UX metrics should remain adaptable and may require adjustments over time.
Make sure that your metric analysis is relevant to the UX touchpoint you want to improve.
Leverage tools such as GA4 and NPS calculator to enhance tracking and measuring efficiency.
Don’t Get Caught Up Trying to Find the Perfect UX Metrics
To identify the most appropriate UX metrics to track and measure, it’s important to start with your goals, rather than the metrics themselves. This is because there are no “perfect” metrics – only metrics that are relevant to your specific objectives. Establish a consistent process by using existing frameworks or creating a customised one to ensure the metrics reflect your organisation’s unique needs and priorities.
Adrenalin is a leading digital product and technology agency for Australia’s top brands and organisations. Stay informed about the latest digital product trends, strategies, and tactics by subscribing to the Adrenalin newsletter below.