Leveraging User Behaviour Data to Solve User Research Dilemma
UBA (User Behaviour Analysis) system is a self-initiated project that I lead during working in JD. It aims to solve some research dilemmas in my daily work.
Team: Yuzhou Guo, Shaoxing Wang, Guozhen Chen, Wei Liu
My Role: Product designer
Date: Nov, 2014 – present
Deliverable: UBA System as a final product for JD internal staff
As a senior researcher, I’m often confronted with the following dilemmas:
– UX research is time consuming:
A usability test usually takes up to three weeks or more. However, for webpage or a mobile app, product iteration is usually much faster than physical ones. In just three weeks time, a new version can be fully updated. Product teams cannot afford to wait for usability feedbacks from last version.
– Research sample is too small:
A new feature inferred from an in-depth interview from a 15-user sample is testified by a 1000-sample quantitative research which is not enough data to analyze.
– Data do not talk:
Many of our research projects are conducted for designers or product managers whom are usually not specialized data. Research data should “talk” in “plain text” in order to be more explicit.
– How to make user research less time consuming?
I interviewed quite a few user researchers in UX industry hoping to find a way to make user research keep pace with product rapid iteration. And fortunately I did found some solutions in terms of lean user research methods. However, there is no new fundamental approach to get users’ response immediately.
I got my creative juice flowing after attending a front-end engineering workshop, which talked about collecting mouse click behaviour data and presenting them in real-time on the webpage in forms of data tag. What if what we collect is not limited to a mouse click? Is it possible for us to leverage users behaviour data to reveal their actual intent or need while users are browsing webpages?
– What are the key behaviour data we should collect?
In order to figure out what to capture upon key user behaviour data, I organised and had a brainstorming session within a cross-functional team which included product managers, interface designers, and content strategists.
Each person in the team was assigned to describe the design or operating problems they were facing during daily work. We found that besides from where users click, which was what product managers care most about, interface designers were quite curious about when and why users bounce out of webpages. Content strategists were interested to what content attracted users’ attention most. Based on these findings, we made mouse click, scroll reach, and attention distribution as the key behaviour data we were going to capture.
– How to make data talk?
We thought about several ways to display our data, statistic table, data label, etc. However, none of these could showcase data in a direct and visualised way. As a UX researcher, I felt obligated to make this analysis tool more intuitive to use and easier to interpret. This is the reason why I finally leveraged high-fidelity heat maps to visualise these behaviour data.
– Mouse Click Heat-map
It helps us understand our users and their expectations. When visitors click or tap a link, they are expressing interest. With deep insight into what they are aiming for, it is easy to help them take the next step.
– Attention Heat-map
It reveals the most valuable real estate and the areas on the page users find most engaging and attractive. With attention heat-map, content managers know which content users care most about and what they are skipping over.
– Scroll Reach Heat-map
This feature shows how far down the pages are users actually scrolling. It helps to find out exactly where users are losing interest and where places have elements that have the most potential.
In the JD Supermarket Channel redesign project (http://chaoshi.jd.com), UBA platform was used to assisted both exploratory research before site revision, and UX improvement measurement after revision.
During the exploratory research, Scroll Reach Heat-map was used to measure whether the page content was attractive to users. It was found that although 94% of the users stayed at the page after the first screen, only 46% of them stayed after the promotion content in the second and third screen. Compare to similar webpages such as Grocery Food Channel (http://fresh.jd.com), user stay rate was 67% at the third screen, 21% higher than that of the Supermarket Channel. This suggested the promotion area as one the revision focus.
Click heat-map was used to figure out possible design problems in the current version. Below is the click pattern among a typical “category floor”. Category tab, banner switcher and brand locator attracted most clicks. However, banners did not attract too much attention. To figure out why banners did not seem attractive to users, an eye tracking experiment was followed up and we found that the main design problem is banners’ USP is too small to be ignored by users.
After the redesign, an A/B test was launched to measure UX improvement. In this case, attention distribution data from Attention Heat-map was used as one of the measurement indexes.
As the chart showed above, the first screen stay time was lower in the new version than the previous one, while other floors’ stay time was longer. Since one of the purposes of this redesign was to improve users’ first screen search efficiency and their willingness to browse the following category floors, we concluded that the revision reached these two goals.
There are some other research cases that we leveraged off of a UBA platform as a powerful analysis tool. However , due to confidential reasons, I cannot share more cases here. As the product designer of this tool, I am currentely still working on letting more people understand this system and leverage it to solve design problems that UX and product teams are faced with in our company.