To make sure you have impact, do quick experiments and redesign things based on usability principles. By doing so, we found out that displaying only one recommendation at the time makes more people click.
A click on a recommendation is a success for Playscan. It means that we’ve provoked a reaction or created interest for taking action.
Sometimes, the little stuff create a big difference. As Thaler and Sunstein writes in their bestseller Nudge: “[S]mall and apparently insignificant details can have major impacts on people’s behaviour. A good rule of thumb is to assume that “everything matters””. With the “everything matters” mindset, the insignificant details can indeed prove fruitful beyond expectation.
Back in the days, Playscan always showed two recommendations to players. The rationale was that this was a trade-off between making sure to give the player more than one choice, to make sure he could find something relevant, but not to overwhelm him with too many. From nothing but our own curiosity, we decided to test whether we were correct.
We randomly divided our visitors into three groups, presenting them with one, two or three recommendations respectively. Next, we measured the click-through rate during a two week period. At the end of it, we realized that we had left a good many clicks on the table.
Our original design with two recommendations proved a 20% click-through, measured as the proportion of players who clicked any tip. The three-tip version showed no significant difference, but our one-tip version did: 36% of players clicked the recommendation. Again: the only change was one vs two recommendations – no other design changes, the same selection of recommendation, no new content – and from this we doubled our click-through!
Hindsight is always 20/20, and there is a reasonable explanation for what we found: players are more likely to click through with fewer conflicting and maybe confusing recommendations to choose from. Still, before our test we thought we had an equally good theory of why two recommendations was the way to do things.
So while doubling our click-through on recommendations based only on simplifying things was a big lesson learned, the biggest was without a doubt that “everything matters”. Ideas and hypotheses are a good starting point, but until proven they are just that: hypotheses.
Now, getting people to use our tools is only a first step in having impact. When it comes to recommendations, the next is having relevant ones. How do we make sure that they are? Well, we will test that too.