Tag Archives: responsible gambling

Responsible gambling across borders – the Svenska Spel approach

In April this year the Playscan team attended the Discovery conference, hosted by the Responsible Gambling Council in Toronto, and spoke about the responsible gambling approach of the Swedish state lottery, Svenska Spel. 

In short, this is what we argued:

Registered play can help us act

The collection, storage and analysis of player data has made it possible for Svenska Spel to create a responsible gambling framework that is both personalised and measurable. The framework sets out from the player’s gambling behaviour – and if that behaviour is excessively risky, he or she is notified. The player is offered tools to help them gain control, such as a consumption history view, limit setting and self-exclusion. The success of these efforts, from identifying problem gambling to seeking contact and how well these tools are used by players, can all be measured and appropriate KPIs can be set.

Things Svenska Spel actively avoid, for responsible gambling

Based on what we know drives problematic gambling Svenska Spel does not offer any bonuses or loyalty programmes. Elevated risk games are marketed restrictively and there is no marketing aimed at high risk players. This is important to recognise: an effective responsible gambling framework is not only about educating players on how to play smart – it is also about, from an operator’s perspective, acknowledging that responsible gambling means saying no to profits coming from high risk players.

—–

Do you want to meet us? We’ll be at the EASG conference, in Malta, 11-14 September 2018. Contact us and we’ll gladly tell you more about our work!

Hello! Calling at-risk players produces positive impact on players and staff

During four weeks in November 2017, we contacted around 70 high-risk players by telephone, informing them about their gambling habits at Svenska Spel. We found that players have a poor understanding of how much money they spend – and they appreciate the information.

Why contacting players by telephone?

One responsibility for gambling companies is to minimise the risk for their players to develop gambling problems. We know a whole about the negative consequences that affect the individual and his/her closest near and dear, and the consequences are devastating. They affect all parts of the individual’s life, economically, socially, and psychologically.

We believe that transparency, showing how much gambling really costs, would lead to better-informed consumers. By contacting at-risk players, we wanted to investigate if conversations by phone was an appreciated service and if it could prevent and reduce any adverse consequences from gambling.

The idea of giving at-risk players feedback by telephone is not new to Svenska Spel. The Norwegian gambling company Norsk Tipping started as early as 2014 and has now a permanent organisation for this type of proactive calls.

The content of the conversations

The pilot had an exploratory design in which we concentrated on the players’ reaction to the call. In the first phase, the project group was trained by a physiologist in different conversational techniques and developed a conversation concept. The conversations aimed to make the player more aware of his or hers gambling consumption, inviting them to reflect upon their habits. If the player was interested, we offered information about possible action for increased control, guiding them to appropriate help and treatment options.

Which customers did we contact?

For this pilot we used information from the Playscan system and selected three target groups to contact:

  • Big losers: a high-risk profile in Playscan and lost (net loss) more than 800 euro the last month.
  • Young risk players: men, 18-25 years, a high-risk profile in Playscan and lost (net loss) more than 400 euro the previous month.
  • Problematic gambling profile: We call this group because they, through the self-test, have told us they have problems with their gambling.

How did they respond?

Prior our first call we were naturally a bit nervous. How would the player react? How do we handle if someone gets sad? Or angry? There were many questions before our first call, but very soon we were strengthened by the fact that several expressed their gratitude and we were convinced that we have a good reason to call these customers. Some conversations did not lead to any concrete actions, such as changing someone’s limits or even helping someone to take a break from gambling, but the conversation seemed to be appreciated. Some said that they can afford to play as they do, but that “it’s good that we are calling”.

Results

Five weeks after the intervention, we see that contacted customers spend less money on gambling (both net and gross) compared to customers we tried to contact but didn’t answer. However, the result is not statistically significant. The average length of the conversation was 7 minutes, and the average age of the customer was 37 years old. Out of the 71 phone calls 11 people chose some type self-exclusion immediately at the time of the call. The most common conversation, however, concerned limit setting and information about the consumption history view. At the end of each call, the customer was asked if they appreciated the conversation. 98% was either positive or neutral towards the call.

The project had a positive impact internally at Svenska Spel. Staff, throughout the entire organisation,  were very supportive towards the project. And also, when communicating it externally, it created good PR for the company.

Next steps

The intervention is still being evaluated with follow up questionnaires to contacted customers. However, since the early results indicate a positive effect on player behaviour, we plan to continue the conversations – especially since the customers seem to appreciate the service.

New study on how the responsible gambling tool Playscan is used

In the article “Usage of a responsible Gambling tool: a descriptive analysis and latent class analysis of user behavior” PhD student David Forsström from the Department of Psychology, Stockholm University, studied how players are using Playscan. He examined the Playscan 3-data of 9 528 online gamblers who used the tool voluntarily and investigated if there are different subclasses of users by conducting a latent class analysis. He observed number of visits to the site, self-tests made and advice used.

 

The study has shown that the tool has a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

 

Professor Per Carlbring states, “The low usage of the tool is not a disappointment. As long as the right ones actually use the tool, which is exactly what we found. People with a higher risk level are using Playscan more.”

 

Find the study here

Using a hypothesis-driven approach to develop effective responsible gambling tools

The prevention of problematic gambling is a complex issue. We at Playscan know it all too well. But in order to learn about effective prevention initiatives we use the method of validated learning for acquiring new knowledge.

By practising hypothesis-driven development for responsible gambling we see the development of new tools and services as a series of experiments to determine whether an expected outcome will be achieved – or not. With this we challenge the concept of having fixed requirements when we develop new features. Instead, the process is iterated until we reach a desirable outcome.

6 steps toward hypothesis-driven development

1. We make user research and formulate a hypothesis

Let us look at an example: In interviews with users we often ask them to describe their general attitudes toward their risk assessment. We hear players ask themselves: ok, so this is my risk assessment…but what do I do now?

(This is where we get the chance to identify what the user is expecting from us. From this it is our responsibility to design features that address the problem.)

Our hypothesis is:
We believe that if we clearly communicate the answer to the question “what do I do now?”
Will result in more players reducing their risk level.
We will know we have succeeded when we see an X% increase in risk levels.

2. We define targets and points to measure

We base the work on the products Impact Map, a document that help us drive our software development towards effect, meaning delivering the right responsible gambling initiative to the right player.

Example: X% more risk players know what to do in order to lover their risk level. This is measured with an online questionnaire; click through on recommendations and analysis of the gambling behavior.

3. We design an experiment to test the hypothesis

Best practices and research inspire us when we work on a solution. We talk it through with our experts on problematic gambling, write texts and produce real content.

4. We develop the solution

During the process of making the solution alive software developers, UX-designers and copywriters work closely together. Simply because it always gives us the best result. Then we launch it.

5. We validate the use, accept or reject the hypothesis

This is where we collect feedback from the player and can see if the solution delivers the use we expected. Did it work? Or do we need to change anything? Here we learn and iterate and make it even better.

Our most important work: we iterate!

To ensure that we are on the right course, we work in short iterations that are generally two weeks long. We build the system with small additions of user-valued functionality and evolve by adapting to user feedback. Have we stumbled on any mines? Well of course. But it’s a part of the game – we do not even expect to hit the target at the first time. For every experiment we do we always learn something new. Even if we had a great hypothesis (based on good observations or research) sometimes the results are just neutral. But this is why this method is so effective: we can quickly get a hint on what seems to work – and what’s not working.

From 10% to 60% click-though to a RG-tool – why user interface design matters


How do players use a responsible gambling tool – why user interface design matters – talk by Natalia Matulewicz. Presented at the SNSUS Conference, Stockholm, June 2-3 2015.

How do we increase the usage of responsible gambling tools? Is mandatory or voluntary the way to go?

This talk means to inspire and give new ideas to how to increase the players interest and usage of responsible gambling tools. With the help of user interface design guidelines and persuasive technology principles we went from 10% players using the tool up to impressive 60%.

Increasing click-through to RG-tools by simple re-design

To make sure you have impact, do quick experiments and redesign things based on usability principles. By doing so, we found out that displaying only one recommendation at the time makes more people click.

A click on a recommendation is a success for Playscan. It means that we’ve provoked a reaction or created interest for taking action.

Sometimes, the little stuff create a big difference. As Thaler and Sunstein writes in their bestseller Nudge: “[S]mall and apparently insignificant details can have major impacts on people’s behaviour. A good rule of thumb is to assume that “everything matters””. With the “everything matters” mindset, the insignificant details can indeed prove fruitful beyond expectation.

Back in the days, Playscan always showed two recommendations to players. The rationale was that this was a trade-off between making sure to give the player more than one choice, to make sure he could find something relevant, but not to overwhelm him with too many. From nothing but our own curiosity, we decided to test whether we were correct.

We randomly divided our visitors into three groups, presenting them with one, two or three recommendations respectively. Next, we measured the click-through rate during a two week period. At the end of it, we realized that we had left a good many clicks on the table.

Our original design with two recommendations proved a 20% click-through, measured as the proportion of players who clicked any tip. The three-tip version showed no significant difference, but our one-tip version did: 36% of players clicked the recommendation. Again: the only change was one vs two recommendations – no other design changes, the same selection of recommendation, no new content – and from this we doubled our click-through!

Lessons learned?

Hindsight is always 20/20, and there is a reasonable explanation for what we found: players are more likely to click through with fewer conflicting and maybe confusing recommendations to choose from. Still, before our test we thought we had an equally good theory of why two recommendations was the way to do things.

So while doubling our click-through on recommendations based only on simplifying things was a big lesson learned, the biggest was without a doubt that “everything matters”. Ideas and hypotheses are a good starting point, but until proven they are just that: hypotheses.

Now, getting people to use our tools is only a first step in having impact. When it comes to recommendations, the next is having relevant ones. How do we make sure that they are? Well, we will test that too.

Making big data actionable by creating user personas

Talk by Natalia Matulewicz at the New Horizons in Responsible Gambling Conference February 2-4, 2015

What do you know about your online player? With anonymous players, customer data is an important factor when making strategic business decisions with limited information. However, big data often becomes a faceless collection of information, rather than a true picture of the players’ wants and needs. One still needs to know how to interpret data and how to combine it with other sources of information.

User Personas brings together big data with qualitative user research such as interviews, field studies and observations to gain an overall picture of a user, their needs, goals and motivation. It also fills the gap between what players claim to act upon compared to their measured actions, which in the context of gambling often differs. Combined with big data, User Personas give the answers to three important questions: what are the main target groups, which target groups should be focused on to make the most impact, and how should communications be designed towards those target groups?

A shorter Self Test: Does not increase completion rate

Summary: Shortening the 16 statement Self Test within Playscan yields negligible improvement in completion rates. The length of the test is not a problem; players either drop off during the first couple of questions or complete the test.

 

A recurring concern about Playscan has been that 16 statements to consider in the Self Test may be too many. The player may grow impatient and abort the test, especially since the questions themselves can be sensitive and draining. We investigated whether a shorter introductory test with “gate questions” would increase the completion rate of tests.

 

When a player clicks into the Self Test, an introductory text is displayed. Here, the player is encouraged to consider all gambling, at all gambling sites, during the past three months. The player is then asked to consider 16 statements, one at a time.

 

playscan_selftest1

 

To investigate the usefulness of gate questions, the results from the Self Tests at Svenska Spel between 2014-07-04 and 2014-10-14 were analyzed. Statistics of these are presented below, showing the completion and drop-off rates.

playscan_selftest_drop_off_rate

 

Looking at the numbers, the completion rate is quite satisfactory; in particular 80% web completion. This high number is likely due to the curiosity that brought the player to Playscan in the first place, and the promise of self-assessment at the end of the process. Self tests in general tend to have a higher completion rate than surveys thanks to the intrinsic motivation behind doing them.

 

The majority of the players who drop off do so at the first question. We also see a difference between channels with a 10% drop-off rate on web and 23% on mobile. The higher drop-off on mobile is hardly surprising, given the users’ attention span in the mobile context.

 

Only 10% of the started tests are dropped between question 2 and 16, regardless of channel. Interesting to note is that the drop-off rate declines as the test continues.

 

This leaves us with a clear answer to the question of gate questions. We would have yielded only 4% more completed tests if the test consisted of four statements. This number is hardly worth chasing at the cost of the players spending less time contemplating their gambling habits or missing out on the nuances that the full 16 statements bring.

 

———

 

 

The research done at Playscan is not academically focused, but aimed at practical application.
We are pragmatists, knee deep in data to explore. Our mission is to help prevent problem gambling rather than to study it, so we spend our time chasing preventive effect wherever we sense it.
We value agility and adaptation.
Where the territory is uncharted, our guiding light is curiosity and making a difference. Our data is local. We sometimes see wildly varying player behavior between operators, not necessarily because the players are different, but because contexts and presentations are. We believe that the research community has lots to learn about the importance of things like wording and design, and what we say will often be framed to show this. Our findings reflect the everyday player experience. This is neither universal nor static. It can change and, more importantly, can be changed.
At the same time, we have the deepest respect for formal research and academics. We welcome critique of our findings, and hope that others find inspiration and ideas to bring into the academic world.  We are happy to help, and love to exchange experience and ideas. Give us a call if you would like to help out!