About Astraware Game Design

Designing Game Difficulty Curves

This is an article explaining our somewhat mathematical approach to designing difficulty curves (or difficulty progressions) into games!  It was originally an internal document which we used when making games from around 2005 through to 2010, where arcade and action-puzzle games formed the majority of our output.

TL;DR : This is an article intended primarily for other developers who are developing games which get more difficulty with time or level. The solution at it’s simplest is to work out how well the best possible player could play, and adjust based on that!  Note – this gets a bit more technical than everyone may be happy to read!

TL;DR/2 : There’s a mild dinosaur reference in this article!

 

Read more

Small Studio development choices – Astraware’s Cuboid Approach

cuboidblanksml

There’s always so much to do, so many new things we could work on! What do we choose next?

More importantly – how do we choose what to do next?

TL;DR : This is an article intended primarily for other developers who are in small studios, about making the best choices on where to focus their efforts. The answer is usually to spend time where it will make the greatest difference, rather than just choosing what’s most fun (well, duh!). Read on for more detail about that!

 

About our games in general

We have, like many developers after a while, a number of games all sharing a similar framework.  That means that an improvement to one, can be rolled out into the other games relatively easily, and maintainability (OS level fixes, new device support) is much easier too!

With that in mind, our choosing process is about where to put the next effort, to best get to a goal of making more revenue. (Got to support that tea and biscuit habit somehow!)

So, do we

  • Make a new game?
  • Work on features in an existing game?
  • Find and implement a new way to improve the monetisation across our games?
  • Work on ways to get more users to the game?

All of these are good options, so it’s about choosing the best.

 

Applying some metrics

Our collective revenue from games is ultimately determined by:

How many games we have :  G

How many daily active Users there are per game (on average)  = ( Dau / G )

Monetisation (aka ‘ARPDAU’ – catchy!) – How much revenue is generated on average per day per active user : Arpdau = ( R / Dau )

 

Simply, if you multiply these values, you get the daily revenue across your whole portfolio : G x  ( Dau / G ) x ( R / Dau ) = R

 

We view these three values as three sides to a cuboid, with the volume being the revenue.

Games, Users (retention), and Monetisation (ARPDAU)

 

 

When we decide what to work on, we look at how much our effort will increase one of those dimensions.

In turn, by knowing the other values that would be fixed, we would be able to estimate the change in the volume.

In the example above, would adding a ‘box’ size to games, users, or monetisation, increase the volume by the most?

Usually – the simplest best choice will be to increase the smallest current dimension.

However, not all of those choices will the same cost, which makes it into a more complex decision.

For instance, over 2016, our possible choices have included:

  • Spend 4 months to make one new game, increasing from 7 to 8
  • Spend 2 months to build a new social system into the games increasing the retention, and the number of users by 10%
  • Spend 3 months to add in a new revenue system (i.e our puzzle streams) which could increase our revenue by 20%
  • Spend 2 weeks on an interim small update to each of the games fixing a problem that affects a small number of users on the latest OS level – 2% improvement in users (but might become more later)
  • Spend 2 weeks on adding in an additional advert provider to increase our ad revenue by perhaps 5%
  • Spend 1 month to build another method for micro-currency earning  ( “Complete surveys for extra stamps!” ) to increase revenue by 20%
  • Spend 4 months to add in daily puzzles to increase active users (via retention) by 50%
    • Spend a further 1 month to add 2nd daily puzzles to add another 10% to users
  • Spend 1 month on adding in extra purchasable puzzle packs to each game to increase revenue by 5%

These are very much the kind of ‘interesting choices’ that might be presented to a game player in a strategic game like Civilization, for instance. It’s one of the places where we’re running a game theory to make smart choices for our games business!

It’s never quite so simple a choice, sometimes a sequence has to be made which isn’t so good in the short term, but ultimately better in the long term. This is like working through a tech-tree in a game in order to get the best bonuses at the end.

For instance, we had to add puzzle streams which was an expensive piece of work, in order to have a micro currency in place and making sense, so that we could add in surveys as an final option. By doing things in that order, we were able to do a meaningful release at each stage, with the players getting a new feature each time.

It would have taken the same total time to do it in the other order, but we wouldn’t have been able to do the interim releases.  We’ve found that every time we do a release we get a burst of new users and activity, and our existing users are kept happy too, so it does make a lot of sense to do things in an order that gets us more updates.

Also there’s always more going on that needs to be done – other projects (like hitting a retail version on time), coping with a new piece of technology (Samsung’s new device resolution of the week), rebuilding to cope with latest OS version ( Apple’s new iOS version of the month! ), etc.

More Games

This is straightforward! We make another new game, create a whole load of puzzles (and plenty to keep the daily puzzle hoppers filled too), and then we can launch it.

A new ‘puzzle’ game in our existing framework takes about 3-4 months of creation time, usually split between the two of us, across all of the tasks – Design / Art / Development / Puzzle Creation / Testing / Production, etc.

We know that we’ll be able to cross-promote to  our existing players, using the advertising system that’s already in the games. We can sacrifice some paid ads, to promote our own games instead, which players typically are happy to try, as they already trust our games and the new game is free too!

Revenue Per User

Statistically the revenue per user can be calculated from the number of users and how much is made, in terms of in-app purchases and ad revenue. It’s a bit more challenging in that advert revenue is comparatively small, only a trickle, but if a user keeps up with the app – playing for free – but for many months or years, that can add up appreciably.  The industry talks about LTV (Long Term Value) or LARPU (Lifetime Average Revenue Per User), which are used in calculations.

Some of the things we can do to increase the revenue are:

  • Adding in extra purchasable features or packs
  • Including adverts at appropriate times
  • Add in other cross-sells to our other games
  • Add in other ways for people to get items in the game (micro-currency) such as doing sponsored surveys

There’s an obvious limit – if we put in too many adverts, or adverts that pop up at a bad time interrupting play, or adverts that take up too much of the screen while a game is in progress, etc., these will increase the number of advert views in a short time, at the expense of retention – nobody likes a product they can’t use because it’s constantly interrupting them! Sometimes it’s a careful balance between monetisation and retention, but we tend to side more with retention!

We currently get a revenue of around $0.15 per user on average, roughly split equally between in-app purchases and revenue from advertising.

 

Number Of Users

While the number of games and revenue per user are fairly straightforward, in terms of changes made, the choices about number of users is quite a bit more subtle.

Every day we have new users finding the game, and some users uninstalling.

Each choice we make will change the number of new users trying the game, and the likelihood of them (or existing users) from deleting the game. This is all about retention which I’ve written more about in another article!

We hope to make choices that increase new users, and decrease uninstalls, at the same time!

  • Best possible first-play experience
  • Plenty of value over time
  • Daily content
  • Sufficient free content (which may be ad supported)

 

Having Fun

Sometimes we can be guilty of being a bit blind, and launching into doing some development that’s fun. For us, that means running headfirst at whichever problem seems the most difficult, challenging, and downright unsolvable. We love creating new things, new systems… but it’s not always the best choice.  Yes, we end up with more fun things (and our customers love that too) but there are times when improving the existing suite of games could do better!

Over the last few years, we’ve deliberately tried to take an informed/data driven approach to what to work on – ultimately we look at the choices, and we work out what think each choice would mean for our revenue stream (best guesses all round). We might divide in to the effort to get a value of  ” revenue per month, per day of effort ” which is an awkward unit, but really makes us focus on the reality.  Sometimes that means we’ll have to punt ‘fun’ things down the development order a few times, which is always a pain, but they do get to stay on the list!

 

Crazy Assumptions

OK, we admit it! There are a lot of assumptions to be able to call it a cuboid!

  1. Not every game has identical monetisation.
  2. Each game will get a different profile of retention.
  3. Each new game doesn’t add the same number of users straight away (it takes a while!)
  4. Fans of certain games are more likely to follow ads (or take surveys) than other games.
  5. What worked last year might not work this year or next
  6. That we can estimate in advance the timescale to create products, features etc.

That’s so many assumptions and errors that it’s not exactly a bankable value, but any analysis is better than none – and if you know the flaws, you can at least appreciate the risks!

 

Summary

While we do love to make new games and solve new problems, that isn’t always the best choice.  Averaging over a long time, we spend about a third of our time on making new games, a third on acquisition and retention (supporting new devices, new platforms, new features, product maintenance and updates), and a third on making sure the games can make us revenue ( extra puzzle packs, more ways to play for free supported by ads / surveys, more content, etc.)

There have been times where we’ve realised (after analysing some data) that we’ve been missing something important, and that we have an easy way to increase one of the dimensions.  An example of this was when we realised how badly we were organising our advertising! ( Hint – mediation can be brilliant! )

 

Player Retention and Acquisition

RetentionSml

We’ve been making games for a long time, and we have quite a lot of happy, and above all – loyal players, who keep on playing!  Here’s some information about some of the recent things we’ve been working on that have made a really big difference to us!

TL;DR : This is an article intended for game (and app) developers, about increasing the total number of users over time.  Summary : Retention matters more than Acquisition!  Read on for more info!

 

 

Acquisition

Acquisition is (relatively) easy by advertising, and of course improving marketing/cross-selling/app landing pages, etc.  Changing our review and rating system built in to the games has really helped with this!

While there are lots of ways of getting more people to try an app, most of them have a cost.

The free things that can improve acquisition include:

  • A good App icon
  • Great keyword choices and good app store text
  • Supporting screenshots (and video, if appropriate)
  • Good chart positioning  – it’s hard to get in the top downloads as an indie developer – but it can often be possible to get into the more niche charts.

If advertising for a game, there’s a definite cost attached.  If the cost to acquire a user exceeds the average revenue you make, then you’ll lose money, so it’s essential to make sure you know how much acquisition costs and how much revenue you make on average from a player over the entire time they have your game installed and running.

 

Retention

Retention is about how long the user keeps the game. Especially for games which generate revenue by having paid items (IAPs) or advertising (interstitials, rewarded ads), it makes a huge difference to the total revenue.

Retention certainly matters more than acquisition – if we keep people for longer (by keeping them happy), and still provide ways for them to give us ongoing revenue, then we improve our total revenue the most.

The ‘Lifetime Average Revenue Per User’ – LARPU – is ultimately what really matters much more than how much a customer buys on day 1.

We use Flurry to collect a certain amount of analytics about the usage of our games, partially for event tracking, partially for tracking any problems that happen, but also as an overview to track things like retention and usage rates;

This article at Flurry has some great info about what’s a normal retention and usage rate for various categories of apps, and from that this chart is especially helpful:

Retention and Frequency of use by App Category

Retention and Frequency of use by App Category

 

We expect many of the new installs – because our game is free – to be uninstalled (or unused) fairly quickly.

Here’s an example of our retention of new users, for our iOS version of Astraware Acrostic, by week;

Rolling Retention (Acrostic)

Rolling Retention (Acrostic)

More precisely, we currently have a D1 of 50%, a D7 of 37%, and a D30 of 22%. ( This doesn’t quite map to the weeks above, Flurry’s a bit weird like that!)  D1 means the retention rate 1 day after the install.

This means that about half of the people who install the game keep it beyond the first day (D1), and half delete it within the first day. We reckon that’s pretty good for a free game!

Of those who keep it beyond that day, 74% are playing a week later (D7), and 44% are still playing after a month (D30). Apparently that’s a good set of values, which was nice to discover!

We did a lot of work about retention over the last year to achieve this!

Onboarding

We’ve worked on the ‘onboarding‘ process (don’t blame us for the term!), which is about encouraging the players to stick with the game beyond the first few seconds, to get them through their first successful game and for them to see what it’s all about. Dori Adar has a great slide deck about this! Our take on it was to make it as easy as possible for people to know what to do to get into their first game as quickly as possible, and to enjoy their first experience with the game.  We don’t overload with nags, we keep the interface easy (with a few subtle hints with glow effects to draw attention to some appropriate buttons), and we make sure that the player doesn’t get an ad for their first game (in fact, we make sure they get at least 5 ad-free games, before introducing an explanation about ads and then having them at the end of games.)

Improving the onboarding is immediately shown by a change in the D1 value, so is extremely measurable.  We still have some ideas to make things a bit better for that first launch ‘experience’.

Medium Term Retention

We also worked on the retention rate in general, which reflects better in the D7/D30/D90 values.

We give people daily content with extra available each new day. The changes we made were to let people catch up at any time for any of the past week’s puzzles, and we added a second puzzle each day, meaning much more free content available especially for anyone who didn’t check every day.

We also increased our number of built in “free” puzzles from around 12 to around 60, giving the average user a few hours worth of completely free puzzle play extra.

Most recently we added extra puzzle streams (ad supported) so that nobody should get to a point where they completely run out of available puzzles.

 

Long Term retention

Incredibly (and wonderfully) we have players who have been playing our daily puzzles for several years – some of them are happy to watch an ad each game, some prefer to buy the In-App Purchase which makes the daily puzzles ad-free.  Both are great for us!

It’s good that many players do eventually build playing our game into their normal daily/weekly routine, but we do acknowledge that we’re not quite as good at keeping players beyond 6 months as we would like to be.

Again, we still have more to do, in particular we want to do better linking with social media (Facebook) so that players can more easily compare results with their friends, which we hope will keep many more people enjoying the game (together) for longer!

Some of the ideas we have in some games, and some others planned

  • Compare daily scores and times with friends (as determined by Facebook login) as the default view for scores
  • Optional show “local” times for nearby players ( perhaps, closest 100 players, geographically. )
  • Achievements to earn over a longer time
  • Extra linking with friends – puzzles that can be played together
  • Gifting/Reciprocation between friends  ( I buy a puzzle pack, I get something to give to a friend too, hopefully they’ll return the favour! )
  • In-game bonuses – extra hints ‘donated’ by active friends, which encourages you to stay somewhat active on the game to help them out too.
  • Loyalty bonus rewards. ( Thanks for staying for 1 month! 3 months! 6 months! 12 months! ) etc.
  • “Come back!” reminders (done politely of course!) to alert a player that they’re about to miss out on a puzzle which will soon drop off their board. This should help to make it so that players come back at least once a week to catch up on free games!

For a game to become part of someone’s habit for over a year is quite an honour – it’s become part of their lives. We know we’re doing some things well when we see a user who gets in touch who has played several hundred daily games!

… Profit?

So, if everything works well, and the cost of acquisition is less than the LARPU, at some point we make profit!

Well, that means we make some net revenue per user… we still have to factor in the cost of development!

Right now we are only spending on very inexpensive acquisition, Apple Search Ads, Google Ads, etc, where we can put in a disgracefully low bid for keywords that make sense for us.  For our ‘paid’ acquisitions like this, our average cost per extra user is around $0.10, but the number of users we get at this rate is fairly low.  We can increase the number by increasing our bids… but then the average cost goes up.

Our LARPU is much harder to calculate, it is very game dependent!

If a user buys any of the IAPs in the game, from a $1 item (for which we get $0.70) upwards, we will do well!  Our ‘average’ IAP revenue per user who tries our games is about $0.07.

The average time a user plays our games is trickier. Many of the users uninstall straight away; Of those who keep the game beyond the first day, the average time they keep the game running for is about 24 days, but there’s a very wide range! (Those who we convert to multi-month players will stay a very long time!).  A player who plays a reasonable number of the daily games for 24 days will see (if we’re lucky) about 10-15 adverts, which will be an average value of about $0.08 to us.  (Our current eCPM for adverts is about $5 on average).

So our LARPU is the IAP revenue plus the Ad revenue – or about $0.15, and splits almost evenly between ad revenue (free players) and IAP revenue (players who purchase).

Wow, that’s not a huge value… but it means a profit of about $0.05 per user if we’ve paid to acquire (by advertising.)

 

Purchased Installs

There are companies out there who are happy to ‘sell’ installs of your game – they incentivise players to try out your game. As a developer you pay per user who makes it into your application (having gone through the install process, and launching the game, perhaps to a minimum level of activity.)   I have had quoted values of $0.50 up through $2.00 or more per install!  Obviously these would not be economical in our case.

 

All users are equal, but some are more equal than others

It may be no surprise, but users who have deliberately searched for your application (in general, or specifically), are those who want to use it!

Then there are those who saw an advert for it as they were doing something else, and thought they might give it a go.

Finally, there are those who are ‘incentivised’ to try the application, not because of the app itself, but because of a reward they’ll get elsewhere.  These users are the least likely to stick with a game (but they still might!)

 

We cross-promote our games where it makes sense – the player of one of our word games is likely to be interested in others too. If someone has had a good experience in one of our games, they’ll already know the user interface for the next, and they’ll hopefully be warm to the next game already, and more likely to stick with it. We can cross-promote almost for free, which is brilliant! ( Building cross-promotion systems into your apps is a good idea!)

We get lots of users who are searching for games of our types in general, and some will try our game if they see it.

We pay for some adverts to come up within searches. Hopefully we’re still advertising to people who are interested in the game, so should still be fairly good value.

We haven’t (yet) tried advertising in general – i.e. in other people’s games, but for that we need to be able to advertise for lower cost than we’re likely to make back.  If our average user gets us $0.15, will someone we’ve found by advertising get us the same?  That’s something the revenue/attribution/tracking companies are real experts at working out – and why all the big companies really focus on this too!

 

Summary

We currently think much more about retention (keeping existing users) than acquisition (getting new users); We’re not the only ones to realise this!   We’ve learned a lot from other developers who’ve shared their experiences online and at various conferences, and the choices we make are about the overall retention and ‘LARPU’ rather than making a quick profit!

 

How we improved our ratings for Astraware games

rating-calloutOr… How we changed a few things to make a world of difference to our ratings and downloads.

TL;DR: We asked our customers to give us good reviews, in a nice way, and they did. We improved our game ratings from mid 3s to high 4s, over the course six months.

Preface : This is a long article where we share some information about our process with other developers.We were inspired in how to do this by posts from various other developers who’ve shared their ideas too, and added our own thoughts and processes, and we’d like to share back too.   We hope this article will be interesting for everyone!

 


Background – the challenge we’ve always had with reviews

We’ve been developing apps and games for quite some years. We think we’re pretty good at it – we’ve had some successes, and plenty of failures, and we’ve tried to learn along the way. We must have done a good job at times, after all, we’re still just about here!

Like all developers, we work on a product, make it as good as we can, and then release it.  We test it ourselves, improving it along the way,  get feedback from beta testers and make more improvements, and then after it’s released, we continue to get feedback from customers too, and we make more fixes, tweaks, improvements.

All the time as we do this, the product (usually! ) continues to improve. Our customers love our games and many of them play every day for months or even years at a time.

However, the product ratings tend not to show this! They have traditionally stuck at a kind of similar level.  A  game which launches and is a ‘ 3 to 4 star game ‘ will tend to stick at that 3-4 stars for years – perhaps indefinitely, despite our improvements!  Shouldn’t it get better over time?

This is something that vexed us – we clearly have lots of enthusiastic customers, but the reviews don’t reflect this. We get some great reviews, but also some poor reviews (some valid, some crazy, even some by disgruntled competitors), in equal balance.  While we do fix the problems, users tended to not then come back and fix up their reviews, and even though our happy users outnumber our unhappy users by a huge ratio, the reviews and ratings don’t always seem to reflect that.

“Its just ok i guess”… 2 stars.

screen-shot-2017-01-07-at-15-03-13

 

What’s more, reviews and ratings matter. OK, less so with free apps (‘what have I got to lose’) – but people still look at the rating for games. Everyone knows that anything less than a 3 star means that you don’t bother even installing it. 3 Stars means ‘maybe try, but it’s not the best choice’, and 4 stars means ‘it’s a pretty good game’.   The decimal place doesn’t matter so much. There’s a world of difference between a 3.9 average and a 4.1 average, not least of which is because the app stores don’t always show fractional graphics. Once you hit the 4, you get 4 stars. 3.99 gets you that lousy 3 stars. Oh, the cruelty!

In 2016 we decided to change it. Here’s what we did!

First up – here’s what we didn’t do.  We didn’t write fake reviews, nor did we hire some teams to write hundreds of fake reviews on our products. All of our reviews are genuine.

We also didn’t really change our games – we didn’t give away masses of free things, or offer incentivised rewards in return for a glowing review. We just continued our usual pace of fixing problems, adding in support for new devices, and adding features where we could. No difference there.

Our real change was three-fold;

  1. When we’d resolved a problem via our customer support, and especially when someone was delighted, we asked people if they wouldn’t mind leaving us a good review on the store.
  2. We encouraged people to tell us when they had a problem, and made it easier for them to do so.
  3. We asked people to leave a rating or review when they’re happy with the game, without annoying them in the process.

 

Well, duh!  Isn’t that obvious?  Yes! Absolutely obvious… like most things – in hindsight.

Here’s how we accomplished this!

1 : A better customer support process

We switched from using a big email account for doing our customer support, and used an online system instead, called FreshDesk.  ( Other systems are available! )  We’ve written before about why we love Freshdesk – it means we track problems, and usually manage to resolve them along with getting back to the customer to let them know – even if it’s been weeks or months since they got in touch.

Freshdesk also let us include a mobile component in our games – adding customer messaging from within the game, as well as in-app FAQs which really help. A customer getting in touch via this system means that we also get some key support information – what device and OS version they have, which game it is, what version of the game they have. These are items which (normally) wouldn’t all be sent by the customer in their first email, and so by getting all of that along with their very first message, we cut out at least one round of back-and-forth asking for details. Happy customers – less time spent on support – and more likely to resolve right away!  Freshdesk’s latest version of their mobile system is called Hotline and we love that too!

So, with more customers being happy, we just had to take the step to ask them to help!  Being oh-so-very painfully British, this was quite a challenge for us, but our customers are lovely and are often delighted to help us out, since we’ve gone out of our way to provide great support.  The bar for “great” customer support is so easy to reach, you just need to be better than the telecomms provider they’ve had to spend hours on the phone with trying to get their problems resolved.  Being a small team (just two of us) we don’t always reply within minutes, but our ‘next day’ kind of target still makes people happy.

Freshdesk gives us a really helpful ‘Request App Review’ system for our threaded customer support, which basically provides the customer with a button and a link to leave a review, connecting them straight to the relevant page on each of the App stores. Anything to take the effort out of finding it is good!

req-app-review

 

2 : Tell us about big problems

By giving people built-in ways to contact us, we’ve mostly been able to receive messages with problems. If the user can tell us of their problem and we can respond quickly (preferably resolve it too, but at least responding), then they usually don’t want to also go and leave a negative review.  Obviously this only works once they’ve been able to successfully install and run the game, but that does cover perhaps 75% of the customer problems.

We also have tried to follow up with reviews on the app stores (Google Play, Amazon) where we’re able to, asking people to get in touch, or telling them when a new version fixes the thing they weren’t happy with. On Google Play it’s fairly easy for users to update their review, which often means that they may change their 2 star review to a 4 or 5 star – which really helps the average!

 

A reply back to a happy user who might (crossing fingers) adjust their star rating later

screen-shot-2017-01-07-at-15-07-10

 

An example of a 2 star review, changed to a 5 after we fixed the problem, and replied via the reviews page

screen-shot-2017-01-07-at-15-09-46

We try to engage positively with the low reviews – even if that reviewer doesn’t change

 

3 : Asking for good reviews

This is the challenging one!

An easy way to get lots of people to leave a review for your app is to nag them with a big modal dialog box popping up every 15 minutes saying “Please leave a review!” – with a button taking you directly to the store page.  This will get you a LOT of reviews, and they won’t be happy ones!  Nobody likes to be interrupted or nagged, and doing this makes people cross and so they’ll leave a review saying that the app endlessly nags them, and then go on to add any other minor gripes they have.  The upshot – piles of 1 and 2 star reviews, lots of negative comments, and a star average that plummets.  Even if you remove that, you’re left with so many low reviews that it’s very hard to pull the average up again, and likely impossible to get it above the target of a 4.

After lots of advice, seeing some great examples by other developers, and a lot of white-board diagrams, we came up with the goals of:

  1. Ask in an unobtrusive way that doesn’t interrupt the game flow
  2. Don’t nag
  3. Make it easy for people to get to the review page
  4. Encourage people to contact us instead if they’re not happy about something
  5. Ask at a high point when the player is likely to be happiest.
  6. Ask them to leave a good review
  7. Only ask the players who are sticking with the game (and so are most likely to like it)
  8. Ask again (only if appropriate) if it’s a new version with a major new feature set.
  9. Aim for quality of reviews rather than quantity.

 

Here’s how we did this!

After completing any of our Daily Puzzle games, the player can enter their name, and then submit to see how they compare against other players. Here’s the usual high score page they get with just one small addition.

highscore-with-panel

Usually they would then just hit Back (hardware button, or the arrow in the top left) to return back out to the title screen.

 

We added this small box underneath the rating area, with two buttons.

are-you-enjoying

This is unobtrusive ( Goal 1 ) and is completely ignorable.

 

We don’t pop this box up every time though – we only include the box at a high point (when we reckon the player is likely to be happiest), determined by:

  • After they’ve completed at least 10 daily games, and at least 5 within the last week. ( Goal 7 )
  • After the score that they’ve just got is in their personal top 20% of how well they have done out of the last 10. ( Goal 5 )

If someone usually gets a bronze medal for their score, then the time they get a silver or gold is the best time to ask if they’re happy.

We want to ask our happiest users to leave reviews, so we ask them if they’re happy first of all! ( Goal 9 )  This is fairly simple filtering, but it makes a big difference!

 

If they’re not happy (they choose “Not really” to the question of whether they’re enjoying), we switch the box out to show this:

sad-send-feedback

Choosing “OK, sure!” opens up our support system with a message box, so that the user has to do as little as possible in order to send us feedback.   ( Goal 4 ) We save a flag to know that they weren’t happy, and we won’t ask them again, although if we resolve their problem by customer support we’ll ask that way.  We find out very quickly the things that bother our players this way, since it’s a small prompt that doesn’t feel like we’re taking them out of their way.

 

 

If as we hope, they choose “Yes I am!” to the original question, it shows this box and asks if they’d be willing to leave a positive review, explaining that positive reviews really help us. ( Goal 6 )

happy-request-review

 

Note that we are a bit cheeky here by ‘framing’ the box with the 5 stars. We don’t have any direct control over what the user eventually chooses, but by suggesting that 5 stars is the default for being happy with the game, we’re hoping that it increases the number of users who choose that.  Without putting anything at all, the ‘default’ score for an app would be a 3 star average.

If they say “No thanks” we don’t ask again. Again, part of the ‘not nagging’ ( Goal 2 ).

 

We provide a link right to the review page (as far as possible) ( Goal 3 ) – and once they return to the app, we put up a ‘thanks!’.

happy-yes-thanks

We don’t offer any particular reward for doing this, although if the game was about to show an advert, we have it skip that one, mainly because after having left the app and returned, we want them to get back to playing as soon as possible.   We save a flag to know that they’ve left a review (although the reality is we don’t know whether they did or not), and consequently don’t ask again ( Goal 2 )

 

However… if a user has been happy in the past, and has left a review, after we’ve updated to a new version (and features), and after at least a large margin of games (30 or more) and again at a happy point (relatively high score), we will ask again if they like the new version. ( Goal 8 ).  We only do this on iOS, which allows a user to put a new review for each version of the app. The Apple App Store shows the overall rating, and also the ratings for this version, so it’s very useful to have some returning reviewers coming back and saying what’s new that they’re happy about.

 

Here’s an overview of the Review Request Flow between the various boxes that we include. (click to embiggen )

reviewflow

 

Aside from the jump out to either the app store review page, or our message sending page, all of this takes place on the high score form, simply by changing the contents of that box in-place, avoiding context switching for the user.

The Results

For Astraware Crosswords, we put in this system in May 2016.  The game had built up (on Google Play) an average star rating of 3.8, with a couple of hundred reviews in total.

Although it took (on average) a user a week or two before we might have prompted them to review (getting a recent personal best might take a while), the quantity of reviews that came in was surprisingly high, and the vast majority of these are 5 stars, with some 4s.  We still had some reviewers leaving low scores – as was always the case – and the rate of these didn’t change, suggesting that the low reviews hadn’t been changed – they are from people who couldn’t install it / didn’t like it, etc.

Here’s how the chart looked, with a breakdown of reviews by month (by colour – darkest green are the 5 star reviews, total on the right) and the cumulative average (scale on the left):

 

cw-cumulativerating

 

It didn’t take a long time for our review average to pick up. A month of new reviews probably added as many as all of the reviews of the game from the previous 3 years.

The update with our review system was added in early May 2016, and sometime in June, our cumulative average crept just above 4.0.  The effect of this on our download rate (all from organic discovery on the Google Play store) was :

 

cw-dailyinstalls-annot

 

A jump from 25 to 50 installs per day doesn’t sound like a big deal, but those extra users mean an increase in our daily users that continues to compound and grow. A higher download number then factors in to the App Store ranking, and so the game creeps up the charts, gets more visibility, and continues to increase. Fantastic!

 

We released further updates in late 2016 which have given some extra spikes of downloads too, but the general trend of increased installs is (we believe) at least as much down to the consistently high rating, as it is to our consistent rate of improving the game!

 

Customer Support

We did get an increase in customer support  –  tagged so that we know they’d seen the review box and had said they weren’t happy. For many of these we were able to actually resolve the problem and delight them – at which point we could ask for a review which they’d be very happy to give.

 

Other Existing Astraware Games

The review improvement system has been a really useful way of increasing our exposure and download numbers for each of our puzzle games – since they are all built on the same framework, we could implement it into each of them relatively easily.

The effect in each of our other main games has been similar – going from average level reviews to an improvement, and an increased number of reviews and downloads.

We rolled out the system to Codewords and Kriss Kross in April too, and you can see the spike of additional reviews for each.

codewords-ratings krisskross-ratings

We didn’t update Wordsearch until September, but the change is still similar!

wordsearch-ratings

Neatly, this reversed the trend of the review average which was creeping down since the beginning of 2016 and heading towards that sub-4.00, much of that being because it hadn’t been updated for a while and wasn’t working well on new devices with large resolutions.

Other Platforms

This system has helped our iOS version too, however, the effect isn’t as marked.

Whether adding a review on the Apple App store is just an extra step too much effort, or for some other reason, we’re not sure. It could even be that the manner of asking iOS users – perhaps the wording – just isn’t as effective.  We haven’t come up with a way of improving this yet, so if anyone knows the magic formula for that we’d love to hear from you!

 

New Astraware Games

The effect on the games (Astraware Acrostic and Astraware Wordoku) that were new releases later in the year was even more interesting;

wordoku-ratingsacrostic-ratings2

 

In these cases we didn’t have a background of legacy users, and an average review level that was relatively low.

In the case of these games, the early users are those who are already existing players of Astraware games where we’ve sent a message asking them to try out the new game. We mostly suggested Wordoku to our Sudoku players, and Acrostic to our Crosswords players, as we thought these would be the groups who would be most interested in each game respectively.

Of particular interest is that once the early reviews are established as high and positive, they set the ‘tone’ for future reviews. Not only are the ratings more likely to be a 5 than anything else, but the comments left in the reviews are almost always popular too. This could be down to some kind of Bandwagon Effect – people wanting to stick with the consensus, Peer Pressure  and perhaps not be the odd one out being negative (aka the Spiral of Silence), or it could just be that the games are awesome and our customers are happy and loyal. I’d like to imagine more of the latter!

 

How many users leave reviews?

We took a decision that we wanted to get the highest rated reviews, at the expense of quantity. That means we aimed to select only the most happy, most regular players, and suggest (in as nice a way as possible) that they leave a review. We used Flurry to track the various segments of our players, through from beginning to play, through getting their best scores, and seeing how many have seen and responded to the message.

 

Cost and Summary

Creating this system didn’t take a huge effort, compared to making a new game, as we were mostly building on blocks of various kinds that we already had in place, and just inserting into a flow that already existed in the game.

 

Rough estimates would be :

Design : 4 Days

Artwork : 2 Days

Implementation into first game : 10 days

Implementation into subsequent games : 1 day each

Total time – approx 3-4 ‘man weeks’

 

The value of this would appear to be, after 3-6 months, a rough doubling of our daily download rate for each game. Some of those new players stick with the games, and either spend money or watch ads to play, so ultimately, perhaps a doubling of our advertising and IAP revenue in total for those games on the Google platform.   A payback time, for us, of perhaps 4 months, and it continues to work!

For us this is a huge win! (Offsetting other endeavours which haven’t been quite so successful, or older products that are dropping away.)

 

What Now?

The changes we worked on later in 2016 have been about alternative ways to monetise the games – giving people more ways to play the games, whether by buying extra packs of puzzles, or by watching video ads in return for more free play puzzles. That’s also the stuff of a later blog article, so stay tuned for that once we’ve finished that phase of work!

Thanks for reading – we hope this is useful! Feel free to send any comments or suggestions, or ask any questions!

yes-i-am