How we improved our ratings for Astraware games

rating-calloutOr… How we changed a few things to make a world of difference to our ratings and downloads.

TL;DR: We asked our customers to give us good reviews, in a nice way, and they did. We improved our game ratings from mid 3s to high 4s, over the course six months.

Preface : This is a long article where we share some information about our process with other developers.We were inspired in how to do this by posts from various other developers who’ve shared their ideas too, and added our own thoughts and processes, and we’d like to share back too.   We hope this article will be interesting for everyone!


Background – the challenge we’ve always had with reviews

We’ve been developing apps and games for quite some years. We think we’re pretty good at it – we’ve had some successes, and plenty of failures, and we’ve tried to learn along the way. We must have done a good job at times, after all, we’re still just about here!

Like all developers, we work on a product, make it as good as we can, and then release it.  We test it ourselves, improving it along the way,  get feedback from beta testers and make more improvements, and then after it’s released, we continue to get feedback from customers too, and we make more fixes, tweaks, improvements.

All the time as we do this, the product (usually! ) continues to improve. Our customers love our games and many of them play every day for months or even years at a time.

However, the product ratings tend not to show this! They have traditionally stuck at a kind of similar level.  A  game which launches and is a ‘ 3 to 4 star game ‘ will tend to stick at that 3-4 stars for years – perhaps indefinitely, despite our improvements!  Shouldn’t it get better over time?

This is something that vexed us – we clearly have lots of enthusiastic customers, but the reviews don’t reflect this. We get some great reviews, but also some poor reviews (some valid, some crazy, even some by disgruntled competitors), in equal balance.  While we do fix the problems, users tended to not then come back and fix up their reviews, and even though our happy users outnumber our unhappy users by a huge ratio, the reviews and ratings don’t always seem to reflect that.

“Its just ok i guess”… 2 stars.



What’s more, reviews and ratings matter. OK, less so with free apps (‘what have I got to lose’) – but people still look at the rating for games. Everyone knows that anything less than a 3 star means that you don’t bother even installing it. 3 Stars means ‘maybe try, but it’s not the best choice’, and 4 stars means ‘it’s a pretty good game’.   The decimal place doesn’t matter so much. There’s a world of difference between a 3.9 average and a 4.1 average, not least of which is because the app stores don’t always show fractional graphics. Once you hit the 4, you get 4 stars. 3.99 gets you that lousy 3 stars. Oh, the cruelty!

In 2016 we decided to change it. Here’s what we did!

First up – here’s what we didn’t do.  We didn’t write fake reviews, nor did we hire some teams to write hundreds of fake reviews on our products. All of our reviews are genuine.

We also didn’t really change our games – we didn’t give away masses of free things, or offer incentivised rewards in return for a glowing review. We just continued our usual pace of fixing problems, adding in support for new devices, and adding features where we could. No difference there.

Our real change was three-fold;

  1. When we’d resolved a problem via our customer support, and especially when someone was delighted, we asked people if they wouldn’t mind leaving us a good review on the store.
  2. We encouraged people to tell us when they had a problem, and made it easier for them to do so.
  3. We asked people to leave a rating or review when they’re happy with the game, without annoying them in the process.


Well, duh!  Isn’t that obvious?  Yes! Absolutely obvious… like most things – in hindsight.

Here’s how we accomplished this!

1 : A better customer support process

We switched from using a big email account for doing our customer support, and used an online system instead, called FreshDesk.  ( Other systems are available! )  We’ve written before about why we love Freshdesk – it means we track problems, and usually manage to resolve them along with getting back to the customer to let them know – even if it’s been weeks or months since they got in touch.

Freshdesk also let us include a mobile component in our games – adding customer messaging from within the game, as well as in-app FAQs which really help. A customer getting in touch via this system means that we also get some key support information – what device and OS version they have, which game it is, what version of the game they have. These are items which (normally) wouldn’t all be sent by the customer in their first email, and so by getting all of that along with their very first message, we cut out at least one round of back-and-forth asking for details. Happy customers – less time spent on support – and more likely to resolve right away!  Freshdesk’s latest version of their mobile system is called Hotline and we love that too!

So, with more customers being happy, we just had to take the step to ask them to help!  Being oh-so-very painfully British, this was quite a challenge for us, but our customers are lovely and are often delighted to help us out, since we’ve gone out of our way to provide great support.  The bar for “great” customer support is so easy to reach, you just need to be better than the telecomms provider they’ve had to spend hours on the phone with trying to get their problems resolved.  Being a small team (just two of us) we don’t always reply within minutes, but our ‘next day’ kind of target still makes people happy.

Freshdesk gives us a really helpful ‘Request App Review’ system for our threaded customer support, which basically provides the customer with a button and a link to leave a review, connecting them straight to the relevant page on each of the App stores. Anything to take the effort out of finding it is good!



2 : Tell us about big problems

By giving people built-in ways to contact us, we’ve mostly been able to receive messages with problems. If the user can tell us of their problem and we can respond quickly (preferably resolve it too, but at least responding), then they usually don’t want to also go and leave a negative review.  Obviously this only works once they’ve been able to successfully install and run the game, but that does cover perhaps 75% of the customer problems.

We also have tried to follow up with reviews on the app stores (Google Play, Amazon) where we’re able to, asking people to get in touch, or telling them when a new version fixes the thing they weren’t happy with. On Google Play it’s fairly easy for users to update their review, which often means that they may change their 2 star review to a 4 or 5 star – which really helps the average!


A reply back to a happy user who might (crossing fingers) adjust their star rating later



An example of a 2 star review, changed to a 5 after we fixed the problem, and replied via the reviews page


We try to engage positively with the low reviews – even if that reviewer doesn’t change


3 : Asking for good reviews

This is the challenging one!

An easy way to get lots of people to leave a review for your app is to nag them with a big modal dialog box popping up every 15 minutes saying “Please leave a review!” – with a button taking you directly to the store page.  This will get you a LOT of reviews, and they won’t be happy ones!  Nobody likes to be interrupted or nagged, and doing this makes people cross and so they’ll leave a review saying that the app endlessly nags them, and then go on to add any other minor gripes they have.  The upshot – piles of 1 and 2 star reviews, lots of negative comments, and a star average that plummets.  Even if you remove that, you’re left with so many low reviews that it’s very hard to pull the average up again, and likely impossible to get it above the target of a 4.

After lots of advice, seeing some great examples by other developers, and a lot of white-board diagrams, we came up with the goals of:

  1. Ask in an unobtrusive way that doesn’t interrupt the game flow
  2. Don’t nag
  3. Make it easy for people to get to the review page
  4. Encourage people to contact us instead if they’re not happy about something
  5. Ask at a high point when the player is likely to be happiest.
  6. Ask them to leave a good review
  7. Only ask the players who are sticking with the game (and so are most likely to like it)
  8. Ask again (only if appropriate) if it’s a new version with a major new feature set.
  9. Aim for quality of reviews rather than quantity.


Here’s how we did this!

After completing any of our Daily Puzzle games, the player can enter their name, and then submit to see how they compare against other players. Here’s the usual high score page they get with just one small addition.


Usually they would then just hit Back (hardware button, or the arrow in the top left) to return back out to the title screen.


We added this small box underneath the rating area, with two buttons.


This is unobtrusive ( Goal 1 ) and is completely ignorable.


We don’t pop this box up every time though – we only include the box at a high point (when we reckon the player is likely to be happiest), determined by:

  • After they’ve completed at least 10 daily games, and at least 5 within the last week. ( Goal 7 )
  • After the score that they’ve just got is in their personal top 20% of how well they have done out of the last 10. ( Goal 5 )

If someone usually gets a bronze medal for their score, then the time they get a silver or gold is the best time to ask if they’re happy.

We want to ask our happiest users to leave reviews, so we ask them if they’re happy first of all! ( Goal 9 )  This is fairly simple filtering, but it makes a big difference!


If they’re not happy (they choose “Not really” to the question of whether they’re enjoying), we switch the box out to show this:


Choosing “OK, sure!” opens up our support system with a message box, so that the user has to do as little as possible in order to send us feedback.   ( Goal 4 ) We save a flag to know that they weren’t happy, and we won’t ask them again, although if we resolve their problem by customer support we’ll ask that way.  We find out very quickly the things that bother our players this way, since it’s a small prompt that doesn’t feel like we’re taking them out of their way.



If as we hope, they choose “Yes I am!” to the original question, it shows this box and asks if they’d be willing to leave a positive review, explaining that positive reviews really help us. ( Goal 6 )



Note that we are a bit cheeky here by ‘framing’ the box with the 5 stars. We don’t have any direct control over what the user eventually chooses, but by suggesting that 5 stars is the default for being happy with the game, we’re hoping that it increases the number of users who choose that.  Without putting anything at all, the ‘default’ score for an app would be a 3 star average.

If they say “No thanks” we don’t ask again. Again, part of the ‘not nagging’ ( Goal 2 ).


We provide a link right to the review page (as far as possible) ( Goal 3 ) – and once they return to the app, we put up a ‘thanks!’.


We don’t offer any particular reward for doing this, although if the game was about to show an advert, we have it skip that one, mainly because after having left the app and returned, we want them to get back to playing as soon as possible.   We save a flag to know that they’ve left a review (although the reality is we don’t know whether they did or not), and consequently don’t ask again ( Goal 2 )


However… if a user has been happy in the past, and has left a review, after we’ve updated to a new version (and features), and after at least a large margin of games (30 or more) and again at a happy point (relatively high score), we will ask again if they like the new version. ( Goal 8 ).  We only do this on iOS, which allows a user to put a new review for each version of the app. The Apple App Store shows the overall rating, and also the ratings for this version, so it’s very useful to have some returning reviewers coming back and saying what’s new that they’re happy about.


Here’s an overview of the Review Request Flow between the various boxes that we include. (click to embiggen )



Aside from the jump out to either the app store review page, or our message sending page, all of this takes place on the high score form, simply by changing the contents of that box in-place, avoiding context switching for the user.

The Results

For Astraware Crosswords, we put in this system in May 2016.  The game had built up (on Google Play) an average star rating of 3.8, with a couple of hundred reviews in total.

Although it took (on average) a user a week or two before we might have prompted them to review (getting a recent personal best might take a while), the quantity of reviews that came in was surprisingly high, and the vast majority of these are 5 stars, with some 4s.  We still had some reviewers leaving low scores – as was always the case – and the rate of these didn’t change, suggesting that the low reviews hadn’t been changed – they are from people who couldn’t install it / didn’t like it, etc.

Here’s how the chart looked, with a breakdown of reviews by month (by colour – darkest green are the 5 star reviews, total on the right) and the cumulative average (scale on the left):




It didn’t take a long time for our review average to pick up. A month of new reviews probably added as many as all of the reviews of the game from the previous 3 years.

The update with our review system was added in early May 2016, and sometime in June, our cumulative average crept just above 4.0.  The effect of this on our download rate (all from organic discovery on the Google Play store) was :




A jump from 25 to 50 installs per day doesn’t sound like a big deal, but those extra users mean an increase in our daily users that continues to compound and grow. A higher download number then factors in to the App Store ranking, and so the game creeps up the charts, gets more visibility, and continues to increase. Fantastic!


We released further updates in late 2016 which have given some extra spikes of downloads too, but the general trend of increased installs is (we believe) at least as much down to the consistently high rating, as it is to our consistent rate of improving the game!


Customer Support

We did get an increase in customer support  –  tagged so that we know they’d seen the review box and had said they weren’t happy. For many of these we were able to actually resolve the problem and delight them – at which point we could ask for a review which they’d be very happy to give.


Other Existing Astraware Games

The review improvement system has been a really useful way of increasing our exposure and download numbers for each of our puzzle games – since they are all built on the same framework, we could implement it into each of them relatively easily.

The effect in each of our other main games has been similar – going from average level reviews to an improvement, and an increased number of reviews and downloads.

We rolled out the system to Codewords and Kriss Kross in April too, and you can see the spike of additional reviews for each.

codewords-ratings krisskross-ratings

We didn’t update Wordsearch until September, but the change is still similar!


Neatly, this reversed the trend of the review average which was creeping down since the beginning of 2016 and heading towards that sub-4.00, much of that being because it hadn’t been updated for a while and wasn’t working well on new devices with large resolutions.

Other Platforms

This system has helped our iOS version too, however, the effect isn’t as marked.

Whether adding a review on the Apple App store is just an extra step too much effort, or for some other reason, we’re not sure. It could even be that the manner of asking iOS users – perhaps the wording – just isn’t as effective.  We haven’t come up with a way of improving this yet, so if anyone knows the magic formula for that we’d love to hear from you!


New Astraware Games

The effect on the games (Astraware Acrostic and Astraware Wordoku) that were new releases later in the year was even more interesting;



In these cases we didn’t have a background of legacy users, and an average review level that was relatively low.

In the case of these games, the early users are those who are already existing players of Astraware games where we’ve sent a message asking them to try out the new game. We mostly suggested Wordoku to our Sudoku players, and Acrostic to our Crosswords players, as we thought these would be the groups who would be most interested in each game respectively.

Of particular interest is that once the early reviews are established as high and positive, they set the ‘tone’ for future reviews. Not only are the ratings more likely to be a 5 than anything else, but the comments left in the reviews are almost always popular too. This could be down to some kind of Bandwagon Effect – people wanting to stick with the consensus, Peer Pressure  and perhaps not be the odd one out being negative (aka the Spiral of Silence), or it could just be that the games are awesome and our customers are happy and loyal. I’d like to imagine more of the latter!


How many users leave reviews?

We took a decision that we wanted to get the highest rated reviews, at the expense of quantity. That means we aimed to select only the most happy, most regular players, and suggest (in as nice a way as possible) that they leave a review. We used Flurry to track the various segments of our players, through from beginning to play, through getting their best scores, and seeing how many have seen and responded to the message.


Cost and Summary

Creating this system didn’t take a huge effort, compared to making a new game, as we were mostly building on blocks of various kinds that we already had in place, and just inserting into a flow that already existed in the game.


Rough estimates would be :

Design : 4 Days

Artwork : 2 Days

Implementation into first game : 10 days

Implementation into subsequent games : 1 day each

Total time – approx 3-4 ‘man weeks’


The value of this would appear to be, after 3-6 months, a rough doubling of our daily download rate for each game. Some of those new players stick with the games, and either spend money or watch ads to play, so ultimately, perhaps a doubling of our advertising and IAP revenue in total for those games on the Google platform.   A payback time, for us, of perhaps 4 months, and it continues to work!

For us this is a huge win! (Offsetting other endeavours which haven’t been quite so successful, or older products that are dropping away.)


What Now?

The changes we worked on later in 2016 have been about alternative ways to monetise the games – giving people more ways to play the games, whether by buying extra packs of puzzles, or by watching video ads in return for more free play puzzles. That’s also the stuff of a later blog article, so stay tuned for that once we’ve finished that phase of work!

Thanks for reading – we hope this is useful! Feel free to send any comments or suggestions, or ask any questions!