Player Retention and Acquisition

RetentionSml

We’ve been making games for a long time, and we have quite a lot of happy, and above all – loyal players, who keep on playing!  Here’s some information about some of the recent things we’ve been working on that have made a really big difference to us!

TL;DR : This is an article intended for game (and app) developers, about increasing the total number of users over time.  Summary : Retention matters more than Acquisition!  Read on for more info!

 

 

Acquisition

Acquisition is (relatively) easy by advertising, and of course improving marketing/cross-selling/app landing pages, etc.  Changing our review and rating system built in to the games has really helped with this!

While there are lots of ways of getting more people to try an app, most of them have a cost.

The free things that can improve acquisition include:

  • A good App icon
  • Great keyword choices and good app store text
  • Supporting screenshots (and video, if appropriate)
  • Good chart positioning  – it’s hard to get in the top downloads as an indie developer – but it can often be possible to get into the more niche charts.

If advertising for a game, there’s a definite cost attached.  If the cost to acquire a user exceeds the average revenue you make, then you’ll lose money, so it’s essential to make sure you know how much acquisition costs and how much revenue you make on average from a player over the entire time they have your game installed and running.

 

Retention

Retention is about how long the user keeps the game. Especially for games which generate revenue by having paid items (IAPs) or advertising (interstitials, rewarded ads), it makes a huge difference to the total revenue.

Retention certainly matters more than acquisition – if we keep people for longer (by keeping them happy), and still provide ways for them to give us ongoing revenue, then we improve our total revenue the most.

The ‘Lifetime Average Revenue Per User’ – LARPU – is ultimately what really matters much more than how much a customer buys on day 1.

We use Flurry to collect a certain amount of analytics about the usage of our games, partially for event tracking, partially for tracking any problems that happen, but also as an overview to track things like retention and usage rates;

This article at Flurry has some great info about what’s a normal retention and usage rate for various categories of apps, and from that this chart is especially helpful:

Retention and Frequency of use by App Category

Retention and Frequency of use by App Category

 

We expect many of the new installs – because our game is free – to be uninstalled (or unused) fairly quickly.

Here’s an example of our retention of new users, for our iOS version of Astraware Acrostic, by week;

Rolling Retention (Acrostic)

Rolling Retention (Acrostic)

More precisely, we currently have a D1 of 50%, a D7 of 37%, and a D30 of 22%. ( This doesn’t quite map to the weeks above, Flurry’s a bit weird like that!)  D1 means the retention rate 1 day after the install.

This means that about half of the people who install the game keep it beyond the first day (D1), and half delete it within the first day. We reckon that’s pretty good for a free game!

Of those who keep it beyond that day, 74% are playing a week later (D7), and 44% are still playing after a month (D30). Apparently that’s a good set of values, which was nice to discover!

We did a lot of work about retention over the last year to achieve this!

Onboarding

We’ve worked on the ‘onboarding‘ process (don’t blame us for the term!), which is about encouraging the players to stick with the game beyond the first few seconds, to get them through their first successful game and for them to see what it’s all about. Dori Adar has a great slide deck about this! Our take on it was to make it as easy as possible for people to know what to do to get into their first game as quickly as possible, and to enjoy their first experience with the game.  We don’t overload with nags, we keep the interface easy (with a few subtle hints with glow effects to draw attention to some appropriate buttons), and we make sure that the player doesn’t get an ad for their first game (in fact, we make sure they get at least 5 ad-free games, before introducing an explanation about ads and then having them at the end of games.)

Improving the onboarding is immediately shown by a change in the D1 value, so is extremely measurable.  We still have some ideas to make things a bit better for that first launch ‘experience’.

Medium Term Retention

We also worked on the retention rate in general, which reflects better in the D7/D30/D90 values.

We give people daily content with extra available each new day. The changes we made were to let people catch up at any time for any of the past week’s puzzles, and we added a second puzzle each day, meaning much more free content available especially for anyone who didn’t check every day.

We also increased our number of built in “free” puzzles from around 12 to around 60, giving the average user a few hours worth of completely free puzzle play extra.

Most recently we added extra puzzle streams (ad supported) so that nobody should get to a point where they completely run out of available puzzles.

 

Long Term retention

Incredibly (and wonderfully) we have players who have been playing our daily puzzles for several years – some of them are happy to watch an ad each game, some prefer to buy the In-App Purchase which makes the daily puzzles ad-free.  Both are great for us!

It’s good that many players do eventually build playing our game into their normal daily/weekly routine, but we do acknowledge that we’re not quite as good at keeping players beyond 6 months as we would like to be.

Again, we still have more to do, in particular we want to do better linking with social media (Facebook) so that players can more easily compare results with their friends, which we hope will keep many more people enjoying the game (together) for longer!

Some of the ideas we have in some games, and some others planned

  • Compare daily scores and times with friends (as determined by Facebook login) as the default view for scores
  • Optional show “local” times for nearby players ( perhaps, closest 100 players, geographically. )
  • Achievements to earn over a longer time
  • Extra linking with friends – puzzles that can be played together
  • Gifting/Reciprocation between friends  ( I buy a puzzle pack, I get something to give to a friend too, hopefully they’ll return the favour! )
  • In-game bonuses – extra hints ‘donated’ by active friends, which encourages you to stay somewhat active on the game to help them out too.
  • Loyalty bonus rewards. ( Thanks for staying for 1 month! 3 months! 6 months! 12 months! ) etc.
  • “Come back!” reminders (done politely of course!) to alert a player that they’re about to miss out on a puzzle which will soon drop off their board. This should help to make it so that players come back at least once a week to catch up on free games!

For a game to become part of someone’s habit for over a year is quite an honour – it’s become part of their lives. We know we’re doing some things well when we see a user who gets in touch who has played several hundred daily games!

… Profit?

So, if everything works well, and the cost of acquisition is less than the LARPU, at some point we make profit!

Well, that means we make some net revenue per user… we still have to factor in the cost of development!

Right now we are only spending on very inexpensive acquisition, Apple Search Ads, Google Ads, etc, where we can put in a disgracefully low bid for keywords that make sense for us.  For our ‘paid’ acquisitions like this, our average cost per extra user is around $0.10, but the number of users we get at this rate is fairly low.  We can increase the number by increasing our bids… but then the average cost goes up.

Our LARPU is much harder to calculate, it is very game dependent!

If a user buys any of the IAPs in the game, from a $1 item (for which we get $0.70) upwards, we will do well!  Our ‘average’ IAP revenue per user who tries our games is about $0.07.

The average time a user plays our games is trickier. Many of the users uninstall straight away; Of those who keep the game beyond the first day, the average time they keep the game running for is about 24 days, but there’s a very wide range! (Those who we convert to multi-month players will stay a very long time!).  A player who plays a reasonable number of the daily games for 24 days will see (if we’re lucky) about 10-15 adverts, which will be an average value of about $0.08 to us.  (Our current eCPM for adverts is about $5 on average).

So our LARPU is the IAP revenue plus the Ad revenue – or about $0.15, and splits almost evenly between ad revenue (free players) and IAP revenue (players who purchase).

Wow, that’s not a huge value… but it means a profit of about $0.05 per user if we’ve paid to acquire (by advertising.)

 

Purchased Installs

There are companies out there who are happy to ‘sell’ installs of your game – they incentivise players to try out your game. As a developer you pay per user who makes it into your application (having gone through the install process, and launching the game, perhaps to a minimum level of activity.)   I have had quoted values of $0.50 up through $2.00 or more per install!  Obviously these would not be economical in our case.

 

All users are equal, but some are more equal than others

It may be no surprise, but users who have deliberately searched for your application (in general, or specifically), are those who want to use it!

Then there are those who saw an advert for it as they were doing something else, and thought they might give it a go.

Finally, there are those who are ‘incentivised’ to try the application, not because of the app itself, but because of a reward they’ll get elsewhere.  These users are the least likely to stick with a game (but they still might!)

 

We cross-promote our games where it makes sense – the player of one of our word games is likely to be interested in others too. If someone has had a good experience in one of our games, they’ll already know the user interface for the next, and they’ll hopefully be warm to the next game already, and more likely to stick with it. We can cross-promote almost for free, which is brilliant! ( Building cross-promotion systems into your apps is a good idea!)

We get lots of users who are searching for games of our types in general, and some will try our game if they see it.

We pay for some adverts to come up within searches. Hopefully we’re still advertising to people who are interested in the game, so should still be fairly good value.

We haven’t (yet) tried advertising in general – i.e. in other people’s games, but for that we need to be able to advertise for lower cost than we’re likely to make back.  If our average user gets us $0.15, will someone we’ve found by advertising get us the same?  That’s something the revenue/attribution/tracking companies are real experts at working out – and why all the big companies really focus on this too!

 

Summary

We currently think much more about retention (keeping existing users) than acquisition (getting new users); We’re not the only ones to realise this!   We’ve learned a lot from other developers who’ve shared their experiences online and at various conferences, and the choices we make are about the overall retention and ‘LARPU’ rather than making a quick profit!

 

How we improved our ratings for Astraware games

rating-calloutOr… How we changed a few things to make a world of difference to our ratings and downloads.

TL;DR: We asked our customers to give us good reviews, in a nice way, and they did. We improved our game ratings from mid 3s to high 4s, over the course six months.

Preface : This is a long article where we share some information about our process with other developers.We were inspired in how to do this by posts from various other developers who’ve shared their ideas too, and added our own thoughts and processes, and we’d like to share back too.   We hope this article will be interesting for everyone!

 


Background – the challenge we’ve always had with reviews

We’ve been developing apps and games for quite some years. We think we’re pretty good at it – we’ve had some successes, and plenty of failures, and we’ve tried to learn along the way. We must have done a good job at times, after all, we’re still just about here!

Like all developers, we work on a product, make it as good as we can, and then release it.  We test it ourselves, improving it along the way,  get feedback from beta testers and make more improvements, and then after it’s released, we continue to get feedback from customers too, and we make more fixes, tweaks, improvements.

All the time as we do this, the product (usually! ) continues to improve. Our customers love our games and many of them play every day for months or even years at a time.

However, the product ratings tend not to show this! They have traditionally stuck at a kind of similar level.  A  game which launches and is a ‘ 3 to 4 star game ‘ will tend to stick at that 3-4 stars for years – perhaps indefinitely, despite our improvements!  Shouldn’t it get better over time?

This is something that vexed us – we clearly have lots of enthusiastic customers, but the reviews don’t reflect this. We get some great reviews, but also some poor reviews (some valid, some crazy, even some by disgruntled competitors), in equal balance.  While we do fix the problems, users tended to not then come back and fix up their reviews, and even though our happy users outnumber our unhappy users by a huge ratio, the reviews and ratings don’t always seem to reflect that.

“Its just ok i guess”… 2 stars.

screen-shot-2017-01-07-at-15-03-13

 

What’s more, reviews and ratings matter. OK, less so with free apps (‘what have I got to lose’) – but people still look at the rating for games. Everyone knows that anything less than a 3 star means that you don’t bother even installing it. 3 Stars means ‘maybe try, but it’s not the best choice’, and 4 stars means ‘it’s a pretty good game’.   The decimal place doesn’t matter so much. There’s a world of difference between a 3.9 average and a 4.1 average, not least of which is because the app stores don’t always show fractional graphics. Once you hit the 4, you get 4 stars. 3.99 gets you that lousy 3 stars. Oh, the cruelty!

In 2016 we decided to change it. Here’s what we did!

First up – here’s what we didn’t do.  We didn’t write fake reviews, nor did we hire some teams to write hundreds of fake reviews on our products. All of our reviews are genuine.

We also didn’t really change our games – we didn’t give away masses of free things, or offer incentivised rewards in return for a glowing review. We just continued our usual pace of fixing problems, adding in support for new devices, and adding features where we could. No difference there.

Our real change was three-fold;

  1. When we’d resolved a problem via our customer support, and especially when someone was delighted, we asked people if they wouldn’t mind leaving us a good review on the store.
  2. We encouraged people to tell us when they had a problem, and made it easier for them to do so.
  3. We asked people to leave a rating or review when they’re happy with the game, without annoying them in the process.

 

Well, duh!  Isn’t that obvious?  Yes! Absolutely obvious… like most things – in hindsight.

Here’s how we accomplished this!

1 : A better customer support process

We switched from using a big email account for doing our customer support, and used an online system instead, called FreshDesk.  ( Other systems are available! )  We’ve written before about why we love Freshdesk – it means we track problems, and usually manage to resolve them along with getting back to the customer to let them know – even if it’s been weeks or months since they got in touch.

Freshdesk also let us include a mobile component in our games – adding customer messaging from within the game, as well as in-app FAQs which really help. A customer getting in touch via this system means that we also get some key support information – what device and OS version they have, which game it is, what version of the game they have. These are items which (normally) wouldn’t all be sent by the customer in their first email, and so by getting all of that along with their very first message, we cut out at least one round of back-and-forth asking for details. Happy customers – less time spent on support – and more likely to resolve right away!  Freshdesk’s latest version of their mobile system is called Hotline and we love that too!

So, with more customers being happy, we just had to take the step to ask them to help!  Being oh-so-very painfully British, this was quite a challenge for us, but our customers are lovely and are often delighted to help us out, since we’ve gone out of our way to provide great support.  The bar for “great” customer support is so easy to reach, you just need to be better than the telecomms provider they’ve had to spend hours on the phone with trying to get their problems resolved.  Being a small team (just two of us) we don’t always reply within minutes, but our ‘next day’ kind of target still makes people happy.

Freshdesk gives us a really helpful ‘Request App Review’ system for our threaded customer support, which basically provides the customer with a button and a link to leave a review, connecting them straight to the relevant page on each of the App stores. Anything to take the effort out of finding it is good!

req-app-review

 

2 : Tell us about big problems

By giving people built-in ways to contact us, we’ve mostly been able to receive messages with problems. If the user can tell us of their problem and we can respond quickly (preferably resolve it too, but at least responding), then they usually don’t want to also go and leave a negative review.  Obviously this only works once they’ve been able to successfully install and run the game, but that does cover perhaps 75% of the customer problems.

We also have tried to follow up with reviews on the app stores (Google Play, Amazon) where we’re able to, asking people to get in touch, or telling them when a new version fixes the thing they weren’t happy with. On Google Play it’s fairly easy for users to update their review, which often means that they may change their 2 star review to a 4 or 5 star – which really helps the average!

 

A reply back to a happy user who might (crossing fingers) adjust their star rating later

screen-shot-2017-01-07-at-15-07-10

 

An example of a 2 star review, changed to a 5 after we fixed the problem, and replied via the reviews page

screen-shot-2017-01-07-at-15-09-46

We try to engage positively with the low reviews – even if that reviewer doesn’t change

 

3 : Asking for good reviews

This is the challenging one!

An easy way to get lots of people to leave a review for your app is to nag them with a big modal dialog box popping up every 15 minutes saying “Please leave a review!” – with a button taking you directly to the store page.  This will get you a LOT of reviews, and they won’t be happy ones!  Nobody likes to be interrupted or nagged, and doing this makes people cross and so they’ll leave a review saying that the app endlessly nags them, and then go on to add any other minor gripes they have.  The upshot – piles of 1 and 2 star reviews, lots of negative comments, and a star average that plummets.  Even if you remove that, you’re left with so many low reviews that it’s very hard to pull the average up again, and likely impossible to get it above the target of a 4.

After lots of advice, seeing some great examples by other developers, and a lot of white-board diagrams, we came up with the goals of:

  1. Ask in an unobtrusive way that doesn’t interrupt the game flow
  2. Don’t nag
  3. Make it easy for people to get to the review page
  4. Encourage people to contact us instead if they’re not happy about something
  5. Ask at a high point when the player is likely to be happiest.
  6. Ask them to leave a good review
  7. Only ask the players who are sticking with the game (and so are most likely to like it)
  8. Ask again (only if appropriate) if it’s a new version with a major new feature set.
  9. Aim for quality of reviews rather than quantity.

 

Here’s how we did this!

After completing any of our Daily Puzzle games, the player can enter their name, and then submit to see how they compare against other players. Here’s the usual high score page they get with just one small addition.

highscore-with-panel

Usually they would then just hit Back (hardware button, or the arrow in the top left) to return back out to the title screen.

 

We added this small box underneath the rating area, with two buttons.

are-you-enjoying

This is unobtrusive ( Goal 1 ) and is completely ignorable.

 

We don’t pop this box up every time though – we only include the box at a high point (when we reckon the player is likely to be happiest), determined by:

  • After they’ve completed at least 10 daily games, and at least 5 within the last week. ( Goal 7 )
  • After the score that they’ve just got is in their personal top 20% of how well they have done out of the last 10. ( Goal 5 )

If someone usually gets a bronze medal for their score, then the time they get a silver or gold is the best time to ask if they’re happy.

We want to ask our happiest users to leave reviews, so we ask them if they’re happy first of all! ( Goal 9 )  This is fairly simple filtering, but it makes a big difference!

 

If they’re not happy (they choose “Not really” to the question of whether they’re enjoying), we switch the box out to show this:

sad-send-feedback

Choosing “OK, sure!” opens up our support system with a message box, so that the user has to do as little as possible in order to send us feedback.   ( Goal 4 ) We save a flag to know that they weren’t happy, and we won’t ask them again, although if we resolve their problem by customer support we’ll ask that way.  We find out very quickly the things that bother our players this way, since it’s a small prompt that doesn’t feel like we’re taking them out of their way.

 

 

If as we hope, they choose “Yes I am!” to the original question, it shows this box and asks if they’d be willing to leave a positive review, explaining that positive reviews really help us. ( Goal 6 )

happy-request-review

 

Note that we are a bit cheeky here by ‘framing’ the box with the 5 stars. We don’t have any direct control over what the user eventually chooses, but by suggesting that 5 stars is the default for being happy with the game, we’re hoping that it increases the number of users who choose that.  Without putting anything at all, the ‘default’ score for an app would be a 3 star average.

If they say “No thanks” we don’t ask again. Again, part of the ‘not nagging’ ( Goal 2 ).

 

We provide a link right to the review page (as far as possible) ( Goal 3 ) – and once they return to the app, we put up a ‘thanks!’.

happy-yes-thanks

We don’t offer any particular reward for doing this, although if the game was about to show an advert, we have it skip that one, mainly because after having left the app and returned, we want them to get back to playing as soon as possible.   We save a flag to know that they’ve left a review (although the reality is we don’t know whether they did or not), and consequently don’t ask again ( Goal 2 )

 

However… if a user has been happy in the past, and has left a review, after we’ve updated to a new version (and features), and after at least a large margin of games (30 or more) and again at a happy point (relatively high score), we will ask again if they like the new version. ( Goal 8 ).  We only do this on iOS, which allows a user to put a new review for each version of the app. The Apple App Store shows the overall rating, and also the ratings for this version, so it’s very useful to have some returning reviewers coming back and saying what’s new that they’re happy about.

 

Here’s an overview of the Review Request Flow between the various boxes that we include. (click to embiggen )

reviewflow

 

Aside from the jump out to either the app store review page, or our message sending page, all of this takes place on the high score form, simply by changing the contents of that box in-place, avoiding context switching for the user.

The Results

For Astraware Crosswords, we put in this system in May 2016.  The game had built up (on Google Play) an average star rating of 3.8, with a couple of hundred reviews in total.

Although it took (on average) a user a week or two before we might have prompted them to review (getting a recent personal best might take a while), the quantity of reviews that came in was surprisingly high, and the vast majority of these are 5 stars, with some 4s.  We still had some reviewers leaving low scores – as was always the case – and the rate of these didn’t change, suggesting that the low reviews hadn’t been changed – they are from people who couldn’t install it / didn’t like it, etc.

Here’s how the chart looked, with a breakdown of reviews by month (by colour – darkest green are the 5 star reviews, total on the right) and the cumulative average (scale on the left):

 

cw-cumulativerating

 

It didn’t take a long time for our review average to pick up. A month of new reviews probably added as many as all of the reviews of the game from the previous 3 years.

The update with our review system was added in early May 2016, and sometime in June, our cumulative average crept just above 4.0.  The effect of this on our download rate (all from organic discovery on the Google Play store) was :

 

cw-dailyinstalls-annot

 

A jump from 25 to 50 installs per day doesn’t sound like a big deal, but those extra users mean an increase in our daily users that continues to compound and grow. A higher download number then factors in to the App Store ranking, and so the game creeps up the charts, gets more visibility, and continues to increase. Fantastic!

 

We released further updates in late 2016 which have given some extra spikes of downloads too, but the general trend of increased installs is (we believe) at least as much down to the consistently high rating, as it is to our consistent rate of improving the game!

 

Customer Support

We did get an increase in customer support  –  tagged so that we know they’d seen the review box and had said they weren’t happy. For many of these we were able to actually resolve the problem and delight them – at which point we could ask for a review which they’d be very happy to give.

 

Other Existing Astraware Games

The review improvement system has been a really useful way of increasing our exposure and download numbers for each of our puzzle games – since they are all built on the same framework, we could implement it into each of them relatively easily.

The effect in each of our other main games has been similar – going from average level reviews to an improvement, and an increased number of reviews and downloads.

We rolled out the system to Codewords and Kriss Kross in April too, and you can see the spike of additional reviews for each.

codewords-ratings krisskross-ratings

We didn’t update Wordsearch until September, but the change is still similar!

wordsearch-ratings

Neatly, this reversed the trend of the review average which was creeping down since the beginning of 2016 and heading towards that sub-4.00, much of that being because it hadn’t been updated for a while and wasn’t working well on new devices with large resolutions.

Other Platforms

This system has helped our iOS version too, however, the effect isn’t as marked.

Whether adding a review on the Apple App store is just an extra step too much effort, or for some other reason, we’re not sure. It could even be that the manner of asking iOS users – perhaps the wording – just isn’t as effective.  We haven’t come up with a way of improving this yet, so if anyone knows the magic formula for that we’d love to hear from you!

 

New Astraware Games

The effect on the games (Astraware Acrostic and Astraware Wordoku) that were new releases later in the year was even more interesting;

wordoku-ratingsacrostic-ratings2

 

In these cases we didn’t have a background of legacy users, and an average review level that was relatively low.

In the case of these games, the early users are those who are already existing players of Astraware games where we’ve sent a message asking them to try out the new game. We mostly suggested Wordoku to our Sudoku players, and Acrostic to our Crosswords players, as we thought these would be the groups who would be most interested in each game respectively.

Of particular interest is that once the early reviews are established as high and positive, they set the ‘tone’ for future reviews. Not only are the ratings more likely to be a 5 than anything else, but the comments left in the reviews are almost always popular too. This could be down to some kind of Bandwagon Effect – people wanting to stick with the consensus, Peer Pressure  and perhaps not be the odd one out being negative (aka the Spiral of Silence), or it could just be that the games are awesome and our customers are happy and loyal. I’d like to imagine more of the latter!

 

How many users leave reviews?

We took a decision that we wanted to get the highest rated reviews, at the expense of quantity. That means we aimed to select only the most happy, most regular players, and suggest (in as nice a way as possible) that they leave a review. We used Flurry to track the various segments of our players, through from beginning to play, through getting their best scores, and seeing how many have seen and responded to the message.

 

Cost and Summary

Creating this system didn’t take a huge effort, compared to making a new game, as we were mostly building on blocks of various kinds that we already had in place, and just inserting into a flow that already existed in the game.

 

Rough estimates would be :

Design : 4 Days

Artwork : 2 Days

Implementation into first game : 10 days

Implementation into subsequent games : 1 day each

Total time – approx 3-4 ‘man weeks’

 

The value of this would appear to be, after 3-6 months, a rough doubling of our daily download rate for each game. Some of those new players stick with the games, and either spend money or watch ads to play, so ultimately, perhaps a doubling of our advertising and IAP revenue in total for those games on the Google platform.   A payback time, for us, of perhaps 4 months, and it continues to work!

For us this is a huge win! (Offsetting other endeavours which haven’t been quite so successful, or older products that are dropping away.)

 

What Now?

The changes we worked on later in 2016 have been about alternative ways to monetise the games – giving people more ways to play the games, whether by buying extra packs of puzzles, or by watching video ads in return for more free play puzzles. That’s also the stuff of a later blog article, so stay tuned for that once we’ve finished that phase of work!

Thanks for reading – we hope this is useful! Feel free to send any comments or suggestions, or ask any questions!

yes-i-am

 

 

Thoughts on Apple’s September 2014 Event

 

For Tim Cook and Apple, this was obviously one of those company-defining events that don’t happen very often. Cook had pretty much confirmed that they would be announcing a new product category in comments earlier in the year and also by the venue: the Flint Centre in Cupertino was previously used to announce the original iMac and the first iPhone. So, what of the much-anticipated “iWatch”, and of the iPhone 6?

iPhone 6

buystrip_hero_largeQuite a lot of the details had already been leaked by Chinese manufacturers, so we knew that there would be a 4.7” phone and probably a 5.5” one. In terms of the aesthetics I’m happy to see the rounded edges (the iPhone-shaped imprint in every pair of jeans I own is embarrassing), but the protruding camera lens had me wondering whether Steve Jobs would have allowed that to be announced. Still, it’s de rigeur these days… The phones are significantly thinner than the already slim iPhone 5s, so maybe that’s a reason for the bulge. I’ve always had my iPhones in cases anyway so hopefully that would make it flat?

The display details are of particular interest to us, as we have to make sure our games look their best on all the popular screen resolutions, although the new iPhones will automatically upscale from the iPhone 5’s retina display. The larger Plus phone has a standard 1080×1920 screen, as used in the Samsung Galaxy S5 and the Nexus 5. That’s great and means that we can support that with ease (I’ll explain our “display metrics” system another day). The iPhone 6’s 750×1334 screen is quite, um, unique though, so I’m guessing we’ll use the common Android 720×1280 resolution and spread things out a bit.

While the iPhone 6’s screen keeps exactly the same “retina” 326 pixels per inch, the Plus has a rather tighter 401. Obviously our eyes have improved 23% since Steve said that the iPhone 4’s retina display was basically as good as you need. But hey, at least it’s not the LG G3’s insane 538 ppi. Apple don’t usually ramp up specs just for numerical superiority.

Other than that, the new iPhones are predictably faster, have a better camera (though interestingly they’ve again gone for better optics while keeping the megapixels the same) and supposedly better battery life.

Apple Pay

The era of the ‘i’ has passed: maybe it’s just getting too expensive to buy up the trademarks these days. Apple Pay is basically a combination of the new iPhones’ NFC capabilities (Apple always get there in the end), the Touch ID fingerprint sensor, and another new secure chip for storing payment identity details. As usual, Apple, being an American tech company, announce this all as if the whole world still uses the magnetic strips on their credit cards. No, most of us can already use ‘contactless’ payments. Oh well!

That said, Apple Pay seems to be a well thought-through and seamless system which does tick the boxes in terms of convenience and security. But if you might be going to a store that doesn’t support contactless (and maybe it actually needs to be Apple-specific software on the terminal) then you’ll need those credit cards…

Apple Watch

See, no “i” here either! The Apple Watch looked surprisingly unsurprising to me… The round Moto 360 is significantly more exciting to look at. It looks quite thick, which is always a turn-off for me when it comes to watches. But Apple have understood that watches need to be more personalised than phones, so it’s good to see a wide range of different styles of straps and three different materials for the watch itself. Gold, if you so desire!

In terms of the software, things seem pretty well thought-through for a 1.0 release. Nice virtual watch faces, good Siri voice control, tidy integrations with email, calendar, messages etc. The ability to communicate with other Apple Watch users via a virtual (tiny) scribble pad looks quite interesting – I can imagine partners or friends building up a library of pictograms for common interactions.

Some pre-event coverage suggested an on-watch App Store, which seemed like a recipe for chaos to me. What they’ve actually done is made watch apps extensions of (the mandatory) iPhone’s apps, via “WatchKit” (which joins a long line of kits such as StoreKit, HealthKit and MapKit). I couldn’t discern from the video how sophisticated this is, though it seemed to enable some quite elaborate graphics, interactions and hardware integration. I wonder what the possibilities are for games? Could something like “2048” with its purely swipe-based control be made to work on there? How about long-term games where you need to tend to your crops/troops etc? There’s certainly some opportunity for exposure from Apple for developers that can do unique things here.


The new iPhones will be available from the 19th of this month, so as usual there’s not long for developers to think about getting games updated in time for launch! But how about Apple Watch, coming in the new year? How about something that works with that…?