The Ultimate Guide to using Google Analytics for Cross Device Optimisation

Craig Sullivan
15 min readJan 22, 2016

Part 2 : Tablet Device Analysis

This is Part 2 of a 7 part guide. Parts 5–7 are coming soon. All the articles can be found in the Article Index

Tablet Insights

So what are we interested in with Tablet visitors?

It would be wonderful if we could figure out things like:

Orientation (how people hold the device)

Resolution (how big their screen is)

Viewport (how big their browsing window is)

Model (is it an iPad mini, iPad 1,2, 3,4, Pro?)

Operating system (are they running Android, Windows or iOS?)

Let me take you through each one of these and explain what we’re trying to find out, whether we can actually get the data and what to look at:

Tablet Orientation

The one thing we can’t track natively in Google Analytics is how people hold their tablet device (or if they switch orientations on particular pages).

You can add code which will detect an orientation change and trigger an event in Google Analytics but that won’t help us if we don’t have this data.

So let’s look at research on orientation, from a source I trust:

Please read everything by Luke Wroblewski ( He’s a genius on forms, mobile, tiny interactions and UX — probably the most useful stuff I have absorbed in my entire career.

His research he shared on how people hold their devices and what fingers they use is very useful for everyone designing products.

Most people are biased by their ego — they think that their own particular way (using thumbs, index finger) is the way that many others use the device. Really? When’s the last time you studied 5 people or actually found out?

If you’re doing any work on Mobile or Tablets, these two videos probably contain things you never knew — watch them:

Market Stats on Orientation

There are plenty of stats out there on orientation but a small warning — these may not match your visitor behaviour — as the context and audience WILL vary tremendously from the ‘market average’.

If you want to get this data, you’ll need your developers to use this sort of code to set a custom variable in Google analytics:

return window.innerHeight > window.innerWidth ? ‘Portrait’ : ‘Landscape’; (Thanks to @conversionworks)

One final note — if you’re really detail obsessed, you could record every time people change their orientation. My guess is that initial orientation of the website may not be a good signal, if people change orientation frequently (it might be your design that does this).

Here’s an article for more reading:

If you have orientation data, it’s going to improve how you expect people might hold your experience in their hands.

If you don’t have this data, all is not lost — because we know that people use both! You ALWAYS need to try both ways when you test, unless you can PROVE your audience holds them all one way — ok? Both ways.

Tablet Viewport & Resolution

This is very interesting stuff — what size of window (and screen) do people have onto the soul of your product? And to designers who insist there is no fold line — there is, but it varies depending on the device mix and viewport size!

If you don’t know how your design actually breaks and flows horizontally and vertically on consumer devices, don’t lecture me about whether the fold line exists or not. It does. It makes a difference. It depends on the design, coding and viewport. Feel free to Go Ahead and Get So Over Yourself on this one.

So what can we find out about screen size and viewport from GA?

Resolution is easy for Tablets — you just pull it from Google Analytics eh? Well yes and no. The problem is that some tablets (all Apple iPads for example) just report the same resolution, regardless of whether they have a retina screen or not. 1024x768 is the resolution of ALL iPads, according to GA.

So what about viewport? The measurement of the inner window size that people are viewing content through? Will that not be a more precise measurement anyway?

And as early Jan 2016, Viewport is at LAST finally available in Google Analytics — as you can now use the ‘Browser Size’ dimension. Immense cheering breaks out until I report that it’s not useful to you.

The problem is that all those retina screens for iPads (and some other models with multipliers like this) now report DOUBLE the pixels. So if someone is viewing content in a large window, it could be because their viewport is large but also it could be because they have a retina screen. Without knowing if the device has a retina screen, you can’t then deduce the viewport accurately.

Thanks GA for this logic loop — there are two customer attributes that I can’t use on their own, or together, to deduce all my Apple iPad screen setups.

So — even though we have browser size now, I can’t see an easy way to get round this problem, unless you can add code to the site. Viewport and Resolution are essentially useless for iPads.

iPad Models

So — we’ve established that Google Analytics can’t tell you what model of iPads are on your site.

It can’t tell you whether it’s an iPad mini, iPad 2, iPad Pro or anything.It’s just an iPad to Google Analytics.

But does the iPad Model really matter?

For web designers working to a grid or particular breakpoints, it makes a difference in terms of how the physical screen size and DPI impacts legibility, readability and scannability — core usability stuff.

Although the density of the iPad display goes up on later models, the resolution, from a web browser point of view, is still the same. Every iPad reports the same resolution (1024x768) for design and layout purposes. I guess that removes one reason to ‘need to know’ this model breakdown but I still yearn for this data, which feels missing.

If your client or company has better data on the model split — I will use that. If they don’t, I will tend to use one recent iPad model (3/4/Air) and one recent iPad mini model. Your market may be different so use some judgement on what you test.

I hate fudges like this so let’s hope Google / Apple can do something to resolve this, or at least make resolution and viewport consistent with each other.

Apple looks huge in my data — why bother with Android?

Apple nearly always looks dominant in the Google Analytics reports.

Why? Well — when you have only a limited number of Apple devices on the market and ZILLIONS of Android devices, the android share of the audience in basic reports looks dwarfed, because it’s fragmented!

Don’t be deceived! This is a Google Analytics newbie mistake 101 — simply looking at the ‘top’ stuff in any report masks the rest of the data from your inspection. You may actually have more android devices than apple on your site, despite what you ‘assume’.

This is one of the things we need to sort in the analysis of this data — how it is clumped, clustered or broken apart. Don’t fall into this trap of thinking the top of the analytics report tells you the true picture. We need to make our own picture here.

Let’s pull some data then — we’re going to do breakdowns or clustering by Operating System, Device Type, Brands, Models and Resolutions.

Data Crunch: Tablet Operating System

Navigate to the Audience, Mobile, Devices report and enable the ‘Tablet Device Category only’ filter you made earlier.

Make sure you turn off the ‘All sessions’ segment at the same time.

You’ll now see a list like this:

Now click the ‘Primary Dimension’ dropdown and choose ‘Operating System’ from the list.

Please expand the number of rows shown on the report to the maximum possible.

You’ll see a report like this:

You’ll see this gives a much clearer split of the ‘broad manufacturer’ split than looking at a device report.

Here you can see that Android drives 21% of sessions on this site.

In this second example above, the Android percentage is much lower (4.5%) but you’d be silly to jump to any conclusions. The reason is because most of these android tablets are being redirected to a dedicated mobile site. It’s a bug — not a feature, consumer preference or signal!

This is why you need to understand the flow as well as device types and categories — to interpret the data — otherwise it can lead you to a biased conclusion.

For the second example above, it tells me that Android is artificially depressed because of bugs in the flow handling. For the first example, this report tells me we have a significant audience we need to cover, not just iPhones.

Remember that this is a moving target (any device or browser version figures) because new versions are coming on the market all the time. Over a year, it will have completely moved on you.

ALWAYS redraw your testing list when things happen like new product launches (e.g. any Apple product launch), Christmas, College time and anything that might skew your previously held beliefs about the device mix.

It’s continually changing.

Data Crunch: Tablet Brands

Let’s now take change the Primary Dimension on your report to show ‘Mobile Device Branding’ — to see what manufacturers are kicking around.

In this example, it’s interesting to see Amazon in 3rd place in all tablets. Apart from that, nothing hugely insightful here — apart from the fact that testing one or two anchors from this list:

Apple tablets, Samsung tablets, Kindles, Tesco tablets

would represent nearly all your tablet experiences on your site. Interesting eh?

You may also find useful information in the ‘Service Provider’ dimension — for example, what 3G/4G carriers are being used by your tablet visitors or whether they are primarily using wifi connections.

Data Crunch: Apple Tablet Resolutions

Navigate to the Audience, Mobile, Devices report and enable the ‘Tablet Device Category only’ filter you made earlier.

Make sure you turn off the ‘All sessions’ segment at the same time.

You’ll now see a list like this:

Now click the ‘Secondary dimension’ dropdown and choose ‘Screen Resolution’ from the list. You’ll see a list like this:

In the filter box on this report, type in the word “iPad” and hit return.

Now you’ll be looking at a report like this.

So what sticks out here?

Well — why are there all these resolutions for the iPad? I thought is was only 1024x768?

Also — why are there so few landscape visitors (1024x768) compared to portrait visitors (768x1024)?

And what are the other resolutions here?

Firstly, some reading that helps explain this data:

So — back to the data.

As you can see here, we have a huge amount of portrait visitors (768x1024) — is this really true? Nope. It isn’t reporting correctly. Ignore this.

Some of these other resolutions are browsers inside app shells — or views onto content from smaller windows. I just can’t be sure — it’s about 0.6% of the site traffic here. If there was more traffic, I might look at the site referrers for where these may be coming from.

So — please make a note of the total Apple devices here (in sessions or user counts). This is your Tablet:Apple figure — in this case, 304,418 sessions. We’ll now collect the Tablet:Android data.

Data Crunch: Android Tablet Resolutions

Now we can take a look at the Android tablets.

In this case, you do the reverse of what we just did. We now filter out iPads to concentrate on Android tablets only.

Just make a copy of the ‘Tablet Device Category’ segment and rename it as ‘Tablet Device Category — Android Only’ after changing it to include only the Android operating system.

See the following screen — just add ‘Operating System’ exactly matches ‘Android’ and save, removing the other segment please!

Now you will see data like this — these are your Android tablet resolutions:

In this case, I did some crunching to put these into resolution buckets and worked out that most of the tablets on this site fall into the 800, 1024 and 1280 resolutions for the longest edge of the tablet.

Bear one thing in mind here — we’re not looking for testing ‘specific’ resolutions. Just think more about resolutions that ‘represent’ large clusters of devices within what your customers use. If you have specific breakpoints in your designs, you may even be able to split by the responsive layout design experience each tablet is getting.

So then what I did here was look at the long and short edges, to see how these stacked up for Android:

I could spend more time analysing this stuff but I’m after a rough idea for testing, not precision.

I can see that tablets with 600+, 800+ small edges cover almost all the models on the site.

There are 10% of tablets with less than 600px of height, so worth trying sometime, just to see what they’re like on the site!

In terms of big edges, it’s pretty much 900+ and 1024+ resolutions that dominate here.

SO — If I cover 800ishx600ish, 900ishx600ish and 1024+ big edge too — it gives me a LOT of reach when it comes to testing.

Once you’ve had a play with the android data, please make a note of the total droid sessions in your report.

This is your Tablet:Android figure and you’ll need it later.

Data Crunch: Android Models

Firstly, go back to the Mobile, Devices report and make sure your segment called ‘Tablet Devices — Android only’ is applied (and no other segments).

Change the primary dimension on the report to ‘Mobile Device Branding’.

Change the primary dimension and you will see this:
Mobile Device Branding Android Only

This shows us clearly that Samsung, Amazon (Kindle), Google, Tesco (Hudl), , Lenovo, Mozilla are about 90% of everything here.

Now drill down by clicking on the first brand here — Samsung:

Now we can see a list like this:

This data is kinda useless as it has the technical name for the device — let’s add a secondary dimension of ‘Mobile Device Marketing Name’

Voila — you then can see this type of list:

The big problem with this list (of over 100 Android Tablets) is — how the heck can I test all these?

And that’s why we did the resolution first!

Because most Android devices behave (from a browser & webkit point of view) almost identically for websites, it’s really things like CPU, screen density, physical size, controls and resolution that impact the experience across this group (for websites — app mileage may vary).

If I can grab a fairly popular Android model, it would be a great proxy for all sorts of popular devices people are using. One part of our testing approach is to find devices that give you coverage of other similar devices too.

Imagine for a moment that you know the latest Samsung device on the market. Now imagine the older version — the back level model, that it’s replacing.

Those two devices (the new entrant, growing) and the mainstay (shrinking more slowly) are the predominant devices we end up testing for Android phones and Tablets. What’s out there now and what’s in our future!

After riffling through the model data, I conclude that the vast majority are 10".1 or 10.5" Galaxy Note/Tab models. If I can test one of those, it actually covers a huge percentage of all the models out there (and similar devices too).

If I wanted to be super thorough (or just curious) — I would also add two recent Samsung Tablets — one 7" and one 10.x”

Let’s repeat the drilldown for the other brands on the list — Amazon, Google, Tesco. I’m not going to screenshot these, just repeat the exercise!


In my example here, I opened up Amazon and found these to be nearly all Kindle Fire HD 7.


Nexus 7 covers almost all of these devices for me, Nexus 10 being a distant second


These are all a low cost Android Tablets, nearly all HUDL 2.

Are you noodling too much?

Bear in mind you don’t want to be exhaustive here. If Android is 25% of your market and there’s a device that’s 2%, it’s actually 0.5% of the addressable audience.

Sweat the big audience segments first and the bugs you fix there will also trickle down to many other devices — particularly for mobile and tablet devices which share the same underlying browser technology.

Data Crunch: Windows and Other Devices

You can also create a segment called “Tablet Device Category — Windows” or repeat this exercise for other brands or operating system segments in your report.

Some segments may be too small to bother with.

So — collect your Tablet:Windows and other measurements and make a note for later. You should have Tablet:Android, Tablet:Apple, Tablet:Windows, Tablet:Amazon figures and so on….

Data Crunch: OS Version

If you’re building an App presence, it’s worth drilling into the Android or iOS Operating System Version as collected by Google Analytics. This will give you a broad idea of the range of versions people have on the devices that visit your site.

If you’ve decided to only support a particular OS version on an App Store, make sure the business knows the impact of that decision if you have a significant audience who appear to use other versions!

The App store may get a different mixture of devices than what you see in Google Analytics but if I KNOW that 40% of my customers are on an older version of iOS, do I really want to give them the finger on the app store?

I see this as a good starting point. If you can’t write apps that support the majority of the OS versions your customers seem to have, then you need to reflect on why your assumptions are wrong!


We’ve now nailed part of the exercise here. We know what kinds of tablet devices (Droid/Windows/Apple/Amazon) are coming to the site.

We have a good idea of what screen resolutions are kicking around on the various platforms that visit our site — and we have some core numbers collected.

In the example I’ve been using, the final test list we drew up was:

One late model iPad mini (recent iOS)

One late model iPad (recent iOS)

One 7" and one 10" Samsung Tablet

One Google Nexus 7

Amazon Kindle Fire HD 7 3rd Gen

Hurrah — we’ve nailed the tablet testing list for this website using the data.

We’ve taken advantage of the similarity of a lot of devices (apple/android) in terms of how they respond to a browser, and found good devices that match our customer ownership as well as being a good technical mix.

Arriving at this list of tablets is not formulaic, as you can see — it requires a bit of judicious thinking about what the data tells you, what then gets covered by the natural ‘clusters’ in the market and finally, arriving at a *small* number of devices that represents the widest audience ‘reach’ for testing.

You should have recorded 4 datapoints at least:





Read Part 3— Desktop Browser Analysis”



Craig Sullivan

Conversion Optimisation, Usability, Split Testing, Lean, Agile,User Experience, Performance, Web Analytics, Conversion Optimization ,#CRO