
This week sees the publication of some more research we’ve done with Which?, aka the Consumer’s Association. This blog provides some background, explains our approach and offers a few technical titbits.
New ICO Guidance Raises App Privacy Questions
Last month the Information Commissioner’s Office released some new draft guidance on privacy for “smart products”. Whilst it doesn’t explicitly mention apps, it does list a range of IoT products that are expected to be used alongside a companion app, so apps are certainly relevant. It’ll be interesting to watch what happens with it.
Our Research with Which? on Mobile App Privacy
We helped Which? test the privacy implications of over twenty of the most popular mobile apps, across a range of categories. The full research includes our technical pokings, the results of a large survey on public opinion to privacy, and Andrew’s analysis of the consent journey and privacy policies for all the tested apps. You can read Andrew’s piece here, or listen to him talking about it on the Which? tech podcast:
Whilst it might not be as headline-grabbing as previous work on smart-home devices and those ever-popular air fryers (see Testing Smart Device Privacy - funny to see air fryers explicitly mentioned in the ICO press release), the latest research we helped with focused on privacy considerations for a wider range of popular mobile applications. This used an improved version of the privacy testing approach that we contributed to and used for the air fryer testing.
The conclusions likely won’t surprise many: most apps request lots of permissions, many of which aren’t strictly required. Shopping apps will spam you with emails. Most apps talk to lots of places around the world, but chiefly cloud servers in the US. Apps from Chinese companies also talk to China.
Some of the same old stuff is still happening: sneaky UI design that makes the “do not consent” button much less obvious than the “consent” button. Too many (15/20) apps requesting fine location access, where most don’t need it. 14 requesting microphone access, many of which have no obvious voice-related features.
The good thing is that both iOS and Android have got stricter in their approach to permissions. Both support just-in-time approval, or allow you to restrict background permissions, or remove unused permissions from apps. Of course, not all developers are keeping up to date, whether deliberately or not. For example, we saw some apps still using the Android permission that reveals what other apps are running, which is officially deprecated.
The survey was interesting - I’d adopted a rather cynical assumption that people don’t care about privacy as they just want to use the apps anyway. But the survey showed it’s probably more complicated, and that maybe the average position is that people are concerned about privacy, use the same apps anyway, but are concerned about the implications of doing so.
How We Tested App Privacy
Privacy testing has a lot of overlap with security testing, but does include a bunch of other considerations too. For example, analysing the interactions - what’s the consent journey, how easy it is to use the privacy and account features. Sometimes it’s just reading the documentation.
Intercepting App Network Traffic with Corellium
This time we made heavy use of Corellium for virtual mobile devices, using a standard device image with example user data set. The big advantage is that it does network interception for you, so we could inspect the myriad of connections that each app made to remote servers. Interestingly, only two of the tested apps balked at running on Corellium, which they presumably see as a threat, so we had to test those on an actual phone.
This was the first time we’ve recorded and inspected network traffic across such a range of apps. Andrew didn’t have space to cover much in terms of network traffic, so it’s worth covering it a bit here. I’d expected to see private information being sent from many apps to all over the internet, but that wasn’t the case. It makes sense, in retrospect, that if, for example, a given app shares your details or information with other places, that it would happen from their backend and not the user app itself.
That said, it is surprising how chatty some of the apps are - whilst we all know the global internet is a thing, it’s still interesting to see it in action for a bunch of typical apps. For example, in a relatively small data capture, the Alibaba app talks to cloud services hosted by themselves, Akamai, Google, ChinaNet and “Zhejiang Taobao Network Company”. No, me neither on that last one, but that shows my Western bias, as a quick search tells me it’s an online shopping site and a part of Alibaba. Their servers in use here are all US-based, interestingly.
Otherwise, I thought the following were at least generally interesting observations:
- The Amazon app talks mostly to AWS, but also a couple of Google Cloud services. Which may sting a little.
- The TikTok app is very chatty, mostly to Akamai services in the US.
- Only a couple of the apps do anything with Azure - sorry Redmond.
- And boy does everyone’s infrastructure rely on CloudFlare, which isn’t great news for Russian users. Pretty sure all our eggs (including Hexiosec’s) are safe in that one big basket.
Why App Privacy Still Needs More Attention
There probably isn’t a market for lots of privacy testing gigs, as if it’s anything like product security then many manufacturers and developers just won’t put the budget into it. But for us, privacy testing is an interesting parallel to typical app security testing. Andy and his colleagues at Which? do some great work for a good cause - the General Public. Often they’re met with cynicism, whataboutery or deflection, but alongside the ICO they can hopefully shift the dial a little more in favour of consumers and app users. We’re happy to do our little bit to help.
Related Posts


