White shape | Hexiosec Logo

Smart Device Privacy

Scott Lester
5 November 2024
|
6 min Read
|
Scott Lester

Introduction

Today sees the publication of the write-up of some research into the privacy of some smart devices that we completed in September alongside Andrew Laughlin at the Consumers’ Assocation (aka Which?).

We’ve worked with Andrew and Which? over the years on various bits of security research, including on smishing, routers and the online attack surfaces of banks and big organisations. Whilst this work was analogous to security testing, this time the we instead were chiefly concerned with user privacy.

Whilst Andrew’s piece obviously focusses on results of the tests and the consumer perspective, we wanted to cover the technical side of what we did, and what we think about the results.

Testing

Testing Process

This research was a little different to what we normally do, although there are clear parallels between testing for privacy and testing for security. Much of our test setup was the same we’d use for analysing a mobile application or hardware device for potential security problems.

We did all of the testing on a rooted Pixel phone, to make it easier to poke around in the data and grab the apps. Somewhat surprisingly, this didn’t cause any issues with the tested apps. Unlike banking apps, these manufacturers clearly don’t want to exclude customers running rooted phones.

We got packet captures of as much traffic as possible, using a Raspberry Pi as a Wi-Fi intercepting proxy, much like the VM setup we blogged about before. With these captures we could at least see what each device and app was talking to. Although we didn’t enable TLS man-in-the-middle, so we couldn’t see the content of the traffic.

We also used the mobsf as a convenient way of unpacking all the Android applications to inspect their permissions, libraries and functionality.

Our Conclusions

Andrew’s article covers the scoring and highlights (or lowlights) of the testing, and includes responses from the manufacturers who replied.

It was clear during testing that there is a big difference between all the different product categories, so it’s only fair to compare within a category; this is obvious, really, as the permissions and data collection that are reasonable for a smart watch tracking lots of fitness data are not necessarily fine for a smart air-fryer cooking your chicken nuggies.

With smart home products there’s clearly a trend to bringing all kinds of devices into a single home app ecosystem. That’s great if you’ve bought into it wholly, but less so if you just want an app to control one device. For example, if you want to use the smart features of the Xiaomi air fryer you need their home app, which requests all the permissions it needs for all the smart home functionality, regardless of you just wanting to control an air fryer. It’s similar for the Google and Amazon smart speakers, although those you can use just for the device setup.

You might assume that some of the big data-hungry companies would be the worst offenders when it comes to privacy, but that wasn’t what we saw. The Google and Amazon smart speakers ask for all kinds of permissions, but if declined will operate in a non-smart mode, albeit without some functionality. And you only really need to use their apps to set up the speakers. The fact that they are big, well-known companies presumably means they are obliged to comply with legislation in a way that the pseudo-anonymous companies selling on marketplaces can keep ignoring. And that’s where the real risk seems to come from: obscure brands selling cheap electronics such as two of the smart watches we tested - almost identical devices from two supposedly different brands, both of which are hard to attribute to an actual developer or manufacturer, but use the same mobile app. There’s a similar problem for security testing and research - what do we do with a zero-day vulnerability in a popular product if we can’t find a contact involved in making it? What’s worse, such products are often targeted at children and teens.

Legislation

Manufacturers have an increasing list of security and privacy requirements with which they are supposed to comply; there’s obviously GDPR, now there’s the Product Security and Telecommunications Infrastructure (PSTI) Act 2022 (see more from Andrew on it here), and there’s more coming from the Information Commissioner’s Office in the new year.

But how much they care does seem to vary wildly. It’s still clearly possible to successfully sell products that make no concessions to any security or privacy legislation on the big online shops. And whilst we see product recalls because of safety, we don’t see products dropped from sale because they pose a security or privacy risk.

This research shows there are massive gaps in between what many manufacturers should and are doing with regards to privacy and data collection.

And for reference, we are always conscious that it’s easy to criticise from the outside. We’ve had internal debates about our own approach to privacy - especially as we do like to practice what we preach when it comes to both privacy and security. But as a small company, that does often mean accepting some impact on sales and marketing. So we’re not always perfect, but we’re always improving.

Consumer Advice

Security people tend towards the paranoid side - we discovered during testing that the three of us involved in the testing don’t have a single smart TV between us, and weren’t interested in getting one. But as we see repeatedly, what’s good for the security paranoid isn’t the same for normal people.

So what would we recommend for most people? Here are a few points to consider:

  1. Remember that all too often, you get what you pay for. If a product or service is free, then it’s likely because they are collecting and selling your data.
  2. Do your research before buying a smart product. For example:
    1. Does it work offline?
    2. What permissions and data does it need to operate?
    3. Is it getting software/firmware/app updates?
  3. Put IoT/smart home devices on a separate Wi-Fi network, so they are isolated from user devices.
  4. Often a little bit of digging can help: in testing, we found that both smart speakers had two kinds of reset: one via the app, and another hard reset that required holding down some buttons. Figuring that out only took a little Googling.

Coverage

About Scott Lester
Scott is a technical Cyber Security professional with over fifteen years' experience across a broad range of roles within the public and private sectors. With a deep understanding of cyber security, he has in his career focussed on applied cryptography, network technologies, digital forensics and security research. At Hexiosec he leads the delivery of all of our cyber security services.
Scott Lester