Building mobile apps that are truly privacy compliant requires a privacy by design approach from the outset. Phil Lee, Partner in our Privacy and Information team based in Palo Alto. highlights the key issues that app developers should be aware of.

My phone is my best friend. I carry it everywhere with me, and entrust it with vast amounts of my personal information, for the most part with little idea about who has access to that information, what they use it for, or where it goes. And what's more, I'm not alone. There are some 6 billion mobile phone subscribers out there, and I'm willing to bet that most – if not all of them – are every bit as unaware of their mobile data uses as me.

So it's hardly surprising that the Article 29 Working Party has weighed in on the issue with an "opinion on apps on smart devices". The Working Party splits its recommendations across the four key players in the mobile ecosystem (app developers, OS and device manufacturers, app stores and third parties such as ad networks and analytics providers), with app developers receiving the bulk of the attention.

Working Party recommendations

Many of the Working Party's recommendations don't come as a great surprise: provide mobile users with meaningful transparency, avoid data usage creep (data collected for one purpose shouldn't be used for other purposes), minimise the data collected, and provide robust security. But other recommendations will raise eyebrows, including that:

  • the Working Party doesn't meaningfully distinguish between the roles of an app publisher and an app developer – mostly treating them as one and the same. So, the ten man design agency engaged by Global Brand plc to build it a whizzy new mobile app is effectively treated as having the same compliance responsibilities as Global Brand, even though it will ultimately be Global Brand who publicly releases the app and exploits the data collected through it;
  • the Working Party considers EU data protection law to apply whenever a data collecting app is released into the European market, regardless of where the app developer itself is located globally. So developers who are based outside of Europe but who enjoy global release of their app on Apple's App Store or Google Play may unwittingly find themselves subjected to EU data protection requirements;
  • the Working Party takes the view that device identifiers like UDID, IMEI and IMSI numbers all qualify as personal data, and so should be afforded the full protection of European data protection law. This has a particular impact on the mobile ad industry, who typically collect these numbers for ad serving and ad tracking purposes, but aim to mitigate regulatory exposure by carefully avoiding collection of "real world" identifiers;
  • the Working Party places a heavy emphasis on the need for user opt-in consent, and does not address situations where the very nature of the app may make it so obvious to the user what information the app will collect as to make consent unnecessary (or implied through user download); and
  • the Working Party does not address the issue of data exports. Most apps are powered by cloud-based functionality and supported by global service providers meaning that, perhaps more than in any other context, the shortfalls of common data export solutions like model clauses and safe harbor become very apparent.

Designing for privacy

Mobile privacy is hard. In her guidance on mobile apps, the California Attorney-General rightly acknowledged that: "Protecting consumer privacy is a team sport. The decisions and actions of many players, operating individually and jointly, determine privacy outcomes for users. Hardware manufacturers, operating system developers, mobile telecommunications carriers, advertising networks, and mobile app developers all play a part, and their collaboration is crucial to enabling consumers to enjoy mobile apps without having to sacrifice their privacy."

Building mobile apps that are truly privacy compliant requires a privacy by design approach from the outset. But, for any mobile app build, there are some top tips that developers should be aware of:

  1. Always, always have a privacy policy. The poor privacy policy has been much maligned in recent years but, whether or not it's the best way to tell people what you do with their information (it's not), it still remains an expected standard. App developers need to make sure they have a privacy policy that accurately reflects how they will use and protect individuals' personal information and make this available both prior to download (e.g. published on the app store download page) and in-app. Not having this is a sure fire way to fall foul of privacy authorities – as evidenced in the ongoing Delta Airlines case.
  2. Surprise minimisation. The Working Party emphasises the need for user consents and, in certain contexts, consent will of course be appropriate (e.g. when accessing real-time GPS data). But, to my mind, the better standard is that proposed by the California Attorney-General of "surprise minimisation", which she explains as the use of "enhanced measures to alert users and give them control over data practices that are not related to an app's basic functionality or that involve sensitive information." Just-in-time privacy notices combined with meaningful user controls are the way forward.
  3. Release "free" and "premium" versions. The Working Party says that individuals must have real choice over whether or not apps collect personal information about them. However, developers will commonly complain that real choice simply isn't an option – if they're going to provide an app for free, then they need to collect and monitise data through it (e.g. through in-app targeted advertising). An obvious solution is to release two versions of the app – one for "free" that is funded by exploiting user data and one that is paid for, but which only collects user data necessary to operate the app. That way, users that don't want to have their data monitised can choose to download the paid for "premium" version instead – in other words, they have choice;
  4. Provide privacy menu settings. It's surprising how relatively few apps offer this, but privacy settings should be built into app menus as a matter of course – for example, offering users the ability to delete app usage histories, turn off social networking integration, restrict location data use etc. Empowered users are happy users, and happy users means happy regulators; and
  5. Know Your Service Providers. Apps serve as a gateway to user data for a wide variety of mobile ecosystem operators – and any one of those operators might, potentially, misuse the data it accesses. Developers need to be particularly careful when integrating third party APIs into their apps, making sure that they properly understand their service providers' data practices. Failure to do proper due diligence will leave the developer exposed.

Any developer will tell you that you don't build great products by designing to achieve compliance; instead, you build great products by designing a great user experience. Fortunately, in privacy, both goals are aligned. A great privacy experience is necessarily part and parcel of a great user experience, and developers need to address users' privacy needs at the earliest stages of development, through to release and beyond.

Published on the Field Fisher Waterhouse Privacy and Information Law Blog, March 2013.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.