Steven Roosa is a Partner in our New York office

On January 10, 2013, the California AG released "best practices" for the mobile app "ecosystem" entitled Privacy to Go.  Unfortunately, the area is already flooded with privacy best practices.  As a result, if another entity is going to throw its hat into the ring, the proposed best practices better be exceptional.  Well, no worries, these aren't.  That's not to say that there aren't a couple of exceptionally good points, because there are.  The larger problem is that there are too many "best practices" of only middling quality—including the California recommendations—while what the industry really needs is a single gold standard, which has yet to emerge.  We will take a hard look in this post at some of the strong, and not-so-strong points raised in Privacy to Go.

The California AG's Strong Points — App Testing and Knowing Your 3rd Parties

Where Privacy on the Go really hits on all cylinders, is the emphasis on practical steps for understanding the privacy properties of one's apps.  The number one compliance and exposure problem—across the board and irrespective of the size of the developer or publisher—is that many companies fail to grasp, at a technical level, what types of information their apps collect, store, and share.  The resulting legal and compliance problem, of course, is that you can't disclose what you don't know.  To close this gap, Privacy on the Go urges companies to take the practical step of "testing" their apps for privacy and also investigating the third-parties implicated by their apps' network traffic.  This is sound advice. 

In terms of a privacy strategy, technical privacy testing and follow-up investigation of third-parties is critical in order to reduce the risk of regulatory inquiries and class action litigation.  Why?  The simple reason is because it is exceedingly rare for an app to be the subject of an FTC or state inquiry where the pre-download privacy policy accurately discloses the privacy characteristics of the app.  As for class action litigation, it's just not that compelling when a plaintiff has to plead that "notwithstanding the defendant's full and accurate disclosure of the information collection and sharing practices of the app, the plaintiffs still assert violations of the Computer Fraud and Abuse Act and the Electronic Communications Privacy Act."  Knowing an app's technical privacy properties empowers companies to make better disclosures.  Accurate disclosures, in turn, empower end-users to download and use apps with actual notice of what the app does behind-the-scenes. 

The California AG's Not-So-Strong Points

(i) Failure to Appreciate the Developer/Publisher Distinction

Privacy on the Go includes a "decision tree" in which app developers begin with planning for privacy before building the app.  Sound familiar?  It should.  This is the FTC's "Privacy by Design" concept warmed over lightly and served with toast.  The FTC has already announced that principle and explained it to industry more clearly and in better detail.  [See Privacy Framework]  If someone wanted to amplify the FTC's theme, the way to do that would be to address the Publisher-Developer dynamic, and dissect the particular roles of each set of entities relative to planning for privacy. No such luck here.  The generic advice offered by the California A.G. on this point is fine as far as it goes, and it doesn't go very far or offer anything that hasn't already been discussed and advanced by others.

(ii) The California AG Muddies the Waters On Unique Identifiers and Misses an Opportunity to Leverage Privacy Enhancing Technology Developed by Apple

Privacy on the Go is vague as to critical key terms.  For example, it deems "unique device identifiers" to be "personally identifiable data."  Okay, so does that mean hardware device identifiers as well as identifiers set by software are personally identifiable data, or only the former?  This question couldn't be more important since iOS, the leading platform for apps in the United States, has created the "Identifier for Advertising" or IFA, which is a unique identifier set by software.  Indeed, Apple created the IFA specifically to alleviate privacy issues (identified in other privacy best practices, ironically) associated with the now deprecated, but still used UDID.  Privacy advocates hailed the creation of the IFA as a major advancement.  The IFA, however, does not garner so much as a single reference by the California AG, who seems oblivious to its existence.  The California AG's oversight is not without consequences.  Since the IFA is not specifically carved out of the category of "personally identifiable data," and because the IFA is both "unique" but not "app-specific," the IFA seems to qualify as personally identifiable data under Privacy to Go

So, this is another reason why the California AG's recommendations are acutely unhelpful.  Not only are they ambiguous, but they fail to account for recent privacy innovations by the leading platform, namely iOS. Privacy enhancing technologies should be encouraged, recognized, and leveraged; not ignored. 

(iii) For the California AG, All Data is Personally Identifiable Data

This leads to another question.  What, exactly, isn't "personally identifiable data" under the California AG recommendations?  Consider her list:

  • Unique device identifier
  • Geo-location (GPS, WIFI user-entered)
  • Mobile phone number
  • Email address
  • User's name
  • Text messages or email
  • Call logs
  • Contacts/address book
  • Financial and payment information
  • Health and medical information
  • Photos or videos
  • Web browsing history
  • Apps downloaded or used

The last item, "Apps used" counts as personally identifiable data?  Really?  Using a mobile device means using apps.  So what has really happened in Privacy to Go, is the California AG has effectively elevated all data to the level of "personally identifiable data." Even in the desktop/laptop arena, other items on the Privacy to Go list, such as "web browsing history" and "health information" have never been considered "personally identifiable data," in the absence of identifiers or specific information that would tie that data to a specific individual.

At bottom, this is the main problem with Privacy to Go, it sweeps too much data within its best practices.  If everything is personally identifiable data, then the resulting disclosures that follow those best practices will be of little use to end-users.

www.hklaw.com

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.