Privacy from a Developer's Perspective

Micah

 

Micah Jaffe is the Engineering Lead at Hattery, working on iOS and Android development with one published app on iTunes. After 15 years as a developer in Silicon Valley at Stanford, Yahoo and many startups he has a special interest in and appreciation for the legal and ethical issues which developers must navigate. Follow him on Twitter @zeade.

Data is at the core of mobile technology offered by startups across the United States. The popularity of smartphones, tablets and other connected devices has led to an explosion of data consumption and generation by consumers. Pervasive mobile technologies -- paired with new businesses, social networks, and applications -- have created opportunities for innovators to grow a vibrant market of applications. Aggregate data from Google and Apple show 40 billion downloads of available mobile apps, according to a March post by Flurry and a May post by the Verge

Reviews, geolocation, status updates, and a host of other information create a base upon which thriving startups provide exciting and unforeseen services to consumers. Without access to this data, many companies growing the national economy would not have the opportunity to develop new products or enhance services for their customers.

It is in this context that lawmakers in the U.S. and around the world are considering new rules and regulations for consumer privacy protection. The Federal Trade Commission released its final report on consumer privacy in March including recommendations for businesses and policy makers. The FTC also announced a workshop on mobile privacy to be held May 30, 2012. 

Despite efforts in Washington, the requirements and responsibilities for app developers remain unclear. There seems to be a broad, if unofficial, consensus that the app maker should be accountable for consumer privacy -- but there isn’t a roadmap for developers to navigate this challenging legal landscape.

Last week I attended the App Developer Privacy Summit, hosted by the Future of Privacy Forum. The event’s purpose was to engage mobile app developers on present and emerging privacy regulation on the use of consumer data in apps. As a developer, this seemed like a rare opportunity to have some of my questions about privacy answered and to participate in the process of policy development. However, I was disappointed with the lack of clarity provided to developers on how to implement sound privacy practices.

What we need is a new perspective on privacy. Often, when we say “privacy,” we really mean “trust of personal information.” A privacy policy is about creating trust, and when a user feels that trust has been broken, that’s when strong measures like litigation come into play. App makers must be vigilant -- and government should legislate accordingly -- to protect and secure personal information online.

The fact is, there are very few practical tools to achieve perfect compliance with the demands for consumer privacy, especially for startups. Small business startups are feeling the most pressure; as the financial and opportunity cost expended on understanding and complying with policy become larger and the fear of litigation grows. To prevent a chilling effect on innovation in the mobile app space, there needs to be a transparent process that clearly dictates the following:

  • To policy makers: what compliance looks like.
  • To developers: transparency regarding the spirit of what you’re planning to do with the consumer data you collect.
  • To the users: clear expectations of what specific types of information will be used for regardless of context, in order to “future-proof” the process.
  • To the enforcer: when to enforce based on what contravenes “safety” in this space.

These are the questions that should be addressed in state and federal legislation. Too much regulation, poorly conceived regulation, or ill-informed enforcement must be avoided. Private sector solutions may prove to be the best way forward.

Clear privacy policies are a good start -- like those created by generators including iubenda and other concise policies. Clear communication of the spirit of the policy in regards to the app is also important. For example, it’s expected that an address book app would in fact read your address book, but taken out of context, that behavior seems much more sinister -- as was the case with Path.

By altering our approaches to these types of data, policy makers and app developers can move the privacy debate into new territory and take steps to create an environment where startups will continue to thrive. I’m hopeful that government and startups will take the right steps together toward security, privacy, and openness by developing more a mutual understanding of data.