Skip to main content
November 1, 2012

California Pushes Mobile App Shops For Clarity on Privacy - But Will it Matter?

California’s Attorney General issued a warning to mobile developers this week: come clean about what kinds of user data you collect - or else! It was a laudable act - especially in the face of federal government indifference. But more daylight may not make users any safer.

After hammering out an agreement with platform providers Amazon, Apple, HP and RIM in February to improve privacy protections for users of mobile applications, California’s trailblazing Attorney General, Kamala Harris was back again this week: calling out mobile application developers who were violating the state’s sweeping data privacy law.

In letter sent to scores of mobile application publishers, Harris criticized them for not publishing privacy agreements that make explicit what user information is collected by their creations and how it will be used. The Attorney General warned that she was prepared to take legal action against them to enforce the California Online Privacy Protection Act (COPPA), which requires commercial operators of online services to post a privacy policy in a conspicuous location so that consumers can review how their personal information will be used prior to installing an application. The publishers have 30 days to respond to the request. Those that don’t could be fined $2,500 for every non-compliant application downloaded by a California resident - strong medicine in a market dominated by $.99 applications!

"Protecting the privacy of online consumers is a serious law enforcement matter," said the Attorney General in a published statement. "We have worked hard to ensure that app developers are aware of their legal obligations to respect the privacy of Californians, but it is critical that we take all necessary steps to enforce California’s privacy laws."

Harris’s actions comes after increased media attention to the issue of online privacy. In December, 2010, for example, The Wall Street Journal published the results of an extensive investigation of 101 popular mobile applications for Apple’s iPhone and Google Android devices, finding that a majority - 56 - transmitted the phone’s unique identifier (UDID) to other firms without the consent of the phone’s owner, while a smaller number (around 5%) sent personal information such as the owner’s age, gender and other personal details.

That report was followed by others that exposed how mobile applications and devices were harvesting reams of personal information and even users location - often without the explicit consent of the phone’s owner. In April, 2011, for example, Apple was forced to acknowledge and publicly apologize for a bug that had its iPhones continuing to transmit location data, even when users had disabled Location Services on their phone. The state laws that followed in the wake of those stories and others like them , including California’s Online Privacy Protection Act (COPPA), passed in 2012, required explicit disclosure of data collection and privacy protection policies. In short: software publishers and platform vendors needed to say what kinds of data they collected, which of it was personally identifiable data, and what they were doing with it once it was collected.

But informing consumers is one thing, and actually protecting privacy is another, much bigger problem. It’s dangerous to mistake one for the other.

For one thing: laws or no laws, data harvesting by mobile devices, mobile applications and web-based applications is just as popular as ever. Juniper Networks this week released a survey of 1.7 million mobile apps and found that anonymous and individual data harvesting was still common. In particular: about a quarter of all free mobile applications for the Android platform obtain user permission to track their location. Around seven percent of free apps also get permission to access a user’s address book. An April survey from the information security group ISACA found that close to 60% of smartphone users took advantage of location services, despite concerns that the information might be misused by advertisers.

And this doesn’t even address the problem of poorly coded- or configured mobile applications that unwittingly share information. In just the last year, users of mobile applications by the professional social network LinkedIn and Path found out that their personal information was being harvested without their consent. In the case of LinkedIn, security researchers showed that personal calendar entries on iPhones and iPads were being transmitted back to LinkedIn without their consent. Path, another social network, was caught uploading its members entire address book to their servers for processing in an attempt to expand the reach of its network.

The idea embodied by laws like COPPA is that more information will liberate consumers to make informed and smart decisions. But it just doesn’t pan out. Why? For one thing, it badly discounts the complexity of our fast-evolving mobile marketplace. The concept also ignores a lot of what we know about human psychology and the findings of independent studies. In one related study by Carnegie Mellon University on the effectiveness of browser warnings about unsafe web sites, for example, researchers found that web site visitors frequently exhibited “dangerous” browsing behavior: clicking past warnings about insecure web connections to get to pages they needed to access. Users, the researchers found, put more trust in the “reputation” of the site they were connecting to than what their browser was telling them about how secure their site was. A similar phenomenon is likely at work when users on Apple’s App Store or Google Play blow past privacy agreements to get their copy of Bad Piggies or Instagram.

So what’s to be done? Pressure on mobile app developers has improved conditions somewhat. The Future of Privacy Forum released a report in June that found more mobile application developers issuing explicit privacy policies with their applications. But stronger measures are needed to protect the most sensitive user information. The researchers at CMU concluded that actually securing users by blocking access to dangerous sites was one way to thwart unsafe behavior. A similar approach might work in the context of mobile application privacy. Namely: protect privacy by protecting privacy - make it illegal to collect certain kinds of user information without explicit consent and do away with blanket permission statements, as the EU is doing. Beyond that, users should have information in real time about which of their personal information will be shared as the result of a particular online action (downloading or running an application, signing up for a new service, and so on). Only with that information, presented in context, can users make informed decisions about their online behaviors.

Related Content

Paul Roberts is an experienced technology writer and editor that has spent the last decade covering hacking, cyber threats, and information technology security, including senior positions as a writer, editor and industry analyst. His work has appeared on NPR’s Marketplace Tech Report, The Boston Globe,, Fortune Small Business, as well as ZDNet, Computerworld, InfoWorld, eWeek, CIO , CSO and He was, yes, a guest on The Oprah Show — but that’s a long story. You can follow Paul on Twitter here or visit his website The Security Ledger.

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.