Articles
Privacy and security are two of the most hotly debated topics in mobile right now and it seems as though everyone has a different angle on what kind of fix is needed.
In the mobile space, and for that matter the world of technology in general, security and privacy are linked at the hip. Software makers, carriers, regulators, marketers and end users are all struggling to find a balance between the release and use of information and the potential security and privacy vulnerabilities created when that two-way street is opened between the device and the world at large.
It's a tough wire to walk. Regulators have been on the war path following some high-profile admissions from major app companies like Path, which admitted to accessing user address books without permission. And then, of course, there were the sensational revelations about the capabilities of solutions like Carrier IQ, which has long been employed by mobile operators as a way of managing network performance and customer experience.
The slow drip of information about how pieces of sophisticated technologies actually work certainty has caught Washington’s ear. Earlier this year, Rep. Henry Waxman, (D-Calif.), ranking member of the Energy and Commerce Committee, and Rep. G. K. Butterfield, (D-N.C.), ranking member of the Trade Subcommittee, sent letters to 34 separate companies with apps in Apple's App Store, asking the organizations to explain how their apps collect and use information.
Washington’s role could mean the beginning of the end for the unregulated world of mobile privacy, or it could simply be a call to bring the discussion out into the open and prompt technology companies to pay a little more attention to how their technologies are paying attention to end users.
The Conflict
The issue of privacy and security on the mobile is very different than any other channel. The hardware in smartphones and the software that run on today’s devices are powerful and invasive, with the potential for serious abuse. While someone might want to reveal their location via GPS to a loved one at all times, they might not want a brand advertiser to know their location.
Recent research from North Carolina State University (NCSU) revealed that including ads in mobile applications poses privacy and security risks on a level that the developers themselves might never have imagined or intended. NCSU found that half of 100,000 Android apps surveyed were connected to ad libraries. Fully 297 of those apps included what NCSU calls “aggressive” ad libraries that were enabled to download and run code from remote servers, which the authors of the report say raises “significant privacy and security concerns.”
The conflict between the end user and the app developer is perfectly illustrated by this research. While consumers might want to offer their location information up to a free photo application on their smartphone, they might not want the ad library associated with that app to have access to their whereabouts. The problem is that the app probably wouldn't even exist if it weren't for the revenue generated by in-app ads.
Xuxian Jiang, an assistant professor of computer science at NCSU and co-author of a paper describing the work, says that in many cases ad libraries receive the same permissions that the user granted to the app itself when it was installed, "regardless of whether the user was aware he or she was granting permissions to the ad library."
Jiang says that the exposed risks find their root in Android’s permissions model, wherein an app is the smallest entity that can be granted permission. Jiang contends there has to be a way to split permissions for the app and the ad libraries if users are to be truly informed about how their information is being used.
Jiang’s team examined 100 representative ad libraries used by the 100,000 apps in its study. One significant find was that 297 of the apps (1 out of every 337 apps) used ad libraries “that made use of an unsafe mechanism to fetch and run code from the Internet – a behavior that is not necessary for their mission, yet has troubling privacy and security implications,” Jiang says, noting that this is only the most extreme cases.
Mobclix was one of the companies included in the NCSU report. Krishna Subramanian, co-founder of Mobiclix, says that his company provides access to as much data as other ad providers, but he says “what's getting lost is that we do it securely.”
“The data is only accessed when the advertisement can make use of it and when the user explicitly agrees to grant the access,” Subramanian says.
Subramanian also says that while the NCSU report notes that Mobiclix might offer access to sensitive data – list accounts, read calendars, call logs, change calendar – those features are used locally on the device. “We do not collect, store, or process any of that information,” he says.
As for the report’s conclusion that there needs to be a separation between permissions granted to the app and permissions granted to the ad library, Subramanian says that’s already being tackled in conjunction with the Mobile Marketing Association (MMA). But in the end, he says, it really comes down to education.
“We have to get end users to an understanding that the goal is to really make ads become content. The closer we can get to that, the better the user experience is going to be for consumers and the better the experience is going to be for advertisers in terms of ROI,” Subramanian says.
Greg Stuart, CEO of the MMA, says that reports like the one from NCSU are an important part of the ongoing process to refine privacy and security policies, but he adds they can also be inflammatory, missing the point entirely.
Stuart says the title of the NCSU paper – Including Ads in Mobile Apps Poses Privacy, Security Risks – is off the mark.
“It’s not the ads that are the problem, it’s that there are people who do things that you might not be aware of, and that’s a different thing,” Stuart says. “You have bad people doing bad things sometimes. If my bank gets robbed, I don’t go blame the president of the bank. I blame the bad guys who robbed it.”
Stuart says it’s actually pretty good news that the study found only 1 out of every 337 apps using ad libraries made use of an “unsafe mechanism” to fetch and run code from the Internet.
“It wasn’t even that they did anything wrong,” Stuart says, “but that they just made use of an unsafe mechanism. Think of this, there’s a one in 200 chance of being a victim of identity theft. Just by waking up in the morning, you’re more at risk of having your identity stolen than running one of these apps.” And even then, he says, you’re probably not in danger.
The Disconnect and The Culture of Free
Regardless of how it’s being used, the trove of personal information on a user’s smartphone really is being accessed by various entities, but in the overwhelming majority of cases, the problem stems more from a lack of understanding on the consumer’s part than it does because of the malicious intent of developers or advertisers.
Mike King, mobile strategist for Appcelerator, thinks most developers want to do the right thing, that their intentions really are simply to better the user experience. "They want to protect their users' data and at the same time they'd like to figure out how they can use things like in-app advertising and the mining of information that the app produces for more targeted advertising," King says.
Part of the problem is the culture of “free,” birthed by the open Internet, which has since carried over to mobile, especially when it comes to software. "I think there is that expectation that software and code will be free, but in the end it really isn't. There's developer time and resources that go into building one of these apps and getting it in the app store. So I think that there is a decent disconnect there."
Perhaps it's the newness of mobile technologies and the extent of the information that end users keep on their phones that makes privacy such a touchy issue. And yet there's precedent for consumers handing over extremely personal information in other areas of their lives, as long as there’s perceived value. King points to the loyalty card as one example. "I'm perfectly willing for Safeway to have a 10-year record of everything I've purchased at their store for maybe 4 percent off my grocery bill," he says. "There's not a lot more private than what I eat and drink."
But it’s not just the way end users are perceiving value in the app store that signals trouble; rather in many cases people don’t even know what it is that they’re agreeing to when they hit the “Accept” button at the bottom of their screens.
King references Google's change in privacy policies as a good example of how to clearly communicate with users about a policy change, and yet he questions how effective it was because users didn't heed the call.
"Google did an excellent job at letting end users know that they were going to change their privacy policies before they changed them, and letting them know that the privacy policies were changed and then alerting them to the fact that this stuff is important and you should read it," Kind says. “And in doing that, they still didn't get a lot people to actually read the entire Terms and Conditions."
So who's responsibility is it to foster clear guidelines around privacy and security? King says that's still being worked out but suggests that while regulation is important, it will be the big players – Apple, Google, Facebook – that will inevitably have to take the lead. At the moment, however, he says there's little in the way of a path for smaller entities to follow.
"There's some best practices and things like that but at no point and time has anybody come out and said, 'This is what's acceptable and not acceptable,'" he says.
Regulators right now are stoking the fire with a David and Goliath storyline that pits the watchful government against the evil Goliaths like Apple, Google and Facebook, but in the end, the posturing doesn't really solve anything, King says.
"I guess my biggest fear is that we see regulation that doesn't take into effect the technology and what it's capable of," he says. "We saw this with HIPAA, with all kinds of other legislation, that quite frankly the technology to enforce doesn't exist."
Is Regulation the Answer?
Whether the industry likes it or not, Washington is pursuing the mobile privacy and security cause. The question is whether all the inquiries and hand-wringing will lead to a solution, a compromise or just more fear.
Stuart of the MMA says the issue of digital privacy isn’t new. Washington has a track record of jumping on an issue to gauge its potential as a platform issue and then backing off if it doesn’t pan out, which he says could be the case with the latest hubbub.
“I would like to ask Washington to let the MMA do our damnedest to self-regulate the industry,” Stuart says, “because we’re going to be much smarter about it, and we’re much more sensitive to the negatives in the long term than they are.”
Stuart says it’s entirely possible Washington could end up doing more damage than good. “You could over-regulate and crush a fledgling industry by doing the wrong thing if you don’t know what you’re doing,” he says. “Mobile advertising is very likely, very rapidly to get to $30 to $50 billion and more.”
Kevin McGuire, vice president of product for Motricity, says the current administration has made it very clear that there is a role for governments to play at the state and local level in terms of consumer protection. But he agrees that over-reaching laws would not do justice to the subtleties of privacy on the mobile.
Illustrating the huge spectrum of views on privacy, McGuire cites a comment made recently by marketing consultant Faith Popcorn, who said: “Privacy is dead, get over it.”
“We thought that was really an interesting comment on where we are as an industry, where some people might be totally fine with that, while others are not OK with this idea that your life is an open book and privacy is a dead concept,” McGuire says. “Where do you draw the line?”
To be sure, any legislation could have major effects on individual companies. Motricity has a policy of never pushing past the boundaries of what’s being legislated or what it believes will be legislated. The company is of the mindset that if the current administration is re-elected, the industry will see even more consumer protection legislation coming down.
“It could literally change companies overnight. It could put companies out of business,” McGuire says.
So Whose Job Is It?
In the end, it could be the large entities – Apple, Google, Amazon – that are in the best position to facilitate responsible privacy and security policies. Michael Sutton, vice president of security research for Zscaler, contends that the "app store gatekeepers" really have not been doing enough to address protection of the end user.
"If you look at things like the Path fiasco... I think that really illustrates this very well," Sutton says. "People freaked out that Path was grabbing the address book and then a couple days later, very quietly other applications like FourSquare and Yelp!, they magically updated their apps as well. Because the dirty little secret was that everybody was doing it."
Sutton says Apple's response to Congress's inquiry on the matter was inadequate, if disingenuous. "They said, 'Hey, we tell people not to do that.' That's a very weak answer, because they're in a position to prevent that. They provide the SDK. And now they're going to [solve the problem] but that should have been done in the first place."
Sutton says it looks like Apple is not reviewing software for security and privacy reasons but more for the end-user experience. "Something like Path stealing your address book, that would have been very easy for Apple to find and not bless that application."
Google recently launched its Google Bouncer initiative, which is supposed to weed out bad apps from Google Play before they make it onto a device. Sutton praised the step as a good first step, if an unproven one.
Never a fan of regulators making the rules about technology they don’t understand, Sutton says that Washington’s current role could be effective in prompting other companies to implement more technologies like Bouncer. “Maybe they can carry the big stick and say, 'Hey guys, if you don't do something, we will."
It’s probably no shocker that the industry would rather police itself, nor is it surprising to see companies like Apple, Google, Facebook and others agreeing to commitments like the one in California designed to bring the industry in line with a California law requiring mobile apps that collect personal information to have a privacy policy. Such agreements signal the intention to do the right thing, without having to draw up a legally binding law that could stifle innovation.
The Federal Trade Commission’s (FTC) recently updated report, “Protecting Consumer Privacy in an Era of Rapid Change,” commented on the “slow pace of self-regulation.”
Daniel Castro, senior analyst with ITIF who specializes in information technology (IT) policy, writes in a recent blog that it is “unclear what metric (if any) [the FTC] used to come to this conclusion,” arguing that the FTC has concluded the process is slow if only because it has yielded a different set of rules than was proposed by the FTC.
Castro notes that one of the benefits of self-regulation is that it typically can move much faster than the government regulatory process.
“Although some privacy advocates claim that self-regulation of online privacy is moving too slowly right now, it is worth noting that the FTC took more than a year just to update its draft privacy policy framework,” Castro writes, adding that during this same time period, the Digital Advertising Alliance (DAA) launched its self-regulatory program for online behavioral advertising and committed to preventing the use of consumer data for secondary purposes like credit and employment decision.
It’s commentary like Castro’s that keeps the discourse healthy and balanced. However, the FTC’s findings that not enough is being done to protect consumers is a clear comment on where at least one regulatory agency thinks things currently stand.
Hope for Self-Regulation
There are signs of hope everywhere that the industry will do as it has done in the past and aspire to a healthy culture of self-regulation through new technologies and standards bodies.
Just recently Velti, a provider of mobile marketing and advertising technology, announced the Open Device Identification Number (ODIN) Working Group, a consortium focused on developing a secure anonymous device identifier for the mobile advertising industry, as an alternative to using the unique device identifiers (UDID).
And Opera Software launched App-Tribute via its advertising subsidiaries AdMarvel, Mobile Theory and 4th Screen Advertising. The new solution aims to provide a privacy-friendly solution that allows mobile publishers and advertisers to receive marketing and analytics data without taking sensitive data elements such as UDID, cookies or MAC addresses.
Mahi de Silva, executive vice president of consumer mobile at Opera Software, says the company felt as though it could build a more trusted third-party platform and create transparency between publishers and advertisers and in doing so remove a lot of the friction and arbitrage that happens in that marketplace.
"We're covering new ground here because from a policy-making perspective, unlike the privacy drums that were beaten around the Internet in the late ’90s, it was really more of a Department of Commerce view. But when it comes to mobile, you also have the FCC getting involved in this," de Silva says, noting that right now the various powers that be –the Department of Commerce, the FTC and the FCC – are all trying to figure out what, if any, role they have to play in the privacy debate.
De Silva says one of the biggest issues is there are too many impressions in mobile and not an equivalent number of campaigns to soak up those impressions. "So you've got this over supply problem and publishers are struggling to make money in these free applications. When you're a start-up company and you're low on funding and some ad network says, ‘I want to buy out your inventory at a $12 CPM,’ the first question that gets asked is not 'Are you compliant to my terms of use and privacy policy?’"
Still, he's optimistic that as the industry matures, the trend will lean toward technological solutions that have data that's all about consumer engagement and conversion and less about running scams where end users are unwittingly charged $10 for an SMS subscription to something they didn't want in the first place.
"It's going to take some time to flush those types of players out of the ecosystem," de Silva says.
But it's a rare challenge in the mobile world that can be overcome by one company or solution, and de Silva says it's also going to take a little old-fashioned teamwork to ensure things are done right going forward.
"Our hope is that with solutions like the one we just launched and through collaboration with different parties in the ecosystem, we can stand up as an industry and say: This is how we've addressed this issue and here's how we're solving it.”


