FTC Report Says App Developers Need Privacy Policies

App Developers need Privacy PoliciesTake a look at some interesting data on the need for app developers to provide their app users with privacy policies from the Federal Trade Commission:

A June 2012 study of 150 of the most popular app developers across three leading platforms – Apple’s iTunes app store, Google’s Play app store, and Amazon’s Kindle Fire app store – reveals how much more work needs to take place. See Future of Privacy Forum, FPF Mobile Apps Study (June 2012). For example, the study found that only 28% of paid apps and 48% of free apps available in Apple’s iTunes app store included a privacy policy or link to a privacy policy on the app promotion page.

The top apps in Google’s Play store fared even worse. There, only 12% of paid apps and 20% of free apps examined provided access to a privacy policy through the app store. The Commission staff’s kids app reports reached similar conclusions, noting the paucity of information provided to parents before they or their children downloaded popular children’s apps. See FTC Staff, Mobile Apps for Kids: Current Privacy Disclosures are Disappointing, supra note 28, at 1; FTC Staff, Mobile Apps for Kids: Disclosures Still Not Making the Grade, supra note 33, at 4-6. Oddly enough, free privacy policies are available online that comply with all the rules from the FTC, Google Play, Apple and others. main-logo

To address this problem, the California Attorney General recently sent warning letters to 100 app developers notifying them that they are not in compliance with California law, which requires the posting of a privacy policy. The developers were given thirty days to conspicuously post a privacy policy within their app that informs users of what personally identifiable information about them is being collected and what will be done with that private information. See Press Release, Office of the Attorney General of California, Attorney General Kamala D. Harris Notifies Mobile App Developers of Non-Compliance with California Privacy Law (Oct. 30, 2012). In addition, the California AG has sued Delta Airlines, one of the recipients of the warning letter.

So if you want to start causing problems with California, the FTC, Google Play, Apple and your potential and current customers, don’t worry about providing a privacy policy. For the rest of you app developers who want to stay in business for a while, contact freeprivacypolicy.com for a free, zero-obligation privacy policy. 

Most of this information from this article is found in the Federal Trade Commission’s 2013 Report on Privacy Policies that you can read here.

Mobile App Developers Need Privacy Policies Too

app developersFor most app developers, privacy policies are usually an afterthought in the mobile app development process.

App developers usually end up creating it after the app’s design and development are done. This legal safeguarding may seem like a last-minute addition that doesn’t merit much thought, but it may be the most important component of your entire business.

They usually end up creating it after the app’s design and development are done. This legal safeguarding may seem like a last-minute addition that doesn’t merit much thought, but it may be the most important component of your entire business. Privacy Policies are not all alike, and there are numerous ways that a missing clause or a mismatch between your legal documents and your app itself can cause catastrophic problems. Quite a few ubiquitous and successful mobile apps have run into massive legal headaches and astronomical fines due to flaws in their privacy policy and a failure to integrate and unify their legal protection with the “private parts” of their app architecture.

In 2013, social app Path was fined nearly 1 million dollars by the FTC (Federal Trade Commission) for privacy violations. The $800,000 penalty stemmed from two lethal mistakes made by the app:

  1. Storing third-party names and numbers from their users’ address books without proper disclosure;
  2. Failing to comply with the provisions of COPPA, a law that applies to every app that knowingly collects information from children.

This means that if you extract phone contacts from your users, not only must you notify them, you must also explain within the app’s privacy policy how any why the information is used. If you collect users’ birth dates, you can likely figure out if children are using your app and do something about it. You essentially have two legal avenues: comply with COPPA or make sure users represent that they’re over 13 years old.

But there’s more. The FTC published a long document with recommendations for app developers and even platform-specific advisement for big platforms like Android and iOS. The FTC wants app developers to use a (relatively) new approach called Privacy by Design.  Companies should build in privacy at every stage in developing their products. This means a number of things:app developers

  • Before building an app or a feature, think of the privacy implications;
  • If you collect information, protect it. Follow the security recommendations of the FTC (with special attention to the third-party software you used) and be careful not to over-promise or make generic reassuring statements;
  • Keep your policy updated! Every time you roll out a new update to the app store, stop for a second and think if you added something that has an impact on your privacy statements. Added a new analytic script? It should go in there. Added “find friends via Facebook”? Go and edit your privacy policy.

There are known best practices—some of them coming from the California Attorney General—to give you some legal protection and prevent problems, privacy breaches, and lawsuits. But this is what the FTC actually says that developers should do.

You must have a privacy policy and it must be accessible from the app store.

The simple way to accomplish this is to simply link the policy when you submit the app. But, this means the privacy policy should live on your website. And although what I have to say now is another article all together, you must keep the site that’s hosting your privacy policy free from hackers. PCI Compliant Vulnerability Scanning can help you with that. You could also provide the full text of the policy within the app, or a short statement describing the app’s privacy practices. Need a privacy policy from scratch? There are many online options including Professional Privacy Policy.com.app developers

You should provide “just-in-time” disclosures and obtain affirmative express consent when collecting sensitive information from outside the platform’s API. You already know that iOS pops up a notification that a certain app is requesting access to the user’s location or other private data. In this case, the disclosure and the consent are taken care by Apple. But, your app might as well collect other important stuff, and a pop-up notification is the best way to make sure the users know. FTC names financial, health, or children’s data, but also a generic “sharing sensitive data with third parties” as sensitive private information, so it’s best to err on the side of caution.

Know the legal implications of the code you’re using.

It’s normal for app developers to use third-party packages, but you should make sure this code is secure and fully understand exactly what information it pulls, because you’re ultimately legally responsible for it. There’s a long list of questions to ask yourself, including:

  • Does this library or SDK have known security vulnerabilities?
  • Has it been tested in real-world settings?
  • Have other developers reported problems?

While PATH’s $800K fine was in connection with COPPA violations, it’s the start of broader policing of privacy practices, even against non-American developers. If you cater to the American mobile market, you can still be fined by U.S. Authorities. It’s time for app developers to get a properly-written, constantly-curated privacy policy. The FTC is encouraging the adoption of public standards and suggests tightened integration among app developers, trade associations, ad networks, and mobile platforms, so this is definitely a topic to keep under the radar. You wouldn’t want a legal problem to cripple your app right as it’s starting to soar.

Special Thanks to Veronica Picciafuoco and her article on SitePoint.com.

FBI Asks Apple to Create Hacker-Friendly Software

FBI Apple iPhone Software WarBy now you have heard about the potentially dangerous security issues that could arise should Apple do as requested by the FBI to build a new software, a backdoor into the iPhone – specifically built to can break the encryption system which protects the personal information of every iPhone user.

According to Bruce Sewell, Apple’s chief lawyer in his statement to a congressional committee today that “the FBI is asking Apple to weaken the security of our products. Hackers and cyber criminals could use this to wreak havoc on our privacy and personal safety. It would set a dangerous precedent for government intrusion on the privacy and safety of its citizens.” In essence, if Apple creates this software, our private information would be vulnerable to the government if we deserve it and to hackers if we don’t. To the iPhone user, having Apple create the software is a lose-lose situation. Aren’t hackers doing enough damage online? It’s hard enough for business owners to scan their sites for vulnerabilities that might be accessible to hackers. Such scans, now required to achieve Payment Card Industry (PCI) compliance, ensure our security as consumers as well as the safety of the business owner’s proprietary content. If Apple creates the requested software, no one will be safe from the possibility of getting their phone hacked into. 

When this all started, the FBI argued that all it wanted was access to one little iPhone – but an important iPhone – as it belonged to a terrorist. But if that was the case, it isn’t the case now. Sewell reminded people of this in his opening statement, saying that “building that software tool would not affect just one iPhone. It would weaken the security for all of them.” He continues, “the US government has spent tens of millions of dollars through the Open Technology Fund and other US government programs to fund strong encryption. The Review Group on Intelligence and Communications Technology, convened by President Obama, urged the US government to fully support and not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software.” Encryption is necessary. App developers and app users alike welcome it as our last-ditch effort to keep our privacy and security safe. Sewell says that Apple has “been using it in our products for over a decade. As attacks on our customers’ data become increasingly sophisticated, the tools we use to defend against them must get stronger too. Weakening encryption will only hurt consumers and other well-meaning users who rely on companies like Apple to protect their personal information.”

Forcing Apple to create this software could damage the security of our freedoms and liberties we hold so dear and make us even more vulnerable to thieves and terrorists. Mandating a backdoor encryption software is a very bad idea. It would just give hackers one more income stream and give government even more access into our personal lives.

Read more here.