Protect your address book note fields from prying eyes. iPrivacy enables you to encrypt and decrypt the note fiels of your contacts using 128-bit encryption. This allows you to keep sensitive information such as credit card numbers, bank numbers, etc. in your iPhone's contacts database eliminating the need for a separate application to store secure information.

  • Simple and clean interface
  • Based on Apple's built-in encryption technology
  • Ability to encrypt and decrypt only selected contact groups

The Story of iPrivacy

Those of you that visited our site in the past will know that iPrivacy has been listed as "coming soon" for quite a while. You might wonder what is taking so long to complete the development of a simple utility application. It turns out that iPrivacy was completed a year ago. iPrivacy was submitted to Apple for review on 9/12/2008. Yes, that's 2008.

Needless to say, iPrivacy contains no controversial content, does not use Apple UI elements, does not conflict with wireless carriers' interests, etc.  However, after keeping it in review for nearly a year (and ignoring multiple status inquiries), Apple chose to reject it. Why? The rejection was on the following grounds, quoting from Apple's rejection email:"

"If a user encrypts the data and forgets the set password,
the information remains encrypted and is not recoverable.
It would be advisable to have a data recovery mechanism to
ensure user data is not permanently lost.  This review was
conducted on iPhone 3G running iPhone OS 3.0 Beta 5.

"If you believe that you can make the necessary changes so
that iPrivacy does not violate the iPhone SDK Agreement,
we encourage you to do so and resubmit it for review."

Contrary to the claim above, the iPhone SDK Agreement makes no stipulation for having recovery mechanisms for encrypted data if the user forgets the password.  More importantly, however, having a means of recovering the encrypted information without a password completely defeats the purpose of symmetric encryption. What is the point of encrypting information if it can be recovered/bypassed if the password is forgotten (or not known)?

This rejection is unfortunately yet another instance of the seemingly nonsensical secret criteria that the Apple review team apply to iPhone apps.  It boggles the mind that someone working at Apple would suggest that an encryption app needs to allow access to the encrypted data if the password is forgotten. One wonders whether Apple's own Mac OS X FileVault has such a back door...

Making matters worse, there is no process for iPhone developers to appeal or escalate such nonsensical rejections so, in general, developers need to basically do as Apple says.  However, in this particular case, that is not an option since doing so would completely defeat the purpose of the app.

Many end users on the web feel that the stories of App Store rejections are nothing more than developers being sore losers for having submitted applications that don't meet the iPhone SDK guidelines. Well, iPrivacy is nonequivocal proof that such is not always the case. iPrivacy meets every documented iPhone SDK guideline. It contains no controversial content. It does not use up any network bandwidth or bypass cellular carrier charges. Its rejection is random. It is arbitrary. It is unreasonable. Apple treats iPhone developers with the uttermost disrespect and contempt. Unfortunately, it's as simple as that.

The Apple product experience is the envy of every tech company. The iPhone SDK is a joy to use. Unfortunately, Apple's relationship with individual iPhone developers is the diametric opposite. It's horrible.

Have we given up? No. We believe that Apple's rejection is unfounded, unreasonable, and unfair. Therefore, we will continue to submit iPrivacy for review until Apple deems it fit to establish a constructive dialog or chooses to approve it as it should have nearly a year ago.

So where does iPrivacy stand? It is still in App Store review purgatory...


One reader commented that that rejection was probably not random but was due to Apple not wanting strong encryption applications on the App Store, perhaps because of some US law. I concur that the rejection is probably not random in the strictest sense of the word. Apple obviously must have its reasons.  However, I can't fathom what they may be and they are certainly NOT due to some US law.

The US encryption technology export control law does not prevent Apple from distributing 128-bit encryption applications. In fact, the iPhone has strong encryption frameworks built in and there are already many applications on the App Store offering strong encryption.

US law only required that people/companies obtain permission from the US government to export proprietary encryption technology.  In the case of publicly available (e.g. open source) encryption algorithms, what is required is to submit a notice to relevant US government authority with the name, description, etc. of the product and publicly available algorithm used.  iPrivacy uses Apple's open source CommonCrypto framework to implement its AES128 encryption.  Therefore, we submitted the relevant notice to the US government so iPrivacy is clear from a US legal and export control perspective.

In short, Apple has already approved many strong encryption applications and iPrivacy is already compliant with US export control law requirements. Therefore, the rejection is not to prevent strong encryption applications on the iPhone platform nor to meet some US legal requirement. In this sense, the rejection is "random" because I can see no logical, business, or legal reason for Apple to reject iPrivacy and Apple hasn't been forthcoming with the real reason for the rejection. The given reason is a smoke screen. This is easily deduced from the fact that the forementioned "there is no way to recover encrypted data without a password" is already violated by the numerous encryption applications on the App Store. That is, of course, unless all such applications have undocumented back doors. In that case, the iPhone user community should be extremely worried because it would mean that anyone could access their encrypted data if such undocumented back doors exist. That would constitute a far bigger and serious issue than a random rejection of a single application...