iOS Keychain Security

4 Solutions Collect From Internet About “iOS Keychain Security”

Fraunhofer’s study on iOS keychain security:

From what I can tell, there are two levels of encryption that the iOS keychain uses. The first level uses the lock screen passcode as the encryption key. The second level uses a key generated by and stored on the device.

Fraunhofer’s researchers have figured out how to get around the second level. This is the “easier” level to get around, since the encryption key is stored on the device. So on iOS4, their method only works with keychain entries which do NOT use kSecAttrAccessibleWhenUnlocked or kSecAttrAccessibleWhenUnlockedThisDeviceOnly, because those entries reside in memory with the first level decrypted–even when the phone is locked.

  • Starting from iOS 4, keys with kSecAttrAccessibleWhenUnlocked and kSecAttrAccessibleWhenUnlockedThisDeviceOnly are protected by an extra level of encryption
  • On iOS 3.x and earlier, all keys can be decrypted using Fraunhofer’s method, regardless of accessibility attribute used
  • Devices with no passcodes at all will still be vulnerable
  • Devices with weak passcodes (less than six digits) will still be somewhat vulnerable

≈50ms per password try; → ≈20 tries per second; → ≈1.7 years for a 50%
change of guessing the correct passcode for a 6-digit alphanumeric
code with base 36. The standard simple code of 4 numeric digits would
be brute-forced in less than 9 minutes. Based on the assumption that
the counter for wrong tries in the iOS can be bypassed, as it is not

Apple Inc. WWDC 2010, Core OS, Session 209 “Securing Application Data”, Slide 24

Bottom line:
If you must store sensitive data, better use your own encryption. And don’t store the key on the device.

There are numerous news articles which cite the Fraunhofer study and reassure their readers not to worry unless their devices are stolen, because this attack can only be done with physical access to the device.

I’m somehow doubtful. The fact the researchers did their tests with physical access to the phone seems to have just been a way to simplify the problem, as opposed to being a limitation. This is their description of what they did to decrypt the keychain entries:

After using a jailbreaking tool, to get access to a command shell, we
run a small script to access and decrypt the passwords found in the
keychain. The decryption is done with the help of functions provided
by the operating system itself.

As anyone who has used knows, jailbreaking does not require physical access to the device. Theoretically it should be trivial to modify the code and have it automate the following:

  1. Perform the jailbreak as normal (all this requires is for the user open a maliciously crafted PDF)
  2. Run Fraunhofer’s scripts after the jailbreak is complete
  3. Send the passwords over the network to a location the attacker can read it from

So once again, be cautious about what you put in the keychain.

Normally, the keychain would be the recommended way to store such a certificate. However, it has been discovered that jailbreaking can be used to bypass the security of the keychain (article).

Franhofer did a study on the safety of the iPhone Keychain :

I can answer part of your question, but since the other part is still unknown, I’m voting the question up as I’m also eager to know the answer.

The part that I can answer is: ‘can an app get full keychain access if no screenlock is enabled’. No, every app has its own keychain area on the iphone, which means an app can only get access to its own secrets. These secrets are not locked for the app itself, so there’s no way to hide the keychain entries from the app itself. So to summarize: an app can read its own entries, and no other entries.

What I’m interested to know though is what happens on jailbroken devices. Are the keychains of all apps exposed once a device has a jailbreak?