How to get only images in the camera roll using Photos Framework

7 Solutions Collect From Internet About “How to get only images in the camera roll using Photos Framework”

After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:

  1. PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) – fetches the Photo Stream album.

  2. PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) – fetches the Camera Roll album.

Haven’t tested if this is backward-compatible with iOS 8.0.x though.

Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.

Sample code:

//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)

assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in

var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %@", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)

If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.


A Boolean value that specifies whether Photos can download the requested image from iCloud.



If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.

If you are searching like me for Objective C code, and also you didn’t get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary’s code , Then this will help you:

Global Variables:

var imageArray: [AnyObject]
var mutableArray: [AnyObject]

convenience func getAllPhotosFromCamera() {
    imageArray = [AnyObject]()
    mutableArray = [AnyObject]()
    var requestOptions: PHImageRequestOptions = PHImageRequestOptions()
    requestOptions.resizeMode = .Exact
    requestOptions.deliveryMode = .HighQualityFormat
    requestOptions.synchronous = true
    var result: PHFetchResult = PHAsset.fetchAssetsWithMediaType(.Image, options: nil)
    NSLog("%d", Int(result.count))
    var manager: PHImageManager = PHImageManager.defaultManager()
    var images: [AnyObject] = [AnyObject](minimumCapacity: result.count)
        // assets contains PHAsset objects.
    var ima: UIImage
    for asset: PHAsset in result {
        // Do something with the asset
        manager.requestImageForAsset(asset, targetSize: PHImageManagerMaximumSize, contentMode: .Default, options: requestOptions, resultHandler: {(image: UIImage, info: [NSObject : AnyObject]) -> void in

Objective C

Global Variables:

NSArray *imageArray;
NSMutableArray *mutableArray;

below method will help you:

    imageArray=[[NSArray alloc] init];
    mutableArray =[[NSMutableArray alloc]init];

    PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
    requestOptions.resizeMode   = PHImageRequestOptionsResizeModeExact;
    requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
    requestOptions.synchronous = true;
    PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];


    PHImageManager *manager = [PHImageManager defaultManager];
    NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];

    // assets contains PHAsset objects.

    __block UIImage *ima;
    for (PHAsset *asset in result) {
        // Do something with the asset

        [manager requestImageForAsset:asset
                        resultHandler:^void(UIImage *image, NSDictionary *info) {
                            ima = image;

                            [images addObject:ima];


    imageArray = [images copy];  // You can direct use NSMutuable Array images

I’ve been banging my head over this too. I’ve found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I’m able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it’s using way too much resources to do for every asset in the list. There must be a better way.

PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];

for (int i=0; i<[fetchResult count]; i++) {

    PHAsset *asset = fetchResult[i];

    [asset requestContentEditingInputWithOptions:nil
                               completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
                                   if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
                                       NSLog(@"asset is in cloud");
                                   } else {
                                       NSLog(@"asset is on device");


If you don’t want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.

Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

Here is Objective- c version provided by apple.

-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
    PHFetchResult *fetchResult = array[1];
    int index = 0;
    unsigned long pictures = 0;
    for(int i = 0; i < fetchResult.count; i++){
        unsigned long temp = 0;
        temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
        if(temp > pictures ){
            pictures = temp;
            index = i;
    PHCollection *collection = fetchResult[index];

       if (![collection isKindOfClass:[PHAssetCollection class]]) {
        // return;
    // Configure the AAPLAssetGridViewController with the asset collection.
    PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
    PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
    self. assetsFetchResults = assetsFetchResult;
    self. assetCollection = assetCollection;
    self.numberOfPhotoArray = [NSMutableArray array];
    for (int i = 0; i<[assetsFetchResult count]; i++) {
        PHAsset *asset = assetsFetchResult[i];
        [self.numberOfPhotoArray addObject:asset];
    NSLog(@"%lu",(unsigned long)[self.numberOfPhotoArray count]);
    return self.numberOfPhotoArray;

Where you can grab following details

PHFetchResult *fetchResult = self.sectionFetchResults[1];
        PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies** 
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
 **value 1,7 used to get favorites**

Apple demo link

Declare your property

@property (nonatomic, strong) NSArray *sectionFetchResults;
@property (nonatomic, strong) PHFetchResult *assetsFetchResults;
@property (nonatomic, strong) PHAssetCollection *assetCollection;
@property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;