How to compress/resize image on iPhone OS SDK before uploading to a server?

I’m currently uploading an image to a server using Imgur on iOS with the following code:

NSData* imageData = UIImagePNGRepresentation(image);
NSArray* paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString* fullPathToFile = [[paths objectAtIndex:0] stringByAppendingPathComponent:@"SBTempImage.png"];
[imageData writeToFile:fullPathToFile atomically:NO];

[uploadRequest setFile:fullPathToFile forKey:@"image"];

The code works fine when run in the simulator and uploading a file from the simulator’s photo library because I’m on a fast ethernet connection. However, the same code times out on the iPhone when selecting an image taken with the iPhone. So, I tried it by saving a small image from the web and attempting to upload that, which worked.

  • Change contrast, brightness, saturation, or color of UIImage
  • Save and Retrieve of an UIImage on CoreData
  • Change color of certain pixels in a UIImage
  • Rotate Newly Created iOS Image 90 Degrees Prior to Saving as PNG
  • Create new UIImage by adding shadow to existing UIImage
  • Converting UIImage to MLMultiArray for Keras Model
  • This leads me to believe the large images taken by the iPhone are timing out over the somewhat slow 3G network. Is there any way to compress/resize the image from the iPhone before sending it?


    13 Solutions Collect From Internet About “How to compress/resize image on iPhone OS SDK before uploading to a server?”

    You should be able to make a smaller image by doing something like

    UIImage *small = [UIImage imageWithCGImage:original.CGImage scale:0.25 orientation:original.imageOrientation];

    (for a quarter-size image) then convert the smaller image to a PNG or whatever format you need.

    This snippet will resize the image:

    [image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
    UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();

    The variable newSize is a CGSize and can be defined like so:

    CGSize newSize = CGSizeMake(100.0f, 100.0f);

    A self-contained solution:

    - (UIImage *)compressForUpload:(UIImage *)original scale:(CGFloat)scale
        // Calculate new size given scale factor.
        CGSize originalSize = original.size;
        CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
        // Scale the original image to match the new size.
        [original drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
        UIImage *compressedImage = UIGraphicsGetImageFromCurrentImageContext();
        return compressedImage;

    Thanks to @Tuan Nguyen.

    To complement @Tuan Nguyen, this is maybe the fastest and most elegant way to do that.

    To link to John Muchow’s post at , adding a category to a UIImage is a very very handy way to scale in a very fast fashion.
    Just calling

        UIImage *_image = [[[UIImage alloc] initWithData:SOME_NSDATA] scaleToSize:CGSizeMake(640.0,480.0)];

    returns you a 640×480 representation image of your NSDATA ( that could be an online image ) without any more line of code.

    Matt Gemmell’s MGImageUtilities are very nice, resizing efficiently and with some effort-reducing methods.

    In this code 0.5 means 50% …

    UIImage *original = image;
    UIImage *compressedImage = UIImageJPEGRepresentation(original, 0.5f);

    use this simple method NSData *data = UIImageJPEGRepresentation(chosenImage, 0.2f);

    Swift implementation of Zorayr’s function (with a bit of a change to include height or width constraints by actual units not scale):

    class func compressForUpload(original:UIImage, withHeightLimit heightLimit:CGFloat, andWidthLimit widthLimit:CGFloat)->UIImage{
        let originalSize = original.size
        var newSize = originalSize
        if originalSize.width > widthLimit && originalSize.width > originalSize.height {
            newSize.width = widthLimit
            newSize.height = originalSize.height*(widthLimit/originalSize.width)
        }else if originalSize.height > heightLimit && originalSize.height > originalSize.width {
            newSize.height = heightLimit
            newSize.width = originalSize.width*(heightLimit/originalSize.height)
        // Scale the original image to match the new size.
        original.drawInRect(CGRectMake(0, 0, newSize.width, newSize.height))
        let compressedImage = UIGraphicsGetImageFromCurrentImageContext();
        return compressedImage
    #import <ImageIO/ImageIO.h>
    #import <MobileCoreServices/MobileCoreServices.h>
    + (UIImage *)resizeImage:(UIImage *)image toResolution:(int)resolution {
    NSData *imageData = UIImagePNGRepresentation(image);
    CGImageSourceRef src = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
    CFDictionaryRef options = (__bridge CFDictionaryRef) @{
                                                           (id) kCGImageSourceCreateThumbnailWithTransform : @YES,
                                                           (id) kCGImageSourceCreateThumbnailFromImageAlways : @YES,
                                                           (id) kCGImageSourceThumbnailMaxPixelSize : @(resolution)
    CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(src, 0, options);
    UIImage *img = [[UIImage alloc]initWithCGImage:thumbnail];
    return img;

    Swift 2.0 version of Jagandeep Singh method but need to convert data to image because NSData? is not converted UIImage automatically.

    let orginalImage:UIImage = image
    let compressedData = UIImageJPEGRepresentation(orginalImage, 0.5)
    let compressedImage = UIImage(data: compressedData!)
    NsData *data=UiImageJPEGRepresentation(Img.image,0.2f);
    UIImage *image = [UIImage imageNamed:@"image.png"];
    NSData *imgData1 = UIImageJPEGRepresentation(image, 1);
    NSLog(@"Original --- Size of Image(bytes):%d",[imgData1 length]);
    NSData *imgData2 = UIImageJPEGRepresentation(image, 0.5);
    NSLog(@"After --- Size of Image(bytes):%d",[imgData2 length]);
    image = [UIImage imageWithData:imgData2];
    imgTest.image = image;

    try to convert JPG by scaling factor. Here I am using 0.5
    In my case:
    Original — Size of Image(bytes): 85KB & After — Size of Image(bytes): 23KB

    -(UIImage *) resizeImage:(UIImage *)orginalImage resizeSize:(CGSize)size
    CGFloat actualHeight = orginalImage.size.height;
    CGFloat actualWidth = orginalImage.size.width;
    //  if(actualWidth <= size.width && actualHeight<=size.height)
    //  {
    //      return orginalImage;
    //  }
    float oldRatio = actualWidth/actualHeight;
    float newRatio = size.width/size.height;
    if(oldRatio < newRatio)
        oldRatio = size.height/actualHeight;
        actualWidth = oldRatio * actualWidth;
        actualHeight = size.height;
        oldRatio = size.width/actualWidth;
        actualHeight = oldRatio * actualHeight;
        actualWidth = size.width;
    CGRect rect = CGRectMake(0.0,0.0,actualWidth,actualHeight);
    [orginalImage drawInRect:rect];
    orginalImage = UIGraphicsGetImageFromCurrentImageContext();
    return orginalImage;

    This is the method calling

     UIImage *compimage=[appdel resizeImage:imagemain resizeSize:CGSizeMake(40,40)];      

    it returns image this image u can display any where………..