How To Use AVCaptureStillImageOutput To Take Picture

I have a preview layer that is pulling from the camera and working as it should. I would like to be able to take a picture when I press a button. I have inited the AVCaptureStillImageOutput like this:

AVCaptureStillImageOutput *avCaptureImg = [[AVCaptureStillImageOutput alloc] init];

Then I am trying to take a picture using this object:

  • When I clean Xcode DerivedData I can't run my app in iOS Simulator
  • A html5 web app for mobile safari to upload images from the Photos.app?
  • Declarations in extensions cannot override yet error in Swift 4
  • Navbar issues, swift
  • Facebook iOS SDK to return a signed request object upon login
  • DidSelectRow method being disabled due to tapGestureRecognizer
  • [avCaptureImg captureStillImageAsynchronouslyFromConnection:(AVCaptureConnection *) completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {  }];
    

    I need help on how to take a picture and save it in a variable. Thanks

    4 Solutions Collect From Internet About “How To Use AVCaptureStillImageOutput To Take Picture”

    You need to be sure to define a AVCaptureVideoPreviewLayer & add it to a view layer :

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    [self.view.layer addSublayer:captureVideoPreviewLayer];
    

    This will be connected to your AVCaptureDeviceInput

    Here’s the full solution :

    /////////////////////////////////////////////////
    ////
    //// Utility to find front camera
    ////
    /////////////////////////////////////////////////
    -(AVCaptureDevice *) frontFacingCameraIfAvailable{
    
        NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        AVCaptureDevice *captureDevice = nil;
    
       for (AVCaptureDevice *device in videoDevices){
    
            if (device.position == AVCaptureDevicePositionFront){
    
                captureDevice = device;
                break;
            }
        }
    
        //  couldn't find one on the front, so just get the default video device.
        if (!captureDevice){
    
            captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        }
    
        return captureDevice;
    }
    
    /////////////////////////////////////////////////
    ////
    //// Setup Session, attach Video Preview Layer
    //// and Capture Device, start running session
    ////
    /////////////////////////////////////////////////
    -(void) setupCaptureSession {
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        session.sessionPreset = AVCaptureSessionPresetMedium;
    
        AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer    alloc] initWithSession:session];
        [self.view.layer addSublayer:captureVideoPreviewLayer];
    
        NSError *error = nil;
        AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
        if (!input) {
            // Handle the error appropriately.
            NSLog(@"ERROR: trying to open camera: %@", error);
        }
        [session addInput:input];
    
        self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
        NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
        [self.stillImageOutput setOutputSettings:outputSettings];
    
        [session addOutput:self.stillImageOutput];
    
        [session startRunning];
    }
    
    
    /////////////////////////////////////////////////
    ////
    //// Method to capture Still Image from 
    //// Video Preview Layer
    ////
    /////////////////////////////////////////////////
    -(void) captureNow {
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
            for (AVCaptureInputPort *port in [connection inputPorts]) {
                if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection) { break; }
        }
    
        NSLog(@"about to request a capture from: %@", self.stillImageOutput);
        __weak typeof(self) weakSelf = self;
        [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
    
             NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
             UIImage *image = [[UIImage alloc] initWithData:imageData];
    
             [weakSelf displayImage:image];
         }];
    }
    

    for swift version:

    @IBAction func capture(sender: AnyObject) {
    
        var videoConnection :AVCaptureConnection?
    
        if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){
            stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (buffer:CMSampleBuffer!, error: NSError!) -> Void in
    
                if let exifAttachments = CMGetAttachment(buffer, kCGImagePropertyExifDictionary, nil) {
                    let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
                    self.previewImage.image = UIImage(data: imageData)
                    UIImageWriteToSavedPhotosAlbum(self.previewImage.image, nil, nil, nil)
                }
            })
        }
    }
    
    -(void)captureImage:(NSString *)string successCallback:(void (^)(id))successCallback errorCallback:(void (^)(NSString *))errorCallback{
    
        __block UIImage *image;
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in stillImageOutput.connections)
        {
            for (AVCaptureInputPort *port in [connection inputPorts])
            {
                if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                {
                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection)
            {
                break;
            }
        }
    
        //NSLog(@"about to request a capture from: %@", stillImageOutput);
        [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    
        [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
         {
             CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
             if (exifAttachments)
             {
                 // Do something with the attachments.
                 //NSLog(@"attachements: %@", exifAttachments);
             } else {
                 //NSLog(@"no attachments");
             }
    
             NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
             image = [[UIImage alloc] initWithData:imageData];
    
    
             successCallback(image);
             //UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
         }];
    
    
        NSError *error;
        if (error) {
            errorCallback(@"error");
        }else{
    
        }
    }
    

    Not sure why I didn’t see this sooner:

    iPhone SDK 4 AVFoundation – How to use captureStillImageAsynchronouslyFromConnection correctly?

    Adams answer works fantastic!