Video processing with OpenCV in IOS Swift project

I’ve integrated opencv in Swift IOS project using bridging header (to connect Swift to Objective C) and a Objective C wrapper (to connect Objective C to C++). Using this method I can pass single images from the Swift code, analyse them in the C++ files and get them back.

I’ve seen that opencv provides CvVideoCamera object that can be integrated with an Objective C UIViewController.

  • Video merging in background iOS
  • MOV to Mp4 video conversion iPhone Programmatically
  • is it possible to set GIF image with video?
  • How to get Bytes from CMSampleBufferRef , To Send Over Network
  • How to crop a video to a circle in iOS?
  • GPUImage Video with transparency over UIView
  • But since my UIViewController are written in Swift I’ve wondered if this is possible as well?

    Solutions Collect From Internet About “Video processing with OpenCV in IOS Swift project”

    This is an update to my initial answer after I had a chance to play with this myself. Yes, it is possible to use CvVideoCamera with a view controller written in Swift. If you just want to use it to display video from the camera in your app, it’s really easy.

    #import <opencv2/highgui/cap_ios.h> via the bridging header. Then, in your view controller:

    class ViewController: UIViewController, CvVideoCameraDelegate {
    ...
        var myCamera : CvVideoCamera!
        override func viewDidLoad() {
            ...
            myCamera = CvVideoCamera(parentView: imageView)
            myCamera.delegate = self
            ...
        }
    }
    

    The ViewController cannot actually conform to the CvVideoCameraDelegate protocol, but CvVideoCamera won’t work without a delegate, so we work around this problem by declaring ViewController to adopt the protocol without implementing any of its methods. This will trigger a compiler warning, but the video stream from the camera will be displayed in the image view.

    Of course, you might want to implement the CvVideoCameraDelegate‘s (only) processImage() method to process video frames before displaying them. The reason you cannot implement it in Swift is because it uses a C++ type, Mat.

    So, you will need to write an Objective-C++ class whose instance can be set as camera’s delegate. The processImage() method in that Objective-C++ class will be called by CvVideoCamera and will in turn call code in your Swift class. Here are some sample code snippets.
    In OpenCVWrapper.h:

    // Need this ifdef, so the C++ header won't confuse Swift
    #ifdef __cplusplus
    #import <opencv2/opencv.hpp>
    #endif
    
    // This is a forward declaration; we cannot include *-Swift.h in a header.
    @class ViewController;
    
    @interface CvVideoCameraWrapper : NSObject
    ...
    -(id)initWithController:(ViewController*)c andImageView:(UIImageView*)iv;
    ...
    @end
    

    In the wrapper implementation, OpenCVWrapper.mm (it’s an Objective-C++ class, hence the .mm extension):

    #import <opencv2/highgui/cap_ios.h>
    using namespace cv;
    // Class extension to adopt the delegate protocol
    @interface CvVideoCameraWrapper () <CvVideoCameraDelegate>
    {
    }
    @end
    @implementation CvVideoCameraWrapper
    {
        ViewController * viewController;
        UIImageView * imageView;
        CvVideoCamera * videoCamera;
    }
    
    -(id)initWithController:(ViewController*)c andImageView:(UIImageView*)iv
    {
        viewController = c;
        imageView = iv;
    
        videoCamera = [[CvVideoCamera alloc] initWithParentView:imageView];
        // ... set up the camera
        ...
        videoCamera.delegate = self;
    
        return self;
    }
    // This #ifdef ... #endif is not needed except in special situations
    #ifdef __cplusplus
    - (void)processImage:(Mat&)image
    {
        // Do some OpenCV stuff with the image
        ...
    }
    #endif
    ...
    @end
    

    Then you put #import "OpenCVWrapper.h" in the bridging header, and the Swift view controller might look like this:

    class ViewController: UIViewController {
    ...
        var videoCameraWrapper : CvVideoCameraWrapper!
    
        override func viewDidLoad() {
            ...
            self.videoCameraWrapper = CvVideoCameraWrapper(controller:self, andImageView:imageView)
            ...
        }
    

    See https://developer.apple.com/library/ios/documentation/Swift/Conceptual/BuildingCocoaApps/MixandMatch.html about forward declarations and Swift/C++/Objective-C interop. There is plenty of info on the web about #ifdef __cplusplus and extern "C" (if you need it).

    In the processImage() delegate method you will likely need to interact with some OpenCV API, for which you will also have to write wrappers. You can find some info on that elsewhere, for example here: Using OpenCV in Swift iOS