Converting raw data to displayable video for iOS
I have an interesting problem I need to research related to very low level video streaming.
Has anyone had any experience converting a raw stream of bytes(separated into per pixel information, but not a standard format of video) into a low resolution video stream? I believe that I can map the data into RGB value per pixel bytes, as the color values that correspond to the value in the raw data will be determined by us. I’m not sure where to go from there, or what the RGB format needs to be per pixel.
I’ve looked at FFMPeg but it’s documentation is massive and I don’t know where to start.
Specific questions I have include, is it possible to create CVPixelBuffer with that pixel data? If I were to do that, what sort of format for the per pixel data would I need to convert to?
Also, should I be looking deeper into OpenGL, and if so where would the best place to look for information on this topic?
What about CGBitmapContextCreate? For example, if I went I went with something like this, what would a typical pixel byte need to look like? Would this be fast enough to keep the frame rate above 20fps?
I think with the excellent help of you two, and some more research on my own, I’ve put together a plan for how to construct the raw RGBA data, then construct a CGImage from that data, in turn create a CVPixelBuffer from that CGImage from here CVPixelBuffer from CGImage.
However, to then play that live as the data comes in, I’m not sure what kind of FPS I would be looking at. Do I paint them to a CALayer, or is there some similar class to AVAssetWriter that I could use to play it as I append CVPixelBuffers. The experience that I have is using AVAssetWriter to export constructed CoreAnimation hierarchies to video, so the videos are always constructed before they begin playing, and not displayed as live video.
- How to get Bytes from CMSampleBufferRef , To Send Over Network
- Holding onto a MTLTexture from a CVImageBuffer causes stuttering
- Apply Core Image Filter to Video on OS X using Swift
- How to directly update pixels - with CGImage and direct CGDataProvider
- Reading video frame-by-frame under iOS
- Knowing resolution of AVCaptureSession's session presets
2 Solutions Collect From Internet About “Converting raw data to displayable video for iOS”
I’ve done this before, and I know that you found my GPUImage project a little while ago. As I replied on the issues there, the GPUImageRawDataInput is what you want for this, because it does a fast upload of RGBA, BGRA, or RGB data directly into an OpenGL ES texture. From there, the frame data can be filtered, displayed to the screen, or recorded into a movie file.
Your proposed path of going through a CGImage to a CVPixelBuffer is not going to yield very good performance, based on my personal experience. There’s too much overhead when passing through Core Graphics for realtime video. You want to go directly to OpenGL ES for the fastest display speed here.
I might even be able to improve my code to make it faster than it is right now. I currently use
glTexImage2D() to update texture data from local bytes, but it would probably be even faster to use the texture caches introduced in iOS 5.0 to speed up refreshing data within a texture that maintains its size. There’s some overhead in setting up the caches that makes them a little slower for one-off uploads, but rapidly updating data should be faster with them.
My 2 cents:
I made an opengl game which lets the user record a 3d scene. Playback was done via replaying the scene (instead of playing a video because realtime encoding did not yield a comfortable FPS.
There is a technique which could help out, unfortunately I didn’t have time to implement it:
This technique should cut down time on getting pixels back from openGL. You might get an acceptable video encoding rate.
- NSObject subclass in Swift: hash vs hashValue, isEqual vs ==
- Limit UITextField input to numbers in Swift
- How to hide info button in ZBar Bar Code Reader for iOS6.0 and above
- Observing change in frame of a UIView during animation
- Xcode Interface Builder Not showing App Delegate Object
- Python “or” equivalent in Swift?
- What does the '@' symbol mean in Swift?
- Separating multiple if conditions with commas in Swift
- Xcode 8 recommend me to change the min iOS Deployment Target from 7.1 to 8.0
- Application Icon Showing Default, not IconSet
- How to dispatch code blocks to the same thread in iOS?
- Return a list of running background apps/processes in iOS
- How to rounded the corners when I draw rectangle using UIBezierPath points
- NSTimer doesn't stop with invalidate
- Perform migration by adding List() and another model class