Blur UIImage outside of bounds (Photoshop-style)
I’m trying to do a Gaussian blur on a UIImage that replicates my photoshop mockup.
In Photoshop, when I run a Gaussian blur filter, the image layer gets larger as a result of the blurred edges.
Observed Behavior: Using GPUImage, I can successfully blur my UIImages. However, the new image is cropped at the original bounds, leaving a hard edge all the way around.
UIImageView.layer.masksToBounds = NO; doesn’t help, as the image is cropped not the view.
I’ve also tried placing the UIImage centered on a larger clear image before blurring, and then resizing. This also didn’t help.
Is there a way to achieve this “Photoshop-style” blur?
UPDATE Working Solution thanks to Brad Larson:
UIImage sourceImage = ... GPUImagePicture *imageSource = [[GPUImagePicture alloc] initWithImage:sourceImage]; GPUImageTransformFilter *transformFilter = [GPUImageTransformFilter new]; GPUImageFastBlurFilter *blurFilter = [GPUImageFastBlurFilter new]; //Force processing at scale factor 1.4 and affine scale with scale factor 1 / 1.4 = 0.7 [transformFilter forceProcessingAtSize:CGSizeMake(SOURCE_WIDTH * 1.4, SOURCE_WIDTH * 1.4)]; [transformFilter setAffineTransform:CGAffineTransformMakeScale(0.7, 0.7)]; //Setup desired blur filter [blurFilter setBlurSize:3.0f]; [blurFilter setBlurPasses:20]; //Chain Image->Transform->Blur->Output [imageSource addTarget:transformFilter]; [transformFilter addTarget:blurFilter]; [imageSource processImage]; UIImage *blurredImage = [blurFilter imageFromCurrentlyProcessedOutputWithOrientation:UIImageOrientationUp];
- Loading images for animation in UIImageView - iOS
- UIImageJPEGRepresentation taking huge memory
- Is this kind of masking possible with UIImage or CGImage API in iOS
- Get average color of UIImage in Swift
- How to stretch Image to fill the Label Width set in Background in UILabel?
- Does the UIImage Cache image?
2 Solutions Collect From Internet About “Blur UIImage outside of bounds (Photoshop-style)”
GPUImage will only produce a result that is processed up to the limits of your image. In order to extend past your image, you’ll need to expand the canvas on which it operates.
To do this, you’ll want to feed your image into a GPUImageTransformFilter, then use
-forceProcessingAtSizeRespectingAspectRatio: to enlarge the working area. However, this will also enlarge the image by default, so to counter that, use a scale transform with your GPUImageTransformFilter to reduce the size of your image relative to the larger area. This will keep it with the same pixel dimensions, while placing it within a larger overall image.
Then all you need to do is feed this output into your blur filter and the blur will now extend past the edge of your original image. The size you force the image to be will depend on how far the blur needs to extend past the original image’s edges.
Try resizing the UIImageView’s bounds to adjust to the blur. A view will clip what is outside of its bounds. Note that in your example, the box blurred in photoshop looks to be about 20% larger than the original image.
UIImageView *image; image.layer.bounds = CGRectMake(0, 0, image.layer.bounds.size.width + 5, image.layer.bounds.size.height + 5);
- ios swift 3 – UIAlertController width
- To detect IOS device type
- Dynamically wrapping text around multiple UIImageView
- How to add color to a view in swift3?
- How do I render a view which contains Core Animation layers to a bitmap?
- UICollectionView recieved layout attributes for a cell with an index path that does not exist
- No AVPlayer Delegate? How to track when song finished playing? Objective C iPhone development
- How to Pass Variables from a Subclass via a Segue to a SecondViewController
- how do you draw a line programmatically from a view controller?
- Programmatically access image assets
- Alamofire extra argument 'method' in call
- Google Maps URL scheme not working on iOS 9
- Lesser than or greater than in Swift switch statement
- Add glowing effect to an SKSpriteNode
- Rotate a UIView clockwise for an angle greater than 180 degrees