Thursday, 12 September 2013

I am using CIFilter to get a blur image,but why is the output image always larger than input image?

I am using CIFilter to get a blur image,but why is the output image always
larger than input image?

Codes are as below:
CIImage *imageToBlur = [CIImage imageWithCGImage:
self.pBackgroundImageView.image.CGImage];
CIFilter *blurFilter = [CIFilter filterWithName: @"CIGaussianBlur"
keysAndValues: kCIInputImageKey, imageToBlur, @"inputRadius", [NSNumber
numberWithFloat: 10.0], nil];
CIImage *outputImage = [blurFilter outputImage];
UIImage *resultImage = [UIImage imageWithCIImage: outputImage];
For example,the input image has a size of (640.000000,1136.000000),but the
output image has a size of (700.000000,1196.000000)
Besides the problem described in title,I try to use resultImage to set
back self.pBackgroundImageView,what I get is a completely dark image,what
just happens?
Any advice is appreciated.

No comments:

Post a Comment