Tuesday, August 24, 2010

UIImage-Blur

Posted by hiday81 at 7:49 AM
Many moons ago, I wrote convolution kernel for Cocoa. It had convenience class functions to do many different types of filters, including blurring, embossing, outlining, edge detection, horizontal shift, LaPlacian, soften, high pass, etc. Now, this was before Core Image and long before the switch to Intel. I don't remember exactly when I first wrote it, but I'm guessing it was around 2001 or 2002. The method that actually applied the filter to an image used AltiVec if it was available, and if it wasn't, it did a brute force filter on the CPU.

Of course, once the switch to Intel happened, the AltiVec code was no longer was helpful, and then Apple came out with Core Image which includesda convolution kernel and all of the filter settings I had created and more. So, I stopped maintaining the code.

Then, when the iPhone came out and didn't have Accelerate or Core Image, I saw a potential use for the old source code. I had a project early on where I needed to be able to blur an image. So, I blew the dust off the old code. I didn't convert the entire convolution kernel – I didn't want to go through the effort if it wasn't going to work — so I created a blur category on UIImage. And it didn't work.

Pressed for time, I found another solution because I was uncertain the processor on the original iPhone would be able to apply a convolution kernel fast enough for my purposes, but I included the broken code when I released the source code for replicating the Urban Spoon effect. Today, I received an e-mail from a reader who figured out my rather boneheaded mistake. The convolution filter was fine, I just specified the wrong CGImage when specifying my provider while converting the byte data back to a CGImage.

Now, since I wrote that code, Apple has come out with the Accelerate framework for iOS, so it could definitely be made faster. It's unlikely that I will be able to spend the time converting it to use Accelerate unless I need a convolution kernel for my own client work; I've got too much on my plate right now to tackle it. If anyone's interested in doing the full convolution kernel port, you can check out the source code to Crimson FX. It's an old Cocoa project which may not work anymore, but it has, I believe, the last version of the convolution kernel before I gave up maintaining it. It shouldn't be hard to port the entire convolution kernel to iOS in the same manner. Once you get to the underlying byte data, the process is exactly 100% the same (even if byte order is different), and the code to convert to and from the byte data is this UIImage-Blur category.

So, without further ado, I've created a small scaffold project to hold the unaccelerated UIImage-Blur category. Have fun with it and let me know if you use it in anything interesting. If you improve it and would like to share your improvements, let me know, and I'll post it here.

You can find the source code right here. Here's a screenshot of the test scaffold with the original image and after several blurs have been applied. The image is from the Library of Congress Prints and Photographs Collection. Plattsburgh is where I grew up, so this public domain image struck my fancy. I don't know why, but the Army Base was spelled Plattsburg without the ending 'h' even though the city has always been Plattsburgh with the ending 'h'.

Screen shot 2010-08-24 at 10.18.58 AM.png

Thanks to Anthony Gonsalves for finding my error!

0 comments:

Post a Comment

Related Post

 

Copyright © 2011 Next Iphone | Store Mobile Phone Store