While the Accelerate framework will be faster than simple serial code, you'll probably never see the greatest performance by blurring an image using it.
My suggestion would be to use an OpenGL ES 2.0 shader (for devices that support this API) to do a two-pass box blur. Based on my benchmarks, the GPU can handle these kinds of image manipulation operations at 14-28X the speed of the CPU on an iPhone 4, versus the maybe 4.5X that Apple reports for the Accelerate framework in the best cases.
If you need a starting project to feed images into the GPU for processing, you might be able to use my sample application from the article here. That sample application passes AVFoundation video frames as textures into a processing shader, but you can modify it to send in your particular image data and run your blur operation. You should be able to use my glReadPixels() code to then retrieve the blurred image for later use.
Since I originally wrote this answer, I've created an open source image and video processing framework for doing these kinds of operations on the GPU. The framework has several different blur types within it, all of which can be applied very quickly to images or live video. The GPUImageGaussianBlurFilter, which applies a standard 9-hit Gaussian blur, runs in 16 ms for a 640x480 frame of video on the iPhone 4. The GPUImageFastBlurFilter is a modified 9-hit Gaussian blur that uses hardware filtering, and it runs in 2.0 ms for that same video frame. Likewise, there's a GPUImageBoxBlurFilter that uses a 5-pixel box and runs in 1.9 ms for the same image on the same hardware. I also have median and bilateral blur filters, although they need a little performance tuning.
In my benchmarks, Accelerate doesn't come close to these kinds of speeds, especially when it comes to filtering live video.
It's a lot easier to find OpenGL ES 2.0 material for iOS (or any OS, really) than it used to be a year or so ago.
For something written from a pure iOS perspective, it's hard to beat Jeff LaMarche's chapters from his unpublished book, which start here. You linked to his OpenGL ES 1.1 tutorials, which are also great, but he didn't place his newer 2.0 material on that list.
iPhone 3D Programming by Philip Rideout is a great book that covers both OpenGL ES 1.1 and 2.0. It does not assume that you know OpenGL ES, and he does explain a good bit of the math and other fundamentals required to understand what he's talking about. He gets into some pretty advanced techniques towards the end. However, all of his code is in C++, rather than Objective-C, so that may be a little disconcerting for someone used to Cocoa development. Still, the core C API for OpenGL ES is the same, so it's easy to see what's going on.
If you're just getting started with OpenGL ES 2.0, it might not be a bad idea to start using GLKit (available only on iOS 5.0), which simplifies some of the normal setup chores around your render buffers and simple shader-based effects. Apple's WWDC 2011 videos have some good material on this, but their 2009 and 2010 videos (if you can find them, some are available at apple archive) provide a lot more introductory material around OpenGL ES 2.0.
Finally, as Andy mentions, I taught a class on the subject as part of my course on iTunes U, which you can download for free here. The course notes for that class can be found here or downloaded as a VoodooPad file here. I warn you that I go a little technical quite fast in the OpenGL ES 2.0 session, so you may want to watch the 1.1 session from the previous semester here. I also talk a little bit about what I've done with OpenGL ES 2.0 in this article about my open source application (whose source code can be grabbed from here, if you'd like to play with a functional OpenGL ES 2.0 iOS application).
While the Accelerate framework will be faster than simple serial code, you'll probably never see the greatest performance by blurring an image using it.
My suggestion would be to use an OpenGL ES 2.0 shader (for devices that support this API) to do a two-pass box blur. Based on my benchmarks, the GPU can handle these kinds of image manipulation operations at 14-28X the speed of the CPU on an iPhone 4, versus the maybe 4.5X that Apple reports for the Accelerate framework in the best cases.
Some code for this is described in this question, as well as in the "Post-Processing Effects on Mobile Devices" chapter in the GPU Pro 2 book (for which the sample code can be found here). By placing your image in a texture, then reading values in between pixels, bilinear filtering on the GPU gives you some blurring for free, which can then be combined with a few fast lookups and averaging operations.
If you need a starting project to feed images into the GPU for processing, you might be able to use my sample application from the article here. That sample application passes AVFoundation video frames as textures into a processing shader, but you can modify it to send in your particular image data and run your blur operation. You should be able to use myglReadPixels()
code to then retrieve the blurred image for later use.Since I originally wrote this answer, I've created an open source image and video processing framework for doing these kinds of operations on the GPU. The framework has several different blur types within it, all of which can be applied very quickly to images or live video. The GPUImageGaussianBlurFilter, which applies a standard 9-hit Gaussian blur, runs in 16 ms for a 640x480 frame of video on the iPhone 4. The GPUImageFastBlurFilter is a modified 9-hit Gaussian blur that uses hardware filtering, and it runs in 2.0 ms for that same video frame. Likewise, there's a GPUImageBoxBlurFilter that uses a 5-pixel box and runs in 1.9 ms for the same image on the same hardware. I also have median and bilateral blur filters, although they need a little performance tuning.
In my benchmarks, Accelerate doesn't come close to these kinds of speeds, especially when it comes to filtering live video.
It's a lot easier to find OpenGL ES 2.0 material for iOS (or any OS, really) than it used to be a year or so ago.
For something written from a pure iOS perspective, it's hard to beat Jeff LaMarche's chapters from his unpublished book, which start here. You linked to his OpenGL ES 1.1 tutorials, which are also great, but he didn't place his newer 2.0 material on that list.
iPhone 3D Programming by Philip Rideout is a great book that covers both OpenGL ES 1.1 and 2.0. It does not assume that you know OpenGL ES, and he does explain a good bit of the math and other fundamentals required to understand what he's talking about. He gets into some pretty advanced techniques towards the end. However, all of his code is in C++, rather than Objective-C, so that may be a little disconcerting for someone used to Cocoa development. Still, the core C API for OpenGL ES is the same, so it's easy to see what's going on.
If you're looking for particular effects, the OpenGL Shading Language book is still one of the primary resources you can refer to. While written for desktop OpenGL, most of the shading language and shaders presented there translate directly across to OpenGL ES 2.0, with only a little modification required.
The books ShaderX6, ShaderX7, GPU Pro, and GPU Pro 2 also have sections devoted to OpenGL ES 2.0, which provide some rendering and tuning hints that you won't find elsewhere. Those are more advanced (and expensive) books, though.
If you're just getting started with OpenGL ES 2.0, it might not be a bad idea to start using GLKit (available only on iOS 5.0), which simplifies some of the normal setup chores around your render buffers and simple shader-based effects. Apple's WWDC 2011 videos have some good material on this, but their 2009 and 2010 videos (if you can find them, some are available at apple archive) provide a lot more introductory material around OpenGL ES 2.0.
Finally, as Andy mentions, I taught a class on the subject as part of my course on iTunes U, which you can download for free here. The course notes for that class can be found here or downloaded as a VoodooPad file here. I warn you that I go a little technical quite fast in the OpenGL ES 2.0 session, so you may want to watch the 1.1 session from the previous semester here. I also talk a little bit about what I've done with OpenGL ES 2.0 in this article about my open source application (whose source code can be grabbed from here, if you'd like to play with a functional OpenGL ES 2.0 iOS application).