Apply a filter algorithm for each frame of the camera
I am working on an Iphone application.
I need to do the following: when the user clicks on the "Camera Tab" the camera open inside the view with circle overlays.
I want to apply a filtering algorithm on the camera.
I am looking for the best way to do this. Is there a library that can help?
What I am doing currently:
OpenCV
Library. cvCaptureFromCam()
method from the OpenCV
framework (This will capture the picture with a camera and return it). UIImageView
The idea is that on each timer tick I get the image, filter it and put it in the UIImageView
. If the timer tick is fast enough it will appear as continuous.
However the cvCaptureFromCam
is a little slow and this whole process is taking too much memory.
Any suggestions of a better way is greatly appreciated. Thanks
Anything that's based on CPU-bound processing, such as OpenCV, is probably going to be too slow for live video filtering on current iOS devices. As I state in this answer, I highly recommend looking to OpenGL ES for this.
As mentioned by CSmith, I've written an open source framework called GPUImage for doing this style of GPU-based filtering without having to worry about the underlying OpenGL ES involved. Most of the filters in this framework can be applied to live video at 640x480 at well over the 30 FPS framerate of the iOS camera. I've been gradually adding filters with the goal of replacing all of those present in Core Image, as well as most of the image processing functions of OpenCV. If there's something I'm missing from OpenCV that you need, let me know on the issues page for the project.
Build and run the FilterShowcase example application to see a full listing of the available filters and how they perform on live video sources, and look at the SimplePhotoFilter example to see how you can apply those filters to preview video and photos taken by the camera.
链接地址: http://www.djcxy.com/p/68310.html上一篇: Phonegap摄像机控制
下一篇: 为相机的每一帧应用滤镜算法