Render pixels to texture

I need to render dynamically changing pixels to screen using OpenGL ES 2.0 on Android.

I have a method that produces width*height array of colors (RGBA, 4*GLubyte). Also there is PutPixel(x, y, color) method that replaces pixel's color in array that has (x, y) coords.

My solution is: each Draw() method call I call BindTexture, glTexImage2D using array as texture source and then render this texture on plane on full viewport.

Is it a right way? Or there is another better solution?


Your method might yield performance that is good enough for your needs, particularly if your width*height is less than the full screen resolution. It also has the virtue of being very simple.

However, in general you get much better performance if you can reduce the size of transfers between CPU and GPU during gameplay.

Assuming you never need to read the pixel colour back on the CPU and that you typically only render a few pixels per frame, then your best bet might be to use a render-to-texture approach. Essentially you create a renderable texture that is width*height in size, and have your PutPixel function render coloured points into the texture (batching them up to minimize draw calls).

链接地址: http://www.djcxy.com/p/33914.html

上一篇: OpenGL:绘制FBO时的纹理

下一篇: 将像素渲染为纹理