在以前的框架顶部绘制一个离屏纹理

我对OpenGL ES 2.0 非常陌生。

我正在尝试使用OpenGL ES 2.0编写一个fingerpaint应用程序。 这个想法是逐渐地将每一帧画到纹理上(不调用glClear(int) ),并将纹理采样到全屏四边形上。

参考下面的代码,当我将GlCircleGlLine绘制到默认的Framebuffer ,一切正常。

但是当我尝试通过使用离屏纹理绘制在前一帧的顶部时,渲染纹理上的坐标似乎关闭:

  • Y轴是倒置的。
  • Y轴上有偏移量
  • 下面的屏幕截图应该直观地显示出了什么问题(红色/蓝色轮廓显示了屏幕上的实际触摸坐标,白色点是从纹理中绘制的):

    在这里输入图像描述

    我究竟做错了什么? 有没有更好的方法来实现这一点?

    这是我的GLSurfaceView.Renderer

    package com.oaskamay.whiteboard.opengl;
    
    import android.opengl.GLES20;
    import android.opengl.Matrix;
    import android.os.Bundle;
    import android.util.Log;
    import android.view.MotionEvent;
    
    import com.oaskamay.whiteboard.opengl.base.GlSurfaceView;
    import com.oaskamay.whiteboard.opengl.drawable.GlCircle;
    import com.oaskamay.whiteboard.opengl.drawable.GlLine;
    import com.oaskamay.whiteboard.opengl.drawable.GlTexturedQuad;
    
    import java.util.ArrayList;
    import java.util.List;
    
    import javax.microedition.khronos.egl.EGLConfig;
    import javax.microedition.khronos.opengles.GL10;
    
    public class GlDrawingRenderer implements GlSurfaceView.Renderer {
    
        /*
         * Keys used to store/restore the state of this renderer.
         */
        private static final String EXTRA_MOTION_EVENTS = "extra_motion_events";
    
        private static final float[] COLOR_BG = new float[]{0.0f, 0.0f, 0.0f, 1.0f};
        private static final float[] COLOR_BRUSH = new float[]{1.0f, 1.0f, 1.0f, 1.0f};
    
        /*
         * Model-view-projection matrix used to map normalized GL coordinates to the screen's.
         */
        private final float[] mMvpMatrix;
        private final float[] mViewMatrix;
        private final float[] mProjectionMatrix;
    
        private final float[] mTextureProjectionMatrix;
        private final float[] mTextureMvpMatrix;
    
        /*
         * Offscreen texture rendering handles.
         */
        private int[] mFrameBufferHandle;
        private int[] mRenderTextureHandle;
    
        /*
         * Lists of vertices to draw each frame.
         */
        private List<Float> mLineVertexData;
        private List<Float> mCircleVertexData;
    
        /*
         * List of stored MotionEvents and PacketData, required to store/restore state of Renderer.
         */
        private ArrayList<MotionEvent> mMotionEvents;
    
        private boolean mRestoreMotionEvents = false;
    
        private GlLine mLine;
        private GlCircle mCircle;
        private GlTexturedQuad mTexturedQuad;
    
        /*
         * Variables to calculate FPS throughput.
         */
        private long mStartTime = System.nanoTime();
        private int mFrameCount = 0;
    
        public GlDrawingRenderer() {
            mMvpMatrix = new float[16];
            mViewMatrix = new float[16];
            mProjectionMatrix = new float[16];
    
            mTextureProjectionMatrix = new float[16];
            mTextureMvpMatrix = new float[16];
    
            mFrameBufferHandle = new int[1];
            mRenderTextureHandle = new int[1];
    
            mLineVertexData = new ArrayList<>();
            mCircleVertexData = new ArrayList<>();
    
            mMotionEvents = new ArrayList<>();
        }
    
        @Override
        public void onSurfaceCreated(GL10 unused, EGLConfig config) {
            // one time feature initializations
            GLES20.glDisable(GLES20.GL_DEPTH_TEST);
            GLES20.glDisable(GLES20.GL_DITHER);
    
            // clear attachment buffers
            GLES20.glClearColor(COLOR_BG[0], COLOR_BG[1], COLOR_BG[2],
                    COLOR_BG[3]);
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    
            // initialize drawables
            mLine = new GlLine();
            mCircle = new GlCircle(5.0f);
            mTexturedQuad = new GlTexturedQuad();
        }
    
        @Override
        public void onSurfaceChanged(GL10 unused, int width, int height) {
            GLES20.glViewport(0, 0, width, height);
    
            // calculate projection, camera matrix and MVP matrix for touch events
            Matrix.setLookAtM(mViewMatrix, 0, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f);
            Matrix.orthoM(mProjectionMatrix, 0, 0.0f, width, height, 0.0f, 0.0f, 1.0f);
            Matrix.multiplyMM(mMvpMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);
            mLine.setMvpMatrix(mMvpMatrix);
            mCircle.setMvpMatrix(mMvpMatrix);
    
            // calculate projection and MVP matrix for texture
            Matrix.setIdentityM(mTextureProjectionMatrix, 0);
            Matrix.multiplyMM(mTextureMvpMatrix, 0, mTextureProjectionMatrix, 0, mViewMatrix, 0);
            mTexturedQuad.setMvpMatrix(mTextureMvpMatrix);
    
            // setup buffers for offscreen texture
            GLES20.glGenFramebuffers(1, mFrameBufferHandle, 0);
            GLES20.glGenTextures(1, mRenderTextureHandle, 0);
    
            mTexturedQuad.initTexture(width, height, mRenderTextureHandle[0]);
        }
    
        @Override
        public void onDrawFrame(GL10 unused) {
            // use offscreen texture frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, mFrameBufferHandle[0]);
            GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
                    GLES20.GL_TEXTURE_2D, mRenderTextureHandle[0], 0);
            GlUtil.glCheckFramebufferStatus();
    
            // restore and draw saved MotionEvents onto texture if they exist
            if (mRestoreMotionEvents) {
                mRestoreMotionEvents = false;
                processStoredMotionEvents();
            }
    
            // draw current MotionEvents onto texture
            drawObjects();
    
            // use window frame buffer
            GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
            GLES20.glClearColor(COLOR_BG[0], COLOR_BG[1], COLOR_BG[2], COLOR_BG[3]);
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    
            // draw texture onto full-screen quad onto the window surface
            drawTexturedQuad();
    
            logFps();
        }
    
        /**
         * Draws any available line and circle vertex data. Objects including {@code GlCircle} and
         * {@code GlLine} are to be drawn on the offscreen texture. The offscreen texture will then be
         * drawn onto a fullscreen quad in the default window framebuffer.
         */
        private void drawObjects() {
            if (!mLineVertexData.isEmpty()) {
                drawLines();
            }
    
            if (!mCircleVertexData.isEmpty()) {
                drawCircles();
            }
        }
    
        /**
         * Draws circles. OpenGL points cannot have radii, hence we draw circles on down key events
         * instead of points.
         */
        private void drawCircles() {
            GLES20.glUseProgram(mCircle.getProgramHandle());
    
            // read offsets
            float dx = mCircleVertexData.remove(0);
            float dy = mCircleVertexData.remove(0);
            float dz = mCircleVertexData.remove(0);
            mCircle.setTranslateMatrix(dx, dy, dz);
    
            // read color
            float r = mCircleVertexData.remove(0);
            float g = mCircleVertexData.remove(0);
            float b = mCircleVertexData.remove(0);
            float a = mCircleVertexData.remove(0);
            mCircle.setColor(r, g, b, a);
    
            mCircle.draw();
        }
    
        /**
         * Draws lines from touch start points to touch end points.
         */
        private void drawLines() {
            GLES20.glUseProgram(mLine.getProgramHandle());
    
            // read offsets
            float x1 = mLineVertexData.remove(0);
            float y1 = mLineVertexData.remove(0);
            float z1 = mLineVertexData.remove(0);
            float x2 = mLineVertexData.remove(0);
            float y2 = mLineVertexData.remove(0);
            float z2 = mLineVertexData.remove(0);
            mLine.setTranslateMatrix(x1, y1, z1, x2, y2, z2);
    
            // read color
            float r = mLineVertexData.remove(0);
            float g = mLineVertexData.remove(0);
            float b = mLineVertexData.remove(0);
            float a = mLineVertexData.remove(0);
            mLine.setColor(r, g, b, a);
    
            mLine.draw();
        }
    
        /**
         * Draws the offscreen texture onto the fullscreen quad, and draws the quad onto the default
         * window framebuffer.
         */
        private void drawTexturedQuad() {
            GLES20.glUseProgram(mTexturedQuad.getProgramHandle());
            mTexturedQuad.draw();
        }
    
        /**
         * Processes MotionEvent.
         * Sets vertex and color data based on MotionEvent information.
         *
         * @param event MotionEvent to process.
         * @param store Pass true when processing fresh MotionEvents to store them to support parent
         *              activity recreations, pass false otherwise.
         */
        public void processMotionEvent(MotionEvent event, boolean store) {
            if (store) {
                mMotionEvents.add(MotionEvent.obtain(event));
            }
    
            int action = event.getActionMasked();
            switch (action) {
                case MotionEvent.ACTION_POINTER_DOWN:
                case MotionEvent.ACTION_DOWN:
                case MotionEvent.ACTION_MOVE:
                    // set centroid
                    mCircleVertexData.add(event.getX());
                    mCircleVertexData.add(event.getY());
                    mCircleVertexData.add(0.0f);
    
                    // set color
                    mCircleVertexData.add(COLOR_BRUSH[0]);
                    mCircleVertexData.add(COLOR_BRUSH[1]);
                    mCircleVertexData.add(COLOR_BRUSH[2]);
                    mCircleVertexData.add(COLOR_BRUSH[3]);
                    break;
            }
        }
    
        /**
         * Draws stored MotionEvents.
         * Required to be able to restore state of this Renderer.
         */
        private void processStoredMotionEvents() {
            for (MotionEvent event : mMotionEvents) {
                processMotionEvent(event, false);
                drawObjects();
            }
        }
    
        /**
         * Prints out current frames-per-second throughput.
         */
        private void logFps() {
            mFrameCount++;
            if (System.nanoTime() - mStartTime >= 1000000000L) {
                Log.d("GlDrawingRenderer", "FPS: " + mFrameCount);
                mFrameCount = 0;
                mStartTime = System.nanoTime();
            }
        }
    
        /**
         * Saves line and circle vertex data into the {@code Bundle} argument. Call when the parent
         * {@code GLSurfaceView} calls its corresponding {@code onSaveInstanceState()} method.
         *
         * @param bundle Destination {@code Bundle} to save the renderer state into.
         */
        public void onSaveInstanceState(Bundle bundle) {
            bundle.putParcelableArrayList(EXTRA_MOTION_EVENTS, mMotionEvents);
        }
    
        /**
         * Restores line and circle vertex data from the {@code Bundle} argument. Call when the parent
         * {@code GLSurfaceView} calls its corresponding {@code onRestoreInstanceState(Parcelable)}
         * method.
         *
         * @param bundle Source {@code Bundle} to save the renderer state from.
         */
        public void onRestoreInstanceState(Bundle bundle) {
            ArrayList<MotionEvent> motionEvents = bundle.getParcelableArrayList(EXTRA_MOTION_EVENTS);
            if (motionEvents != null && !motionEvents.isEmpty()) {
                mMotionEvents.addAll(motionEvents);
                mRestoreMotionEvents = true;
            }
        }
    }
    

    这里是GlTexturedQuad类:

    package com.oaskamay.whiteboard.opengl.drawable;
    
    import android.opengl.GLES20;
    
    import com.oaskamay.whiteboard.opengl.GlUtil;
    
    import java.nio.ByteBuffer;
    import java.nio.ByteOrder;
    import java.nio.FloatBuffer;
    import java.nio.IntBuffer;
    import java.nio.ShortBuffer;
    
    public class GlTexturedQuad {
    
        /*
         * Vertex metadata: we have 3 coordinates per vertex, and a quad can be drawn with 2 triangles.
         */
        private static final int VERTEX_COORDS = 3;
    
        private static final String VERTEX_SHADER_SOURCE =
                "uniform mat4 u_MvpMatrix;                              n" +
                "attribute vec4 a_Position;                             n" +
                "attribute vec2 a_TextureCoord;                         n" +
                "varying vec2 v_TextureCoord;                           n" +
                "                                                       n" +
                "void main() {                                          n" +
                "   v_TextureCoord = a_TextureCoord;                    n" +
                "   gl_Position = u_MvpMatrix * a_Position;             n" +
                "}                                                      n";
    
        private static final String FRAGMENT_SHADER_SOURCE =
                "uniform sampler2D u_Texture;                           n" +
                "varying vec2 v_TextureCoord;                           n" +
                "                                                       n" +
                "void main() {                                          n" +
                "   gl_FragColor = texture2D(u_Texture, v_TextureCoord);n" +
                "}                                                      n";
    
        /*
         * Vertex locations. The quad will cover the whole screen, and is in normalized device
         * coordinates. The projection matrix for this quad should be identity.
         */
        private static final float[] VERTICES = {
                -1.0f, +1.0f, 0.0f,
                -1.0f, -1.0f, 0.0f,
                +1.0f, -1.0f, 0.0f,
                +1.0f, +1.0f, 0.0f
        };
    
        /*
         * Describes the order in which vertices are to be rendered.
         */
        private static final short[] VERTICES_ORDER = {
                0, 1, 2,
                0, 2, 3
        };
    
        /*
         * (u, v) texture coordinates to be sent to the vertex and fragment shaders.
         */
        private static final float[] TEXTURE_COORDS = {
                0.0f, 0.0f,
                0.0f, 1.0f,
                1.0f, 1.0f,
                1.0f, 0.0f
        };
    
        private float mMvpMatrix[];
    
        private int mRenderTexture;
    
        /*
         * FloatBuffers used to store vertices and their order to draw.
         */
        private final FloatBuffer mVertexBuffer;
        private final ShortBuffer mVertexOrderBuffer;
        private final FloatBuffer mTextureCoordsBuffer;
    
        /*
         * OpenGL handles to shader program, attributes, and uniforms.
         */
        private final int mProgramHandle;
        private final int mMvpMatrixHandle;
        private final int mPositionHandle;
        private final int mTextureHandle;
        private final int mTextureCoordHandle;
    
        /**
         * Default constructor. Refrain from calling this multiple times as it may be expensive due to
         * compilation of shader sources.
         */
        public GlTexturedQuad() {
            // initialize vertex buffer
            ByteBuffer vertexBuffer = ByteBuffer.allocateDirect(VERTICES.length * 4);
            vertexBuffer.order(ByteOrder.nativeOrder());
            mVertexBuffer = vertexBuffer.asFloatBuffer();
            mVertexBuffer.put(VERTICES);
            mVertexBuffer.position(0);
    
            // initialize vertex order buffer
            ByteBuffer vertexOrderBuffer = ByteBuffer.allocateDirect(VERTICES_ORDER.length * 2);
            vertexOrderBuffer.order(ByteOrder.nativeOrder());
            mVertexOrderBuffer = vertexOrderBuffer.asShortBuffer();
            mVertexOrderBuffer.put(VERTICES_ORDER);
            mVertexOrderBuffer.position(0);
    
            // initialize texture coordinates
            ByteBuffer textureCoordsBuffer = ByteBuffer.allocateDirect(TEXTURE_COORDS.length * 4);
            textureCoordsBuffer.order(ByteOrder.nativeOrder());
            mTextureCoordsBuffer = textureCoordsBuffer.asFloatBuffer();
            mTextureCoordsBuffer.put(TEXTURE_COORDS);
            mTextureCoordsBuffer.position(0);
    
            // compile vertex and fragment shader sources
            int vertexShader = GlUtil.glLoadShader(GLES20.GL_VERTEX_SHADER,
                    VERTEX_SHADER_SOURCE);
            int fragmentShader = GlUtil.glLoadShader(GLES20.GL_FRAGMENT_SHADER,
                    FRAGMENT_SHADER_SOURCE);
    
            // create shader program and attach compiled sources
            mProgramHandle = GLES20.glCreateProgram();
            GLES20.glAttachShader(mProgramHandle, vertexShader);
            GLES20.glAttachShader(mProgramHandle, fragmentShader);
            GLES20.glLinkProgram(mProgramHandle);
    
            // store attribute / uniform handles
            mMvpMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_MvpMatrix");
            mTextureHandle = GLES20.glGetUniformLocation(mProgramHandle, "u_Texture");
            mPositionHandle = GLES20.glGetAttribLocation(mProgramHandle, "a_Position");
            mTextureCoordHandle = GLES20.glGetAttribLocation(mProgramHandle, "a_TextureCoord");
        }
    
        /**
         * Initializes texture components.
         *
         * @param width Width of texture in pixels.
         * @param height Height of texture in pixels.
         */
        public void initTexture(int width, int height, int renderTexture) {
            mRenderTexture = renderTexture;
    
            // allocate pixel buffer for texture
            ByteBuffer byteBuffer = ByteBuffer.allocateDirect(width * height * 4);
            byteBuffer.order(ByteOrder.nativeOrder());
            IntBuffer texturePixelBuffer = byteBuffer.asIntBuffer();
    
            // initialize texture
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mRenderTexture);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S,
                    GLES20.GL_CLAMP_TO_EDGE);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T,
                    GLES20.GL_CLAMP_TO_EDGE);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,
                    GLES20.GL_LINEAR);
            GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,
                    GLES20.GL_LINEAR);
    
            GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGB, width, height,
                    0, GLES20.GL_RGB, GLES20.GL_UNSIGNED_SHORT_5_6_5, texturePixelBuffer);
        }
    
        /**
         * Draws this object. The model-view-projection matrix must be set with
         * {@link #setMvpMatrix(float[])}.
         */
        public final void draw() {
            GLES20.glEnableVertexAttribArray(mPositionHandle);
            GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
    
            // set vertex position and MVP matrix in shader
            GLES20.glVertexAttribPointer(mPositionHandle, VERTEX_COORDS, GLES20.GL_FLOAT,
                    false, VERTEX_COORDS * 4, mVertexBuffer);
            GLES20.glUniformMatrix4fv(mMvpMatrixHandle, 1, false, mMvpMatrix, 0);
    
            // bind texture
            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, mRenderTexture);
    
            // set texture data and coordinate
            GLES20.glVertexAttribPointer(mTextureCoordHandle, 2, GLES20.GL_FLOAT, false, 0,
                    mTextureCoordsBuffer);
            GLES20.glUniform1i(mTextureHandle, 0);
    
            GLES20.glDrawElements(GLES20.GL_TRIANGLES, VERTICES_ORDER.length, GLES20.GL_UNSIGNED_SHORT,
                    mVertexOrderBuffer);
            GLES20.glDisableVertexAttribArray(mPositionHandle);
            GLES20.glDisableVertexAttribArray(mTextureCoordHandle);
        }
    
        /**
         * Sets the model-view-projection matrix in the vertex shader. Necessary to map the normalized
         * GL coordinate system to that of the display.
         *
         * @param mvpMatrix Matrix to use as the model-view-projection matrix.
         */
        public void setMvpMatrix(float[] mvpMatrix) {
            mMvpMatrix = mvpMatrix;
        }
    
        public int getProgramHandle() {
            return mProgramHandle;
        }
    }
    

    编辑(12/11/2015)

    @ reto-koradi提出了一个更好的解决方案。 通过更改纹理坐标来反转V轴。 此修复也很简单:

    更改此项(在GlTexturedQuad初始化TEXTURE_COORDS数组):

    /*
     * (u, v) texture coordinates to be sent to the vertex and fragment shaders.
     */
    private static final float[] TEXTURE_COORDS = {
            0.0f, 0.0f,
            0.0f, 1.0f,
            1.0f, 1.0f,
            1.0f, 0.0f
    };
    

    为此:

    /*
     * (u, v) texture coordinates to be sent to the vertex and fragment shaders.
     */
    private static final float[] TEXTURE_COORDS = {
            0.0f, 1.0f,
            0.0f, 0.0f,
            1.0f, 0.0f,
            1.0f, 1.0f
    };
    

    我已经解决了这个问题。 问题在于用于GlTexturedQuad的投影矩阵。 修复很简单:

    我改变了这种(在onSurfaceChanged(GL10, int, int)GlDrawingRenderer ):

        // calculate projection and MVP matrix for texture
        Matrix.setIdentityM(mTextureProjectionMatrix, 0);
        Matrix.multiplyMM(mTextureMvpMatrix, 0, mTextureProjectionMatrix, 0, mViewMatrix, 0);
        mTexturedQuad.setMvpMatrix(mTextureMvpMatrix);
    

    为此:

        // calculate projection and MVP matrix for texture
        Matrix.orthoM(mTextureProjectionMatrix, 0, -1.0f, 1.0f, 1.0f, -1.0f, 0.0f, 1.0f);
        Matrix.multiplyMM(mTextureMvpMatrix, 0, mTextureProjectionMatrix, 0, mViewMatrix, 0);
        mTexturedQuad.setMvpMatrix(mTextureMvpMatrix);
    

    所以现在mTextureProjectionMatrix考虑了纹理的V轴反转。 再次,我是OpenGL ES 2.0初学者,我的解释可能是错误的。 但它的作品:)

    我希望这篇文章能帮助那里的人!


    虽然似乎有很多解决方案可以修复倒屏,但您应该了解后台会发生什么情况,为什么甚至会在您的情况下倒置,并且偶然为什么您的解决方案不通用。

    openGL缓冲区遵循传统的桌面坐标系统,其中左下角是原点,高度向上增加。 因此,由于图像数据的使用方式,原始缓冲区数据将具有左下角的第一个像素数据,而不是左前角的像素数据。 因此,如果您想要绘制到图像的左上角部分,则实际需要绘制到缓冲区的左下角部分(尊重演示文稿)。

    所以你的问题不在于你如何呈现绘制的纹理,而是你如何真正绘制到纹理本身。 绘制点时,您的坐标系反转。 但是,我反转的地方有什么不同?

    实际上存在巨大差异。 由于您在绘制到FBO时反转坐标系,然后在绘制到显示缓冲区时再次反转以获得正确的结果,您的反转方程就是(-1 * -1 = 1) 。 然后,如果通过添加另一个FBO来添加一些后处理,会发生什么情况: (-1 * -1 * -1 = -1)这意味着您将需要将演示文稿坐标恢复为正常,因为这些会再次显示为反转。

    另一个问题是如果您尝试读取像素来生成图像。 在所有情况下,如果您尝试从演示缓冲区中读取它,它将被颠倒。 但是,如果您使用FBO并从该缓冲区读取像素,则数据应该是正确的(这不是您的情况)。

    因此,真正的一般解决方案是在绘制除展示缓冲区之外的任何东西时尊重方向。 FBO矩阵不应该反转Y坐标, Y应该向上增加。 在你的情况下,最好的办法是使用一个单独的ortho呼叫:对于FBO来说,只需将topbottom翻转成与演示文稿值相比较。

    链接地址: http://www.djcxy.com/p/33977.html

    上一篇: Drawing on top of previous frame with an offscreen texture

    下一篇: OpenGL ES 2.0 drawing more than one texture