OpenGL LWJGL Texture Rendering Failure

Working with LWJGL's OpenGL version 1.1 and 2D Textures, I find myself stuck...

For some reason, the LWJGL engine will not render loaded textures on the 2D Layer... Instead, I get a white square..

I'm assuming that it is highly likely that I'm missing something somewhere in my code.. Below is the entire code, as related to such event..

Loading up the OpenGL Environment:

glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glEnable(GL_TEXTURE_2D);
glClearColor(0, 0, 0, 1);
glClearDepth(1);
glClearStencil(0);
glColor3f(1.0f, 1.0f, 1.0f);

Entering the 2D Drawing Mode (functional for the purpose of drawing a solid colored square - tested via the glcolor function followed by glvertex calls for dimensions):

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glDisable(GL_CULL_FACE);
glDisable(GL_DEPTH_TEST);
glViewport(0, 0, w, h);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, Display.getWidth(), Display.getHeight(), 0, 0, 1);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();

Loading the image (This has been verified to be functional, via a Swing testing environment):

this.ref = ref;
if (!fmt.isImage()) {
    return;
}
f = new File("file/path/to/image.gif");
if (!f.exists() || !f.canRead()) {
    return;
}
BufferedImage tmp = ImageIO.read(f);
WritableRaster raster = WritableRaster.createInterleavedRaster(DataBuffer.TYPE_BYTE, tmp.getWidth(), tmp.getHeight(), 4, null);
img = new BufferedImage(glColorModel, raster, false, new Hashtable());
Graphics g = img.getGraphics();
g.setColor(new Color(0, 0, 0, 0));
g.fillRect(0, 0, img.getWidth(), img.getHeight());
g.drawImage(tmp, 0, 0, null);
g.dispose();

Converting the image data:

byte[] data = ((DataBufferByte) img.getRaster().getDataBuffer()).getData();
buff = ByteBuffer.allocateDirect(data.length);
buff.put(data);
buff.flip();

Moving the data into the OpenGL Buffers:

id = glGenTextures();
glBindTexture(GL_TEXTURE_2D, id);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_RGBA,
             GL_UNSIGNED_BYTE, buff);

Drawing the texture on the 2D surface:

glBindTexture(GL_TEXTURE_2D, tex.getId());
glTranslatef(x, y, 0);
glBegin(GL_QUADS);
{
    glTexCoord2f(0, 0);
    glVertex2f(0, 0);
    glTexCoord2f(w, 0);
    glVertex2f(w, 0);
    glTexCoord2f(w, h);
    glVertex2f(w, h);
    glTexCoord2f(0, h);
    glVertex2f(0, h);
}
glEnd();

I'm not exactly experienced with LWJGL, or even OpenGL for that matter, however, this code looks proper to me, as a Java developer and having studied a few bits of example source codes...

Ultimately, my question is "How can I fix this?" I added the "glEnable(GL_TEXTURE_2D);" while typing this, which had caused the drawing area to go from white to a blood red color...


In OpenGL texture coordinates are given from [0,0] (meaning bottom left corner) to [1,1] (upper right corner). When texture coordinates are out of this range and GL_TEXTURE_WRAP_[R|S|T] is set to GL_REPEAT (as by default), the actual lookup positions into the texture are calculated by

lookup.xy = fract(texCoord.xy)

In the special case given here, the texture coordinates range from 0-w, which will result in w repetitions of the texture. Since the viewports width is also set to w, each of this repetitions will only be of 1px width.

链接地址: http://www.djcxy.com/p/34036.html

上一篇: 使用实例化绘图时渲染为纹理

下一篇: OpenGL LWJGL纹理渲染失败