glDrawArrays vs glDrawElements

Ok so I'm still struggling to get this to work. The important parts of my code are:

def __init__(self, vertices, normals, triangles):
    self.bufferVertices = glGenBuffersARB(1)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferVertices)
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, ADT.arrayByteCount(vertices), ADT.voidDataPointer(vertices), GL_STATIC_DRAW_ARB)
    self.vertices = vertices
    self.bufferNormals = glGenBuffersARB(1)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferNormals)
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, ADT.arrayByteCount(normals), ADT.voidDataPointer(normals), GL_STATIC_DRAW_ARB)
    self.normals = normals
    self.bufferTriangles = glGenBuffersARB(1)

    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)
    glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER_ARB, ADT.arrayByteCount(triangles), ADT.voidDataPointer(triangles), GL_STATIC_DRAW_ARB)

    self.triangles = triangles
    glDisableClientState(GL_VERTEX_ARRAY) **(Not sure if any of the following influence in any way)** 
    glDisableClientState(GL_NORMAL_ARRAY)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0)
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0)

I don't think there is anything wrong here from what I've read so far about VBO's. So now I have my vertex, normals(not used yet) and triangle indices buffers. Now for the actual draw:

def draw(self, type):
    glDisableClientState(GL_VERTEX_ARRAY)  
    glDisableClientState(GL_NORMAL_ARRAY)
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, 0)
    glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, 0)
    **Again above line not sure if they have any use.**        
    glEnableClientState(GL_VERTEX_ARRAY)         
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferVertices)
    glVertexPointer(3, GL_FLOAT, 0, None)

    glEnableClientState(GL_NORMAL_ARRAY);
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, self.bufferNormals)
    glNormalPointer(GL_FLOAT, 0, None)

    if type == GL_POINTS:    
        #glDrawArrays( GL_POINTS, 0, len(self.vertices) );    
        glDrawElements(type, len(self.vertices), GL_UNSIGNED_SHORT, 0)
    else:
        #glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)**(If I uncomment this doesnt seem to make any difference?!)**
        #glDrawArrays( GL_TRIANGLES, 0, len(self.triangles) );  
        glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)**(What does it draw now since GL_ELEMENT_ARRAY_BUFFER_ARB is binded to 0 ?!)**

Now the glDrawArrays works. But in the case where I have to draw my triangles it doesn't draw the triangles I have defined in bufferTriangles(this is normal from what I've read since drawArrays doesn't use indices ? Or am I wrong here? ). The problem is that if I try to use the glDrawElements everything crashes with:

Exception Type:  EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x000000003150ebbc
Crashed Thread:  0

Thread 0 Crashed:
0   com.apple.GeForce8xxxGLDriver   0x1a3e7050 gldGetTextureLevel + 743600
1   com.apple.GeForce8xxxGLDriver   0x1a3e7563 gldGetTextureLevel + 744899
2   GLEngine                        0x1a206eee gleDrawArraysOrElements_VBO_Exec + 1950

Now am what am I missing here? From what I can understand I'm probably passing a bad pointer somewhere? Note that even if I try to use glDrawElements(type, 24, GL_UNSIGNED_INT, 0) it still crashes even tho the number of triangles defined is way way larger so I don't think it has anything to do with the size.

Regards, Bogdan

EDIT: Ok so now I've done some extra checking and here is my current situation: I've changed the len(triangles) to ADT.byteCount, no solution yet. So I checked all the data I was getting and it's like this: The vertices array contains ~60000 * 3 = 180000 vertices entries of GL_Float type, as does the normals array. Since there are only < 62535 vertices I'm using unsigned short for the triangles. So I have len(triangles) is ~135000. I've also changed the glDrawElements(GL_TRIANGLES, len(self.triangles), GL_UNSIGNED_SHORT , 0) .I've also checked and all the data from the triangles array is between 0 and 62534, as I was thinking maybe some index that is out of range slipped trough. What else could be wrong here ? Oh and how does glDrawElements(GL_POINTS, ...) work? Does it also need some kind of indices ?

EDIT2 I've updated the code above and as said there, now draw elements draws my GL_POINTS, but I'm not sure where does he gets indices? Or are they not needed in case of GL_POINTS ? And for the GL_TRIANGLES, it works like this , with glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles) commentated, but again what kind of indices does it take here now that the element buffer is binded to 0 ?! And another thing is that glDrawElements will not draw all the points that glDrawArrays does. To better explatin:

glDrawArrays( GL_POINTS, 0, len(self.vertices) );

This draws all my points correctly:

glDrawElements(type, len(self.vertices), GL_UNSIGNED_SHORT, 0)

This seems to visibly draw much fewer points than glDrawArrays. Now the funny thing is that if I pass as size something like 10 * len(self.vertices) to draw elements it will draw all the points(some maybe twice or more ; can I check this? ) but wouldnt it suppose to crash ?

Regards

EDIT3

Some more precise info about the arrays:

vertices - an array of floats,

len(vertices) = 180000 byteCount(vertices) = 720000

triangles - an array of numpy.uint16

len(triangles) = 353439 byteCount(triangles) = 706878 min(triangles) = 0 max(triangles) = 59999 , so they should be pointing to valid vertices

The drawing is done:

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)

UPDATE

Ok just when I tought I got how this should work, I tried to skip the VBO for the elements and went just:

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, ADT.voidDataPointer(self.triangles))

Now not only does this work and draws all my triangles perfectly, but the FPS is better. Shouldn't the VBO be faster? And what could cause the above approach to work but the following to crash:

glBindBufferARB(GL_ELEMENT_ARRAY_BUFFER_ARB, self.bufferTriangles)
glBufferDataARB(GL_ELEMENT_ARRAY_BUFFER_ARB, ADT.arrayByteCount(triangles), ADT.voidDataPointer(triangles), GL_STATIC_DRAW_ARB)
glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)

I have no experience with Python GL, but I think I spotted something. You use len(self.triangles) in the call to glDrawElements , so I suppose that gives you the number of indices in the triangles array. But why then using len(triangles) as size in glBufferData and not ADT.arrayByteCount like in the other calls. So your buffer is just too small, as it contains len(triangles) bytes, although triangles contains unsigned ints. If triangles really contains bytes (what I doubt) you would have to use GL_UNSIGNED_BYTE in glDrawElements .

EDIT: According to your edits I got some more answers. Of course glDrawElements(GL_POINTS, ...) needs indices, too. It just uses every index to draw a point, instead of every three indices for a triangle. It's just that for points you often don't need glDrawElements , as you don't reuse vertices anyway, but you still need indices for it. It doesn't magically become a glDrawArrays call under the hood.

And keep in mind, that the vertices array contains floats and glDrawArrays draws vertices, so you have to draw len(vertices)/3 vertices. Juts remember, an element is an index (of a single vertex), not a triangle and a vertex is 3 floats (or what you specified in glVertexPointer ), not just one.

But if your triangles array really contains tuples of 3 indices (and therefore len(triangles) is the triangle count and not the index count) you would have to draw 3*len(triangles) elements (indices). and if your vertices array contains vectors and not just floats, then drawing len(vertices) vertices in the glDrawArrays call is correct. It would therefore be nice to see their declarations to be sure.


In my experience, the Python OpenGL wrapper is very buggy once you start using some of the more advanced OpenGL calls. Many calls seem to cause a crash for no reason and sometimes work if you replace them with an equivalent sequence of different calls...I switched programming languages for OpenGL instead of having to deal with these issues.


The reason why

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, ADT.voidDataPointer(self.triangles))

works and

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)

doesn't is because PyOpenGL expects None as the void pointer, rather than 0. Be careful when using OpenGL examples written in C, because they use (void*)0 as the void pointer, which isn't interpreted correctly as a pointer by PyOpenGL, which instead treats 0 as a non-void value.

Instead, you should use

glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, None)

(See also https://stackoverflow.com/a/14365737/478380)

链接地址: http://www.djcxy.com/p/52846.html

上一篇: DotNetZip ExtractProgress错误?

下一篇: glDrawArrays vs glDrawElements