Topic Title: OpenGL driver bug with GL_UNSIGNED_BYTE via glVertexAttribPointer?
Topic Summary:
Created On: 08/20/2014 01:04 PM
Status: Post and Reply
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 08/20/2014 01:04 PM
User is offline View Users Profile Print this message

Author Icon
jkenneally
Peon

Posts: 2
Joined: 08/20/2014

Hello,

I'm working on a slightly older ATI Mobility Radeon 5650 graphics card, with Catalyst drivers version 14.100.0.0.  I'm targeting OGL 3.3 (and glsl 330).

I have a test application that creates a Vertex Array Object with a simple set of 2 attribute arrays, one containing vertices of 3 floats each, and the other containing color values of 3 unsigned bytes each.  I'm using glVertexAttribPointer and telling OGL to normalized my unsigned byte colors into floats for the shader.

Now into my array, I pack 6 lines representing the world axis, each using a different color.  Then I pack a series of lines representing a 2D grid across the world XZ plane, all of one color.

I render in two steps using glDrawElements().  I bind my VAO, then the world axis is rendered at one line width.  Next a second glDrawElements is called for the grid lines after changing line thickness again.   I just offset my start offset and index counts to shift where I'm drawing from in the VAO for each call.

Now here is the issue:  

On some (and only some) ATI cards, when I draw the grid lines, the last 6 lines seem to get random colors.  Sometimes they are black/white, sometimes they are the colors of my world axis lines.  It is significant that it is just the last 6 lines, as that is how far into the VAO the grid set (and its colors) are offset past the 6 lines of the world axis.  The lines of the grid itself are correct, so I know this isnt as simple as just specifiying the wrong start offset/index count in glDrawElements.

Now, if I ignore trying to change line widths and just render the entire VAO in one pass, everything appears correctly.

Also, if I change to using GL_FLOAT for my source color component instead of GL_UNSIGNED_BYTE (and set normalize to GL_FALSE when calling glVertexAttribPointer), but change nothing else, it works fine using the two draw calls.

Lastly, the GL_UNSIGNED_BYTE based implementation works fine under all NVidia and Intel graphics adapters we tested.  It even worked fine on a few desktop ATI cards.

For once, I'm starting to strongly suspect a driver bug instead of this being my fault... =P

 

 

 08/20/2014 02:49 PM
User is offline View Users Profile Print this message

Author Icon
jkenneally
Peon

Posts: 2
Joined: 08/20/2014

I think I've found the solution to the issue, and I do think it is an ATI bug.

If I use 4 byte colors (ie add an alpha component) instead of just passing rgb sets, everything works fine.

Knowing this and digging around some more, I have found other references to ATI drivers not being happy with 3 byte color attributes.  =P

Statistics
88898 users are registered to the AMD Support and Game forum.
There are currently 2 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2014 FuseTalk Inc. All rights reserved.