更新时间:2022-10-26 19:10:28
好的,我通过反复试验发现阵列需要是3D。这个数组适合我(3 * 4 * 3维)我用glTexImage2Df:
pix = [[[1.0,1.0,0.0],[0.0, 0.0,0.0],[1.0,1.0,1.0],[0.0,0.0,0.0]],[[1.0,1.0,1.0],[0.0,0.0,0.0],[1.0,1.0,1.0],[0.0, 0.0,0.0]],[[1.0,1.0,1.0],[0.0,0.0,0.0],[1.0,1.0,1.0],[0.0,0.0,0.0]]]
I've gone through many tutorials and still I can't figure out what I'm doing wrong. If I disable textures it correctly draws a white cube but when I enable textures it draws nothing (or black square same color as background). The only thing I suspect is GL_INT because up to this point I have only been using unsigned byte in my projects. Thanks in advance.
Edit: my window is RGB
Here is my code:
self.texture = glGenTextures(1) glPixelStorei(GL_UNPACK_ALIGNMENT, 1) glPixelStorei(GL_PACK_ALIGNMENT, 1) glBindTexture(GL_TEXTURE_2D, self.texture) pix=[255,255,255,0,0,0,255,255,255,0,0,0] glMatrixMode(GL_PROJECTION) glLoadIdentity() glOrtho(0.0, 640, 480, 0.0, 0.0, 100.0) glMatrixMode(GL_MODELVIEW) glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP) glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP) glTexParameterf(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER, GL_LINEAR) glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR) glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL) glTexImage2D(GL_TEXTURE_2D, 0, 3, 2, 2, 0, GL_RGB, GL_INT, pix) glEnable(GL_TEXTURE_2D) glDisable( GL_LIGHTING) glBegin(GL_QUADS); glTexCoord2i(0, 0); glVertex2i(100, 100) glTexCoord2i(0, 1); glVertex2i(100, 200) glTexCoord2i(1, 1); glVertex2i(200, 200) glTexCoord2i(1, 0); glVertex2i(200, 100) glEnd() glutSwapBuffers()
Ok, I found out through trial and error that the array needs to be 3D. This array worked for me (3*4*3 dimensions) I used glTexImage2Df with this:
pix=[[[1.0,1.0,0.0],[0.0,0.0,0.0],[1.0,1.0,1.0],[0.0,0.0,0.0]],[[1.0,1.0,1.0],[0.0,0.0,0.0],[1.0,1.0,1.0],[0.0,0.0,0.0]],[[1.0,1.0,1.0],[0.0,0.0,0.0],[1.0,1.0,1.0],[0.0,0.0,0.0]]]