Texture problem

Heinrich Janzing h.janzing at myrealbox.com
Mon Dec 11 13:27:53 PST 2006


Hi,

Since I failed to draw window contents using an OpenGL texture (using 
Composite extension, and the texture from pixmap extension), I decided to 
test using a simple gray texture. Just as with the textures generated by 
glXBindTexImageEXT, the texture wasn't being used (only vertex color showing). 
I'm working on Xgl and linking to the mesa libGL on ubuntu edgy eft. I have 
the latest ATI fglrx drivers and an X1600 Mobility. I'm using an orthogonal 
projection.

Here's my testtexture.cpp which I'm using now for testing:

=== START testtexture.cpp ===
#include <GL/gl.h>
#include <iostream>

using namespace std;

TestTexture::TestTexture(bool powerOfTwo)
 : GLTexture()
{
	int width, height;
	if (powerOfTwo)
	{
		width = 512;
		height = 512;
	}
	else
	{
		width = 500;
		height = 400;
	}
	
	int bytes = 3 * width * height;
	GLubyte* data = new GLubyte[bytes];
	for (int i=0; i < bytes; ++i)
		data[i] = 128;
	
	if (powerOfTwo)
		setTarget(GL_TEXTURE_2D);
	else
		setTarget(GL_TEXTURE_RECTANGLE_ARB);
	
	fetchUniqueName(); // This will call glGenTextures and set the name
	
	glBindTexture(getTarget(), getName());
	glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
	glTexParameterf(getTarget(), GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexParameterf(getTarget(), GL_TEXTURE_MAG_FILTER, GL_LINEAR);
	glTexImage2D(getTarget(), 0, 3, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, 
data);
	glBindTexture(getTarget(), 0);
	
	setRealWidth(width);
	setRealHeight(height);
	
	delete[] data;
}


TestTexture::~TestTexture()
{
	
}

void TestTexture::enable( )
{
	glEnable(getTarget());
	glBindTexture(getTarget(), getName());
}

void TestTexture::disable( )
{
	glBindTexture(getTarget(), 0);
	glDisable(getTarget());
}

=== END testtexture.cpp ===

Of course, enable() and disable() are declared virtual at the parent class 
level. Basically I just create an RGB gray texture in memory and try to load 
it into an OpenGL texture. I have tried using GL_MODULATE as well. I'm 
currently testing with powerOfTwo = true. And here's the code where the 
rendering happens:

=== START code from ClientWindow::render() ===

	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	
	GLTexture* p_texture;
	
	glDisable(GL_DEPTH_TEST); // These three are just to be sure...
	glDisable(GL_LIGHTING);
	glDisable(GL_BLEND);

	glColor4f(0.0f, 0.0f, 0.0f, 1.0f);

	p_texture = Settings::getTestTexture();
	glGetError();
	p_texture->enable();
	
	double texCoordMulX, texCoordMulY;
	if (! p_texture->isNormalized())
	{
		texCoordMulX = p_texture->getRealWidth();
		texCoordMulY = p_texture->getRealHeight();
	}
	else
	{
		texCoordMulX = 1;
		texCoordMulY = 1;
	}
	
	int ymod;
	if (Settings::isYInverted())
		ymod = -1;
	else
		ymod = 1;
	
	glBegin(GL_QUADS);
		glNormal3f(0.0f, 0.0f, 1.0f);
		glTexCoord2f(0.0f, texCoordMulY);
		glVertex3f(getX(), getY(), getZ());
		glTexCoord2f(0.0f, 0.0f);
		glVertex3f(getX(), getY() + ymod * getHeight(), getZ());
		glTexCoord2f(texCoordMulX, 0.0f);
		glVertex3f(getX() + getWidth(), getY() + ymod * getHeight(), getZ());
		glTexCoord2f(texCoordMulX, texCoordMulY);
		glVertex3f(getX() + getWidth(), getY(), getZ());
	glEnd();
	
	p_texture->disable();

=== END code from ClientWindow::render() ===

This results in a black square. No OpenGL errors are generated during 
execution of this code (or at least, 
glGetError() after this code gives GL_NO_ERROR). I'd prefer to ask this on an 
OpenGL mailing list first since I'm probably doing something wrong, but the 
only list I could find seems dead for over a year.

Any help is greatly appreciated!

Thanks,
Heinrich



More information about the xorg mailing list