Jump to content

  • Log In with Google      Sign In   
  • Create Account


Same texture is used for both meshes, OpenGL and CG


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
1 reply to this topic

#1 Nanook   Members   -  Reputation: 494

Like
0Likes
Like

Posted 11 September 2012 - 03:26 PM

I'm using opengl with cg and its fx format..

I call enable and disable before and after I draw the geometry.. with two different textures, but both of my cube meshes are drawn with the same texture.. does the setup and enable / disable code look correct? I have called cgGLSetManageTextureParameters with CG_TRUE so I shouldn't need to call cgGLEnableTextureParameter..

So I'm having two instances of this class and m_glTexture is two seperate texture objects.. but the cgParameter passed into the constructor is the same as its using the same effect.. I have checked that the texture data is different so its actually creating two seperate textures on the GPU, but only one of them is bound for both meshes..

TE::Render::APITexture2D::APITexture2D( Texture& texture, CGparameter cgParameter )
{
texture.Prepare();

glGenTextures(1, &m_glTexture);
assert(glGetError() == GL_NO_ERROR);

glBindTexture(GL_TEXTURE_2D, m_glTexture);
assert(glGetError() == GL_NO_ERROR);
cgGLSetTextureParameter(cgParameter, m_glTexture);
assert(glGetError() == GL_NO_ERROR);
cgSetSamplerState(cgParameter);
assert(glGetError() == GL_NO_ERROR);
glTexImage2D(GL_TEXTURE_2D,
  0,
  4,
  texture.GetImage().GetWidth(),
  texture.GetImage().GetHeight(),
  0,
  APIMapping::s_colorType[texture.GetImage().GetColorType()],
  GL_UNSIGNED_BYTE,
  texture.GetImage().GetDataPtr());
assert(glGetError() == GL_NO_ERROR);
}
TE::Render::APITexture2D::~APITexture2D()
{
}
void TE::Render::APITexture2D::Enable(CGparameter cgParameter )
{
glBindTexture(GL_TEXTURE_2D, m_glTexture);
cgSetSamplerState(cgParameter);

assert(glGetError() == GL_NO_ERROR);
}
void TE::Render::APITexture2D::Disable()
{
glBindTexture(GL_TEXTURE_2D, 0);
assert(glGetError() == GL_NO_ERROR);
}


Sponsor:

#2 Nanook   Members   -  Reputation: 494

Like
0Likes
Like

Posted 11 September 2012 - 04:42 PM

Never mind.. I found out.. here's the correct code for anyone interested :)

TE::Render::APITexture2D::APITexture2D( Texture& texture, CGparameter cgParameter )
{
texture.Prepare();

glGenTextures(1, &m_glTexture);
assert(glGetError() == GL_NO_ERROR);

glBindTexture(GL_TEXTURE_2D, m_glTexture);
assert(glGetError() == GL_NO_ERROR);
//bind the texture to the cg texture parameter
cgGLSetTextureParameter(cgParameter, m_glTexture);
assert(glGetError() == GL_NO_ERROR);
//initialize the state specified for a sampler parameter
cgSetSamplerState(cgParameter);
assert(glGetError() == GL_NO_ERROR);
glTexImage2D(GL_TEXTURE_2D,
  0,
  4,
  texture.GetImage().GetWidth(),
  texture.GetImage().GetHeight(),
  0,
  APIMapping::s_colorType[texture.GetImage().GetColorType()],
  GL_UNSIGNED_BYTE,
  texture.GetImage().GetDataPtr());
assert(glGetError() == GL_NO_ERROR);
}
TE::Render::APITexture2D::~APITexture2D()
{
}
void TE::Render::APITexture2D::Enable(CGparameter cgParameter )
{
cgGLSetTextureParameter(cgParameter, m_glTexture);
assert(glGetError() == GL_NO_ERROR);
}
void TE::Render::APITexture2D::Disable()
{
glBindTexture(GL_TEXTURE_2D, 0);
assert(glGetError() == GL_NO_ERROR);
}





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS