Sign in to follow this  

Render part of texture with shader

This topic is 1107 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, how do I render just part of texture using shaders. In example I have texture:

 

28s6n43.png

 

 

And I want to render just 1 sprite from spritesheet.

 

Here's my shaders:

Vertex:
 

std::string vertSource =
"#version 330 core\n"
"in vec2 position;"
"uniform vec2 trans = vec2(1.0,1.0);"
"uniform vec2 trans2 = vec2(0.0,0.0);"
"uniform vec2 objsize;"
"uniform vec2 winsize;"
"uniform float timer = 0;"
"in vec2 texCoord;"
"out vec2 texCoordV;"
"void main()"
"{"
"   texCoordV = texCoord;"
"   vec2 trans3 = ( ( trans2 * vec2( 2.0, 2.0 ) ) / winsize ) + vec2( -1.0, -1.0 );"
"   trans3 += ( objsize / winsize );"
"   vec4 test = vec4( ( position * trans ) + trans3, 0.0, 1.0 );"
"   test.xy *= "
"   mat2"
"   ("
"       vec2( cos( timer ),  -sin( timer )),"
"       vec2( sin( timer ),  cos( timer ))"
"   );"
"	gl_Position = test;"
"}";

Fragment:
 

std::string fragSource =
"#version 330 core\n"
"out vec4 out_color;"
"uniform sampler2D textureMap;"
"in vec2 texCoordV;"
"uniform float text = 0;"
"void main()"
"{"
"   if( text == 1 )"
"   {"
"       vec2 a = texCoordV;"
"       a.x = 0.335;"
//"       float time = mod(0.5, 9.0f);"
//"       a.x += (1.0f*floor(time)/9);"
"       out_color = texture2D(textureMap, texCoordV);"
"   }"
"   else"
"       out_color = vec4(0.0,1.0,0.0,1.0);"
"}";

Share this post


Link to post
Share on other sites

A texture, out of its dimension (weheather 256*1024) is always being 1.0*1.0. Strange yet absolute. You should always happen to have your texture to be square, and those squares being a power of two (256/512/1024/2048), but may it be any value, height and width is always reflected agains 1.0/1.0 value, so make it reflect towards thoes, as you try to map those taxtures onto. And if you have an arbitrary picture remaping, compute it towards 1.0/1.0 coordinae of sample , derived from arbitrary map space of texture

Share this post


Link to post
Share on other sites

A texture, out of its dimension (weheather 256*1024) is always being 1.0*1.0. Strange yet absolute.

That is not strange, it's simply using the general mathematical concept of normalized coordinates.

You should always happen to have your texture to be square,

There is absolutely no requirement to have square textures. It might make your math easier but otherwise has no effect.

and those squares being a power of two (256/512/1024/2048), but may it be any value, height and width is always reflected agains 1.0/1.0 value, so make it reflect towards thoes, as you try to map those taxtures onto. And if you have an arbitrary picture remaping, compute it towards 1.0/1.0 coordinae of sample , derived from arbitrary map space of texture

OpenGL explicitely allows you to create rectangular textures (with non-power-of-two lengths). Some extremely old hardware had problems with that but on anything remotely modern, it just works. Rectangular textures consume more memory (often the same as a texture with each side rounded up to the next power of two, but the details depend on the driver). Nowadays you can just use npot for normal textures, but there is still GL_TEXTURE_RECTANGLE. It does not allow mipmapping but allows you to address the texture using texel coordinates instead of normalized coordinates.

Share this post


Link to post
Share on other sites

In example I have fragment shader:

std::string fragSource =
"#version 330 core\n"
"out vec4 out_color;"
"uniform sampler2D textureMap;"
"in vec2 texCoordV;"
"uniform float text = 0;"
"void main()"
"{"
" if( text == 1 )"
" {"
//" vec2 a = texCoordV;"
//" a.x = 0.335;"
//" float time = mod(0.5, 9.0f);"
//" a.x += (1.0f*floor(time)/9);"
" out_color = texture2D(textureMap, texCoordV);"
" }"
" else"
" out_color = vec4(0.0,1.0,0.0,1.0);"
"}";

and I don't know what coordinates are stored in texCoordV variable. Can someone give me example? Cuz I try to do things with [1.0,1.0] but nothing works.

Edited by povilaslt2

Share this post


Link to post
Share on other sites

 

There is absolutely no requirement to have square textures. It might make your math easier but otherwise has no effect.

Now pardon me for recomending that ! Like if I said it is not passible to sample a raster of other dimensions... when one sacrifices mipmaping and sampler sanity

 

 

OpenGL explicitely allows you to create rectangular textures (with non-power-of-two lengths). Some extremely old hardware had problems with that but on anything remotely modern, it just works. Rectangular textures consume more memory (often the same as a texture with each side rounded up to the next power of two

I thik that I have enough clearly stressed that texture absolute space of 1.0-1.0 is relative to a raster of any ratio and any dimension. Yet it does not take my right to encourage one to use square raster of pow2 dimension, does it?

Share this post


Link to post
Share on other sites

and I don't know what coordinates are stored in texCoordV variable. Can someone give me example? Cuz I try to do things with [1.0,1.0] but nothing works.

"

You will sample the same texel all the time if you put constants to sampling function in fragment shader or vertex shader

 

You need to set texCoordV values in the bound vertex buffer accordingly or inspect them what they are

Share this post


Link to post
Share on other sites
JohnnyCode, this is not the first you later claim you said something different than you actally did. I stand by my corrections to your post. If that is not just modus operandi to escape criticism, you really need to take significantly more care to express your intended meaning more clearly.

Share this post


Link to post
Share on other sites

Based on the images, the time calls in the fragment shader, and the want to show a different part of the texture. It seems like you want to get an animation going.

 

I'm not sure what your set up is like, but if it were me I would do this all on the CPU side of things.

I'd first calculate what "frame" (set of texture coords) to use based on my animation timer. Then the quad I'm using to render on would get these texture coords assigned to it as apart of its vertex data. Which would then eventually get used by my shaders, where the only thing the texture coords have to do is get used by my fragment shader's sampler

 

As for what texture coords to actually use, the guys above are right. Textures get normalized values, which basically means

0, 0 is the top left corner

1, 0 is the top right corner

1, 1 is the bottom right corner

0, 1 is the bottom left corner

 

So to only show a specific part of a texture, you can just calculate what UV coord points, a value between 0 and 1, to use.

 

Now lets say we wanted to know the texture cords for your third frame (the girl looking like she is doing a flamingo stance).

Assuming that each frame is 100x125, you can get the correct texture coords to use by doing

 

Vertex UV coords 1:

startPixelX/textureWidth = 300/660 = 0.45

startPixelY/textureHeight = 0/755 = 0

 

Vertex UV coords 2:

(startPixelX + frameWidth)/textureWidth = 400/660 = .66

startPixelY/textureHeight = 0/755 = 0

 

Vertex UV coords 3:

(startPixelX + frameWidth)/textureWidth = 400/660 = .66

(startPixelY + frameHeight)/textureHeight = 125/755 = .165

 

Vertex UV coords 4:

startPixelX/textureWidth = 300/660 = 0.45

(startPixelY + frameHeight)/textureHeight = 125/755 = .165

Share this post


Link to post
Share on other sites

This topic is 1107 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this