Jump to content

  • Log In with Google      Sign In   
  • Create Account


Sampling only PART of an image in shaders in DirectX9


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 Muzzy A   Members   -  Reputation: 621

Like
0Likes
Like

Posted 14 October 2012 - 08:03 PM

Hey, i'm working on a 2D project, but i'm rendering the textures to geometry instead of using the Sprite Manager. But I don't know how i would only render part of the image. With the sprite manager, the draw function had a 'SrcRect' parameter. Does anyone know how to do this? i can't find anything online about it =\

Sponsor:

#2 Bacterius   Crossbones+   -  Reputation: 7963

Like
1Likes
Like

Posted 14 October 2012 - 08:26 PM

Inside your shader, you have a texture sampling function (tex2d, iirc). It takes two parameters for a 2D texture, called "uv coordinates" which range from 0 to 1 and encompass the whole texture. So (0, 0) is the top left pixel of the texture, and (1, 1) is the bottom right pixel. These uv coordinates are defined for each of your geometry vertices, and interpolated over pixels that way. (you can also define what happens when you sample the texture outside the [0..1] range, for instance it can either tile, clamp, or mirror).

If you only want to sample, say, the top-left quarter of the texture, then you need your uv's to be between 0 and 0.5.

If your geometry was a simple rectangle, then you'd have those vertices with the indicated uv's:

The whole texture is displayed:
[source lang="java"](0, 0) ---------------- (1, 0)| || |(0, 1) ---------------- (1, 1)[/source]

The top half of the texture is displayed:
[source lang="java"](0, 0) ---------------- (1, 0)| || |(0, 0.5) ---------------- (1, 0.5)[/source]

The top left quarter of the texture is displayed:
[source lang="java"](0, 0) ---------------- (0.5, 0)| || |(0, 0.5) ---------------- (0.5, 0.5)[/source]

These coordinates are often set inside your 3D modelling programs (you might have heard the term "uv-unwrapping" before, it is a method to "wrap" a model with correct uv coordinates so the texture doesn't get distorted), but you can also set them in hardware.

There is no easy way to just say "render the texture between pixels (20, 13) and (100, 49)" in a shader, because the shader doesn't care about what size your texture is.

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#3 Muzzy A   Members   -  Reputation: 621

Like
0Likes
Like

Posted 14 October 2012 - 08:47 PM

k soo... I would have to create separate geometry for each part of the texture i want to use?

#4 Bacterius   Crossbones+   -  Reputation: 7963

Like
1Likes
Like

Posted 14 October 2012 - 09:14 PM

k soo... I would have to create separate geometry for each part of the texture i want to use?

Yes, in theory, a 3D model is made of lots of little triangles, and each triangle has its very own "texture chunk" delimited by the triangle's vertices' uv coordinates. But this is usually automated through 3D modelling programs like 3ds-max, maya, etc... so you don't have to worry about it, you just read whatever uv coordinates were provided with your model and use that in your shader.

What are you trying to do exactly?

The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

 

- Pessimal Algorithms and Simplexity Analysis


#5 Muzzy A   Members   -  Reputation: 621

Like
0Likes
Like

Posted 14 October 2012 - 09:36 PM

ANIMATIONS!

Well i'm experimenting with a side scroller, and i was wanting to do lighting with it. But i dont know how, or if you even can do lighting with just the SpriteManager that directx 9 gives you. So i started rendering everything to geometry in order to do the lighting. I was just having a hell of a time trying to get only part of the texture so i could draw my animations lol.

But you have helped alot, thanks!

Edited by Muzzy A, 14 October 2012 - 09:38 PM.


#6 phil_t   Crossbones+   -  Reputation: 3107

Like
0Likes
Like

Posted 14 October 2012 - 10:43 PM

If you're doing animations, such as from a sprite sheet, you don't necessarily need to use a different piece of geometry for each sprite. You can modify the texture coordinates that are output from the vertex shader according to some shader constant (which would define the cel of animation you want).

#7 Muzzy A   Members   -  Reputation: 621

Like
0Likes
Like

Posted 14 October 2012 - 11:11 PM

If you're doing animations, such as from a sprite sheet, you don't necessarily need to use a different piece of geometry for each sprite. You can modify the texture coordinates that are output from the vertex shader according to some shader constant (which would define the cel of animation you want).


I can see how you could offset the very first texCoord, but what about the others after it?

#8 phil_t   Crossbones+   -  Reputation: 3107

Like
0Likes
Like

Posted 14 October 2012 - 11:17 PM

Just add a constant value to each one.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS