• Advertisement
Sign in to follow this  

Textures

This topic is 1823 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

I need to impose a specific texture on the triangle primitives.

 

So, I have a texture file: [attachment=13225:256colour3.png]

 

And I have a list of triangle primitives. I build a surface of these triangles. Each of triangle point contain data:

1) position

2) texture coordinate 

 

Depands of texture coordinate on each point, I want to make color transition on the surface, like this:[attachment=13226:Capture1234.PNG]

 

But, using this shader

Texture ColorTexture;

sampler ColorTextureSampler = 
sampler_state 
{ 
texture = <ColorTexture> ; 
magfilter = NONE; 
minfilter = NONE; 
mipfilter = NONE; 
AddressU = wrap; 
AddressV = wrap; 
};

struct VertexShaderInput
{
    float4 Position : POSITION;
    float2 texCoord : TEXCOORD0;
};

struct VertexShaderOutput 
{
    float4 Position : POSITION;
    float2 texCoord : TEXCOORD0;
};

float4x4 worldViewProj;

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
	VertexShaderOutput output = (VertexShaderOutput)0;
	
	output.Position = mul(input.Position, worldViewProj);
	output.texCoord = input.texCoord;
	
    return output;
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR
{	
	return tex1D( ColorTextureSampler, input.texCoord.x);
}

technique Technique1
{
    pass Pass1
    {
	VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}

 

I receive this: [attachment=13227:Capture1236.PNG]

 

So, how should I modificate shader to receive such kind of color transition? Or, maybe, I should modificate texture file?

Share this post


Link to post
Share on other sites
Advertisement

how well is that plane subdivided?, the problem is that you can't get the output you want if your using 2 triangles per quad(and it looks like 9 quads), since the pixel shader linearly interpolates the uv values being passed in from the vertex's, this means you need to at least have the number of vertices equal to the number of data points, and then it has to be arranged to look like the quad, which could be very complex.

 

it might be better to actually take your pallete and build a 2D texture surface from your data points(and interpolate those data points yourself), then simply render that onto a quad.

 

edit: this might be possible in a compute or geometry shader though, but i don't believe you could achieve this easily with just vertex/pixel shader(although i'd love to see if someone does have a good solution).

Edited by slicer4ever

Share this post


Link to post
Share on other sites

This is actually possible but you'd need a second texture, which would be a greyscale representation of the pattern you want.  You do a standard tex2D lookup on that texture, then feed the result of that into your tex1D lookup to get the final colour.

 

Of course, by then you already have the pattern in a texture anyway so you may as well go all the way and put in the full RGB version, unless you have any specific reason not to (e.g. you want to swap out the RGB palette at runtime).

Share this post


Link to post
Share on other sites
I'd also say that the 1d texture and .x in the pixel shader is 'strange' (though probably on purpose). Did you try it out as tex2d? You probably also need to look up the information of a specific uv depending on the pos on your plane (in the vertexshader and pass through the corresponding uv texcoord to the ps)

Share this post


Link to post
Share on other sites

It seems like one vertex shared by two faces has different uv coordinates, so you cannot get a smooth face.

Share this post


Link to post
Share on other sites
how well is that plane subdivided?, the problem is that you can't get the output you want if your using 2 triangles per quad(and it looks like 9 quads), since the pixel shader linearly interpolates the uv values being passed in from the vertex's, this means you need to at least have the number of vertices equal to the number of data points, and then it has to be arranged to look like the quad, which could be very complex.

 

it might be better to actually take your pallete and build a 2D texture surface from your data points(and interpolate those data points yourself), then simply render that onto a quad.

 

edit: this might be possible in a compute or geometry shader though, but i don't believe you could achieve this easily with just vertex/pixel shader(although i'd love to see if someone does have a good solution).

 

You are right, here is 9 quads = 18 triangles = 16 points (data points). Each point has a texture coordinate. Number of vertices are equal to the number of data points since vertice IS a data point. Sorry, if I confused you. Each vertice contain coordinate and texture coordinate.

About 2D texture, not sure how it would help since my palette consists of vertical lines and V (y) coordinate is no needed (at the moment). 

To my mind the problem is that textures are imposing on the separate element, not on all surface. But I may be wrong.

 

 

 

This is actually possible but you'd need a second texture, which would be a greyscale representation of the pattern you want.  You do a standard tex2D lookup on that texture, then feed the result of that into your tex1D lookup to get the final colour.

 

Of course, by then you already have the pattern in a texture anyway so you may as well go all the way and put in the full RGB version, unless you have any specific reason not to (e.g. you want to swap out the RGB palette at runtime).

 

This is not the final surface, it can be changed. I need the universal solution, which can be implemented to any surface. As I understand, pattern will be sutable on for this surface. Changing of surface will require changing of pattern, or am I wrong? 

Share this post


Link to post
Share on other sites

i don't think you quite understand(or we don't quite understand what you want to accomplish?)

 

how well is that plane subdivided?, the problem is that you can't get the output you want if your using 2 triangles per quad(and it looks like 9 quads), since the pixel shader linearly interpolates the uv values being passed in from the vertex's, this means you need to at least have the number of vertices equal to the number of data points, and then it has to be arranged to look like the quad, which could be very complex.

 

it might be better to actually take your pallete and build a 2D texture surface from your data points(and interpolate those data points yourself), then simply render that onto a quad.

 

edit: this might be possible in a compute or geometry shader though, but i don't believe you could achieve this easily with just vertex/pixel shader(although i'd love to see if someone does have a good solution).

 

You are right, here is 9 quads = 18 triangles = 16 points (data points). Each point has a texture coordinate. Number of vertices are equal to the number of data points since vertice IS a data point. Sorry, if I confused you. Each vertice contain coordinate and texture coordinate.

About 2D texture, not sure how it would help since my palette consists of vertical lines and V (y) coordinate is no needed (at the moment). 

To my mind the problem is that textures are imposing on the separate element, not on all surface. But I may be wrong.

I don't think you quite understand how the vertex->pixel shader works, yes you have 16 data/vertex points, and could even pass in the data that your trying to interpolate across those vertices, however the problem is how the vertex shader get's translated to the pixel shader. it is always going to be done in a perfectly linear system, your sample image does not move across the triangle perfectly linearly, if you could control the pixel shader translation's across the surface you could apply w/e formula you use to do this, but at the moment, with strictly vertex/pixel shader, and as far as i know(and i may be wrong, which would be pretty cool to learn), you can't control how the pixel shader translates that data.

 

this is why we suggest you construct a 2D pallete of those data points translated across your 2D surface, and then render this, which can be done on initialization, or even when their's a change in data points.  with modern computer's you probably won't even notice a slow-down when doing this in real-time(but not every frame, just on every change, although you might still get smooth results with every frame, i would not recommend doing that at all.)

 

 

for example, let's forget about it as a triangle, and think of it as just a straight line, looking at your image, let's take the bottom 2 vertice's on the left side, now then for simplicity sake let's look at this as if it's only along a single axis(let's say x):

the first vertex has a position at: 0 x, the uv is set to 0 as well.

the second vertex has a position at: 1 x, and the uv is set to 1.

 

so, the pixel shader sees their are 50 pixels between those 2 points, so each pixel get's this for it's input:

position = v0.x+(v1.x-v0.x)/50

uv = v0.u + (v1.u-v0.u)/50

 

see how their's no room for creating a system which isn't perfectly linear(and as you can see in your sample image, the color's along that bottom left line are not equal parts from each other).

 

I hope this explanation helps you better understand your problem.

Edited by slicer4ever

Share this post


Link to post
Share on other sites

I realy do not completely understand what should I do and it is my fault. I do not have enough knowledge and expirience for even understand all you are telling me. Sorry for that.

 

So, I modificated my shader for using 2 textures for lookup. 

Texture ColorTexture;

sampler ColorTextureSampler = 
sampler_state 
{ 
texture = <ColorTexture> ; 
magfilter = NONE; 
minfilter = NONE; 
mipfilter = NONE; 
AddressU = wrap; 
AddressV = wrap; 
};

Texture AcrossTexture;

sampler AcrossTextureSampler = 
sampler_state 
{ 
texture = <AcrossTexture> ; 
magfilter = NONE; 
minfilter = NONE; 
mipfilter = NONE; 
AddressU = wrap; 
AddressV = wrap; 
};


struct VertexShaderInput
{
    float4 Position : POSITION;
    float2 texCoord : TEXCOORD0;
};

struct VertexShaderOutput 
{
    float4 Position : POSITION;
    float2 texCoord : TEXCOORD0;
};

float4x4 worldViewProj;

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
	VertexShaderOutput output = (VertexShaderOutput)0;
	
	output.Position = mul(input.Position, worldViewProj);
	output.texCoord = input.texCoord;
	
    return output;
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR
{	
	float4 TextureColor = tex2D(AcrossTextureSampler, input.texCoord);	
	float f = TextureColor.r;
	return tex1D(ColorTextureSampler, f );
}

technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}
Texture tex1 = Texture.FromFile(d3dDevice, "vertical.png");
Texture tex2 = Texture.FromFile(d3dDevice, "across.png");//, 100, 132, 1, Usage.None, Format.L8, Pool.Default, Filter.None, Filter.None, 1);  // L8 - grayscale format, but using this advanced options gives strange resylts  
            effect.SetTexture("ColorTexture", tex1);
            effect.SetTexture("AcrossTexture", tex2);

 

//vertices
for (int j = 0; j < list.Count; j++)
                {
                    points.Add(new Vertex()
                    {
                        Coord = new Vector2(0, SomeTextureCoordinate),
                        Position = new Vector3(list[j].X, list[j].Y, list[j].Z),
                    });
                }

 

 

AcrossTexture: [attachment=13240:across.png]

ColorTexture: [attachment=13241:256colour2.png]

Result: [attachment=13239:C.PNG] 

 

Share this post


Link to post
Share on other sites

That's not how we mean for using a second texture.  what you did was just rotate the texture.

 

no, what we are saying is this:

 

from my understanding, you want to create some unique result from a set of data inputs.

 

so, you have to have some function for figuring out what color should be at what pixel.

 

so, if you know the size of the texture when it's rendered, do something like this:

[code]  //psedo- c code:   char *Texels = new char[TexWidth*TexHeight];.   for(int x=0;x<TexWidth;x++){
   for(int y=0;y<TexHeight;y++){
     int ColorLookup = YourFormulaForLookingUpTheColorAtASepeceficPoint(x, y);     Texel[x+y*TexWidth] = Pallete[ColorLookup]; //Pallete is assumed to be your 1D array texture    }  }  //Upload this newly created texture, and draw a single quad with only that texture, and use just a pass-through shader(essentially a shader that emulates a fixed-function pipeline texturing)[/code]

 

essentially what your doing is creating the texture of your sample image on the cpu side, then just drawing that texture.

Share this post


Link to post
Share on other sites
That's not how we mean for using a second texture.  what you did was just rotate the texture.

 

no, what we are saying is this:

 

from my understanding, you want to create some unique result from a set of data inputs.

 

so, you have to have some function for figuring out what color should be at what pixel.

 

so, if you know the size of the texture when it's rendered, do something like this:

 

  //psedo- c code:
  char *Texels = new char[TexWidth*TexHeight];.
  for(int x=0;x<TexWidth;x++){
   for(int y=0;y<TexHeight;y++){
     int ColorLookup = YourFormulaForLookingUpTheColorAtASepeceficPoint(x, y);
    Texel[x+y*TexWidth] = Pallete[ColorLookup]; //Pallete is assumed to be your 1D array texture
   }
 }
 //Upload this newly created texture, and draw a single quad with only that texture, and use just a pass-through shader(essentially a shader that emulates a fixed-function pipeline texturing)

 

 

essentially what your doing is creating the texture of your sample image on the cpu side, then just drawing that texture.

 

More information about my task:

 

          X         Y           Z              value         

 

   -0.095   0.095     0.695           864
   -0.095   0.0475   0.695           963
   -0.095   0            0.695           963
   -0.095   -0.0475  0.695           321
.... and so on,
 can have unlimited count of such points. 
 
I must draw surface from these points and make color transition respectively to value on each point (as in picture at the beginning of this topic above).
The surface may be changed, so I'm searching for a universal method to resolve this problem. 
I wrote a little, which calculates texCoords respectively to the value on point. It returns me value [0,1] for each point which I'm using as texCoord. 
 
Now I'm trying to implement your suggestion... Hope it will do the trick, but I need some time to do it...

Share this post


Link to post
Share on other sites

i think i understand what your doing, the key point here is what is the function you use to map your data points to your pallete?

 

for example, if i said, give me the pallete to use at x = 0, y = 0(using your data set from your sample, and in your sample is red), then what is your code for figuring out to choose red?

Share this post


Link to post
Share on other sites
i think i understand what your doing, the key point here is what is the function you use to map your data points to your pallete?

for example, if i said, give me the pallete to use at x = 0, y = 0(using your data set from your sample, and in your sample is red), then what is your code for figuring out to choose red?
I'm just finding min and max value (on points) and splitting the range on 12 levels (12 colors in palette)
float step = (Math.Abs(min) + Math.Abs(max)) / 12;
and then I just calculate in what period input value belongs to:
if ((val >= levels[0]) && (val < levels[1]))            {                return 0.082f;            }            if ((val >= levels[1]) && (val < levels[2]))            {                return 0.164f;            }
And return float value, which apply to the texture above. For example 0-0.083 in my texture is blue and 9.8-9.999 is red. I'm using this returned values as coordinated of my texture.

EDIT: Tryed to implement your function. Created new texture, BUT: have no idea how to calculate "YourFormulaForLookingUpTheColorAtASepeceficPoint(x, y);". I have 4 points on element, for example I'm creating 24x24 texture. How can I choose color for 576 inner texels, when I know only 4 of them? Edited by Yura

Share this post


Link to post
Share on other sites

[quote name='Yura' timestamp='1358322904' post='5022085']
EDIT: Tryed to implement your function. Created new texture, BUT: have no idea how to calculate "YourFormulaForLookingUpTheColorAtASepeceficPoint(x, y);". I have 4 points on element, for example I'm creating 24x24 texture. How can I choose color for 576 inner texels, when I know only 4 of them?
[/quote]

 

You interpolate between them, using some function you define. There is no "right answer". You need to figure out how you want to interpolate between them. It could be a weighted average of all the points, with a weight inversely proportional to the distance to the current point, for instance.

Share this post


Link to post
Share on other sites
i think i understand what your doing, the key point here is what is the function you use to map your data points to your pallete?

for example, if i said, give me the pallete to use at x = 0, y = 0(using your data set from your sample, and in your sample is red), then what is your code for figuring out to choose red?
I'm just finding min and max value (on points) and splitting the range on 12 levels (12 colors in palette)
float step = (Math.Abs(min) + Math.Abs(max)) / 12;
and then I just calculate in what period input value belongs to:
if ((val >= levels[0]) && (val < levels[1]))            {                return 0.082f;            }            if ((val >= levels[1]) && (val < levels[2]))            {                return 0.164f;            }
And return float value, which apply to the texture above. For example 0-0.083 in my texture is blue and 9.8-9.999 is red. I'm using this returned values as coordinated of my texture.

EDIT: Tryed to implement your function. Created new texture, BUT: have no idea how to calculate "YourFormulaForLookingUpTheColorAtASepeceficPoint(x, y);". I have 4 points on element, for example I'm creating 24x24 texture. How can I choose color for 576 inner texels, when I know only 4 of them?

 

sorry for the late reply, but i have to ask, how did you create that sample image?, how did you choose what pixel was what color?, when you can answer that question, you can do this.  otherwise i'm not really certain what your expected to be able to do from the beginning of this thread.

 

obviously you have some points of data that mean something to you, but you seem to be avoiding how those points are to be used to actually create the representation you want.

Share this post


Link to post
Share on other sites
i think i understand what your doing, the key point here is what is the function you use to map your data points to your pallete?

for example, if i said, give me the pallete to use at x = 0, y = 0(using your data set from your sample, and in your sample is red), then what is your code for figuring out to choose red?
I'm just finding min and max value (on points) and splitting the range on 12 levels (12 colors in palette)
float step = (Math.Abs(min) + Math.Abs(max)) / 12;
and then I just calculate in what period input value belongs to:
if ((val >= levels[0]) && (val < levels[1]))            {                return 0.082f;            }            if ((val >= levels[1]) && (val < levels[2]))            {                return 0.164f;            }
And return float value, which apply to the texture above. For example 0-0.083 in my texture is blue and 9.8-9.999 is red. I'm using this returned values as coordinated of my texture.

EDIT: Tryed to implement your function. Created new texture, BUT: have no idea how to calculate "YourFormulaForLookingUpTheColorAtASepeceficPoint(x, y);". I have 4 points on element, for example I'm creating 24x24 texture. How can I choose color for 576 inner texels, when I know only 4 of them?

 

sorry for the late reply, but i have to ask, how did you create that sample image?, how did you choose what pixel was what color?, when you can answer that question, you can do this.  otherwise i'm not really certain what your expected to be able to do from the beginning of this thread.

 

obviously you have some points of data that mean something to you, but you seem to be avoiding how those points are to be used to actually create the representation you want.

 

The sample image wasn't created by me. It is a sample of work of readymade program. I don't know how does it works, I just made this sample for you to explain what I want to achieve. Those points of data are pressure: X, Y, Z coordinates and PRESSURE. The surface is subjected to pressure. Where the pressure is greater surface tends to red, lower - to blue.

Share this post


Link to post
Share on other sites
As a couple of people have said already, it looks like the problem is with the vertices.

If you were sharing vertices between quads, the colours would match across the boundary.

Doesn't matter what shader or textures you use if the vertex data is wrong, so try to fix that first.

Share this post


Link to post
Share on other sites
As a couple of people have said already, it looks like the problem is with the vertices.

If you were sharing vertices between quads, the colours would match across the boundary.

Doesn't matter what shader or textures you use if the vertex data is wrong, so try to fix that first.

 

Yes, I'm sharing vertices between quads, but what the problem? For example:

 

0.9 ______ 0.5    -- texCoords on each point

      |          |

      |          |

0.2 |_____| 0.3____ 0.1

      |          |              |

      |          |              |

0.4 |_____| 0.9____| 0.3        

Share this post


Link to post
Share on other sites
How about you recreate the entire texture as a lookup table in the pixel shader and then sample it like this: out.color = colors[round(texCoords.x * 11)]?

const float3 colors[] = {

   float3(0.f, 0.f, .5f),
   float3(0.f, .23f, .73f),
   float3(0.f, .45f, .95f),
   float3(0.f, .68f, 1.f),

   float3(0.f, .9f, 1.f),
   float3(0.f, 1.f, .73f),
   float3(0.f, 1.f, .27f),
   float3(.23f, 1.f, 0.f),

   float3(.63f, 1.f, 0.f),
   float3(1.f, .91f, 0.f),
   float3(1.f, .45f, 0.f),
   float3(.93f, .11f, .14f),
}
Edited by eppo

Share this post


Link to post
Share on other sites
Look at the bottom right of the first quad. There are four different colours around the vertex. If those four quads are sharing a vertex, how can they get different colours at the same point?

Also, the left and right edges of each quad are the same colour, when they ought to be different.

Share this post


Link to post
Share on other sites
Look at the bottom right of the first quad. There are four different colours around the vertex. If those four quads are sharing a vertex, how can they get different colours at the same point?

Also, the left and right edges of each quad are the same colour, when they ought to be different.
There are mustn't be 4 different corors around 1 vertex. Oposite, there are must be one color on the 4 quads (near this vertex)

EDIT: Maybe there is other way to receive needed result? Maybe, textures are a bad idea?? Edited by Yura

Share this post


Link to post
Share on other sites

How about you recreate the entire texture as a lookup table in the pixel shader and then sample it like this: out.color = colors[round(texCoords.x * 11)]?
 

const float3 colors[] = {

   float3(0.f, 0.f, .5f),
   float3(0.f, .23f, .73f),
   float3(0.f, .45f, .95f),
   float3(0.f, .68f, 1.f),

   float3(0.f, .9f, 1.f),
   float3(0.f, 1.f, .73f),
   float3(0.f, 1.f, .27f),
   float3(.23f, 1.f, 0.f),

   float3(.63f, 1.f, 0.f),
   float3(1.f, .91f, 0.f),
   float3(1.f, .45f, 0.f),
   float3(.93f, .11f, .14f),
}

 

Good idea, it works, but the same as a texture (look at picture "I receive this" at the begining of post).

Is it possible to modificate it to receive the result I want?

Share this post


Link to post
Share on other sites

[quote name='Yura' timestamp='1358855043' post='5024261']
There are mustn't be 4 different corors around 1 vertex. Oposite, there are must be one color on the 4 quads (near this vertex)
[/quote]

 

Perhaps it's best if you post the code that shows how you set up the quads' vertex positions and their accompanying temperature-texcoords. Like said before, it's probably something in there that's causing these unwanted discontinuities.

Share this post


Link to post
Share on other sites

There are mustn't be 4 different corors around 1 vertex. Oposite, there are must be one color on the 4 quads (near this vertex)

 

 

Perhaps it's best if you post the code that shows how you set up the quads' vertex positions and their accompanying temperature-texcoords. Like said before, it's probably something in there that's causing these unwanted discontinuities.

 

Ofcourse, if it would help...

So, I initialize VertexBubber with data in "points":

for (int j = 0; j < list.Count; j++)
                {
                    
                    points.Add(new Vertex()
                    {
                        Coord = new Vector2(chooser.GetTexture1DCoordinate(elements[i].valueOnPoint[j]), 0),
                        Position = new Vector3(list[j].X, list[j].Y, list[j].Z),
                        
                    });
                }

 

And here is ColorChooser.GetTexture1DCoordinate method, which takes load on vertex as input information and returns tenture coordinate on each point:

 

public class ColorChooser
    {
public ColorChooser(float min, float max)
        {
            MaxValue = max;
            MinValue = min;

            float step = (Math.Abs(min) + Math.Abs(max)) / lvlCount;
            float nextStep = min;

            levels[0] = min;
            for (int i = 1; i < lvlCount - 1; i++)
            {
                nextStep += step;
                levels[i] = nextStep;
            }
            levels[lvlCount - 1] = max;
        }


public float GetTexture1DCoordinate(float val)
        {
            if ((val >= levels[0]) && (val < levels[1]))
            {
                return 0.06f;
            }
            if ((val >= levels[1]) && (val < levels[2]))
            {
                return 0.164f;
            }
            if ((val >= levels[2]) && (val < levels[3]))
            {
                return 0.244f;
            }
            if ((val >= levels[3]) && (val < levels[4]))
            {
                return 0.336f;
            }
            if ((val >= levels[4]) && (val < levels[5]))
            {
                return 0.425f;
            }
            if ((val >= levels[5]) && (val < levels[6]))
            {
                return 0.576f;
            }
            if ((val >= levels[6]) && (val < levels[7]))
            {
                return 0.66f;
            }
            if ((val >= levels[7]) && (val < levels[8]))
            {
                return 0.74f;
            }
            if ((val >= levels[8]) && (val < levels[9]))
            {
                return 0.832f;
            }
            if ((val >= levels[9]) && (val < levels[10]))
            {
                return 0.904f;
            }
            if ((val >= levels[10]) && (val < levels[11]))
            {
                return 0.986f;
            }
            if ((val >= levels[11]) && (val < levels[12]))
            {
                return 0.999f;
            }
            else
            {
                return 0.999f;
            }
        }
}

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement