• Advertisement

MePHyst0

Member
  • Content count

    223
  • Joined

  • Last visited

Community Reputation

232 Neutral

About MePHyst0

  • Rank
    Member
  1. Hi all, Could anybody help me out with this one? I basically want to replace the SetTexture call made by ID3DXEffect with my own called right on the device. I have the shaders that are going to be drawn with and the texture handles into the effect. However I don't know the stages which bind these textures to. The following code does not work(it is called after effect->Begin and BeginPass so the pixel shader is set correctly): device->GetPixelShader(&shader); shader->GetFunction(&buffer, &a) D3DXGetShaderConstantTable(buffer, &table); device->SetTexture(table->GetSamplerIndex(m_DiffuseMap), diffusetex); the d3dx return codes are ok. Thanks
  2. approximately 208x AddRef and Release.
  3. actually it tooks quite much time, summed up addref and release take up to 0.317ms whereas all the other API calls take up to 3.3ms, so it's cca 10% and I really don't want to sacrifice 10% of the frame time just to do the right thing COM-wise :)
  4. Hi everyone, Recently I have been profiling our engine and noticed that the effect section takes a lot time to complete, therefore I ran it through pix and ati's perf studio and found out that the biggest timeeaters are AddRef and Release called from ID3DXEffect::SetTexture, do anyone know why are these two called from settexture?(actually, i can't seem to find a reason to call AddRef and Release right after) Thanks, Szabo Tibor
  5. Optimise for SLI

    I would recommend you to look here , there are at least 2 papers dealing with SLI.
  6. Reflection issue

    Are your pBuffers(I really advice you to move on to Frame Buffer Objects) and the back buffer the same dimension? If not, then you have to take this into cosideration. Basically what are you getting is trying to read the reflection texture with coordinates outside [0,1], and thanks to CLAMP_TO_EDGE you get a constant color on the borders.
  7. I'm just wondering how can you get it working without a 3.0 vertex shader, because the specifications are saying that you can't :) Are you gettin a performance penalty with a simple 3.0VS? How big?
  8. Please Help ! (CgFx)

    I definetely agree with godmodder, implement at least the vertex program and check whether it works, if not implement the fragment program as well, then it should definetely work.
  9. Ragged edges to water reflection

    you are welcome :)
  10. Ragged edges to water reflection

    the error comes up because you are trying to fetch the texture outside its bounds, since the default clamp is clamp/repeat, so you will fetch a texture from the other side of the texture(see, you have the hill there on the error areas). There isn't really any way how to completely remove this side-effect, however you can partly compensate with setting the clamp to mirror, in most cases the error will be invisible.
  11. Better Ocean Rendering

    Quote:Original post by Gagyi Quote: Original post by MePHyst0 I bet that the particles in Gagyi's demo are connected via springs and integrated one after another. Nope, I am too lazy to make a physically correct representation, it's a crude approximation. If you would be able to extend your theory to be physically correct and to be applicable to 3D as well maybe we could make a little 3D demo, I would write the 3D back end and you would develop the theory. What do you say?
  12. Weird shader behavior

    Can you tell us what exactly stops working if you use IN.Normal.xyz ? I mean if IN.Normal.xyz really is a normalizes vector, then it should definetely work. Have you tried checking the assembly of the shader? Maybe visualising the normal vector should help...
  13. Better Ocean Rendering

    I think that the problem lies in the height map and its representation of the data. You can move just on the Y axis. I think if we would consider the tesselated ocean plane as a net of points joined together via springs it would be a lot easier. I bet that the particles in Gagyi's demo are connected via springs and integrated one after another. See the it? Exactly as a vertex shader would work. There was a paper from nVidia about integrating such string nets in the vertex shader, I think it would be a good starting point.
  14. I guess that the technique described in the link that you posted is the best around.
  15. maybe you could use the standard projection matrix, but after transforming the vertex to clip space, just discard its X and Y coordintates and replace them with the original ones(the coordinates before perspective projection). It's just a quick thought, so I'm not sure whether it will work.
  • Advertisement