# Daniel Wilson

Member

23

159 Neutral

• Rank
Member
1. ## Tool to mathematically create a mesh

That's what I needed thanks!
2. ## What to do with an attenuated light vector in my shader

So I've been working on this a while longer, and the only real conclusion I have come up with is the colour values must be wrong. I thought for a while maybe they were right and I was not doing the correct thing with the result. I was given the good advice that: The scattered radiance should be added to the phong radiance component because the phong model computes the radiance at the surface only, the scattered radiance is just more radiance coming from under the surface, to the eye. [/quote] So hopefully that might help some future googlers, now if if I could just work out the correct values to begin with! I knew the colours were too trippy
3. ## How to change colour based on viewing angle

Very nice thanks for that cool answer, the effect is simple works well for my purposes!
4. ## How to change colour based on viewing angle

I would like to fake some caustics on a glassy surface very quickly, and the idea I have is to overlay a static caustic texture that appears based on the viewing angle, so when the angle is more grazing, more of the texture appears. How might I do this? A small piece of code would be seriously amazing. Thankyou!
5. ## What to do with an attenuated light vector in my shader

Yeah should've mentioned that too sorry, I am certain that data is correct. This shader is actually being used in Ogre, and it's easy to generate the texture in Ogre and the output is virtually identical to fx composer. Each rgba component in the floating point texture just holds a preset scattering coefficient value taken from this table:

7. ## What to do with an attenuated light vector in my shader

Okay so the [font=courier new,courier,monospace]p_omega_out[/font] value looks like this as a colour: Applied to a simple plane. I'm not sure whether or not that's the correct output for it but that's what I have at the moment. If I slot it in to my phong normal mapped plane, I get this: Which is pretty but obviously isn't right! So is [font=courier new,courier,monospace]p_omega_out[/font] wrong, or am I just putting it in the wrong place!? Here is the final part of the code for reference but the only change is the inclusion of the new attenuated vector: [source lang="cpp"] float3 BumpNormal = tex2D(nm, IN.texcoord)*2.0 - 1.0; float4 amb = AmbientIntensity * ambient; float4 diff = DiffuseIntensity * float4(p_omega_out, 1) * saturate(dot(IN.tangentSpaceLightDir,BumpNormal)); float3 R = normalize(2.0 * dot(BumpNormal, IN.tangentSpaceLightDir) * BumpNormal - IN.tangentSpaceLightDir); float3 v = normalize(IN.tangentSpaceEye); float spec = pow(saturate(dot(R,v)), specularPower) * SpecularIntensity; // compute final color float4 color = tex2D(color_map,IN.texcoord); float4 finalcolor = (amb + diff + spec) * color; return finalcolor;[/source]
8. ## Force field around a model and pushing vertices outwards in shader.

Okay I found it, sorry it wasn't on the publications page hope I didn't waste your whole night! This link contains the ppt presentation, it's a whopping 433mb so thank goodness for broadband. An overview of the talk is discussed here as well, with the shield stuff on page 3. Pretty nice ideas about colorizing it with tiny colour palette ramps, and fading out the effect as a function related to the depth buffer.
9. ## What to do with an attenuated light vector in my shader

Ah okay this is interesting thank you. You see the paper assumes you just know what to do with the output vector that has been attenuated from K-M-P. So theoretically it should be okay to use it in place of a hard coded "diffuse" value I did have, e.g.: [source lang="cpp"]float4 diff = DiffuseIntensity * p_omega_out * saturate(dot(IN.worldLightDir,BumpNormal));[/source] At the moment if I output p_omega_out as a colour, the result is some pretty blurry bands of colour, so hopefully it won't damage the rest of the colour too much. (I'll post a screen cap in a few mins). Thanks
10. ## Force field around a model and pushing vertices outwards in shader.

I saw a nice talk on this on the Bungie website, because Halo has a LOT of force fields. I can't remember the exact talk but it's on that page somewhere sorry!
11. ## What to do with an attenuated light vector in my shader

So I didn't quite figure this out yet because it's quite a complex shader. I would lke to implement this shader with a basic phong model. Say I have this: [source lang="cpp"] float3 BumpNormal = tex2D(nm, IN.texcoord)*2.0 - 1.0; float4 amb = AmbientIntensity * ambient; float4 diff = DiffuseIntensity * diffuse * saturate(dot(IN.worldLightDir,BumpNormal)); float3 R = normalize(2.0 * dot(BumpNormal, IN.worldLightDir) * BumpNormal - IN.worldLightDir); float3 v = normalize(IN.eye); float spec = pow(saturate(dot(R,v)), specularPower) * SpecularIntensity; // compute final color float4 color = tex2D(color_map,IN.texcoord); float4 finalcolor = (amb + diff + spec) * color;[/source] R and V are purely directional vectors right? So my attenuated light vector must represent a direction and the intensity of the light that is returned to the viewer I think. In the above code the only thing that strikes me as close to this is the calculation of the diffuse value. Ignoring the bump map (I don't need bumped normals), does anyone know if it would be wrong to have: [source lang="cpp"]float4 diff = DiffuseIntensity * diffuse * p_omega_out;[/source] Where [font=courier new,courier,monospace]p_omega_out[/font] is the attenuated light intensity/direction? I think my vector needs to go from the pixel to the viewer, would this vector do that?!
12. ## When to normalize vectors in a shader?

Yes I thought so, I just figured maybe there was some rules like "you shouldn't normalize in view space". Thanks for the tips. I shall try and keep any normalizing to a minimum though for now and see how it goes
13. ## When to normalize vectors in a shader?

Hi, I am currently working on a shader and the result is not quite right. One question that bugs me I cannot seem to find the answer to is why and when exactly we normalize vectors in a shader. I understand that it seems to only be important for direction vectors such as the light, but is it important to do so in certain spaces with other vectors aswell? I took a naive approach at first and normalized all the time. I am not so much concerned with efficiency, just why and when it is okay or not okay to normalize a vector. For example, the shader I am writing at them moment requires the vector from the view position to the vertex position in world space. Some shaders I see normalize this when it is in the fragment shader, but does that not destroy the value of the vector for further calculations?
14. ## Tool to mathematically create a mesh

Hmm, this is very close to what I need, but it uses parametric equations to define the formula, e.g. a sphere would in the form: x = r cos(u) sin(v), y = r sin(u) sin(v), z = r cos(v). I need something I can type x^2 + y^2 + z^2 = r^2 and get a sphere!? I've been looking for a way to convert the equation of the surface I have to a parametric one but can 't really figure it out!
15. ## Tool to mathematically create a mesh

Hi I need a good tool to mathematically create a surface and then get it into 3ds max, for example a sphere is defined algeraically as x^2 + y^2 + z^2 = r^2. Any examples? The only one I can find is 3D-Math Xplor, but the full version only works on a mac and I have Windows 7. The Java version that works on all platforms does not have the export functionality!