Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

140 Neutral

About enigmagame

  • Rank
  1. enigmagame

    Linear gradient shader ( Photoshop-like)

    With isFixed false I want the gradient influenced by the camera position. My shader is wrong, since the start point of the gradient is the bottom of the window instead of the bottom of the sprite. The question is: how can I modify the shader in order to have the gradient starting from the bottom of the sprite? Maybe I need the size of the sprite in pixel? Or there are other convenient ways?   The other question regards the "fixed gradient": if I want the gradient not influenced by the camera position, what is the convenient way? It's possibile to have these two behavior in the same shader?   Thanks.
  2. I'm searching a way to implement a linear gradient shader that behaves as the linear gradient in Photoshop (only the vertical case is necessary). It will be applied to 2D sprites. Currently I'm passing to the pixel shader these parameters: StartColor. EndColor. Offset: the gradient starting point. Length: the gradient length (the range inside where the colors will be interpolated). isFixed: a boolean parameter that indicates if the gradient must be influenced by the camera position or not. Here a first attempt of the vertex shader and the pixel shader that I've implemented: struct VsInput { float3 position : VES_POSITION; float2 uv : VES_TEXCOORD0; float4 color : VES_COLOR; }; struct VsOutput { float4 position : HPOS; float2 uv : TEXCOORD0; float4 color : COLOR0; float4 outPos : TEXCOORD1; }; VsOutput main(VsInput input) { VsOutput vsOutput; vsOutput.position = MUL(float4(input.position, 1.0f), WorldViewProjection); vsOutput.uv = input.uv; vsOutput.color = input.color; vsOutput.outPos = vsOutput.position; return vsOutput; } struct PsInput { float4 Position : HPOS; float2 UV : TEXCOORD0; float4 Color : COLOR0; float4 outPos : TEXCOORD1; }; float4 startColor; float4 endColor; float offset; float len; bool isFixed; float4 main(PsInput psInput) : COLOR { psInput.outPos = psInput.outPos / psInput.outPos.w; float yScreenSize = 900.0f; float yPixelCoordinate = 0.0f; if (isFixed) { yPixelCoordinate = 0.5f * (1.0f - psInput.UV.y) * yScreenSize; } else { yPixelCoordinate = 0.5f * (psInput.outPos.y + 1.0f) * yScreenSize; } float gradient = (yPixelCoordinate + offset) / len; gradient = clamp(gradient, 0.0f, 1.0f); return lerp(startColor, endColor, gradient); } I think something's wrong, for example, with an offset of 0.0, the gradient starts from the bottom of the window and not from the bottom of the sprite. Another issue is that there is no correlation between the fixed and non-fixed mode.   I'm pretty sure that there's something correct and something wrong in my approach, so I'm here for any suggestions.   Thanks.
  3. Given a light direction, how can I move it according to the camera movement, in a shader? Think that an artist has setup a scene (e.g., in 3DSMax) with a mesh in center of that and a directional light with a position and a target. From this position and target I've calculated the light direction. Now I want to use the same direction in my lighting equation but, obviously, I want that this light moves correctly with the camera. Thanks.
  4. Nik02 thanks for the hint, you're absolutely right: thinking on the texture maps has completely moved my focus on the problem. I've just tried with two control points and the result is correct. Just another question: I've ramps with four, five or six control points, what is the best way to interpolate these control points and achieve a correct result? Bézier curve? Thanks.
  5. The material definition of a mesh is composed of these three components: Self-Illumunation, Refletcion and Refraction. Each of these components has a Gradient Ramp as a map and the mapping mode is set to spherical environment. I'm searching for a way to reproduce these effects in a shader (the shader language doesn't matter). Is it possible? My first idea was to save the Gradient Ramp as a texture: you can see the result in the image below: It seems to be a Blinn/Newell Latitude Map instead of a Spherical Map, but using the math behind the first, the result isn't correct. Thanks.
  6. [color=#333333]I'm searching for suggestions and resources on the possible ways to design a character animation system. I mean a system built on top of the graphics engine (as graphics engine I use Ogre3D, that provide an animation layer), and that's must be in contact with the logic of the game[color=#333333]. More in detail I'm searching about the action state mechines (or animation state machines), that is build on top of the animation pipeline already provided by the graphics engine. So, a state-driver animation interface for use by virtually all higher-level game code. [color=#333333]It's for a sports title, maybe the question is non trivial.
  7. enigmagame

    Bowling physic parameters

    Hi guys, I'm working on a simple bowling game, using Ogre as rendering engine and Bullet as physics engine. I'm at the point to tune all the physical parameters, but how? Where I can find those? For example: - Ball friction - Ball restitution - Ball angular damping - Ball linear damping - Ball, any other... - Pins parameters - Lane friction - Lane restitution Thanks.
  8. Ok guys, I've found the problem, and it was a very, very, very stupid problem: the size of the sphere mesh. I was convinced, very convinced, that the size was 1, but I was wrong. I've created a new mesh, and this is the results: It seems correct. You confirm? Thanks.
  9. Sorry, but I don't understand very well. I'm a bit confused. I've correctly understood the problem that you've explained, but not the solution.
  10. Do you mean the attenuation? Because if I use the original attenuation formula described on the tutorial: //surface-to-light vector float3 lightVector = lightPosition - position; //compute attenuation based on distance - linear attenuation float attenuation = saturate(1.0f - length(lightVector)/lightRadius); I obtain the same, wrong, result.
  11. Yes, but I already do that, the vertex shader receive the world matrix (for the sphere) that you've wrote above. I think that the problem isn't here. The problem is the clear cut, that you can see in all the screenshot that I've linked in the precedents post. Small sphere radius: ] Big sphere radius:
  12. It's right, in fact storing also x and y positions don't change the results. I don't understand very well. What you intend?
  13. Very interesting, you are right, at the moment I store color (diffuse), normals, and depth, not x and y positions. Then, as you say, I get the position from the vertex shader. I'm going to store also x and y positions. You're storing positions and depth on the same texture or using two differents texture?
  14. It's a very interesting question in fact, for me, there is something that's going wrong on other place. But, where? If I watch the data in PIX it seems all correct (diffuse, specular, normal and depth). But, of course, I'm not totally sure.
  15. But this is the approach that I've used in the code posted above (where I've posted the whole PS). And as you can see the result isn't correct. Well try using it like this: Set the first attenuation parameter to 1.0f, the second one too, and the third one to 0.0f. Tell me how it looks like, for me its working this way. [/quote] This is the results: I'm very confused, I don't understand what's going wrong.
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!