Jump to content

  • Log In with Google      Sign In   
  • Create Account


HDR lighting + bloom effect


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
19 replies to this topic

#1 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 26 April 2009 - 05:43 AM

Hi! I want to implement hdr lighting to my engine, but i just can't find a good tutorial for managed directx to show me how to do it. Bassicly, i understood what i have to do: 1. render the color data to a texture (color can exceed 1.0f) 2. resize the texture (make it smaller) 3. set the areas that are below a luminance level to black (0) 4. blur the texture 5. scale the texture to its original size 6. render the original texture combined with the resulting texture from step 5. 7. get the maxium luminance level of the texture resulting texture from step 6. 8. divide the texture colors by the maxium luminance 1st question: are these the right steps to perform a hdr rendering? 2nd question(s): if they are, how do i render the color data to a texture?, how do i resize the texture?, how do i get the luminance of a pixel?, how do i blur the texture?, how do i combine 2 textures?, how do i get the maxium luminance level of a texture? 3rd question: can all those operations be described in one .fx file? (i mean i don't want to do any of this operations within my application code (i don't want to use the cpu)). thx in advice.

Sponsor:

#2 wolf   Members   -  Reputation: 848

Like
-1Likes
Like

Posted 26 April 2009 - 06:04 AM

you can start with the examples that are in the DirectX SDK. They are a good starting point to do your own stuff.

#3 wolf   Members   -  Reputation: 848

Like
0Likes
Like

Posted 26 April 2009 - 06:07 AM

upps I just saw that are you are specifically looking for managed DirectX ... I think it would be good to check out the C++ stuff first.
What you say there is a good starting point. You will have to use render targets with a higher resolution then 8-bit per channel and then there are lots of small little challenges attached to each of the points you mention in 1 - 8..
The level of detail you will go into will decide on how good your HDR pipeline is at the end.
Gamma correction will be also a major topic to look into.

#4 MJP   Moderators   -  Reputation: 10917

Like
0Likes
Like

Posted 26 April 2009 - 06:50 AM

First of all....why MDX? It's a dead project that's no longer included in the SDK, and will never be updated again. These days if you want to do DirectX with a managed language your only real choices are SlimDX and XNA. The former is (very good) wrapper of DX9/DX10/DX10.1, while the latter is a wrapper of DX9 combined with other game-related framework utilities (it's also compatible with the Xbox 360 and Zune).

Secondly, I agree with wolf that the SDK samples are a good place to start. In fact there's actually a managed port of the HDR Pipeline sample that happens to be the first when when you search on google for "HDR sample MDX".

You might also want to check out this, it's a good overview.

#5 simonjacoby   Members   -  Reputation: 544

Like
7Likes
Like

Posted 26 April 2009 - 07:42 AM

EDIT: sorry, I was still writing when you guys answered :)

Hi,

you've got some of the concepts mixed up. From your description, it sounds like you're trying to do three things:

1. Render HDR
2. Perform automatic luminance adaption during tone-map pass
3. Add a bloom effect

Here's a brief explaination on how you do each step, and why:

HDR rendering: this is the source of a lot confusion, mainly because it's one of those buzzwords that gets thrown around alot. Here's what it means in practical terms:

When you draw stuff "regularly", you usually do that to a color buffer where each channel is eight bits (for example RGBA8). This is fine for representing colours, but when you're rendering 3D-stuff you really need more precision, because your geometry will be lit and shaded in various ways, which can cause pixels to have very high or very low brightness.

The way to fix this is simply to render to a buffer that has higher precision in each color channel. That's it. Instead of just using 8 bits, use more. One format that is easy to use and has good enough precision is half-float format, in D3D lingo D3DFMT_A16R16G16B16F. Because of limitations of the GPU, you usually can't set the backbuffer to a high-precision format. So instead, you create a texture with this format, render to it, and then copy the result to the backbuffer so it can be shown on your screen.

So, all you have to do is to create a texture with this new format (for example), and bind it as the render target instead of the default back buffer. Let's call this texture the HDR render texture. When you have created it and set it as render target, just draw as usual. When you're done rendering, copy the pixels in the HDR texture to the old back buffer to show it. The copy is usually done by drawing a full screen quad textured with the HDR render texture over the back buffer. When you've done this: voilá! Your very first HDR rendering is done :)

If you've done this correctly, the first thing you will notice is that there has been no improvement at all to your regular rendering ;) This is because we haven't done any of the cool stuff that higher precision enables us to do. Some of the most common things people do are bloom, exposure, vignetting and luminance adaption (exposure, vignetting and luminance adaption are usually called tone-mapping when used together).

Here's what the are, and how you do them.

Exposure: there's a great article written by hugo elias that explains it much better than I could do here: http://freespace.virgin.net/hugo.elias/graphics/x_posure.htm
In practice, that article boils down to a single line of code at the end of your shader:

float4 exposed = 1.0 - pow( 2.71, -( vignette * unexposed * exposure ) );

where 'unexposed' is the "raw" pixel value from your HDR texture, 'vignette' is explained below, and 'exposure' is the constant K in hugo elias article. In my code it's simply declared as:

const float exposure = 2.0;

...because 2.0 makes my scene look nice. You may have to use a different value that look good for you, if you decide to implement exposure. If you want it a bit more robust, You can make this happen automatically, it is described in 'luminance adaption' below. Also, know that there are several ways of performing the exposure, with different formulas, which result in different images. The Hugo Elias one is an easy way to get started though.

Vignetting: Because a lens in a camera has a curved shape, it lets in less amount of light at the edges, so many photos or films (especially on cheap cameras) have noticably darkened edges. See example here: http://mccollister.info/vignette70.jpg. This effect is called vignetting. It is simulated with two lines of code:

float2 vtc = float2( iTc0 - 0.5 );
float vignette = pow( 1 - ( dot( vtc, vtc ) * 1.0 ), 2.0 );

...where iTc0 are the texture coordinates of the full screen quad, and ranging from 0..1. The result is a factor that is 1.0 in the center of the screen and becomes less as it moves away from the center.

Luminance adaption: this is part of the exposure, but can be done separately. In Hugo's code, the constant K (and in my code the variable 'exposure') the exposure is fixed, meaning that you have to tweak it manually for a scene to look good. If a level varies a lot in brightness (for example you are standing in a dark room and then walking outside to a sunny day), no value of K will work very well for both scenes (the sunny outside may be 10,000 times (or more) brighter than the dark inside). Instead, you need to measure how bright the scene is so you can adjust K accordingly.

The easiest way to do this is to take the average of all pixels in the HDR texture. One way to do this that is fast is to make mip-maps of the HDR texture, all the way down to a 1-pixel texture. This final one-pixel texture will then contain the average of all the pixels above, which is the same as the average scene luminance. Use this value as K when doing the exposure. You simply do this by using the 1-pixel texture as input to the exposure, instead of the hardcoded K (or 'exposure' as it's called in my code example). You will need to tweak it to look good and adapt the way you want, but when it's done your renderer can handle all kinds of brightnesses, which is very cool :)

Finally, there's the blooming: i'm sure you already know what this is, simply making bright parts of the scene glow a bit. This is simply done by taking a copy of the current scene, blurring it, and adding the blurred version back to the original. To make this fast, you usually scale down the scene to a texture that is for example 1/4 of the original, and then blur that. Another thing that you do is that you only want the brightest pixels to glow, not the entire scene, therefore, when scaling down the scene, you usually also subtract a factor from the original pixels, for example 1.0, but you can use whatever looks good. The smaller this factor is the more parts of your scene will glow and vice versa.

Whoa, long post :)

While this probably seems pretty complicated, depending on what look you want for your game and the type of game you are creating you can decide on implementing all of this or just some of it. The modern FPS games and racing games implement most of this above, but if you just want to make a simple space shooter with some nice glowing effects, all you have to implement is the "render to high-precision-texture"-part, and the bloom part.

For starters, you should probably just try that, and then add the other effects as you get more comfortable.

So, to answer your questions:

1. It depends, see above :)
2. You render color data to a texture by creating a texture with usage D3DUSAGE_RENDERTARGET and a high precision pixel format, and then setting it with device->SetRenderTarget( 0, m_hdr_rt_tex );
3. You resize by creating more mip levels for your texture, and rendering to them. Don't forget to set the ViewPort to match the mip level size.
4. One simple way of getting the luminance is simply averaging the color channels together. This can be done with a dot product, like so:
float lum = dot( color.rgb, float3( 0.333, 0.333, 0.333 ) );
Some people like to weigh the different channels differently with more on green and a lot less in blue, but in practice nobody ever notices the difference unless you point it out ;) Feel free to experiment :)
5. You blur a texture by averaging several nearby samples together.
6. It depends, but when adding bloom you usually just add the color values.
7. Described above.
8 (3rd? :)). Yes, you can have everything in one .fx-file, create each one as a technique (downsample_technique, bloom_combine_technique etc).

Best of luck!

Simon

#6 leet bix   Members   -  Reputation: 116

Like
0Likes
Like

Posted 26 April 2009 - 08:26 AM

superb post Simon.

#7 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 26 April 2009 - 08:36 AM

wow.... thx for the fast answers.

K. So, to answer to ur'....answers :).... i'm working on a rts game, and now i'm doing the map editor. the reason that i'm using mdx is because i don't know how to add butons, panels, and so on in native c++. (the game application is written in native c++. i've done the start screen and now i have to do the level editor).

I've looked at the samples that come up with the SDK, but i simply don't understand them. I started studying HLSL 2 days ago (i follow the www.riemers.net tutorial).

thx for the 'long answer' :). it was the kind of answer i was expecting to get.
but, i don't know how to do some of the things u described there.... 1st, u said that i have to copy the HDR-texture to the back buffer. isn't there a simpler way of doing this? smth like device.backbuffer = hdr-texture? from what u wrote there, i have to create 4 vertices and render them with the hdr-texture right?
2nd: i don't know how to scale the texture (create mipmaps)
3rd: i didn't understood how to blur the texture :(

i think that's all :)

[Edited by - cyberlorddan on April 26, 2009 3:36:37 PM]

#8 MJP   Moderators   -  Reputation: 10917

Like
0Likes
Like

Posted 26 April 2009 - 11:27 AM

Quote:
Original post by cyberlorddan

K. So, to answer to ur'....answers :).... i'm working on a rts game, and now i'm doing the map editor. the reason that i'm using mdx is because i don't know how to add butons, panels, and so on in native c++. (the game application is written in native c++. i've done the start screen and now i have to do the level editor).



Well I really don't think you want to create a version of your renderer in native DX, and a version of your renderer in MDX. That's a disaster waiting to happen.

What you CAN do is generate managed wrappers of your native C++ classes using C++/CLI. This will allow you to write your editor in C#, and use the same native rendering back-end. However I'll warn you that although it starts out somewhat simple, maintaining your wrappers can turn into a very non-trivial task. If you're not working on a bigger team, it's much more ideal to just have everything written in managed code.

Another option is to use a C++ toolkit like Qt or WxWidgets for doing the UI. Those are generally much easier to work with than the native Windows API.


#9 leet bix   Members   -  Reputation: 116

Like
0Likes
Like

Posted 26 April 2009 - 04:29 PM

1. What you generally do, is once you have all of your final images, and all that is left to do is merge, is that you get a copy of the back buffer through the device, set it as the render target, then draw a full screen quad and render all of the images that have been put into your one final one.

2 and 3 can be answered if you would take the time to look at some code from the sdk or other relivant source, for examples on how to do both take a look at this article; http://www.gamedev.net/columns/hardcore/hdrrendering/

there is also a more in depth and complex description of the hdr process based on the DirectX10 API here on the wiki.

#10 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 27 April 2009 - 05:13 AM

thx for the info. i'm slowly implementing hdr to my engine (i got everything that i need to do this...thx for the links :) ).
1 more question..... U said that MDX is a dead project... that means that it won't be updated anymore? or only the documentation for it won't be updated? i'm asking this because i saw that tesselation was implemented in directx 11 only, and i saw somewhere (i can't remember where) while i was doing modifications to my mdx engine, a struct or enumerator or smth (it showed up in the intellisense scroll list) that had the name Tesselation.... :| it might be a stupid question, but i really don't know very much of mdx :(

#11 MJP   Moderators   -  Reputation: 10917

Like
0Likes
Like

Posted 27 April 2009 - 05:30 AM

Quote:
Original post by cyberlorddan
1 more question..... U said that MDX is a dead project... that means that it won't be updated anymore? or only the documentation for it won't be updated? i'm asking this because i saw that tesselation was implemented in directx 11 only, and i saw somewhere (i can't remember where) while i was doing modifications to my mdx engine, a struct or enumerator or smth (it showed up in the intellisense scroll list) that had the name Tesselation.... :| it might be a stupid question, but i really don't know very much of mdx :(


Yes, it won't be updated in any way. No D3D10, no D3D10.1, no D3D11. Not even bug fixes. Like I said it's not even included in the SDK anymore. The tessellation stuff you're seeing in the documentation was never actually supported in D3D9 hardware, and is completely different from the programmable tessellation available in D3D11.





#12 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 27 April 2009 - 05:42 AM

Quote:
Original post by cyberlorddan
thx for the info. i'm slowly implementing hdr to my engine (i got everything that i need to do this...thx for the links :) ).
1 more question..... U said that MDX is a dead project... that means that it won't be updated anymore? or only the documentation for it won't be updated? i'm asking this because i saw that tesselation was implemented in directx 11 only, and i saw somewhere (i can't remember where) while i was doing modifications to my mdx engine, a struct or enumerator or smth (it showed up in the intellisense scroll list) that had the name Tesselation.... :| it might be a stupid question, but i really don't know very much of mdx :(


i said that i know everything i should (well i was wrong). i ran into another problem :(

first of all, the format of the texture that i should set to render the scene to. i have a problem..... if i set it to A16B16G16R16, then semi-transparent objects are rendered wrong (instead of blending with the content that's below them, they are blending with the color i set in device.clear() method.

second, i have no depth buffer :| . when rendering to the texture (surface) instead of the screen, the depth buffer doesn't work. it renders everything in the order i tell them to render (meaning that objects that are in the back are shown in front of others)

any solutions? :(

btw, i should have mentioned that by setting the texture format to rgb8 the transparency problem is solved, but that is not a hdr format right?



#13 leet bix   Members   -  Reputation: 116

Like
0Likes
Like

Posted 27 April 2009 - 08:20 AM

Have you looked at the HDRLighting sample in the SDK?

#14 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 27 April 2009 - 08:35 AM

Quote:
Original post by leet bix
Have you looked at the HDRLighting sample in the SDK?


yes... but when i try to execute it it gives an error that says that the device can not pe initialized properly... :(
i've benn thinking that this problem might be caused by the incompatibility of my graphics card (it's kinda old... geforce 6200 . i'll change it on 10th may with an 9800 one - my birthday :D ).
could this also be because i'm using mdx instead of native directx?

#15 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 27 April 2009 - 09:48 AM

i think i should better post the important parts of my code: (i removed the parts that weren't relevant to the problem)

let me explain first what this code should do (or what's left from it). first i initialize all the stuff i have to. the function render() is called on every frame. the problems: objects are rendered in front of others (the depth buffer doesnt work). the transparency is a mess.... you can look at this image to see what results i get: http://i383.photobucket.com/albums/oo272/cyberlorddan/dxprob.png

i also explained some things in the image


namespace LevelEditor
{
public partial class Main : Form
{
int widthP;
int heightP;

Device motorGrafic;

VertexBuffer vb = null; //fsdfsdfsfdsfsdfsdfs
IndexBuffer ib = null;

Matrix projection;
Matrix camera;

short[] indices = { 0, 1, 2, 2, 1, 3 };

CustomVertex.PositionNormalTextured[] terrainTriangle;
TerrainPointByPoint[] terrainDetailPBP; //terrainpbp is class that holds the terrain data such as height, texture, pathing and so on....

Texture rocksTexture;
Texture dirtFinalTexture;

Texture originalRenderedScene;
Surface originalRenderSurface;
Surface bbS;
Surface depthStencilS;

CustomVertex.TransformedTextured[] screenVertices = new CustomVertex.TransformedTextured[6];

Effect effect;

float curWindHei;
float curWindWid;

public Main()
{
InitializeComponent();
//widthP and heightP represent the terrain size. they are initialized here (i removed the code because it was big and wasnt relevant

initializeMainEngine();
initializeBuffers();
initializeMeshes();
initializeTextures();

vd = new VertexDeclaration(motorGrafic, velements);

originalRenderedScene = new Texture(motorGrafic, this.Width, this.Height, 1, Usage.RenderTarget, Format.A16B16G16R16, Pool.Default);
originalRenderSurface = originalRenderedScene.GetSurfaceLevel(0);

}
void initializeBuffers()
{
vb = new VertexBuffer(typeof(CustomVertex.PositionNormalTextured), 4, motorGrafic, Usage.Dynamic | Usage.WriteOnly, CustomVertex.PositionNormalTextured.Format, Pool.Default); //fsdfsdfsfdsfsdfsdfs
vb.Created += new EventHandler(this.OnVertexBufferCreate); //fsdfsdfsfdsfsdfsdfs
OnVertexBufferCreate(vb, null);
ib = new IndexBuffer(typeof(short), indices.Length, motorGrafic, Usage.WriteOnly, Pool.Default); //fsdfsdfsfdsfsdfsdfs
ib.Created += new EventHandler(this.OnIndexBufferCreate); //fsdfsdfsfdsfsdfsdfs
}
void initializeMainEngine()
{
PresentParameters paramPrez = new PresentParameters();
paramPrez.SwapEffect = SwapEffect.Discard;
paramPrez.Windowed = true;
paramPrez.MultiSample = MultiSampleType.FourSamples;
paramPrez.AutoDepthStencilFormat = DepthFormat.D16;
paramPrez.EnableAutoDepthStencil = true;
paramPrez.BackBufferFormat = Format.X8R8G8B8;

motorGrafic = new Device(0, DeviceType.Hardware, this.splitContainer1.Panel2, CreateFlags.SoftwareVertexProcessing, paramPrez);
effect = Effect.FromFile(motorGrafic, "defaultEffect.fx", null, ShaderFlags.None, null);
}
void initializeTextures()
{
//removed code
}
void initializeMeshes()
{
//removed code
}
void initializeTerrain()
{
//removed code

}
void OnIndexBufferCreate(object sender, EventArgs e)
{
//removed code
}

void OnVertexBufferCreate(object sender, EventArgs e)
{
VertexBuffer buffer = (VertexBuffer)sender; //fsdfsdfsfdsfsdfsdfs
originalRenderedScene = new Texture(motorGrafic, this.splitContainer1.Panel2.Width, this.splitContainer1.Panel2.Height, 1, Usage.RenderTarget, Format.A16B16G16R16F, Pool.Default);
originalRenderSurface = originalRenderedScene.GetSurfaceLevel(0);


//some code removed here

}

void generateTerrainData(int whichOne)
{
//code removed here
}

Point GetMouseCoordonates()
{
//code removed here
}

void render()
{

projection = Matrix.PerspectiveFovLH((float)Math.PI / 4, curWindWid / curWindHei, 0.1f, 50.0f);
camera = Matrix.LookAtLH(currentCameraPosition, currentCameraTarget, currentCameraUp);


bbS = motorGrafic.GetBackBuffer(0, 0, BackBufferType.Mono);


motorGrafic.SetRenderTarget(0, originalRenderSurface);

motorGrafic.Indices = ib;
motorGrafic.VertexDeclaration = vd;
motorGrafic.RenderState.SourceBlend = Blend.SourceAlpha;
motorGrafic.RenderState.DestinationBlend = Blend.InvSourceAlpha;
motorGrafic.SetStreamSource(0, vb, 0);
motorGrafic.BeginScene();

motorGrafic.Clear(ClearFlags.Target | ClearFlags.ZBuffer, Color.CornflowerBlue.ToArgb(), 1, 0);

motorGrafic.RenderState.AlphaBlendEnable = true;
motorGrafic.RenderState.ZBufferEnable = true;

effect.SetValue("xColoredTexture", rocksTexture);
effect.Technique = "Simplest";
effect.Begin(0);
effect.BeginPass(0);
effect.SetValue("xViewProjection", Matrix.Translation(trackBar1.Value, trackBar2.Value, trackBar3.Value) * camera * projection);
effect.SetValue("xRot", Matrix.Translation(trackBar1.Value, trackBar2.Value, trackBar3.Value));
motorGrafic.SetTexture(0, rocksTexture);
motorGrafic.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, 0, 4, 0, 2); //////////i use this to render the position of my light


////i render here my scene

}
};

effect.EndPass();
effect.End();
motorGrafic.EndScene();

motorGrafic.RenderState.Lighting = false;
motorGrafic.SetRenderTarget(0, bbS);
motorGrafic.SetTexture(0, originalRenderedScene);
motorGrafic.Clear(ClearFlags.Target, Color.Red, 1, 0);
motorGrafic.BeginScene();


motorGrafic.VertexFormat = CustomVertex.TransformedTextured.Format;
motorGrafic.RenderState.CullMode = Cull.None;

motorGrafic.DrawUserPrimitives(PrimitiveType.TriangleList, 2, screenVertices);

motorGrafic.EndScene();
motorGrafic.Present();
}

void setCameraPosition(float xCamPozS, float yCamPozS, float zCamPozS)
{

////removed code
}
void setCameraTarget(float xCamTarS, float yCamTarS, float zCamTarS)
{
//removed code
}
}
}


...so how do i fix this?

#16 leet bix   Members   -  Reputation: 116

Like
0Likes
Like

Posted 27 April 2009 - 09:45 PM


originalRenderedScene = new Texture(motorGrafic, this.Width, this.Height, 1, Usage.RenderTarget, Format.A16B16G16R16, Pool.Default);
originalRenderSurface = originalRenderedScene.GetSurfaceLevel(0);


Only the latest DirectX 10 compatible graphics cards (NVIDIA G8x) supports alpha blending, filtering and multi-sampling on a 16:16:16:16 render target. Graphics cards that support the 10:10:10:2 render target format support alpha blending and multi-sampling of this format (ATI R5 series). Some DirectX 9 graphics cards that support the 16:16:16:16 format support alpha blending and filtering (NVIDIA G7x), others alpha blending and multi-sampling but not filtering (ATI R5 series). All alternative color spaces than the following (HSV, CIE Yxy, L16uv, RGBE) do not support alpha blending. Therefore all blending operations still have to happen in RGB space.
An implementation of a high-dynamic range renderer that renders into 8:8:8:8 render targets might be done by differing between opaque and transparent objects. The opaque objects are stored in a buffer that uses the CIE Yxy color model or the L16uv color model to distribute precision over all four channels of this render target. Transparent objects that would utilize alpha blending operations would be stored in another 8:8:8:8 render target in RGB space. Therefore only opaque objects would receive a better color precision.
To provide to transparent and opaque objects the same color precision a Multiple-Render-Target consisting of two 8:8:8:8 render targets might be used. For each color channel bits 1-8 would be stored in the first render target and bits 4 - 12 would be stored in the second render target (RGB12AA render target format). This way there is a 4 bit overlap that should be good enough for alpha blending.

#17 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 28 April 2009 - 05:11 AM

Quote:
Original post by leet bix

originalRenderedScene = new Texture(motorGrafic, this.Width, this.Height, 1, Usage.RenderTarget, Format.A16B16G16R16, Pool.Default);
originalRenderSurface = originalRenderedScene.GetSurfaceLevel(0);


Only the latest DirectX 10 compatible graphics cards (NVIDIA G8x) supports alpha blending, filtering and multi-sampling on a 16:16:16:16 render target. Graphics cards that support the 10:10:10:2 render target format support alpha blending and multi-sampling of this format (ATI R5 series). Some DirectX 9 graphics cards that support the 16:16:16:16 format support alpha blending and filtering (NVIDIA G7x), others alpha blending and multi-sampling but not filtering (ATI R5 series). All alternative color spaces than the following (HSV, CIE Yxy, L16uv, RGBE) do not support alpha blending. Therefore all blending operations still have to happen in RGB space.
An implementation of a high-dynamic range renderer that renders into 8:8:8:8 render targets might be done by differing between opaque and transparent objects. The opaque objects are stored in a buffer that uses the CIE Yxy color model or the L16uv color model to distribute precision over all four channels of this render target. Transparent objects that would utilize alpha blending operations would be stored in another 8:8:8:8 render target in RGB space. Therefore only opaque objects would receive a better color precision.
To provide to transparent and opaque objects the same color precision a Multiple-Render-Target consisting of two 8:8:8:8 render targets might be used. For each color channel bits 1-8 would be stored in the first render target and bits 4 - 12 would be stored in the second render target (RGB12AA render target format). This way there is a 4 bit overlap that should be good enough for alpha blending.


thx for the answer. so i'll have to wait until i get a new nvidia graphics card... i could use this time to move the code to c++.
anyway, i still can't get the depth buffer to work properly. this isn't a hardware problem, because the HDRFormats sample shows the teapot as it should. any solution for the depth buffer problem?

#18 leet bix   Members   -  Reputation: 116

Like
0Likes
Like

Posted 28 April 2009 - 08:16 AM

I don't think you need newer hardware, just use a differnet format render target to store your hdr values.
I don't know what's wrong with the depth buffer, but I don't think you should have to use a floating point buffer to hold the information, 256 deltas per channel should be plenty.

#19 cyberlorddan   Members   -  Reputation: 122

Like
0Likes
Like

Posted 28 April 2009 - 08:28 PM

Quote:
Original post by leet bix
I don't think you need newer hardware, just use a differnet format render target to store your hdr values.
I don't know what's wrong with the depth buffer, but I don't think you should have to use a floating point buffer to hold the information, 256 deltas per channel should be plenty.


I 'solved' the problem with the depth buffer. I had multisampling turned on. When i turned it off, the depth buffer worked as it should.
but now a new, probably stupid, question arises: how do i turn multisampling on without 'damaging' the depth buffer?

#20 simonjacoby   Members   -  Reputation: 544

Like
0Likes
Like

Posted 28 April 2009 - 11:52 PM

You need to create a multisampled depth buffer to match your color buffer. The color/depth must always match in multisample modes (and no multisampling is also a multisample mode ;))

/Simon




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS