Jump to content

  • Log In with Google      Sign In   
  • Create Account


Necrolis

Member Since 12 May 2010
Offline Last Active Yesterday, 03:19 AM

Posts I've Made

In Topic: How many APIs are for drawing text?

26 July 2014 - 08:25 AM

In a slightly more generalized answer: If you can attach PIX then that will help track down any specific DX API call used to perform an operation or render (as it gives you a breakdown of DX calls in a frame along with a before and after of the state and rendered frame -- in the frame capture mode). It will also help in debugging any issues you might have (though unfortunately MS saw fix to break PIX with a certain Win7 patch...). The various GPU vendors also provide similar free tools (NSight for nVidia, GPUPerfStudio for AMD and GPA for Intel).

 

The game might possibly call D3DPERF_SetOptions() to disable PIX, but its very easy to NOP/remove.


In Topic: Computer Graphics Stackexchange

08 June 2014 - 04:55 AM

Committed

 

 

I will commit if I can find my stack exchange login.

<offtopic> If stack exchange isn't the most annoying implementation of single-sign-on ever deploy to customers... </offtopic>

 

I bound mine to a google account, never had problems since


In Topic: Unreal Engine 4

23 April 2014 - 06:07 AM

Did they dropped their dynamic octree GI thingy? Or it wasn't dynamic to begin with?

They are releasing the Elemental Demo with UE4.1 (for free, see here), however I still have not been able to figure out if it contains the SVOGI impl., or if its been converted to LightMass or the LPV tech they settled with for dynamic GI (the Fable: Legends blog had a post on this in the past week).


In Topic: Unreal Engine 4

25 March 2014 - 01:50 AM

 


They use some hard-coded code style I've not seen before to build the UI 'forms' which looks kind of like nested method calls followed by multiple nested arrays but each section starts of with a '+'. From memory I can't remember the exact syntax but it looks quite odd to me. Perhaps it's something new in the latest version of c++?

 

Its actually a metric-ton of operator overloading done on multiple classes from what I can tell, its really weird to look at, but makes sense in a visual way as it more closely matches the nesting hierarchy of the elements.

 

A question I'd like to have answered is whether or not UE4 is set up to make it relatively easy to implement your own lighting models, or if you would have to go ripping deep into the source code to get that to work. Also, is it all node based, or can you use HLSL/GLSL at all?

 

The shaders are very well structured and nicely commented, so it should be a (relatively) simple task to add in new models (with the way they segregate a lot of it; it should be easier than from scratch as you get to reuse a lot of the common infrastructure), same goes for the C++ code as well. same goes for customizing the BDRF (which already has quite a lot of options built in).


In Topic: Emulating SetGammaRamp in a shader

21 March 2014 - 07:33 AM

 

How does your pRamp/GenGammaRamp stuff get used? The D3DGAMMARAMP is based around WORD values, not doubles, plus you seem to be generating an array of 255 values instead of 256.
 

It gets mapped back to a WORD before being used to filled the D3DGAMMARAMP (the reason for the size mapping is cause there is also a palette-based software renderer, but it ignores the gamma ramp -.-), as for the off by one error, thats probably my fault along the line somewhere, so thanks for catching that smile.png

 

As for how it gets mapped: its literally casted to a WORD as the range mapping is done inside GenGammaTable (I folded in the range value, fMaxGamma, cause I'm only concerned with the D3D gamma, originally this was a parameter):

double GammaTable[256];
D3DGAMMARAMP ramp;
GenGammaRamp(myGammaValue,myContractValue,GammaTable);

for(int i = 0; i < 256; i++)
{
    WORD Gamma = (WORD)GammaTable[i];
    ramp.red[i] = ramp.green[i] = ramp.blue[i] = Gamma; 
}
pD3ddevice->SetGammaRamp(0,D3DSGR_NO_CALIBRATION,&ramp);

 

If you wanted to do this in a shader, you'd take the array of 256 'gamma' values, and store them in a 256px * 1px texture (you could use D3DFMT_R32F and store them as floats in the 0-1 range). You'd then use a post-processing shader like this:

float3 coords = myColor.rgb;//treat the colour as a texture coordinate
coords = coords * 0.99609375 + 0.00390625;
//^^^ this is necessary to map 0.0 to the center of the left texel, and 1.0 to the center of the right texel
//sample the texture 3 times to convert each channel to the value in the gamma ramp:
myColor.r = tex2D( theTexture, float2(coords.r, 0) );
myColor.g = tex2D( theTexture, float2(coords.g, 0) );
myColor.b = tex2D( theTexture, float2(coords.b, 0) );

 

Ah so it is just a straight off "scaled-index", originally I had tried using "Out.Color = pow(In.Color,gGamma/2.2)" but I had no clue how to add in the contrast, this also washed out the colors very quickly as apposed to the original ramp.

 

I'm already using 1D LUT's to emulate 8-bit palettized color, so technically I should be able to remap the palette LUT to account for the gamma If I understand this correctly; though I think its probably best to first have it working with the double LUT. Your note about the texel centering reminds me that I didn't do this for my palette LUTs, so that fixes something else as well smile.png


PARTNERS