Jump to content

  • Log In with Google      Sign In   
  • Create Account

JohnnyCode

Member Since 10 Mar 2008
Offline Last Active Yesterday, 04:36 PM

Posts I've Made

In Topic: Efficient array shuffing in pure HLSL

Yesterday, 04:36 PM

You need a deterministic function for at least 1 milion values that returns enough noise over them (since you have a determined index for every pixel).

 

Pretty close to this volatilism demand might be goniometric function , for which you pick period size and way to volatilize y values on sufficient defintion length.

 

If you reduce to 100000 values and pick definition 0.0-100.0  then x would be multiple of index and constant 1/100.0. You than may decide to volatilize 10 periods of goniometric function by polynome of 10th degree (20 multiplications) - this randomizing polynome is predefined and does not change.

You can pick period size and   polynome size/period size, what allows you to scale the noise and stereotype size.

 

There are more ways how to achive noise. You may also volatilize polynome definition (its constants) upon index (still determined), or you can experiment with closest prime number to index.


In Topic: texture2D lookup bias?

28 October 2014 - 06:29 AM

I still sort of wonder wheather in case of bias not being a whole floating number, the sample is weighted from the two levels. That would be sort of performance hit, since it would require two fetches in a very incoherent memory.


In Topic: Fix bad mesh

27 October 2014 - 03:30 PM

In 3ds max is a modifier called "STL check". It will highligh those issues besides others , it is up to you what you will pick to inspect (open edges, spikes, double faces...)


In Topic: texture2D lookup bias?

27 October 2014 - 08:46 AM

Thanks Xycaleth. You seem to deeply understand this parameter, do I now understand it correctly that bias of 0.5 will sample from level current level+ 0.5 , meaning it will sample between the current level and lower level and those 2 samples weighted equaly? And that bias of 1.0 will sample exactly at the one level lower and return this sample?


In Topic: Is optimization for performance bad or is optimizating too early bad?

26 October 2014 - 05:36 AM

I am a total radical fan of premature optimization. This stems from the fact that I am a very -make it faster- hobby programmer. I realized that writing my projects optimized from the very beginning saves me from refactoring and rebuging my projects if I was to decide to optimize them later on. It keeps me bound to a very good policy when designing my project actualy (OOP only for rare objects, critical data being DOP). But I do leave isolated parts of project not optimized, leaving them able to get optimized later without refactoring outer parts of project. In the end, I am very  satisfied how I designed the project, to be scalable, fast, and still having potential to optimize safely.


PARTNERS