Jump to content
  • Advertisement

Lightness1024

Member
  • Content Count

    243
  • Joined

  • Last visited

Community Reputation

936 Good

About Lightness1024

  • Rank
    Member

Personal Information

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Lightness1024

    A retrospective on the Infinity project

    that is the mistake duke nukem forever made. don't mutate the technology again and again. And you're in first place to already know that. ECS was hype, it brings very little. Just like deferred was hype, we're back to forward+ nowadays. take care.
  2. Lightness1024

    DirectX12 adds a Ray Tracing API

    Just putting this here to troll a bit. this is frostbite's talk about battlefront 2 static GI.
  3. Lightness1024

    Slightly enhanced impostor technology

    Yes this is the question. I asked yesterday and it appears it's only 4 specimen so I'm relieved that it's respectively 250 instances, all of a sudden this method becomes very relevant Though, call it YAGNI, I'd love a system that scales to 1000 specimens. It's not super necessary immediately though, but I was thinking of streaming and possibly aggressive compression, 1 sub-LOD with normals striped off...
  4. Lightness1024

    Slightly enhanced impostor technology

    I found this very interesting write-up by Ryan Brucks http://www.shaderbits.com/blog/octahedral-impostors about Fortnite, and I need to say this quite fits my inquiry. This is 2.5d at its best, modern and practical. Though, for 1000 objects, at 512*512 for the atlases, you'd need 1GiB of impostor data if you store very compressed 8 bits colors, 10 bits depth and 14 bits normals. I guess this can be streamed in/out to an extent. but still seems a bit scary.
  5. Hello gamedev, I am currently evaluating the worthiness of jumping into RD work for an automatic impostor system in our engine. In the past I've witnessed tremendous performance increase from such a system into the engine of LumenRT (which has to cope with unoptimized user created content). We're a little bit in the same situation right now. Possibly large fields with way too much data (high poly etc..). So if the engine would support auto-impostor-ing of stuff that'd be cool. Though, to make it a bit more modern, I was thinking that we could extend the parallax validity of billboards by storing the depth too, and render them using parallax occlusion mapping. So the invalidation could come after the camera has moved to a more radical angle than for traditional impostors. These exist techniques with full volumetric billboards that I am aware of, but they need the gometry shader to generate slices, and cost heavy voxel storage. I need something very light on the bandwidth to cope with switch/PS4 limitations. Can you point me to modern research on well balanced imposter techniques sounding like this ? or any idea you have on the matter. thanks
  6. Lightness1024

    DirectX12 adds a Ray Tracing API

    why all of a sudden, if MS and NV speaks about it, it becomes "finally", and all the hype, when this has been a subject ever since the quake 3 demo, 12 years ago : https://www.youtube.com/watch?v=bpNZt3yDXno And I'm not even talking about heaven seven (http://www.pouet.net/prod.php?which=5), 18 years ago. DXRT has mercilessly copied all the OpenRL SDK, available here: https://community.imgtec.com/developers/powervr/openrl-sdk/ And it was already hyped too: https://www.extremetech.com/extreme/161074-the-future-of-ray-tracing-reviewed-caustics-r2500-accelerator-finally-moves-us-towards-real-time-ray-tracing See any similarity in the vocabulary at the time ? So, is this a fanboy effect or history revisionism ? Even Embree has been doing RTRT for years on CPU only, as long as you keep it first bounce. And if you check what Epic has to say about it, (here: GDC on youtube) you'll see they use a cluster of 4 tesla v100 with hyperlink and they were not able to include global illumination. They can just afford 2 rays per effect on a 80k$ hardware. I would put the horses back in the stable, but well, hype is contagious...
  7. Lightness1024

    Is there a doctor in the house?

    You need a fidget cube https://en.wikipedia.org/wiki/Fidget_Cube (to make your compulsion worse)
  8. Lightness1024

    Opinions on cryptocurrencies

    Nice, I think that's a good perspective and a bit of fairness is overdue in the debate. Absolutely, that's what I start with in my article (linked in OP). True. But there is indeed a small amount of bias for increase in value. Pure speculation would not leave you with more than 50/50 chance of loss vs profit. However, if we take BTC: that is a coin with a fixed supply, because of people who lose their keys (coin burning), and because of the entropy in the real economy "always rising" (added value theory), BTC is a deflationary currency by nature, which means it IS an investment. Also there is a bandwagon effect which is why some people have accused it of being a Ponzi (more people coming is the reason for rise in value, which is not sustainable -> pyramid effect.) But I refute this accusation in my article. So this comment has been a subject of extensive discussion in the field, and this doesn't have to be a red flag at all. Buffet said he will not invest in Apple either. You take it or leave it if your personality matches his, that's all, but it's not an absolute red flag. Authority is not an argument (one of the rhetoric fallacies). This is very much and sadly true, not so much in BTC because of the very large capitalization, but on other smaller coins it's the wild wild west and lots have been left holding the bag. I personally think this is not a problem at all, and it's a anarcho liberal point of view, the thing is just to be aware of this and hedge according to risks you can take, if you can't, just don't play, but there is no need to dismiss the game altogether because the players are rough.
  9. Lightness1024

    Opinions on cryptocurrencies

    All the reasons cited above are valid concerns to me. That's why my prefered cryptocurrency and the only one I can endorse with mind honesty, is nano (ex RaiBlock). They use multiple blockchains and no mining. So no waste of energy, instant transactions, and no fees. The host of the network nodes are idealists like hosts of tor nodes. There is less corruption in nano space thanks to no gold rush. It doesn't solve the economics though. (supply/inflation/volatility..) I've head the opinion though that economics could be a part of the system, since lots of those things work with votes already, we can imagine a crypto with director interests and all central bank prerogatives included in the system and the people would have control by votes. so a replica of the current system but with full democracy.
  10. Lightness1024

    Opinions on cryptocurrencies

    Ok for the value. And as for the "one unified" that won't work, that is a very sad observation. But I could tolerate the next best thing: The minimum number of different currencies. Let's take the Euro crisis as an example of why a unified currency can't work, and let's establish the fact that we need some amount of elasticity in the value and the inflation rates of currencies based on local specificities, we still can attempt to minimize how many fragments are needed no ? Your third point was usage in relation to goods. Yes, adoption is the main issue cryptocurrencies are facing. But, will really buy/sell transactions for material goods, create a "peg" on the value of those objects ? didn't we see hyper inflation in period of war make the price of bread skyrocket to billions of Deutshmark in Germany ? It seems the value can still be volatile even when massively adopted. But inflation should be something controlled by the central banks, so in a system where the monetary base is fixed, there can be only deflation. And the speed at which it happens cannot spin out of control since the deflation is bound to the production volume of the whole economy, and not arbitrary loans, compensatory easing, or interest rates which are not controllable parameters in current cryptos. So the volatility diminution would have to come from an increase in liquidity only no ?
  11. Lightness1024

    Opinions on cryptocurrencies

    It's not really related to gaming, but many games nowadays include an economy a la second life, with their own token/coin and downloadable contents. There even exists a token that has been designed just for games: enjin coin I personally don't believe in this, I think we should have a unified currency that has so much volume that the value becomes stable. fragmenting currencies into 3000 coins like today creates volatility. I thought there were so many problems in general with cryptocurrencies I had to write a long rant, I made a full fledged article about issues here: https://motsd1inge.wordpress.com/2018/02/10/cryptocurrencies-not-there-yet/ So, 'd love to hear your thoughts about its content and if there are points you disagree and stuff.
  12. Lightness1024

    Dealing with frustration

    "hackers and painters" by Paul Graham What you talk about is a bit like the white page syndrome isn't it. We all go through that, and yes TODO lists only grow, rarely shrink. Especially when you are alone. To successfully get a personal project to reach a state you can be proud of, you need to keep scale down, leverage libraries, take shortcuts, try to avoid generic & robust "production-like" support of the stuff you do, go straight to your use case only. There will be time way later, to think about "but what about those IGP uses, or what about linux..." in the meantime if you have choices between "generic" and "specific", only consider cost. Sometimes though, you can get both, for example: is it better to use boost filesystem for a neat platform independent code, or Win32 API to go straight to business ? Turns out boost FS is the cheaper option, and it's more generic only as the cherry on top of the cake. But that's not the case of most choices you are going to face. If something bores you, find a library, if some specific problem is core to your passion, do it yourself.
  13. Lightness1024

    [PBR] Renormalize Lambert

    well apparently disney is not complexed by just adding both: but still it appears to be a subject of pondering: https://computergraphics.stackexchange.com/questions/2285/how-to-properly-combine-the-diffuse-and-specular-terms https://gamedev.stackexchange.com/q/87796/35669 This nice paper, from paragraph 5.1 speaks of exactly what I'm concerned with: http://www.cs.utah.edu/~shirley/papers/pg97.pdf And they propose an equation (equation 5) one page later that looks quite different from disney's naive (as it seems to me) approach.
  14. Lightness1024

    [PBR] Renormalize Lambert

    @FreneticPonE are you talking about this: I've never seen this magic, seems interesting though. This is just further confusing me unfortunately. Let's say I chose a lambert for diffuse and cook torrance for speculars, am I supposed to just add the two ? Lambert doesn't even depend on roughness so mirror surfaces are going to look half diffuse half reflective if just adding both. How one would properly combine a lambert diffuse and a pbr specular ?
  15. Hello, I'd like to ask your take on Lagarde's renormalization of the Disney BRDF for the diffuse term, but applied to Lambert. Let me explain. In this document: https://seblagarde.files.wordpress.com/2015/07/course_notes_moving_frostbite_to_pbr_v32.pdf (page 10, listing 1) we see that he uses 1/1.51 * percetualRoughness as a factor to renormalize the diffuse part of the lighting function. Ok. Now let's take Karis's assertion at the beginning of his famous document: http://blog.selfshadow.com/publications/s2013-shading-course/karis/s2013_pbs_epic_notes_v2.pdf Page 2, diffuse BRDF: I think his premise applies and is enough reason to use Lambert (at least in my case). But from Lagarde's document page 11 figure 10, we see that Lambert looks frankly equivalent to Disney. From that observation, the question that naturally comes up is, if Disney needs renormalization, doesn't Lambert too ? And I'm not talking about 1/π (this one is obvious), but that roughness related factor. A wild guess would tell me that because there is no Schlick in Lambert. and no dependence on roughness, and as long as 1/π is there, in all cases Lambert albedo is inferior to 1, so it shouldn't need further renormalization. So then, where does that extra energy appear in Disney ? According to the graph, it's high view angle and high roughness zone, so that would mean, here: (cf image) This is super small of a difference. This certainly doesn't justify in my eyes the need for the huge darkening introduced by the 1/1.51 factor that enters in effect on a much wider range of the function. But this could be perceptual, or just my stupidity. Looking forward to be educated Bests
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!