• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.


  • Content count

  • Joined

  • Last visited

Community Reputation

3291 Excellent

About FreneticPonE

  • Rank

Personal Information

  • Location
    Galactic Central, Messier 81
  • Interests
  1. I'm confused about why you'd want to cast rays over an entire sphere. Unless you're sampling translucency lighting should only be incoming over a hemisphere, sampling over the entire sphere would produce overdarkening artifacts, IE shadows being cast onto objects from behind them.
  2. The black edges look like you're hitting out of screenspace without falling back on any other reflection, such as whatever it is the normal game does.
  3. What's the point of PBR if you're not using an HDR pipeline?
  4. Yaaaaaaaaaaayyyy!* Looking forward to it *Excited Kermit voice Related, does anyone know how Reset does their variable distance volume marching for their precipitation effect/large scale god rays? EG: Near camera volume marching is fine, you set your z-slices and go. Clouds are fine if they're on a defined plane/sphere. But they also seem to use such to represent rain, a very cool effect. I ask because the most dramatic scattering, or god rays, often comes between clouds/distant mountains. And somehow that seems to be accomplished here. A logarithmic march? If everything (including the clouds) was in some sort of shadowmap like depth buffer you could get a summed area table, then do a gradient domain like march. EG skip space till sunlight, etc. Any other ideas?
  5. That would probably be because CLEAN mapping doesn't take ani-stropic filtering into account, so error will increase as you get parallel to the surface, IE more towards the horizon for your water. And if you want a non deferred texturing solution to better tessellation culling (not tessellating non visible triangles) run an async compute pass of the geometry, then discard non visible triangles before tessellation stage. Regardless, looking forward to the demo, the crap public wifi I'm on definitely won't be downloading it in any reasonable time but I'll check it out later. Glad your performance is going up!
  6.   Performance is also super  :( 47ms on a 1080 (at 1080p) for anything close to a real game outside a "single room" type scenario. And it doesn't support dynamically moving objects. It's the classic "hey, over a defined sphere complexity grows more 2 dimensionally than 3 dimensionally!" Global Illumination trap. As long as you keep things local GI is actually easy right? Infinite bounces, no lightleak, easy easy easy. Then you realize the whole "global" part is important, that most games aren't rendering a cornell box, and the farther away your calculate GI the more your performance starts to tank geometrically. 
  7. Ah, well then here is a more step by step with example code. DICE's stochastic SSR is more of a "what you can do once you've got the basics up and running. Hope that helps.
  8. Nope, it's all screespace raytracing. There's a dozen variations on sampling but it's all the same basic thing. Why, any goal in particular?
  9. It's certainly some sort of virtual 64bit precision, two 32bit depth buffers or some such thing. A single 32bit buffer wouldn't give any benefit, it's already common to flip it for more precision over a range. And native 64bit is sloooooow even on most "professional" cards, let alone average consumer ones. Star Citizen does virtual 64bit, though I never learned the details I'd bet it's pretty much the same.
  10. Oh! In that case you want to do something completely different. What you want to do is actually simplify your ocean rendering as it gets farther away, tessellating less and less, and letting normal maps, and eventually a brdf take over. Take a look here: http://onlinelibrary.wiley.com/doi/10.1111/j.1467-8659.2009.01618.x/abstract There's other papers on the same thing, including LEAN mapping and etc. that take on similar problems.
  11. It sounds to me like passing through the actual tessellation pipeline, in which case deferred texturing/full "visibility buffer" would indeed help, as you're only tessellating onscreen triangles. In fact it helps so much DICE found that a full async geometry pass to only draw visible geometry to the g-buffer (traditional deferred) significantly increased tessellation pipeline perf. So the idea is perfectly sound, though it does sound like overtessellation is going on, as you're still not getting any benefit out of tessellating to subpixel triangles. 
  12. So do you want runtime generation or offline? Rather, minecraft or all calculated beforehand? Because those are 2 different solutions. I get the idea that this is a minecraft/runtime generated world type thing. Meaning your calculations are going to have to be fast, and since you want the same results everytime, deterministic. The first thing you could do is generate a deterministic noise. In which case you'd want to look up the halton sequence (just google it, I can't find a totally good coding overview but it's simple enough). This will give you an evenly space, random looking distribution for your trees. Then you'd want to sample your heightmap to check for slope. The halton sequence is deterministic and I assume your slope isn't changing, so a tree will always be there. Alternatively you can also generate a "foliage" map everytime you generate a new piece of terrain, which is to say do the above then store the result in its own texture so you don't have to redo it everytime the user comes back to that area. From there you can get fancier and fancier. Check the terrain's material to see if it's the right place to put a tree, no on rocks or water, yes on grass and dirt for example. "Elevation" is another consideration often used, eg transition from broadlead trees to pine trees could be done if elevation > x. Then there's multipass generation. EG Generate boulders first, then trees where boulders aren't, then undergrowth beneath trees and grass where there aren't trees. You can go nuts, check for distance from water, whatever you want. You can take the above, then stored in a "foliage object map" will also work for "offline" generation as well. Just read your texture back to place trees. Same works for rocks, grass, whatever detail objects you want. But what you don't want to do is have some trees disappear and reappear as you get nearer and closer. Distance culling is usually done uniformly by object size, so the smallest objects disappear all at once first, then etc. etc. Also trees never usually disappear as they're considered "too big" by most games standards. Star Citizen for example will have them disappear certainly, but then most games don't allow you to go from orbit down to a (somewhat) realistically scaled planet sans loading screen or travel restrictions. So to finish I'd take a look at impostors and ... well their was also a fantastic, crazy realtime editable simulation of entire ecosystems done at I want to say siggraph in the last year or 2. But my google fu fails me, if anyone knows of the paper it'd be awesome to see it. Edit- With that being said, while not as impressive as the paper I failed to find these 2 recent GDC presentations have procedural tools: Ghost Recon Wildlands and Horizon Zero Dawn
  13. I'd add that "coherent rays" are just closer to parallel to each other, and "incoherent rays" are just less close to parallel. A 1d graph if you will with "rays" being fully parallel and in the same direction or "coherent" on one end and fully perpendicular/firing in opposite directions/etc. or "incoherent" at the other end. 
  14. Edit- Juliean said the same thing and hit "post" first. So +1 and all that.
  15. That's my guess anyway, something something aliasing patterns?