lawnjelly

Members
  • Content count

    192
  • Joined

  • Last visited

Community Reputation

1233 Excellent

About lawnjelly

  • Rank
    Member

Personal Information

  • Location
    England
  • Interests
    |programmer|
  1. I know next to nothing about Pilobolus, but if you are interested in how life achieves complexity then I would recommend reading some Dawkins as a grounding to how it all works - 'the selfish gene' and 'the blind watchmaker'. I really don't know the extent of your biology knowledge, but imo there are 2 big aspects to get a grasp of - evolution and genetics (which Dawkins is a good introduction to, and there are more advanced books by e.g. Matt Ridley). The other is development and complexity arising from simple rules, and understanding that something apparently complex (e.g. a tree) can be built by simpler branching etc rules (have a look at Conway's 'game of life' cellular automaton for an example). Even things like human organs tend to be built in the same way - see for example the similarity in the branching in the lungs with the structure of a tree. It is a means to increase the surface area to volume ratio for gas exchange. I'm not super familiar with the specifics of development of any particular organism, but a lot of work has been done on simple organisms like fruit flies to understand how they are built, you could read about this to see how things like limbs and specialisation can happen. As you read about evolution you will read about how most of the organisms today are built from a few body plans / phyla, and share a lot of their blueprint. I just finished reading 'wonderful life' by Stephen J Gould, which aside from being a little rambling and overlong, suggests that during the first explosion of multicellular life there were far more bodyplans being experimented on by mother nature, and whether by random accident or better design, just a few of them won out and form the basis for later life on earth. As to creating models, go for it, maybe even start with simpler models than Pilobolus. You can even add genetics to your model and let nature 'select' the best version of your species. Or even compete 2 or more species against each other if you want to make things interesting, or have predator prey interactions. This is all assuming you are not a religious fruitcake, of course, in which case, forget all this, and just accept that everything was created by the flying spaghetti monster, waving his noodly appendages.
  2. Yes definitely, I've been finding this. Has made me so glad I went with pre-rendering the scrolling background as rendering all those sprites every frame would have killed performance. Most of the work on a frame is done by just drawing one big screen size quad for the background. The 'big work' is done when rendering a new row or column of the background, which only happens every few frames, and is limited to a small viewport so it minimizes the fillrate requirements. See here: https://www.youtube.com/watch?v=Xfaj4TtvjKk which shows it working on the ground texture. As well as hardware depth testing (so the particles interact with the animals), the particles and models also can do a depth check against the custom encoded RGBA depth texture for the background, so they go behind trees etc. This is an extra texture read and calculations in the fragment shader so did give a speedup when turned off. Yup I definitely found this to be the case.
  3. Aside from being careful with headers, nested includes etc, I too found the biggest win using unity builds, changed my life.. :lol: For my current little game I'm getting 15 seconds with a full rebuild, ~150K LOC according to the visual studio trick, about 2 seconds of which is my code, the rest is third party stuff which I couldn't get in the unity build like lua, and linking. One thing for the linking, being someone who favours static linking of runtime libraries, I think I found that linking the dynamic libraries was faster, so I often do that for debug builds, and static link for release. If you can, I believe building some of your stuff that is less likely to change as dlls for development builds might help quite a bit in very large projects, in terms of speeding up iteration.
  4. Hmm, I'm aiming for a scripted jungle exploration game, where you will get lots of sub missions depending on the areas you enter, mostly from natives and chiefs, kind of like rpg like baldur's gate etc, but simplified without all the character stats and inventory. The animals will be there to bother you as you go about missions. There will be an overall story arc too. It is influenced a lot by things like the king kong movies. Although the test levels here are just random, the levels can have biomes such as desert, forest, plains, rocky, volcano, lake, sea shore etc. I'm keen to not get out of hand so the major technical challenges are scrolling renderer, animals, and hopefully some kind of geometry morphing for the natives so I can have a lot of variation. Then simple objects you can interact with / collect. I'm also planning on the player being able to pilot a canoe, swim, jump. Also there may be a wrestling sub-game, it depends how difficult it would be to coordination the animations, and how time consuming that would be. I have experimented with auto level generation on the previous version, and this works well while having the facility to manually map edit (in the game engine) and move things after the auto-generate. Then I would place actors and write scripts for the missions. Possibly some of the scripts could be auto generated too.
  5. Ah yes! Good thinking, :D The simplest solution to pre-render some frames. I actually did this as the first version, long ago but had forgotten! PCF is just basic shadow mapping but taking multiple samples: http://fabiensanglard.net/shadowmappingPCF/   It kind of works like this already, a little more complex though as it has a wrapping tiling background bigger than the screen, and handles ground textures separately.   This is how it does things already with the shadows, except I am not casting from the animals at this stage as I figured that would be too expensive, I'll probably just add simple blob shadows for the animals. Although the shadows are received by the animals, from trees etc. The shadow map only needs to be regenerated as you move across the map, it is not rendered every frame. With the dynamic shadows received on the animals turned off, the shadows on the terrain are essentially free for most frames, but they do cost when scrolling to a new tile. I have this afternoon implemented the static water as part of the background (although not yet done the bit to add blue colour to under water animals). It doesn't look really bad on my low end phone and is now rendering mostly 60fps. There are occasional dropped frames during scrolling to new tiles but I'll see if I can address that. I will see if I can add random jitter to the terrain shadows to make it look better with fewer samples.
  6. I've just got my latest build of my OpenGL ES 2.0 jungle game working on my android devices (nexus 7 2012 tablet, cat b15 phone) and am currently deciding the best way to address performance. (more details here: https://www.gamedev.net/blog/2199/entry-2262867-sound-particles-water-etc/) All the graphics seem to be working fine, but it seems that the biggest issue is fillrate / texture samples / shader complexity. So far I've identified the biggest issues: :blink: dynamic water shader particles pcf shadows [attachment=35790:dynamic.jpg] I'm aiming for 60fps even on low end phones, if possible. It seems to me that I should have graphic options so the user can get the best graphics / performance for their device.   Some of the issues are a consequence of using a scrolling pre-rendered background, with colour and a custom depth texture (as depth textures are not supported on some devices). When rendering the background as the viewer moves around I currently use 2 passes, one for the colour and one to write the depth into an RGBA, then in realtime I render dynamic objects on top (e.g. the animals) and I read from the depth texture, decode it and compare to the fragment z value.   One obvious speedup is to remove the depth comparison with the background for shaders that do not require it. For the particles, they look much nicer when they are hidden by trees / vegetation, but still look acceptable without it.   The PCF shadows I always suspected were going to be a problem. I was using PCF shadows for the pre-rendered scrolling background (only need refreshing every few frames) and PCF shadow on the animals as they get shaded by trees etc. Taking this down to a single sample greatly sped the shader up, so it is obviously a bottleneck. The single sample shadows look very bad however, so I think the options should be: turning them off for animals perhaps simplifying them for background or using some kind of pre-calculation. There is also the option of randomized jitter / rotating sample window to get a softer shadow with less performance hit. The biggest question I am still facing is how to do the water. :huh:  Is it actually *feasible* to run a complex water shader covering the whole screen on these devices (worst case for sea parts) or do they lack the horse power? I am actually considering (!!) pre-rendering a static water as part of the background. Then bodging in some kind of depth blue colour for parts of animals that are below the surface on each frame. It won't look amazing but should be super fast. I could even add some dynamic particles or something on the water surface to make it look at least a little dynamic. This is what static water might look like: :blink:  [attachment=35791:simpleshader.jpg] I am currently just rendering a giant quad for the water, then using depth testing against the custom depth texture to handle visibility. But this is a bottleneck, as well as the calculations of the water colour. I have already considered drawing polys for the rough area where water will be (around the shores etc) rather than the whole screen, however this will only help in best case scenarios, not in worst cases. Maybe there is a cheaper way of deciding where it can draw the water? I would use the standard z buffer but that option does not appear to be open, given that I am using a custom encoded depth texture, and the shaders cannot write to the standard z buffer without an opengl extension (which may or may not be present lol :rolleyes: ).   I could maybe wangle another background luminance layer or something for where to draw realtime water, but this seems a lot of effort for not much reward (it would only be saving on decoding the depth texture and doing a comparison).   Another question that does occur is, whether all of these bottlenecks are simple bottlenecks, or whether I am stalling the pipeline somewhere with a dependency, and could I double / triple buffer the resources to alleviate the problem.   Anyway sorry for this long rambling post, but I would welcome and thoughts / ideas - probably along the lines of whether these should actually be causing such problems, and any ideas around them, particularly the water. In fact any suggestions for super fast simple water shaders would be useful .. I suspect just adding 2 scrolled tiled textures might produce something useable enough, if the texture reads are faster than calculations within the shader.
  7. It looks promising. :) Just some things I saw : It took a while to startup so I wasn't sure if it had frozen, maybe more animation during the loading to show it is still working. The instructions I didn't find were enough for me, I only managed to put some grain and cows in fields, I couldn't figure much else from the guide (and some of the icons could be bigger / more informative?) .. can you do a youtube or in game tutorial? Also moving the map took a bit to figure out, maybe some left right up down arrows would show where to hover / click.
  8. Time for an update to show how I'm getting on. A lot of what I've been doing is copy-pasting and reworking code from the old version of the game, so I'm progressing more rapidly than would otherwise be the case. Some of the things I've added: Hills Now instead of just random test heights the landscape is made up of distinct hills, which raise the surface from their centre. I've had to compromise a bit with the heights available (bottom of the lakes to top of the hills) because it affects a lot of aspects of the scrolling renderer and I don't want to go above hardware limits for the render target size. Water Just using my old dodgy shader from the old version, I'm currently just drawing a big quad at the water surface level. This may be changed to a rough polygonal shape around the lakes, to save on fillrate. It has to read the custom depth buffer so must be moderately expensive even when there is no drawing taking place. Particles I've added a very simple particle system, for things like fire, blood, splashes etc. You can place particle systems on the map and it will intelligently turn them on / off as needed as you move around. It is currently using point sprites, so the particles are flicking out of view as the point centre moves off the screen. This may not happen on the OpenGL ES version, I haven't tried it yet, but if it is still a problem I'll either switch to quads or try a workaround (I did read a suggestion of changing the viewport and using glScissor). I'm also considering using something similar for things like butterflies. Animation Minor tweak, I use the distance travelled to determine how far to advance the animation, instead of just time, so the footsteps are a better match to the ground instead of sliding so much. Jumping Added support for altitude and gravity. This is mainly used for the player but will also be used for flying creatures. The bats are not yet implemented, they are at a fixed height for now. However it works with the collision detection, so e.g. bats can fly over animals and plants, and the player can jump over low obstacles. :D Scripting The very basics of the LUA scripting is working again. I need to do more work for attaching it to characters, when I deal with level loading. You can use the scripting to drive subtitles for the game and speech bubbles on characters, play sounds, animations, move characters etc. :ph34r: Sound Finally I've got the sound working again. This was initially mostly a copy-paste affair, but I added support for looping sounds, for ambience around the maps, like insect noises, water rippling, fire etc. I also improved it to use a basic positional audio, where each sound has a location, and the listener has a location, and it smoothly interpolates the sounds in stereo as they move about relative to the listener. There is also reverb / echo and dynamic compression. I haven't tried the music tracker yet but I intend to change this considerably. As usual any suggestions / comments are most welcome. :)
  9. Some good info there from Matias regarding the decision between branchless (more texture lookups) and branched shaders (have the same issue in some other shaders, I'll have to do some tests to compare performance .. it maybe the shader compiler decides it is quicker to do both the lookups even in the branched shader). Using mipmaps is an interesting idea and something I hadn't thought of, but as osbios says this only gains a little extra texture space at quite a bit more complexity, so I'll probably go with the simpler approach. This is for that situation of alpha transparent depth sorted sprites (trees etc in the screenshot). :) 
  10. Yeah I guess I'll try passing the texture ID from the vertex shader and compare with just using the fragment shader. If there's not a lot of difference I might just go for the easier method so I don't have to adjust the spritesheet code. If there is a performance hit I can have 2 sets of shaders, depending on whether the hardware supports > 2048, so the later hardware does not pay a price for the compatibility. :) I've also had problems with devices not supporting frame buffers > 2048 for scrolling textures, but that's a whole other problem lol. Does make you wonder how the GPU guys decide what the limits should be... It must be intriguing what goes into GPU design, although I'm sure all top secret lol.
  11. Good 'morrow, this is for Android game (possibly iOS later) : I'm currently hitting up against maximum texture size limits on some devices. I've decided to not bother with trying to support 1024x1024, but I'm aiming to support 2048x2048 as a lot of devices still have this limit (including my 2013 nexus 7).   Now for some games this isn't an issue, you just downsize the textures .. however I'm using a pixel perfect spritesheet for depth ordered billboards, and want to be able to draw everything with one drawcall, and I'm thinking I'm going to need at least 4096 x 2048. I can't just render with 2 drawcalls, as the sprites are interleaved randomly and have to be drawn in depth order. [attachment=35662:zorder.jpg] So some options I have: Half size the spritesheet on load and upsize at rendering (ugly blocky) Half size the spritesheet and half the game resolution (very involved to support 2 version of the game, but would better support different screen sizes?) Use some shader jiggery pokery to get around the limit I'm currently thinking along the lines of 3, something like : Binding 2 halves of the spritesheet to active texture 0 and 1 (each 2048 x 2048) Pass the uv coords in pixel space In the fragment shader, if the u coord >= 2048, then -2048 and sample from texture 1 (and vice versa). I could also maybe make the spritesheet generator avoid placing on the boundary, and decide which spritesheet in the vertex shader, if this would help.   Is this feasible? What kind of penalty should I be expecting for such an approach (with a conditional in the frag shader) versus the hardware supporting it natively?    
  12. From the album Explorer Game

  13. Also a very common gotcha for this is mipmaps. If you upload the texture but don't create mipmaps, and it samples from the blank mipmaps you just get black.
  14. Ahh I get you, yeah then I would be looking into caching the results of the blending if you can. Don't give up though, you've obviously put in a lot of work, and it looks nice, so enjoy optimizing it! :D
  15. The important lesson here imo is in future to be thinking about performance from the start, and as you go, and constantly monitoring as you add new features. The best way to work imo is to design according to your target platform, not design and then 'hope for the best' (many products have sunk this way). Consider not only average frame rate, but also frame spikes which could cause dropped frames. That said :) , what is your target platform and graphics API? I'm guessing PC, which may mean you have more tricks available than if you were say, targeting OpenGL ES 2. If you have access, try out a graphics debugger program to gain more info on what is bottlenecking and causing frame spikes. If you don't have this, try running the game with a very low resolution to see whether you are fill rate limited. If you are fill rate limited do what you can to reduce this .. e.g. optional simpler fragment shaders etc. If it runs slow even at low resolutions, look at how many draw calls you are making, how you are batching things, how you are optimizing static scenery, how complex are your vertex shaders / models, what method are you using for animation etc. And it almost goes without saying of course profile profile profile for all the CPU stuff. /edit - I just had time to have a quick look at it seems as though you say are using some kind of LOD system already. If your doing a lot of texture blending for the tops of hexagons, can you cache the results in a terrain texture for the tops? And obviously you only need low mipmapped versions in the distance. For the goblins, you could use different textures for the different types / selected ones, it could be faster, but that totally depends on the shader you are using. For distant goblins I'd look at precalculating say 20 frames of animation, and playing them directly stop frame style rather than doing any skinning / tweening on the fly, again using LODs. Or you could look at imposters (maybe for the grass too?). For the hexagons maybe you can combine the geometry and render in one go if you aren't already. It is not going to be super easy given the view distance, unless you add in some limitations like only allowing the viewer to look down at a certain range of angles, or have aggressive fog in 1st person type view.