Jump to content
  • Advertisement

Search the Community

Showing results for tags 'Graphics'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Categories

  • Audio
    • Music and Sound FX
  • Business
    • Business and Law
    • Career Development
    • Production and Management
  • Game Design
    • Game Design and Theory
    • Writing for Games
    • UX for Games
  • Industry
    • Interviews
    • Event Coverage
  • Programming
    • Artificial Intelligence
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Engines and Middleware
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
  • Archive

Categories

  • Audio
  • Visual Arts
  • Programming
  • Writing

Categories

  • Game Dev Loadout
  • Game Dev Unchained

Categories

  • Game Developers Conference
    • GDC 2017
    • GDC 2018
  • Power-Up Digital Games Conference
    • PDGC I: Words of Wisdom
    • PDGC II: The Devs Strike Back
    • PDGC III: Syntax Error

Forums

  • Audio
    • Music and Sound FX
  • Business
    • Games Career Development
    • Production and Management
    • Games Business and Law
  • Game Design
    • Game Design and Theory
    • Writing for Games
  • Programming
    • Artificial Intelligence
    • Engines and Middleware
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
    • 2D and 3D Art
    • Art Critique and Feedback
  • Community
    • GameDev Challenges
    • GDNet+ Member Forum
    • GDNet Lounge
    • GDNet Comments, Suggestions, and Ideas
    • Coding Horrors
    • Your Announcements
    • Hobby Project Classifieds
    • Indie Showcase
    • Article Writing
  • Affiliates
    • NeHe Productions
    • AngelCode
  • Topical
    • Virtual and Augmented Reality
    • News
  • Workshops
    • C# Workshop
    • CPP Workshop
    • Freehand Drawing Workshop
    • Hands-On Interactive Game Development
    • SICP Workshop
    • XNA 4.0 Workshop
  • Archive
    • Topical
    • Affiliates
    • Contests
    • Technical
  • GameDev Challenges's Topics
  • For Beginners's Forum
  • Unreal Engine Users's Unreal Engine Group Forum

Calendars

  • Community Calendar
  • Games Industry Events
  • Game Jams
  • GameDev Challenges's Schedule

Blogs

There are no results to display.

There are no results to display.

Product Groups

  • Advertisements
  • GameDev Gear

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me


Website


Role


Twitter


Github


Twitch


Steam

Found 112 results

  1. Heyo! For the last few months I've been working on a realtime raytracer (like everyone currently), but have been trying to make it work on my graphics card, an NVidia GTX 750 ti - a good card but not an RTX or anything ... So I figured I'd post my results since they're kinda cool and I'm also interested to see if anyone might have some ideas on how to speed it up further Here's a dreadful video showcasing some of what I have currently: I've sped it up a tad and fixed reflections since then but 'eh it gets the gist across . If you're interested in trying out a demo or checking out the shader source code, I've attached a windows build (FlipperRaytracer_2019_02_25.zip). I develop on Linux so it's not as well tested as I'd like but it works on an iffy laptop I have so hopefully it'll be alright XD. You can change the resolution and whether it starts up in fullscreen in a config file next to it, and in the demo you can fly around, change the lighting setup and adjust various parameters like the frame blending (increase samples) and disabling GI, reflections, etc. If anyone tests it out I'd love to know what sort of timings you get on your GPU But yeah so currently I can achieve about 330million rays a second, enough to shoot 3 incoherent rays per pixel at 1080p at 50fps - so not too bad overall. I'm really hoping to bump this up a bit further to 5 incoherent rays at 60fps...but we'll see I'll briefly describe how it works now :). Each render loop it goes through these steps: Render the scene into a 3D texture (Voxelize it) Generate an acceleration structure akin to an octree from that Render GBuffer (I use a deferred renderer approach) Calculate lighting by raytracing a few rays per pixel Blend with previous frames to increase sample count Finally output with motion blur and some tonemapping Pretty much the most obvious way to do it all So the main reason it's quick enough is the acceleration structure, which is kinda cool in how simple yet effective it is. At first I tried distance fields, which while really efficient to step through, just can't be generated fast enough in real time (I could only get it down to 300ms for a 512x512x512 texture). Besides I wanted voxel accurate casting for some reason anyway (blocky artifacts look so good...), so I figured I'd start there. Doing an unaccelerated raycast against a voxel texture is simple enough, just cast a ray and test against every voxel the ray intersects, by stepping through it pixel by pixel using a line-stepping algorithm like DDA. The cool thing is, by voxelizing the scene at different mipmaps it's possible to take differently sized steps by checking which is the lowest resolution mipmap with empty space. This can be precomputed into a single texture allows that information in 1 sample. I've found this gives pretty similar raytracing speed to the distance fields, but can be generated in 1-2ms, ending up with a texture like this (a 2D slice): It also has some nice properties, like if the ray is cast directly next to and parallel to a wall, instead of moving tiny amounts each step (due to the distance field saying it's super close to something) it'll move...an arbitrary amount depending on where the wall falls on the grid :P. Still the worst case is the same as the distance field and it's best case is much better so it's pretty neat So then for the raytracing I use some importance sampling, directing the rays towards the lights. I find just picking a random importance sampler per pixel and shooting towards that looks good enough and allows as many as I need without changing the framerate (only noise). Then I throw a random ray to calculate GI/other lights, and a ray for reflections. The global illumination works pretty simply too, when voxelizing the scene I throw some rays out from each voxel, and since they raycast against themselves, each frame I get an additional bounce of light :D. That said, I found that a bit slow, so I have an intermediate step where I actually render the objects into a low resolution lightmap, which is where the raycasts take place, then when voxelizing I just sample the lightmap. This also theoretically gives me a fallback in case a computer can't handle raytracing every pixel or the voxel field isn't large enough to cover an entire scene (although currently the lightmap is...iffy...wouldn't use it for that yet XD). And yeah then I use the usual temporal anti aliasing technique to increase the sample count and anti-alias the image. I previously had a texture that would keep track of how many samples had been taken per pixel, resetting when viewing a previously unviewed region, and used this to properly average the samples (so it converged much faster/actually did converge...) rather than using the usual exponential blending. That said I had some issues integrating any sort of discarding with anti-aliasing so currently I just let everything smear like crazy XD. I think the idea there though is to just have separate temporal supersampling and temporal anti aliasing, so I might try that out. That should improve the smearing and noise significantly...I think XD Hopefully some of that made sense and was interesting :), please ask any questions you have, I can probably explain it better haha. I'm curious to know what anyone thinks, and of course any ideas to speed it up/develop it further are very much encouraged . Ooh also I'm also working on some physics simulations, so you can create a realtime cloth in it by pressing C - just usual position based dynamics stuff. Anyway questions on that are open too :P. FlipperRaytracer_2019_02_25.zip
  2. Hey, hello there! This is yet another new entry in this truly wonderful weekly update blog! This week was quite big. There is a couple of new stuff so let's get right to it! Pooling First, I want to talk about a subtle upgrade. In the game, I'm using static factories to generate different game objects such as projectiles, props and entities among other things. While this is quite handy (especially since you don't need to drag'n' drop any components at all), it wasn't really optimized. Some object (especially projectiles) could frequently be instantiated and destroyed. For some objects this is fine but for others, it can really affect the FPS. To fix this we need to use pools. These are essentially scalable stacks of the pre-instantiated game objects that are ready to use. Instead of destroying them we simply disable them and store them back into the pool, thus recycle the game object and avoiding a lot of overhead. While on the surface this might seems odd it's actually really useful with short-lived and numerous objects. Pretty simple to integrate into my factories too! It works like a charm! I've partially based my Pool system off this one if you're interested. Spark Effect Next, there's a new particle effect in the game. This one is quite AESTHETIC, so be warned. Basically, it's a Memphis design effect that is supposed to represent some kind of collision. The particles are also coloured accordingly to the colour palette too! Right now these are used on any physical hit. However, I'm still trying to think about a better use case for them... Here's a video showing them off: New Enemy Finally, I'm proud to say that there's now another new enemy in the game: the archer. This enemy uses a bow to attack. It will try to keep its distance and instead try to attack at range. It will also try to trick the enemy to avoid being targeted by them too. They also got a different behaviour that other, in which they will try to circle around their target and keep moving no matter what. When they're near their target and has a clear sight of them, they will draw an arrow, pull their bow and launch it. It's a good idea to strike when they're pulling their bow as it's when they're the most vulnerable. Like their sabre-wielding piers, they will also patrol around their room and be on the lookout for any enemies. They will also flee when weak and can be one-shot too! Here's an encounter with an archer: Minor Updates Big refactor of the GUI. Replace duplicated code with inheritance. Optimized some GUI events. Fixed some oversight in some GUIs. Added a cached state controller that can cache useful data that rarely changes. Added individual limbs target to player characters Right now there isn't any damage multiplier on the player's limbs nor are the enemies try to target some. This might be an idea for later though. Made the active arrow much more flexible Replace the unintuitive layer mask int to a LayerMask instance. It makes it so much better to edit these Made the arrow play the appropriate collision sound when colliding with objects. Big code conventions refactor. Added proper animation masks to my animated models Made most behaviour tree controllers much more flexible Basically the same thing as the active arrow, which is replacing layer mask int to LayerMask instances. Fixed bugs with the hamburgers Fixed some oversights in physic collisions between layers. Next Week I'm still not done with the archer enemy. It still needs a bit of balancing and bug fixing. I also need to perhaps complexify my behaviour tree a tad bit. I'm slowly getting there at each iteration. It's quite interesting and all but working with behaviour tree requires a lot of conception and debugging too! After I'm done with the archer, I'll probably add a gunner too. Afterwards, I'm gonna focus on the bosses. Of all entities these are probably the dumbest and also the buggiest, so that's up next. And then there's the usual suspect (relics, capacities, items, rooms, etc.). I sure hope that with the right type of enemies the game can get better exposure... Only time will tell...
  3. Hello everybody! I decided to write a graphics engine, the killer of Unity and Unreal. If anyone interested and have free time, join. High-level render is based on low-level OpenGL 4.5 and DirectX 11. Ideally, there will be PBR, TAA, SSR, SSAO, some variation of indirect light algorithm, support for multiple viewports and multiple cameras. The key feature is COM based (binary compatibility is needed). Physics, ray tracing, AI, VR will not. I grabbed the basic architecture from the DGLE engine. The editor will be on Qt (https://github.com/fra-zz-mer/RenderMasterEditor). Now there is a buildable editor. The main point of the engine is the maximum transparency of the architecture and high-quality rendering. For shaders, there will be no new language, everything will turn into defines.
  4. I tried to implement hardware tessellation to increase the detail on my snow mesh but when I run the code it places some of the geometry in the middle of my view and rotates them when moving the camera. This is my vertex, constant patch function, hull and domain shader code:
  5. I'm wondering if anybody has seen a glitch like the one below (the shadow strip that appears in front of the cube). I have been trying to fix it for some time but the cause eludes me. I'm using a PCSS implementation in my fragment shader and Parallax Occlusion Mapping to add some realism to the ground texture as well. I'm still trying to get through the root of the problem, which is to determine what kind of artifact this is but I'm not too sure. It seems like a self-shadowing "acne" issue but in that case why am I not seeing acne across the rest of the ground texture, and why does it only appear when panning the camera at that particular location? Any help would be kindly regarded.
  6. Wasn't sure where to post my game (https://incredicat.com) as I'm still working on it but wanted to put it out there to get any useful feedback or thoughts from the experts. It's basically a game similar to 20 Questions (or Animal, Vegetable, Mineral) that attempts to ask you questions to work out an object you are thinking about. You can think of everyday items (animals, household objects, food, quite a bit of other stuff etc) and it has 30 questions to try and guess the item. I've been working on it for a while but not sure what to do next so interested to hear anyone's thoughts... I've finished working on the new algorithm which is based on ID3; entropy and information gain. The link for anyone that wants to try it out is incredicat.com Thanks in advance!
  7. Bacrylic

    Level01 - Copy.png

    From the album: The Book of Alex

  8. Hello there I've been working lately on my 2D fishing game but i can't do everything alone sadly. That's why i'm here looking for somebody that would want to join my project and help out with graphics. Could be realistic, cartoonish or minimalistic fantasy. I just don't want to make another pixel art game. If you're an artist or anybody with useful skills and you'd like to help, write to me on discord: MrKosiej#6327. Game Details: I'm using construct 2 to create this PC sidescroller. It's gonna be a simulation and somewhat an economical game. Later i'm planning on adding some action adventure elements but that's far away. In the game you're playing as a boy who got thrown out of his home by his fother and now he needs to find his way in life. He walk and wanders and then he finds a little abandoned fishing hut. He decides to become a fisher and decide what to do next in the meantime. Cheers.
  9. OnOrbit on Google Play
  10. Hello everyone, I want to make a Steampunk Adventure Jump'n Run but I'm missing an Pixelart artist. I have a website with posts about indie games and would like to develop one myself. Of course I have programming knowledge. Unfortunately I'm not quite talented about that. I have already worked out a concept and would be very happy if someone would help me. Have a nice day, Kilian
  11. Hi, I am fairly inexperienced in DX11 and I am trying to implement a particle system that is handled purely on the GPU. To keep things simple, the particle system will have a fixed number of particles that, once dead, will be respawned at the emitter location. As such, I only have a single compute shader that updates each particle. Based on the method outlined in 'Practical Rendering and Computation in Direct3D 11', an AppendStructuredBuffer<Particle> and a ConsumeStructuredBuffer<Particle> are bound to the compute shader, so that last frame's particles can be consumed, updated and appended, ready for the next frame. As I want to minimise the data flowing back to the CPU, I want to make a call to DrawIndexedInstancedIndirect(), feeding the particle data directly from the compute shader to the vertex shader. In the vertex shader, the particle will be expanded to a quad billboard (I am trying to avoid the geometry shader for performance) and sent to the pixel shader for rasterization. As a side-effect, the primitive topology will be set to TRIANGLELIST. I am using DrawIndexedInstancedIndirect() rather than DrawInstancedIndirect() as the number of particles and, by extension, the index buffer will remain fixed. However, I am unclear on how to implement indirect draw calls and have the following questions: - How can I pass the updated particle data from the compute shader to the vertex shader without sending the data back to the CPU? As I understand, the vertex shader cannot read from the AppendStructuredBuffer that was written to by the compute shader. - How do I generate vertices without a vertex buffer bound to the input assembler? I know that vertices are identified with the semantic SV_VertexID, but I have no input layout bound to the input assembler. Any help would be greatly appreciated, as I am not very confident with indirect drawing in DX11 and I cannot find any beginner-friendly resources to point me in the right direction.
  12. Hey, I've updated my shaders to uses Shader Model 2.0, with ps_2_0 and vs_2_0, works good only on PC's where was possible to use Shader Model 3.0. The problem its the shaders its already compiled for SM 2.0, but it doesnt works on PC's with obsolete hardware config. That PC's supports SM 2.0, so I dont know the shaders isn't working. Looks the crash is caused on DrawIndexedPrimitive, but this function never returns something different from S_OK. This way I can't check any log or something like that. Im compiling shaders like: technique Mesh { pass P0 { VertexShader = compile vs_2_0 MeshVS(); PixelShader = compile ps_2_0 MeshPS(); } } Initialization logs where have that crashing problem: 16/04/2019 23:18:13 - (INF) Direct3D 9 Interface Created 16/04/2019 23:18:13 - (ERR) Your graphics hardware doest not support 32 bit Dynamic Texture [8876086A] 16/04/2019 23:18:13 - (INF) FPU Preserve 16/04/2019 23:18:13 - (INF) Vertex Processing: Software 16/04/2019 23:18:13 - (INF) Device Created! 16/04/2019 23:18:13 - (INF) Multi Sample: 0 0 16/04/2019 23:18:13 - (INF) Resolution: 800x600 16/04/2019 23:18:13 - (INF) Window Mode 16/04/2019 23:18:13 - (INF) Color Depth: 32BPP 16/04/2019 23:18:13 - (INF) @ Device Capabilities 16/04/2019 23:18:13 - (INF) Vertex Shader Version: 0.0 16/04/2019 23:18:13 - (INF) Pixel Shader Version: 2.0 16/04/2019 23:18:13 - (INF) Max Vertex Blend Matrices: 0 16/04/2019 23:18:13 - (INF) Max Vertex Blend Matrix Index: 0 16/04/2019 23:18:13 - (INF) Max Primitive Count: 65535 16/04/2019 23:18:13 - (INF) Max Vertex Index: 65534 16/04/2019 23:18:13 - (INF) Max Streams: 16 16/04/2019 23:18:13 - (INF) Max Streams Stride: 255 16/04/2019 23:18:13 - (INF) Max Vertex Shader Constant Registers: 0 16/04/2019 23:18:13 - (INF) Max VShader Instructions Executed: 0 16/04/2019 23:18:13 - (INF) Max PShader Instructions Executed: 96 16/04/2019 23:18:13 - (INF) Max Vertex Shader 30 Instruction Slots: 0 16/04/2019 23:18:13 - (INF) Max Pixel Shader 30 Instruction Slots: 0 16/04/2019 23:18:13 - (INF) Max Simultaneous Textures: 8 16/04/2019 23:18:13 - (INF) Max Texture Blend Stages: 8 16/04/2019 23:18:13 - (INF) Max Texture Width: 2048 16/04/2019 23:18:13 - (INF) Max Texture Height: 2048 16/04/2019 23:18:13 - (INF) Max Volume Extent: 256 16/04/2019 23:18:13 - (INF) Max Texture Repeat: 8192 16/04/2019 23:18:13 - (INF) Max Texture Aspect Ratio: 2048 16/04/2019 23:18:13 - (INF) Max Anisotropy: 4 16/04/2019 23:18:13 - (INF) Max Active Lights: 0 16/04/2019 23:18:13 - (INF) Max User Clip Planes: 0 16/04/2019 23:18:13 - (INF) Max Point Size: 256.000000 16/04/2019 23:18:13 - (INF) Max Npatch Tesselation Level: 0.000000 16/04/2019 23:18:13 - (INF) Num Simultaneous RTs: 1 16/04/2019 23:18:13 - (INF) Using Textures Non Pow2 Conditional: Yes 16/04/2019 23:18:13 - (INF) Using Textures Pow2: Yes 16/04/2019 23:18:13 - (INF) Supports HW Skinning: No 16/04/2019 23:18:13 - (INF) Software Skinning My initialization logs (where shader with sm2 works perfectly): 22/04/2019 14:45:21 - (INF) Direct3D 9 Interface Created 22/04/2019 14:45:21 - (INF) FPU Preserve 22/04/2019 14:45:21 - (INF) Vertex Processing: Hardware 22/04/2019 14:45:21 - (INF) Pure Device 22/04/2019 14:45:21 - (INF) Device Created! 22/04/2019 14:45:21 - (INF) Multi Sample: 0 0 22/04/2019 14:45:21 - (INF) Resolution: 1024x768 22/04/2019 14:45:21 - (INF) Window Mode 22/04/2019 14:45:21 - (INF) Color Depth: 32BPP 22/04/2019 14:45:21 - (INF) 32 bit Back Buffer 22/04/2019 14:45:21 - (INF) @ Device Capabilities 22/04/2019 14:45:21 - (INF) Vertex Shader Version: 3.0 22/04/2019 14:45:21 - (INF) Pixel Shader Version: 3.0 22/04/2019 14:45:21 - (INF) Max Vertex Blend Matrices: 4 22/04/2019 14:45:21 - (INF) Max Vertex Blend Matrix Index: 0 22/04/2019 14:45:21 - (INF) Max Primitive Count: 16777215 22/04/2019 14:45:21 - (INF) Max Vertex Index: 16777215 22/04/2019 14:45:21 - (INF) Max Streams: 16 22/04/2019 14:45:21 - (INF) Max Streams Stride: 255 22/04/2019 14:45:21 - (INF) Max Vertex Shader Constant Registers: 256 22/04/2019 14:45:21 - (INF) Max VShader Instructions Executed: 65535 22/04/2019 14:45:21 - (INF) Max PShader Instructions Executed: 65535 22/04/2019 14:45:21 - (INF) Max Vertex Shader 30 Instruction Slots: 4096 22/04/2019 14:45:21 - (INF) Max Pixel Shader 30 Instruction Slots: 4096 22/04/2019 14:45:21 - (INF) Max Simultaneous Textures: 8 22/04/2019 14:45:21 - (INF) Max Texture Blend Stages: 8 22/04/2019 14:45:21 - (INF) Max Texture Width: 16384 22/04/2019 14:45:22 - (INF) Max Texture Height: 16384 22/04/2019 14:45:22 - (INF) Max Volume Extent: 2048 22/04/2019 14:45:22 - (INF) Max Texture Repeat: 8192 22/04/2019 14:45:22 - (INF) Max Texture Aspect Ratio: 16384 22/04/2019 14:45:22 - (INF) Max Anisotropy: 16 22/04/2019 14:45:22 - (INF) Max Active Lights: 8 22/04/2019 14:45:22 - (INF) Max User Clip Planes: 8 22/04/2019 14:45:22 - (INF) Max Point Size: 8192.000000 22/04/2019 14:45:22 - (INF) Max Npatch Tesselation Level: 0.000000 22/04/2019 14:45:22 - (INF) Num Simultaneous RTs: 4 22/04/2019 14:45:22 - (INF) Using Textures Non Pow2 Conditional: No 22/04/2019 14:45:22 - (INF) Using Textures Pow2: No 22/04/2019 14:45:22 - (INF) Supports HW Skinning: Yes 22/04/2019 14:45:22 - (INF) Hardware Skinning Does anyone know what might be happening? Thanks.
  13. Hello everyone, About a month ago I finished programming the lighting (or at least it is what I thought), because a few hours ago I realized that when I translate the object, the lighting breaks (not when rotating, because I multiply the normals by the transpose of the inverse of the modelMatrix), only if I move the object from the 0,0,0. I only have one light, a directional one. Here the rock is at 0,0,0. T Here the rock is at -10,0,0. I only moved it to the left, and as you can see looks like the normals are wrong, but I do not touch them, except when I multiply them by the transpose of the inverse of the modelMatrix (if I do not multply the normals by that matrix the lighting doesnt break but when the object rotates the light rotates with it). So, is there a way to have the normals well for translating and rotating? Now I can translate the object and the lighting doesnt brake, but if I rotate it the light rotates with it. Now I can rotate it but if I move it the lighting breaks. Thanks a lot!!
  14. Hello - I recently started designing a tile-based game, but I'm not exactly sure what would be the best method for storing my tiles in the map that will be displayed. I understand that most maps are a simple 2D array with each dimension representing the different axes (x, y), but my game has massive quantities of tiles that need to be displayed at once (I don't have an exact number, but I know it will be over 60x30, which is 1800 at once). I'm fairly certain that the aforementioned construct will be too simple and costly for my needs. I'm also looking to delve into simple procedural generation in the future, meaning I want my method of storage to be able to be easily modified/populated. The best analogy of a game I could give for reference is Terraria, since it has both procedural generations and a bunch of tiles. To simplify, my question is basically: What would be an efficient way to store a map with large amounts of tiles (in the thousands), while still allowing the construct to be populated relatively easily? Thanks for any input/help.
  15. Hi everyone, I'm trying to implement the displacement mapping technique by William Donnelly from GPU Gems 2: Chapter 8 for a project. I'm working in XNA/MonoGame and using HLSL for my shaders. The problem I'm having is how to create the distance map and I'm not really understanding how I would go in creating it. I know that Nvidia has released the code for that chapter but the files are in c++ and I'm not familiar with c++ so its been throwing me off on how to recreate it in c#. Any explanation on how I should go about in creating the distance map would be extremely helpful. Thanks for any help or suggestions.
  16. Hello all. I'm newbie on this forum and my english not very good. Sorry >< I created a volumetric lighting with a shadow for my water. Right now I need to compute a max height relative to the water (like a volume water depth). So, I have only screen space uv and 4 clip space vertexes (graphics.blit of unity3d). How I can reproduce the water vertex world space position? My water always is horizontal. Like in this screen. I have one bad idea, but I don't like it. I can render the water depth to texture and use this depth for reconstruct world space position in the lighting post effect. Also, I can reuse this water depth for manual writing to unity depth buffer (for correct depth of field posteffect). Also, it's only way to write to depth texture with transparent geometry. I tried work with a matrix transformations of space (posteffect quad -> world space quad -> transform this quad using TRS (model) matrix of water -> view -> projectrive -> clip), but it does not work for me. Any idea?
  17. Hi, What is the formula to convert a z-coordinate to depth buffer value?? The values of the depth buffer goes from 0.0 to 1.0 and I want to convert every value of z-axis to this range. Thanks.
  18. hi, i´ve tried for a long time to get a skinning example code to work, without much success. does anyone know a website/tutorial where it is described step-by-step what happens to the vertices during key-frame animations? i want to point put that there isnt a more confusing way than wrapping essential pieces of code into class methods, as it is done here: http://ogldev.atspace.co.uk/www/tutorial38/tutorial38.html so this one is just not comprehendable, at least not to me ... i´d like to reiterate what i already know: vertices are defined in "mesh space". to transform that mesh to its proper position in the world, several chained transformations have to be applied to those vertices ("world space"). first, a model is (or can be) a hierarchical organized scene, for example, to put the hand of a human to its correct place, the ellbow-transform and shoulder-transform have to be applied after the models transform itself (that is, where the model in the world is located). absolute_hand_transform = model_transform x shoulder x ellbow x hand ... each defined relative to its parent, of course. this is all static, no skinning there ... thats what i´ve already implemented. if you load a model using ASSIMP (for example), bones contain a "offset matrix", that transform the vertices of a mesh to "local space of the bone": ... then, an animation produces for each bone a relative transformation (relative to its parent, i guess) that i can use (draw) after all the previous or "parent" transforms are applied. IS IT THEN CORRECT (?) that to draw a skinned mesh, the transformations i have to calculate and send to the vertexshader are calculated as following: MeshSpace-To-WorldSpace = ModelToWorld-Transform x MeshSpace-To-BoneSpace_0 x AnimatedBone_0 x MeshSpace-To-BoneSpace_1 x AnimatedBone_1 x MeshSpace-To-BoneSpace_2 x AnimatedBone_2 x MeshSpace-To-BoneSpace_3 x AnimatedBone_3 where all those "MeshSpace-To-BoneSpace" transforms are these "bone offset matrices" AND each of those "AnimatedBone" are directly computed by interpolating the animation keys ... correct ??? (i assume its not because often i read about having to use some "inverse bone offset transform" in some way i dont get ... 😐) thanks in advance for any answers/suggestions/clarifications !!
  19. Does D3D11 allows that a texture was bound to a SRV and a UAV simultaneously? And SRV is in render pipeline and UAV is in compute pipeline. The detail is as below, 1. Create a texture 'tex', and an UAV 'uav' for compute pipeline, which is bound to 'tex'. 2. In compute shader, dispatchCompute, change 'tex' content through compute shader. 3. Does not unbind the UAV to texture, then create a SRV for render pipeline, which is also bound to 'tex ' 4. In pixel shader, drawArrays, sample 'tex' content through pixel shader, and draw content to window. I had written a case, and found that compute shader had written correct content to texture 'tex', but seemed that sampler sampled an undefined content in pixel shader.
  20. Hi, currently I am trying to programm a snow shader consisting of snow deformation and a snowing particle effect. My problem is that I need to use a UAV preferably an RWTexture2D which consists of 2 16bit float values (R16G16_Float) but only R32_UINT allows simultaneous writes. I saw in the documentation that there is a possibilty to cast those in HLSL by including D3DX_DXGIFormatConvert.inl but my program does not find the header. So this is my problem how do I properly include the format convert? Any thoughts?
  21. I am looking for some things to experiment on and research and maybe try to find a solution. If you guys have any idea about common problems or challenges that haven't been solved until today when to comes to graphics programming, shading, rendering please post here, i would appreciate it. Thanks in advance
  22. does anyone have an example of how to load skinned mesh from .x file with animations? I managed to load mesh but can't figure out how to load animations. For any example or at least some suggestion on how to deal with it, i would be very grateful. And should be done with slimdx and directx 9.
  23. SmaugA

    Release V0.1.7 in Play market

    ENG: Hello everyone! The lost souls was released on OS Android for free and you can find it in Play market (URL: https://play.google.com/store/apps/details?id=com.VitalyDemin.TheLostSouls ). Thanks to all people who reading this blog and contacting me. V0.1.7 include: Chapter 1 with 3 levels; New type of enemy; Manual and prologue; Sound effects; And many small changes. RUS: Привет всем! The lost souls вышла на ОС Android в Play market, и Вы можете поиграть в неё совершенно бесплатно (URL: https://play.google.com/store/apps/details?id=com.VitalyDemin.TheLostSouls ). Спасибо всем тем, кто читал и читает мой блог разработки, а также контактирует со мной. V0.1.7 включает в себя: Первую главу с 3-мя уровнями; Новый тип врага; Мануал и пролог; Звуковые эффекты; И много мелких изменений.
  24. Hi, I am using a Unity compute shader to create a vertex buffer and index buffer. The vertex buffer is an AppendStructuredBuffer<float3> vertexBuffer Each thread calculates a vertex and then calls vertexBuffer.Append(vertex) Later in that same function (in that same kernel) I have a RWStructuredBuffer<int> indexBuffer I would like to add the index of the vertex just placed in vertexBuffer to indexbuffer. However there doesn't seem to be any way to get the index of the last item added to the vertex Append buffer? I'm aware of CopyCount but that doesn't seem appropriate here. I've also looked into CopyStructureCount but I don't think that can be called in a Unity compute shader? Thanks
  25. Why hello there! It's time for your favourite Weekly Update blog again! This time it's quite heavy in visual, so without further ados, let's dive right into this! New Shaders First up, let's talk about shader. With me being quite active on Reddit, I'm able to see many different boards. One of them is the Unity3D board, r/Unity3D. To keep this brief, Unity got a visual graph interface for writing shaders called "Shadergraph". In many aspects, it works like Unreal Engine's blueprints where there are different nodes with inputs and outputs. The developer simply needs to link input to outputs to build a shader. So lately a meme was lately spreading around with using Shadergraph to create water shaders. It's quite interesting, as I've only seen a few here and there but it's a great meme nevertheless. The catch is that ShaderGraph is only available in the pro version of Unity, and while I'd wouldn't mind spending the cash on it I'm currently stuck with the free version. But that didn't discourage me the slightest. I've decided to build my own water shader with blackjack and hookers (but most importantly don't forget the shader! 😉 ). The Depth Texture So, in case you don't know, the first step is to be able to get the camera's depth texture. Basically, the depth texture is a colour map holding the depth of each pixel compared to the rendering camera. The furthest the pixel is the greater will be its depth value. This means that any geometries writhing to the Z buffer will affect the depth. This is quite handy, as with this information nice effects can be made. Here's a picture of a depth texture for a given scene Because I'm using differed rendering this texture is kinda given to me for "free" but if I would have used forward rendering another rendering step would be needed. Nevertheless, getting it is quite easy as it's available in all shaders as a global sampler called _CameraDepthTexture. Just add this sampler to your shaders and you're (almost) good to go. The Shader But, using the depth texture itself isn't enough. We need to transform this data to be able to get that sweet sweet water shader. First thing is to actually project the sampler unto the geometry. We then get the eye space depth for each pixel. This, in turn, creates some kind of gradient that follows the border of any pierced geometries, creating some kind of fade out. float4 depthSample = SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, IN.screenPos); float depth = LinearEyeDepth(depthSample).r; float depthDistance = (depth - IN.screenPos.w); Afterwards, it's a matter of getting the actual distance and tweaking the gradient a bit to get it sharper (or softer if you want). float shoreFoamLine = saturate(UNITY_ACCESS_INSTANCED_PROP(Props, _ShoreFoamDistance) * depthDistance); float foamLine = saturate( (UNITY_ACCESS_INSTANCED_PROP(Props, _DepthFactor) / 2.0) * depthDistance); When the gradient is sharp enough we can create some kind of cartoonish foam around any geometries. Used properly this can create quite the effect. The last step I've taken was to change the gradient to use a custom gradient instead. I've basically opened my image editor and created a really pixel-ish gradient with clearly defined layers. Then it's only a matter of sampling that gradient texture with the gradient and you can get quite the effect. With that new gradient, we can change the material's properties in any way we want. It can make various effects when using different colours and settings. Here's a quick video showing off the different possible effects depth texture can create: Room Redesign Secondly, while the shaders were compiling I had time to continue some redesigns here and there. Let's start with the bank, which is by far the oldest rooms. One predominant difference you'll see in most of these rooms is obviously better lighting. This was due to the room scaling its wall to match the level's wall heights. While it created a certain level of coherence it was really weird looking. It was also a significant pain to light up, which resulted in badly lit rooms. Another problem came with the reflection probes, which needed to be re-scaled and re-positioned each time a special room was spawned. So I removed that wall scaling thing. The effect is that now I can properly set up the lighting of each room and also made those reflection probes easier to manage. The New Bank First, let's look at the new bank! now the counters are much more detailed, with each counter having a proper computer setup with screens and keyboard. The counter screen got a lot more details than before. Now it got metal bars instead of a solid design. Another thing to mention is those massive pillars on the sides. It's supposed to mimic classical bank designs. The lighting is also greatly updated. It's much more present and in turns make the scene much more detailed and designed than before. Lastly, I've also added a darkened corridor that is supposed to tell the player that the bank is bigger than it looks. I've used another depth texture shader to create that darkening effect too! I'm fairly content with that design right now. Can't say its the final design though (because of agility and whatnot), but it's really neat looking, to be frank. The New Diner Next, it's all about the new diner. First, I've decided to change the ceiling, of which I thought was really boring and didn't make me think of diners at all... To fix this I've decided to add a curve to that ceiling and give it a more "diner" aesthetic altogether. Another big change is the design change of the props. Now its much more low-poly with its flatten quads and sharpen edges. Another change was the use of metallic materials. In classic diners, metallic materials were used almost everywhere. In my dinner, there weren't any at all. Now they're a bunch of metallic ridges on most geometries, which gives it quite the "diner" aesthetic I'm going for. I've also decided to change how the diner displays its menu. Previously the items were put on the counter. It wasn't clear enough that those food items were intractable, so I decided to make them hover above the counter instead. I've also added a constant rotation to them, which makes them more appealing to the player in a way. SpinningFoods.mp4 I'm not done with that design, though. I still want to add something similar to the bank's dark hallway. I want to make the player believe that this room is actually just a portion of the whole thing. I still need to think about it, though. The New Casino Finally, let's talk about the new casino. I'm gonna be honest here, the star of the scene here is mainly the ceiling. I was inspired by my city's casino design, which actually is a reused pavilion from the 1967 World's fair. It got really funky curves and whatnot. This inspired me to create a spiral-shaped ceiling with big wide windows all around it. I'm reusing the same "fake sun" lighting system that I've used in my super shoppe room. Basically, I rotate the spotlight to replicate the sun's alignment. I personally think that the projected light makes quite the impression, especially with relatively simple room design and simple low-poly geometries. I also added a bunch of lights to it just to make it look quite nice too. That's about it for rooms as of right now Minor Updates Refactored my enemies (while it's implemented It's still a WIP, so no pictures, sorry!): Now they got a more "humanoid" skeleton: This effectively means that I now have access to things like head IK and many more advance humanoid feature, such as animation mirroring and whatnot. Updated the walk animation blend tree to be 2D; With new walking animations, enemies can now sidestep and backstep; Redesigned the model to match the new designs. Changed the Super Shoppe floor to carve a nice glass design on the second floor's floor: This carving goes through the floor and is also visible from below; It's supposed to represent the icon for crystals. Optimized some unoptimized code here and there: I've been doing profiling a lot and discovered that some of my GUI elements can give lag (can drop from 60 FPS to ~15 FPS), so it's up to the drawing board again I guess. Bunch of cleaning, refactoring and optimizations. Next Week While redesign the rooms are primordial, I'm also worrying about performance too. Its a relatively simple game, and having a 25% lag spike is kind of unacceptable (especially given that the geometries are stupidly low-poly compared to modern games). I have no idea why, but I always was conservative with the resources of the game. Perhaps it's because I'm lacking either confidence in the technology or perhaps it's a lack of knowledge on the actual limit of the engine. I really have no idea, and while I'm not as knowledgeable as others on the mather I also think that it's something that can be learned. But anyways, first up is fixing that unoptimized GUI element. I don't think it'll take long, but who knows? Aside from that, I want to continue the room redesigns. Then it's enemy time and after that, it's your usual suspects. I really think that if I'm working on it I could have something by about July or August. After hindsight, the March/April deadline was kind of pretentious, especially knowing the fact that the game itself wasn't even 1 year into development, to begin with. I knew that games like these could take up to 2 years before being done, but I was also kinda cocky about the fact I could do it in half the time. Needless to say that it wasn't really possible at all. But right now let's focus on now and let's slowly plan ahead as we go: the future can be really unpredictable sometimes.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!