Jump to content
  • Advertisement

Hashbrown

Member
  • Content count

    49
  • Joined

  • Last visited

  • Days Won

    1

Hashbrown last won the day on June 23

Hashbrown had the most liked content!

Community Reputation

126 Neutral

About Hashbrown

  • Rank
    Member

Personal Information

  • Role
    DevOps
  • Interests
    Art
    Audio
    Business
    Design
    DevOps
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hey guys thanks again for all the help! I've definitely taken all the advice given to me. I changed to Simplex 3D and I'm using spherical coordinates. I love the results now because there's no stretching. I had trouble understanding why 2D noise wasn't the best choice but all the explanations helped. (not the real colors of the planet, just marking areas)
  2. Gnoll thanks for answering! I'm definitely going to try Simplex, I found an implementation here. Isn't Simplex under a strict patent though? I appreciate answering the seam question too, I'll definitely give it a try that was really annoying me. As for cutting off certain areas of the noise, I found the answer in this video. The person who made the tutorial calls it a "falloff map". The tutorial is short too, 11 mins long. (could use some improvement, but I'm getting there) I also found out why my texture looked stretched: I was actually stretching the texture! I'm dynamically creating the texture but was making it 512x512, but spheres apparently like images that are wide. So I went with 1000x500 and it looks so much nicer.
  3. I'm trying to use Perlin Noise to paint landscapes on a sphere. So far I've been able to make this: (the quad is just to get a more flat vision of the height map) I'm not influencing the mesh vertices height yet, but I am creating the noise map from the CPU and passing it to the GPU as a texture, which is what you see above. I've got 2 issues though: Issue #1 If I get a bit close to the sphere, the detail in the landscapes look bad. I'm aware that I can't get too close, but I also feel that I should be able to get better quality at the distance I show above. The detail in the texture looks blurry and stretched...it just looks bad. I'm not sure what I can do to improve it. Issue #2 I believe I know why the second issue occurs, but don't know how to solve it. If I rotate the sphere, you'll notice something. Click on the image for a better look: (notice the seam?) What I think is going on is that some land/noise reaches the end of the uv/texture and since the sphere texture is pretty much like if you wrap paper around the sphere, the beginning and end of the texture map connect, and both sides have different patterns. Solutions I have in mind for Issue #2: A) Maybe limiting the noise within a certain bounding box, make sure "land" isn't generated around the borders or poles of the texture. Think Islands. I just have no idea how to do that. B) Finding a way to make the the noise draw at the beginning of the uv/texture once it reaches the end of it. That way the beginning and ends connect seamlessly, but again, I have no idea how to do that. I'm kind of rooting for the solution a though. I would be able to make islands that way. Hope I was able to explain myself. If anybody needs anymore information, let me know. I'll share the function in charge of making this noise below. The shader isn't doing anything special but drawing the texture. Thanks! CPU Noise Texture: const width = 100; const depth = 100; const scale = 30.6; const pixels = new Uint8Array(4 * width * depth); let i = 0; for (let z = 0; z < depth; z += 1) { for (let x = 0; x < width; x += 1) { const octaves = 8; const persistance = 0.5; const lacunarity = 2.0; let frequency = 1.0; let amplitude = 1.0; let noiseHeight = 0.0; for (let i = 0; i < octaves; i += 1) { const sampleX = x / scale * frequency; const sampleZ = z / scale * frequency; let n = perlin2(sampleX, sampleZ); noiseHeight += n * amplitude; amplitude *= persistance; frequency *= lacunarity; } pixels[i] = noiseHeight * 255; pixels[i+1] = noiseHeight * 255; pixels[i+2] = noiseHeight * 255; pixels[i+3] = 255; i += 4; } } GPU GLSL: void main () { vec3 diffusemap = texture(texture0, uvcoords).rgb; color = vec4(diffusemap, 1.0); }
  4. Actually that did work, also understood what I was missing now. Thanks a lot Tim!
  5. I'm trying to use values generated with a 2D Perlin noise function to determine the height of each vertex on my sphere. Just like a terrain height map but spherical. Unfortunately I can't seem to figure it out. So far it's easy to push any particular vertex along its calculated normal, and that seems to work. As you can see in the following image, I'm pulling only one vertex along its normal vector. This was accomplished with the following code. No noise yet btw: // Happens after normals are calculated for every vertex in the model // xlen and ylen are the segments and rings of the sphere for(let x = 0; x <= xLen; x += 1){ for(let y = 0; y <= yLen; y += 1){ // Normals const nx = model.normals[index]; const ny = model.normals[index + 1]; const nz = model.normals[index + 2]; let noise = 1.5; // Just pull one vert... if (x === 18 && y === 12) { // Verts model.verts[index] = nx * noise; model.verts[index + 1] = ny * noise; model.verts[index + 2] = nz * noise; } index += 3; } } But what if I want to use 2D Perlin noise values on my sphere to create mountains on top of it? I thought it would be easy displacing the sphere's vertices using its normals and Perlin noise, but clearly I'm way off: This horrible object was created with the following code: // Happens after normals are calculated for every vertex in the model // xlen and ylen are the segments and rings of the sphere // Keep in mind I'm not using height map image. I'm actually feeding the noise value directly. for(let x = 0; x <= xLen; x += 1){ for(let y = 0; y <= yLen; y += 1){ // Normals const nx = model.normals[index]; const ny = model.normals[index + 1]; const nz = model.normals[index + 2]; const sampleX = x * 1.5; const sampleY = y * 1.5; let noise = perlin2(sampleX, sampleY); // Update model verts height model.verts[index] = nx * noise; model.verts[index + 1] = ny * noise; model.verts[index + 2] = nz * noise; index += 3; } } I have a feeling the direction I'm pulling the vertices are okay, the problem might be the intensity, perhaps I need to clamp the noise value? I've seen terrain planes where they create the mesh based on the height map image dimensions. In my case, the sphere model verts and normals are already calculated and I want to add height afterwards (but before creating the VAO). Is there a way I could accomplish this so my sphere displays terrain like geometry on it? Hope I was able to explain myself properly. Thanks!
  6. Thank you so much! Removing the fract function was exactly the solution! Why was this occurring though? Could it be due to the welding issue, or maybe floating point precision? I'll figure out another way to dynamically tile textures. I appreciate all the answers suggested, thanks all. Update: Hodgeman's answer above explains why this is happening for those of you wondering.
  7. Hey everybody, still no solution Zakwayda is right, as much as I tested with a Blender model, I'm sure there's something wrong with my implementation. I've actually uploaded the entire engine to github. Hopefully it's not too large of a project. [GitHub link] I'll leave a little guide here so nobody has to guess where everything is. By the way, if you want to run the project, you can either host the client folder or if you got node installed: $ npm install // after dependency download $ node server If you're using nodejs, the game will be served at localhost:8080. By the way, I tried drawing the scene with no mipmapping but got the following warning: texture bound to texture unit 2 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering. Which is true, my planet texture is w > h. I tried it with mipmapping off, min and mag filters on nearest. Guide If you press the lockmouse button, you enter fps mode enabling you to navigate around using your mouse and arrow keys (not WASD). CORE src/engine/loop.js: Game Loop. Physics and Render loops called from here MAIN ENTRY src/main.js: This is where I create the planet entities, textures, meshes. Game.load: calls a function in src/game/loader.js and loads all objs, images, etc. Game.ready: (line26) is called once all assets are ready and before the game starts UV SPHERE, RENDERING, MESH AND TEXTURE CONSTRUCTORS src/engine/systems/drawscene.js: (line 109) Entities are rendered here. src/engine/texture/texture.js: The Texture object for creating new Textures src/engine/data/model.js: (line 216) The UV Sphere function is located here, it's the very last static function. src/engine/data/mesh.js: This is the Mesh object for new Meshes SHADER src/shaders/phong.frag: The only shader I'm using aside from the post processing ones. You'll notice I'm feeding vec4 color the diffuse texture map only, ignoring my lighting algorithm for now. MATH src/engine/math: Here you'll find matrices, vectors and also a class for calculating normals.
  8. WebGL2 is what I'm using., actually I'm also using CLAMP_TO_EDGE, sorry I should have been more specific. The api I wrote just needs the word "clamp". if (value === "clamp") wrap = gl.CLAMP_TO_EDGE; Gnoll, actually I'm going to try that, thanks for your suggestion!
  9. Hey what's up Gaxio, no padding as far as I see, in fact this is what it looks like: Good point, but I actually tested exporting a uv sphere from Blender and I get the same issue. Here's the UV Sphere code, which I got from the following tutorial.
  10. I've taking a WebGL2 tutorial on how to make UV Spheres but noticed a one pixel size line running from the north to south poles of my uv sphere: (you'll have to click on the images to get a better look) Before posting this question, I tried exporting a UV Sphere primitive made on Blender, unwrapped it using Blender's sphere unwrapping function, and still I got the same line. I wanted to discard the possibility of this line happening because of my uv sphere implementation, but get the same situation with a uv sphere generated with blender. I'm also not lighting the scene, just using the diffuse texture map. I also drew a straight line from beginning to end on the diffuse map in order to make sure the uvs are lined up, and everything is okay..but at some point, you'll see that minor 1 pixel separation as mentioned above. Would anybody have any clue of what might be happening? I can't seem to figure out why this is occurring. Some extra info: Thought I'd mention some of my texture settings just in case it helps: mipmapping: true, wrap: "clamp", minfilter: "linearmipmap", magfilter: "linear"
  11. Scouting Ninja, thanks a lot! That was exactly what's going on. I had no idea what Anisotropic Filtering does, now I know. I had to set this first: minfilter = gl.LINEAR_MIPMAP_LINEAR; magfilter = gl.LINEAR and of course: const max = gl.getParameter(ext.MAX_TEXTURE_MAX_ANISOTROPY_EXT); gl.texParameterf(gl.TEXTURE_2D, ext.TEXTURE_MAX_ANISOTROPY_EXT, max); Now my frames are rendering very nicely: For those of you looking for more details on how to do this on WebGL or OpenGL ES, you check out this tutorial. It's a very brief but informative article. Thanks again Ninja!
  12. I've found two particular visual issues in my renders and can't seem to find a way to solve it. Just to setup some context: floors, walls and roof all use diffuse, specular and normal maps. Oh and all three are tiled in the fragment shader. If you don't mind, take a look at the video below: If you look at the roof area of the room, the pixels morph depending on the camera movement (light source is not moving): looks horrible I've also noticed that the less I tile the roof, the less morphing and dots you see. Also, if you check out the floor, you'll notice these strange artifacts around the lines of the floor. In fact, I'll show you a zoomed in picture (click for a larger view): I'm no sure what to do. I've checked if my video card supports anti-aliasing: function antialiasSamples () { const antialias = gl.getContextAttributes().antialias; return gl.getParameter(gl.SAMPLES); } ...and it does, supports 8 samples. I also have a post processing pass, but I'm blitting the frame buffer and using a multisample texture (8 samples): static toMultisample (input, output) { gl.bindFramebuffer(gl.READ_FRAMEBUFFER, input.buffer); gl.bindFramebuffer(gl.DRAW_FRAMEBUFFER, output.buffer); gl.blitFramebuffer( 0, 0, input.width, input.height, 0, 0, output.width, output.height, gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT, gl.NEAREST ); gl.bindFramebuffer(gl.READ_FRAMEBUFFER, null); gl.bindFramebuffer(gl.DRAW_FRAMEBUFFER, null); } ...and this definitely works, yet still I get the same jagged edges and artifacts in the floor. I even added a feature to my textures which I honestly don't understand very well: const ext = ( gl.getExtension('EXT_texture_filter_anisotropic') || gl.getExtension('MOZ_EXT_texture_filter_anisotropic') || gl.getExtension('WEBKIT_EXT_texture_filter_anisotropic') ); if (ext) { const max = gl.getParameter(ext.MAX_TEXTURE_MAX_ANISOTROPY_EXT); gl.texParameterf(gl.TEXTURE_2D, ext.TEXTURE_MAX_ANISOTROPY_EXT, max); } ... still not much difference. Are there any tips some of you can offer me to remove these issues? Below I'll share some information about my rendering pipeline and even share my shaders. Thank you very much for reading. Shader Pastebin Links: VERTEX SHADER FRAGMENT SHADER WebGL2 There's only one point light in the scene at vec3(0.0, 3.0, 2.0) Textures are tiled in the fragment shader (vec2 uvs = fract(uvcoords).xy * uvtile) Mipmapping on for all textures -------- By the way, I thought I'd share a video of a closer look at objects with normal mapping. So you all have an idea of how it's working so far. I'm moving the light left and right btw.
  13. Hey Randy thanks a lot for the great answer, very much appreciated. I'm definitely changing the algorithm to the more simpler version you're suggesting and changing my character to a capsule collider. Thanks again!
  14. Hey Randy thanks for the share, I appreciate it. Although I forgot to mention that I'm working on a fast paced 3D fps that uses some very basic physics. So I read that Minkowski Difference was a nice candidate, is that true? I know I'll have to modify my code in order to avoid tunneling and consider the colliders velocities, but I wanted to understand the algorithm in its simplest form. By the way I finally understood how to get the "penetration vector". I like calling it an offset vector instead, since I use this returning vector to push the collider out. Would this algorithm (plus modifications) still be a good idea for a fast paced fps game with physics, or maybe a better question, what would Minkowski Difference in terms collision be good for? What situations? Thanks!
  15. I've been reading this article on using Minkowski Differences for game collision, but unfortunately stuck on a particular part. I managed to create the Minkowski differenced AABB, and collision detection works on any side. But I fail to understand (intuitively) how to get the penetration vector of said collision. Please note that this article uses the Y inverted, but I re-wrote the code and as I mentioned earlier, got collision detection working. Up in my implementation is +y. Accoridng to this article: I'm guessing he's referring to the distance from the origin in Minkowski Space to the Minkowski differenced box? I really can't seem to wrap my mind around this penetration vector code to be honest . He describes a function called closestPointOnBoundsToPoint, which does exactly that, but I'm not getting it. A Vector.zero is passed to the function, so in what situation do you pass something different? . If I can give a wild guess, I'm assuming it's due to the fact that if there's a collision, the Minkowski AABB will always surround the Minkowski space origin. I don't know if I'm right, I'm assuming. Hopefully somebody can point me towards the right direction. Below I'll share the code I have so far in order to get the Minkowski differenced aabb. If anymore information or code is needed, let me know. Thank you very much in advance! // AABB constructor arguments: center, size of AABB // Btw, this particular example collides. const a = new AABB (new Vec2(0, 0), new Vec2(1, 1)); const b = new AABB (new Vec2(2, 0), new Vec2(1, 1)); const diff = Vec2(); diff.x = a.min.x - b.max.x; diff.y = a.min.y - b.max.y; const size = new Vec2(); size.x = a.size.x + b.size.x; size.y = a.size.y + b.size.y; // Minkowski AABB const md = new AABB(); md.center.x = (diff.x + size.x); md.center.y = (diff.y + size.y); md.size = size; // Check For Collision if (md.min.x <= 0 && md.min.y <= 0 && md.max.x >= 0 && md.max.y >= 0) { // Collision, but needing the penetration vector, so I can push object out :( } else { // No Colision. } Edit: I tried visualizing the AABBs and noticed that the Minkowski AABB is a mirror of the point of collision but in some other space around the origin, let me call it Minkowski space for now. I still don't understand how to get the penetration vector though. I'm guessing since the Minkowski AABB is a mirror of the point of collision, we can use the origin (Vector.zero) in Minkowski space to calculate the vector I can use as an offset. By the way, I'm sorry if I'm not using the right terminology.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!