Jump to content
  • Advertisement

denisve

Member
  • Content Count

    37
  • Joined

  • Last visited

Community Reputation

182 Neutral

About denisve

  • Rank
    Member
  1.   Could you elaborate a bit more? I'm not too good at math, so if you could put it as formulas, or any kind of code, that would be great.   Also sampling for 3 noise values and normalizing the vector just produces a nice colourful noise, which is cute, but not really what I need :) I aim for blueish normal map looks.
  2. Thanks, I've read it, but there's one catch. I'm using WebGL, which means no dFdxFine, which is used in fragment shader code there. Any way around it in GLSL ES?
  3. Hi, all.   I'm working on large landmass generation. Right now I'm using pregenerated normal maps, however this is getting costly and the goal is to have a good quality at any zoom level, so the best approach would be procedural. Let's presume the mesh is already generated, and I need to render it using predefined colors and normal map. I was thinking to write a function in GLSL that will return smooth normal map by UV coords, pretty much similar to how GLSL implementation of noise works. What I "invented" so far is generating height map from noise and then converting it to normal map. However, all implementations of such conversion I have found, rely on sampling the heightmap texture's neighbouring pixels to calculate normals. This is problematic in GLSL, since noise doesn't really have "pixels", it's generated on the fly per fragment using UV coords.   So how would you approach this problem? Basically I think this boils down to knowing pixel size in UV coords, so I can calculate neighbouring fragments UV coords to calculate normals per fragment. Is this realistic? Is this problematic performance-wise?   Thanks, Denis.
  4. I actually found an interesting solution. Find center of each field and translate it to screen coordinates. Then render to a buffer and run a simple flood fill starting at center of field, and restricted by wireframe yellow pixels. That way I can account for every pixel in every face.   But then I ran into this problem -> http://www.gamedev.net/topic/673088-missing-pixels-when-rendering-lines-with-webgl/  
  5.   I can, but then I need to know what face that particular pixel belongs to.
  6. Hey, all.   I've got this weird behavior when rendering non-antialiased lines with WebGL. Basically I've got a mesh, I create a buffer of _gl.LINES, where I have a line for each mesh edge. Now I should have a continuous geometry, but when I zoom in, I can clearly see missing pixels and lines are unconnected. Most pixels are missing on the very edge of the geometry     And very little inside the geometry as well, though  
  7. When constructing the landscape structure, what would restrict you from storing the information of land type right there, understood onto very face/hex?     I generate landscape as image, using perlin noise. Also my landscape mesh is non-uniform, since it's sphere surface and the hexagons are also non-uniform size. I'm actually projecting the generated image on mesh UV map, but I can't really interpolate between the image and faces on the mesh, the mesh is too arbitrary, in size, shape, etc,
  8. Basically I have a hexagonal mesh, on XY plane, upon which I draw a pseudo-randomly generated landscape.     Then to decide which face is going to be water and which land, I check for white pixels per face. If white pixels > black pixels, it's land, otherwise water.   The way I do it right now is render the buffer offscreen, and then for each pixel on the canvas, I ray cast to know which face the pixel belongs to, and then sum up all the pixels per face. Problem is... the canvas is 1000x700 pixels, and it takes AGES to raycast 700,000 pixels in JS.    So the question is... is there any faster way to know which face is located at arbitrary (x,y) pixel on the canvas, without having to raycast the entire mesh to death 
  9. Thanks for your inputs, guys.   In any case, I'm going to implement this differently, with no need for ifs. Going to pass a map via texture, which I'm going to sample, according to UV coords, then with the index I've received, going to sample the atlas at specific location.   The only thing remaining is mipmaps though.
  10.   That actually solved it! Lots of thanks! Could you perhaps link me to an article that explains why sampling textures shouldn't be done inside if? Also what other functionality should be avoided?
  11.   Mipmaps is an issue I'll have to solve separately, since I'm going to use atlas texture, so I'll have to implement something to avoid pixel bleeding. I'm not there yet, though, far from it. Just trying to basic code right now, to see it it will work at all.
  12. Were are these artifacts?       See the images I linked above in my first post. The "desert" square has a border of weird pixels around it, you can also see it on the magnified image.   And yes, I'm planning to have 40 different textures, or even more. This was just an example, so I've passed 2 uniforms, the real implementation is going to be one big texture atlas and I'm going to sample from it directly.
  13. I expect it to be as it is now, just without the artifacts on the seams.   The idea is very simple: I have a UV mapped mesh, I have a set of textures. Now depending on UV coords I sample from different textures, each textures covers UV square of size 1 x 1.   See the image below, hope it explains better. Yellow lines denote faces, red grid is the UV grid, each square is 1 x 1. What I want, is for each square on the grid, the shader would sample from a different texture. See fragment shader code in OP.  
  14. Hi, all.   I'm working on terrain editor, and the idea so far, is to have a set of textures I could place on each square with UV coords 1x1. I wrote a simple shader, which samples from 2 different textures, depending on UV coords, but for some reason, even though the textures should connect seamlessly, I'm having a strange artifact pixels on the seams. Now the really funny part, is that it works perfectly on a crappy Intel on-board video card, and doesn't work on my GTX500 with drivers up to date (!!!)   For instance, on an arbitrary hexagonal surface I sample everything, except for one square from grass texture, and one square from desert texture.     Magnified (Where are those pixels sampled from exactly? o.O)     Vertex shader code: precision highp float; precision highp int; uniform mat4 modelViewMatrix; uniform mat4 projectionMatrix; attribute vec3 position; attribute vec2 uv; varying vec2 vUv void main() { vUv = uv; vec4 mvPosition; mvPosition = modelViewMatrix * vec4( position, 1.0 ); gl_Position = projectionMatrix * mvPosition; } Fragment shader code: precision highp float; precision highp int; uniform vec3 diffuse uniform float opacity; varying vec2 vUv; uniform sampler2D map; uniform sampler2D map1; void main() { gl_FragColor = vec4( diffuse, opacity ); vec4 texelColor; if(vUv.x >= 0.0 && vUv.x < 1.0 && vUv.y >=0.0 && vUv.y < 1.0) // <- Sample from a different texture for that square texelColor = texture2D( map1, vUv ); else texelColor = texture2D( map, vUv ); gl_FragColor = gl_FragColor * texelColor; } JSFiddle link (sorry for a big chunk of code, embedded textures take some space). The code is first 100 lines, everything below is THREE.js code to render the scene (Scroll wheel: dolly in/out, left click: rotate, right click: pan) -> https://jsfiddle.net/denisve/aucxz90s/   What am I doing wrong? The code is pretty straight forward, just sample from a different texture depending on UV.   Thanks in advance, Denis.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!