Jump to content
  • Advertisement

dpadam450

Member
  • Content Count

    3296
  • Joined

  • Last visited

Community Reputation

2365 Excellent

About dpadam450

  • Rank
    Contributor

Personal Information

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. dpadam450

    Texture lagging

    Your shader isn't setting output.tex = . Not even sure what we are looking at with your model or the texure, probably because you never gave the pixel shader a valid texture coordinate
  2. Let's look at your portfolio real quick: Metal Deer? What is this? I press a button and random objects move? Kevo? I can jump onto platforms with an underlying game engine that already supports physics and assets? These 'games' look like basic framework demos anyone can make in about 9 minutes. If you want to stand out, make something cool that you can talk about what you did. That is nothing impressive. It also says "We" so I don't know who worked on it or what. Your initial website navigation is confusing and not personal to you. Not even sure what that really is. Simplest way to get a job is to make a decent game demo and put up a video of it. Nobody will want to download and play games.
  3. Have you tried the full pipline with a single triangle or quad with a texture on it? Rotate the uv's 90 degrees and see if it still works... You have to step by step debug this. There are lots of things that could go wrong. Start with something very simple. As stated, we don't know what we are looking at. Get a simple normal map and simple geometry and figure it out. I've had to do the same types of things in my project.
  4. You can copy the final image to a buffer. Render the water plane converting the 4 coordinates to screenspace so that the water plane matches the final texture exactly. Then run a distortion image to distort the underlying objects. This can be as simple as a black and white texture that is a noise texture and you can offset the current pixel by the noise texture amount. As long as you keep depth testing on etc, it will only distort pixels where water renders. Honestly, it depends on what technique you want to mimic. I would look at some games or youtube water rendering demos and see which one is easiest that you like.
  5. Since a point such as a one-pixel peak is not a surface, but a point: You should perform the surface normal calculation I said. If you want the accurate peak normal to point at the sky, it would be averaging the surrounding pixles. That's how per-vertex normals are calculated for a mesh, same thing would apply here.
  6. First, understand that an engine can be a simple while() loop that polls for input and moves and object and draws it. You can build upon that and continually build new features and decide to refactor when you need to. I wouldn't worry about organizing the perfect way to do it up front. I could post a long reply, but if you are starting your own engine from scratch, just implement the features you need/want to work on. Get it working and keep building new features. From an advance standpoint, typically the renderer is on it's own thread. It will have positions/locations stored outside of what positions game/physics have. When a new rock is spawned that doesnt move, you could tell the renderer "here is an object, the 3d mesh it belongs to, and the location". For something like a moveable character, the gameplay/physics thread will have some kind of pointer/handle to be able to tell the render thread "hey that object, this is where it is now". Things like that. Render class, Camera class, Camera could do the rendering, you can render shadows from camera or the main players view from a camera. You just have to start somewhere. If you just want to learn rendering tricks and shaders, then work towards that.
  7. Sorry that is (right - current) = vector, (upper - current) = vector. Cross the two.
  8. Not sure what I am looking at. I see the shadow wrong but what is the giant beam going up. Looks like a strange height map or something. Also odd, you are calculating normals without using the actual current pixel, just the surrounding ones. I would take the current pixel and sample the right pixel and upper pixel only and use that to calculate the normal and tangents. I think if you cross the right sample with the upper sample, you will get a more proper normal.
  9. I didn't down vote, but UV's can go to the edge and further. Again, there is no limit to UV space. It just wraps around unless you request to clamp to edge.
  10. dpadam450

    Distance Fields

    I guess if you generate them per instance and pre-compute the distances offline it would make sense. Each instance would have a unique 3d volume.
  11. dpadam450

    Distance Fields

    I briefly looked at some of the material, what I'm wondering is for a screen-space pixel: How does a pixel determine what distance field volumes are overlapping it that need to be raycasted against for doing AO? Does it have reference to all DistanceField volumes and is doing a computer shader with a BSP tree or something?
  12. dpadam450

    Distance Fields

    http://advances.realtimerendering.com/s2015/DynamicOcclusionWithSignedDistanceFields.pdf I get the idea that they are using for the distance fields, they are essentially a 3d texture/volume with varying resolution per object, but how are they used/rendered? Are the cubes rendered? How/when/where does a shader evaluate them. Is it a fullscreen shader?
  13. dpadam450

    oscillating a 2D object in opengl

    glVertex3f(x1 + r1 * cos(t), y1 + r1 * sin(t), 0.0); You may want to read up a bit more on openGL and how transforms work, glLoadIdentity(), glRotatef(), glScalef(). You shouldn't be recalculating vertices this way.
  14. Trying to get texture buffer objects working. I'm planning on updating a buffer every frame with instance matrices, for now I have stored a single color. I stripped my code to a simple example. Right now the color I am getting is black. I'm wondering if there is something dumb I need to know about TBO's. I have 0 glGetError() issues. The buffer should definitely contain data, so I wonder if there isn't something with binding the texture properly. ***I missed in my example, but I am calling glUniform1i("instanceMatrixBuffer", 11); To properly bind texture 11 to the sampler in the shader. glGenVertexArrays(1, &VAO_Handle); glBindVertexArray(VAO_Handle); glBindBuffer(GL_ARRAY_BUFFER, VBO_Handle); glEnableVertexAttribArray(0); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, (void*)this->m_vboLayout.positionDataOffset); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, VBO_Index_Handle); glBindVertexArray(0); ...later glBindVertexArray(VAO_Handle); static bool doOnce = true; if(doOnce) { doOnce = false; glGenBuffers(1, &TBO_Buffer_Handle); glBindBuffer(GL_TEXTURE_BUFFER, VBO_Index); float data[4] = {1.0, 0.0, 1.0, 1.0}; glBufferData(GL_TEXTURE_BUFFER, sizeof(float)*4, data, GL_STATIC_DRAW); glGenTextures(1, &TBO_Texture_Handle); glBindTexture(GL_TEXTURE_BUFFER, Texture_Index ); glTexBuffer(GL_TEXTURE_BUFFER, GL_RGBA32F, VBO_Index); } glActiveTexture(GL_TEXTURE11); glBindBuffer(GL_TEXTURE_BUFFER, VBO_Index); glBindTexture(GL_TEXTURE_BUFFER, Texture_Index); glDrawElementsInstanced(GL_TRIANGLES, meshToRender->num_faces*3, GL_UNSIGNED_INT, 0, instanceCount) GLSL vec4 Color = texelFetch(instanceMatrixBuffer, 0);
  15. dpadam450

    Contract Work

    He suggested that nobody can live off 100K. Just stating that not every job is a high paying software job. I mean you have experience in Seattle as a game dude. Did you start at 100K? Just curious.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!