Jump to content
  • Advertisement

Arjan B

Member
  • Content Count

    115
  • Joined

  • Last visited

Community Reputation

1136 Excellent

About Arjan B

  • Rank
    Member

Personal Information

  • Interests
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks for the suggestions! Sadly, setting iterator debug level to 0 made no difference. Seems that range checking on the iterators is not the bottleneck.
  2. I started working on a raytracer project (again) and ran into a problem when compiling a Debug configuration. All I have at the moment is a set of pixels in float RGB format that I convert to unsigned char RGBA format that SFML wants. This happens once per frame, running at 200+ FPS in Release mode, but 1 FPS in debug mode. Please have a look at the attached profiling result. It seems to spend almost all of its time in std::vector::push_back(). Is there any way to speed up this process? Could I create all elements in a batch and then start filling in values? Is there some handy use of std::transform that I could apply? Thank you in advance! std::vector<sf::Uint8> SfmlFilm::ToRgba(const std::vector<Spectrum>& image) { std::vector<sf::Uint8> rgba; rgba.reserve(GetWidth() * GetHeight() * 4); for (auto spectrum : image) { const auto max = 255; rgba.push_back(static_cast<sf::Uint8>(spectrum.r * max)); rgba.push_back(static_cast<sf::Uint8>(spectrum.g * max)); rgba.push_back(static_cast<sf::Uint8>(spectrum.b * max)); rgba.push_back(max); } return rgba; }
  3. Arjan B

    'Remove' direction from velocity

    Just to add to Alvaro's comment: projection of A onto B gives you all of A in the direction of B. This is why he/she subtracts that projection from A. The Wikipedia page on vector projection calls this the rejection of A from B: https://en.wikipedia.org/wiki/Vector_projection
  4. Wow, thanks a lot guys!   I ended up doing what Juliean said, which brought me right to the glUniform1f() function. And exactly as Nanoha stated, it generated a GL_INVALID_OPERATION error, which was fixed by simply replacing the 'f' with an 'i'. Wish I'd posted here sooner, I spent tons of hours sadly staring at my screen as well.   Thanks again!
  5. Hi! I first create a lookup-table for a transfer function and then try to upload it as a 1D texture as follows: for (unsigned i = 0; i < 1024; i++) tfm0[i] = qfeGetProbability(tf0, (float)i/1023.f); glActiveTexture(GL_TEXTURE17); if (glIsTexture(tfmTex0) glDeleteTextures(1, &tfmTex0); glGenTextures(1, &tfmTex0); glBindTexture(GL_TEXTURE_1D, tfmTex0); glTexImage1D(GL_TEXTURE_1D, 0, GL_R16F, 1024, 0, GL_RED, GL_FLOAT, tfm0); Right before the rendering call I make sure all the textures I need are bound to the right texture units: glActiveTexture(GL_TEXTURE17); glBindTexture(GL_TEXTURE_1D, tfmTex0);  Then, I set my uniform variable for the 1D texture: glUniform1f(glGetUniformLocation(shaderProgram, "tf0"), 17); And this is how the 1D texture is defined in the fragment shader, where sampleNorm is a value between 0 and 1: uniform sampler1D tf0; vec4 tfValue = texture1D(tf0, sampleNorm); Somehow, all of the tfValues end up being (0, 0, 0, 1), which I suspect is a default fallback value.   To be sure that I uploaded the values to the graphics card correctly, I also have this check right before the draw call: float values[1024]; glActiveTexture(GL_TEXTURE17); glGetTexImage(GL_TEXTURE_1D, 0, GL_RED, GL_FLOAT, values); It retrieves the values in the texture I uploaded back to "normal" memory, and they show up to be exactly the values I expect them to be.   Does anyone have an idea of where things might be going wrong? What would cause the sampler in the fragment shader to return (0, 0, 0, 1), when it should be returning my values in the R-channel?   Thank you in advance, Arjan
  6. Arjan B

    Is ray tracing hard or is it just me?

    I think it's appropriate here to link to Bacterius' journal: http://www.gamedev.net/blog/2031-ray-tracing-devlog/. I think he does a good job at thoroughly explaining the process of writing a raytracer.
  7. Arjan B

    The Rendering Equation

    Loving this blog. Keep up the good work!
  8. Depth of field My friend had added depth of field to the path tracer, which shows some nice results. Without DoF: With DoF: This effect was achieved by picking a focal point on the focal plane for every pixel, and then jittering our camera rays to go through this focal point. Finished report After some significant revisions on our two reports for the two subjects for which we did this project, we are finally finished. I feel like I've learned an awful lot more about rendering, mostly due to looking at it from a different angle than the approach I'm used to (rasterization). Working on this project has been a joy for me and I'm happy with the results. Having finished the report does not mean that we're finished with this project. We do intend to find some time to add more features. But, in reality, time might be sparse. My interests have shifted to learning how to implement these kind of effects (AA, DoF, GI) in a rasterization setting. I hope people enjoyed having a look at this series of blog posts. Maybe there'll be more. Thanks for reading!
  9. For a school project, I'm implementing SPH fluid simulation, according to the paper by Müller et al in 2003: "Particle-Based Fluid Simulation for Interactive Applications". I've implemented the calculation of density and pressure, the pressure forces, viscosity forces and gravity. Now that I'm adding a bounding box, things start going wrong.   My response to a collision is to move the particle back to the contact point, reflect its velocity around the normal of the box at the contact point, and damp the magnitude a bit by some bounce factor.   Now I have the following scenario. Two particles, p2 above p1, start by floating somewhere in the bounding box. They are too far away from each other for the pressure or viscosity forces to work on them. So gravity starts pulling them both down. p1 reaches the bottom of the bounding box, bounces a little bit and then stays on the bottom. Now, p2 is still too far away for pressure/viscosity forces and then, within one timestep, p2 hits the bounding box as well and is placed at the bottom. Now p1 and p2 are incredibly close to each other, causing the pressure force to be incredibly large. This makes the particles propel away from each other with extreme speed.   What kind of solution would you suggest? Just decrease the timestep? Use penalty forces instead of projection?   Thanks in advance!
  10. Arjan B

    Beginning GLSL - Quick Question

    Good luck! This online book helped me greatly: http://www.arcsynthesis.org/gltut/index.html
  11. Arjan B

    Beginning GLSL - Quick Question

    You calculate your view matrix and you calculate your projection matrix. Instead of telling your shader: "Here's the both of them", you simply multiply them once on the CPU and tell your shader: "Here's the view*projection matrix". There's no need to work out how to calculate the both of them in one go.   Since the result of that matrix multiplication is the same for one draw call anyway, you might as well do it just once on the CPU, instead of doing it again and again for every vertex in the shader.   Yes, every time you update your view matrix, you will have to multiply it with the projection matrix again and then feed that to your shaders.
  12. Arjan B

    Beginning GLSL - Quick Question

    Well, uniform variables in GLSL remain constant over a drawing call, like glDrawArrays() for example. So with a separate call, I meant that you have your uniform matrix set for drawing the background tiles, draw them, set the uniform matrix for your character and then draw that one.   If you were to use a different shader for drawing your character, which uses an extra matrix that you don't want to have in the drawing of your background tiles, then you would have to create a separate shader and program for this. You then just bind the right program right before you draw.   Now my initial suggestion of using an extra matrix might be overkill for simple quads (I assume you draw 2D sprites on quads). If all you plan sprites to have is different positions, then you could use some uniform offset/position variable. For all these different characters, you would first set the uniform position variable, then draw that character. This might be very similar to what you are doing now, I guess.   I'd like to add that I'm by no means an expert, I'm just putting my thoughts out there.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!