Jump to content
  • Advertisement

dietrich

Member
  • Content count

    22
  • Joined

  • Last visited

Community Reputation

770 Good

About dietrich

  • Rank
    Member

Personal Information

  • Role
    Programmer
  • Interests
    Art
    Design
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi, Hashbrown! The math actually looks fine to me, are you sure there's a problem? I've always thought that this kind of distortion is to be expected from a perspective projection. Something to do with the fact that we're projecting a 3D scene onto a 2D plane. In the second image the spheres are closer to the edge of the screen, which results in the more apparent distortion, but they aren't perfectly circular in the first image either. Here's an illustration: https://en.wikipedia.org/wiki/File:Fig_3._After_pivoting_object_to_remain_facing_the_viewer,_the_image_is_distorted,_but_to_the_viewer_the_image_is_unchanged..png Shaarigan, could you please explain, what a and b coefficients in your implementation do? I'm using basically the same matrix as Hashbrown, so I'm guessing, yours is a more general form? I mean, if left == -right and top == -bottom, then both a and b are 0, yielding the same matrix as Hashbrown has. Setting those to different values results in some kind of skewed/asymmetrical frustum?
  2. That's strange, we do the same thing in our project (assign a UStaticMeshComponent to RootComponent), and it works fine. UStaticMeshComponent inherits from USceneComponent, so such assignment is valid. Does it work, if you do an explicit cast? RootComponent = (USceneComponent*)PickupMesh; Don't know, if it has anything to do with the problem, but it looks like you're using an older version of Visual Studio (2013?), while they recommend VS2017 for UE4.18 (although we use VS2015 without any issues). UPDATE: I guess, it doesn't immediately follow, that the assignment of UStaticMeshComponent* to USceneComponent* is valid, only that a cast from UStaticMeshComponent* to USceneComponent* is. And unfortunately I'm not the right person to explain, whether such cast will happen implicitly, and what kind of constructor you need in order for it to happen.
  3. All you need is a unit-length normal, you can just add v_Normal = normalize(v_Normal); to your fragment shader:)
  4. Ah, well, that happens too, glad to hear it's fixed:) One more thing: you're currently using v_Normal as is, but you really want to normalize it again in the fragment shader before doing any computations with it (the tutorial seems to be missing this step). Each fragment receives an interpolated normal, and a linear interpolation of unit vectors is not necessarily a unit vector itself, here's a nice illustration (image via google):
  5. Then I'd suggest to simplify things a bit and move the lighting computations into world space, and see if it changes anything. v_Position will then become v_Position = vec3(worldMatrix * object_space_pos); and light position will remain unchanged, v_LightPos = u_LightPos;
  6. No problem, @lonewolff That looks interesting:) worldMatrix is at the moment an identity matrix, correct?
  7. Hi, lonewolff! My initial guess would be that your v_Position and v_LightPos are in different coordinate spaces, making the distance and light direction computation meaningless. v_Position is in view space: v_Position = vec3(WV * object_space_pos); Can you make sure that you also transform v_LightPos into view space somewhere in your app? Does it help if you comment out attenuation? diffuse_light = diffuse_light * (1.0 / (1.0 + (0.25 * dist * dist))); Also, probably just a formality, but directional lighting doesn't use light position or attenuation, it's an approximation for the case when the light source is very far relative to the scale of the scene, e.g. the sun illuminating a building. In such a case we assume that the light rays all travel in the same direction and the light's intensity falloff with distance is negligible. Point lighting would be a better name in this case.
  8. dietrich

    Horrible texture/object popping

    While not exactly blending, several modern games do try to smooth out LOD popping by using a dithering-like effect. Here's a blog post on how Assassin's Creed 3 does it; Far Cry 4 uses a similar effect, and maybe GTA V too?
  9. I believe, font height isn't actually height of capital letters, it's the entire vertical extent of the font (consider, eg. glyphs "Ć" and "g"). So the height of glyph "C" will be less than 32, in your bitmap it's roughly 18 pixels. If you set the height to 18, it should fix the vertical stretching. Then, there's the problem of the rendering being to small. Will it help, if I point out, that it's exactly half the size of what you expect?:)
  10. dietrich

    Is Sublime Text a valid option for C++ development?

    I use Sublime for hobby projects (C-style C++ and shaders mostly) and I find it much more comfortable, fluid and frustration-free than Visual Studio's built-in text editor, that I use at work. For me it was enough to justify having to set up command-line compilation. Agree 100%, but using an external text editor doesn't really prevent one from debugging in VS - so why not enjoy the benefits of both? One can always edit code in Sublime, compile and Alt-Tab into Visual Studio to do some debugging.
  11. First you would load your image into an array of bytes, representing the pixels of the image, essentially a bitmap. Then you can manipulate this array any way you like, including extracting portions of it to form a new texture. A pixel at location {x, y} would be addressed as data[y*imageWidth + x] To load a bitmap you could write you own parser by looking at the specs of a specific file format (BMP is fairly straightforward to load, others more challenging), or you could save yourself some time and use a library that does it for you. I prefer stb_image, it's lightweight and easy to use. After that it's simply a matter of using DirectX API to initialize a Texture2D with your data. IIRC, you can pass a pointer to your bitmap as pSysMem member of D3D11_SUBRESOURCE_DATA, when calling ID3D11Device::CreateTexture2D. Another option would be to actually preprocess you font into a set of textures, one per character. Again, stb_truetype by the very same Sean Barrett could do that for you. Yet another option is to actually use a single font texture and use appropriate UV coordinates to draw a protion of your texture, containing the desired character. Personally, I would go with this option (I have tried both recently, and having a single texture just meant that much less bookkeeping, although it may well be different with your project), and since you already know the texcoords of each character in you texture, it shouldn't be too hard to implement.
  12. Not an expert on the topic myself, but here is a series of blog posts you may find useful, if you haven't seen it yet: link.
  13. dietrich

    XMMatrixRotationRollPitchYaw

    Matrices m1 and m3 are more or less equal, as expected, and look something like this, if you print them out: |-0.00  0.00 -1.00  0.00| | 0.71  0.71 -0.00  0.00| | 0.71 -0.71 -0.00  0.00| | 0.00  0.00  0.00  1.00| DirectXMath uses row-vector convention under the hood, so you would transform a vector like so: v*M1*M2*M3. Matrix concatenation order is then M1*M2*M3. Since XMMatrixRotationRollPitchYaw rotates first around X, then around Y, your m3 matrix holds the expected transform. Here is a great article about row vs column-vector convention: link.   Now to that "more or less" part. Transforming a vector by either m1 or m3 should produce visually indistinguishable results, however, these matrices aren't equal up to binary representation. That's easy to see, if we inspect the memory, where the matrices are located, or even simply add a few decimal places to the print out: m1 | 0.000000089 -0.000000030 -0.999999821  0.000000000| | 0.707106709  0.707106829 -0.000000089  0.000000000| | 0.707106650 -0.707106709  0.000000119  0.000000000| | 0.000000000  0.000000000  0.000000000  1.000000000| m3 |-0.000000119  0.000000000 -0.999999881  0.000000000| | 0.707106709  0.707106709 -0.000000084  0.000000000| | 0.707106650 -0.707106769 -0.000000084  0.000000000| | 0.000000000  0.000000000  0.000000000  1.000000000| I believe, that is due to the fact, that m1 and m3 were constructed differently and hence went through a different number of different floating point operations. XMMatrixRotationRollPitchYaw doesn't actually use matrix multiplication at all, but rather goes through an intermediate quaternion representation to compute the resulting rotation matrix. And floating point operations tend to accumulate error, which eventually leads to slightly different results.
  14. If you're looking for a way to render your scene at a fixed resolution, no matter the window size, I think it will be best to always resize your swapchain's buffers, but only render to a portion of the backbuffer by keeping the correct width and height in the D3D11_VIEWPORT. Keep in mind though, that this may not be what the user expects, especially if the window becomes smaller than the specified resolution and part of the rendered scene get cut off.
  15. I've just been experimenting with window resizing, and I'm fairly sure that what you're seeing isn't the correct behavior. If you don't do any handling of WM_SIZE message, the swap chain should just stretch the back buffer to match the front buffer, ie your window client area, when presenting it to the screen. If you're seeing your window background instead, it probably means that IDXGISwapChain::Present method isn't working correctly. Could you take a look at the HRESULT it's returning or post your rendering code here? Also, are you using debug DirectX runtime (creating d3d device with D3D11_CREATE_DEVICE_DEBUG flag)? It may report some relevant issues too.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!