Jump to content
  • Advertisement

Vladimir128

Member
  • Content Count

    12
  • Joined

  • Last visited

Community Reputation

229 Neutral

About Vladimir128

  • Rank
    Member

Personal Information

  • Role
    Programmer
  • Interests
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hm... It seems that I have the same problem, but only with certain resolutions. I'll also try to find a solution :))
  2. Vladimir128

    Metal without Mac

    Well, actually, we should not worry about port for MacOS, OpenGL will work there probably for many years, and if not, there are MoltenGL and MoltenVK. But what if one has intention to learn Metal as it is and doesn't have Mac?
  3. Vladimir128

    Metal without Mac

    Hi everyone. As we know, OpenGL is now deprecated on MacOS and iOS, and it is recommended for developers to use Metal. But the problem is - it requires a modern hardware. There are two quite obvious ways to use it - a real Mac and Hackintosh. But the question is - are there some ways to run Metal on virtual machine? I used VMware for testing OpenGL on MacOS, the video card on virtual machine supports OpenGL 2, and it is enough for making the game engine cross-platform, since the main issue is creating window, handling events and so on, and not modern features of OpenGL. But for Metal it's definitely not enough, even to run the simple application with a polygon. And it seems that VMware is not capable of using a real video card to render. It's probably possible to build Metal application on virtual machine and then test with a friend who has real Mac, but that's way too slow and inconvenient for both Maybe someone has any ideas or even experience with Metal on virtual machine.
  4. Vladimir128

    Problems with vsync on Linux

    Thanks for reply I listened to your advices, and these few days we were testing different applications with friends, who have Linux. VSync works fine, looks like it was a problem of Virtual Machine. But there's another problem with extensions. There are a lot of similar code, so I'll show you just a pseudocode, with a few lines, how it is structured in files. Something like this   File GLXExtensions.h   class GLXExtensions { public:     GLXExtensions(void);     ... private:     bool Initialize(void);     ... };   File GLXExtensions.cpp   #include "GLXExtensions.h"   static PFNGLXSWAPINTERVALSGIPROC glXSwapIntervalSGI = NULL;   GLXExtensions::GLXExtensions(void) {     Initialize(); } ... bool GLXExtensions::Initialize(void) {     glXSwapIntervalSGI = (PFNGLXSWAPINTERVALSGIPROC)glXGetProcAddress(reinterpret_cast<const GLubyte*>("glXSwapIntervalSGI"));       // CASE 1       return true; }   File GLExtensions.h   extern PFNGLXSWAPINTERVALSGIPROC glXSwapIntervalSGI;   File GLRenderDevice.h   #include GLExtensions.h   class GLRenderDevice { ...     bool initialize(void); ...     GLXExtensions *glxExt; }   File GLRenderDevice.cpp   #include "GLRenderDevice.h"   GLRenderDevice::initialize(void) {     glxExt = new GLXExtensions();       // CASE 2 }   The problem is - if I call glXSwapIntervalSGI in the "CASE 1" line, everything is fine, but if I call it in "CASE 2" line, program crashes. Even if pointer is not null. And also, if pointer is declared without "static" attribute, just as "PFNGLXSWAPINTERVALSGIPROC glXSwapIntervalSGI = NULL;", than program crashes even in "CASE 1". The same thing with other extensions, for example with VBO. So the question - how these pointers should be initialized properly? By the way - everything works fine on Virtual Machine, it crashes only on real Linux. Also the same code with wglGetProcAddress works on Windows, even without "static". Does anyone have suggestions?
  5. It's difficult to recommend you NeHe tutorials, because they don't use shaders, except one lesson So, don't worry about that GLUT is the library for studying OpenGL, and as far as I know, it is not recommeded to use in serious applications. FreeGLUT is probably also a little bit deprecated. GLU is the library with some functions that can be written with just standart GL. So, it's helpful when you study OpenGL, but not necessarily too. To create window and initialize OpenGL you probably should use GLFW or API methods of your OS, if you prefer low-level programming SDL is also a popular library
  6. Hi everyone. I've got some troubles with OpenGL on Linux in my graphical application. Actually, I cannot turn vsync on. glXSwapInterval returns GLX_BAD_CONTEXT. So I took the code for Linux from NeHe tutorial, as a simple example, but the situation is the same. I tried to call the function after glXMakeCurrent and before, but anyway it returns an error. Probably, there's no need to show you the code, because it is available on the NeHe website, and you can try it yourself. Has anyone got some suggestions, what can be the reason and how to fix it? P.S. There are few functions glXSwapInterval, for example glXSwapIntervalSGI. I run Linux on the virtual machine with a simple videocard, so the function available there is glXSwapIntervalMESA. Maybe this is the reason ) And other functions would work on the real PC. But I'm not sure
  7. Vladimir128

    What happened to Rastertek?

    First - strange lines: #include <DirectXMath.h> using namespace DirectX;   OK, I replaced it with: #include <xnamath.h>   Second - D3DCompileFromFile not found. MSDN says it's from d3dcompiler.h file, but there's no such function Of course, as a last resort, shaders can be loaded manually and compiled with D3DCompile :)
  8. Vladimir128

    What happened to Rastertek?

    Has anyone got executable files, which were available with tutorials?
  9. Calculation of VB each draw is probably not really neccessary. Because you can add something like "dirty" flag, and calculate buffer only if text was changed
  10. Vladimir128

    DX11 Fullscreen.

    Yes, try to do that, and when you get the message, check if your application was activated and call the method IDXGISwapChain::SetFullscreenState with parameter Fullscreen set to TRUE. That should work :)
  11. It looks like in this case you won't be able to check different multisample quality levels, because the method ID3D11Device::CheckMultisampleQualityLevels should be used with already created Device, but before creating SwapChain
  12. Vladimir128

    Using two shaders at the same time

    Try to mix shaders in HLSL code again, and to make it work. Because, even if you will successfully apply two shaders to one object, all pixels of this object will be processed twice. That's not very good for performance, especially when there are a lot of objects
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!