Jump to content

  • Log In with Google      Sign In   
  • Create Account

Aks9

Member Since 10 Jul 2009
Offline Last Active Today, 09:16 AM

Posts I've Made

In Topic: Knowing my REAL OpenGL version - RESOLVED

08 October 2014 - 11:38 AM

Well now I am on the trail of trying to disable Optimus... and it looks like it might not be possible on my Dell XPS 15Z.  :\ I tried disabling my Intel Chip on my device manager, and that pretty much breaks everything. Resolution is dropped to 800x600 - and I'm guessing its not GPU accelerated.

 

Going to follow up with NVidia.. man its really going to suck if this a dead end for my machine, after all this work! sad.png

AFAIK, you cannot disable Optimus. Intel's GPU is the only way NVIDIA's GPU can communicate with a display.

Nsight (and PerfKit, as Nsight relies on it) really had a problem with Optimus, and probably it still has (I haven't tried the latest version yet).

Btw, you should know how to activate NVIDIA's GPU in your application. :)

By default, Intel's GPU is used. Fortunately, it is so easy with Optimus.


In Topic: Knowing my REAL OpenGL version - RESOLVED

05 October 2014 - 05:41 AM

 

I even tried older versions (like 2.0) and NSight still insists my OpenGL version is 3.0.

 

 

What does glGetString(GL_VERSION) say?

That should returned the highest GL version supported by the driver.

 

Specification is clear:

 

 

The attribute names WGL_CONTEXT_MAJOR_VERSION_ARB and WGL_CONTEXT_MINOR_VERSION_ARB request an OpenGL context supporting the specified version of the API. If successful, the context returned must be backwards compatible with the context requested.

 

 

So, if you require GL 2.0 context, you could legitimately get GL 4.4 compatibility profile, since it is backward compatible with 2.0.

 

P.S. My browser or the engine that powers up this site, or both in a combination, are "lucid". All I typed down was in the same font and size, but the outcome is ridiculous. dry.png 


In Topic: What do with compute shaders?!

04 October 2014 - 10:34 AM

 

Aks9 - Geometry shaders are useful too! No need to skip them. Just to know where to use them. 

 

 

I know. ;)

I'm sorry if my previous post make a confusion. I have skipped them in previous counting because of a general performance, not because of functionality.


In Topic: Knowing my REAL OpenGL version - RESOLVED

04 October 2014 - 09:48 AM

"Father, forgive them, for they do not know what they are doing." sad.png 

 

That is bad advice, aks9. There's nothing to learn when it comes to context creation, other than what a nightmare it can be if you have older (or buggy) drivers. 

 

 

This is a typical agnostic claim. Everything is a source of knowledge. A rendering context creation is the first thing one should learn when starting with computer graphics.

But OK, I don't have time or will to argue about that.

 

Could you post a link, example or whatever to illustrate "the nightmare"? I've been creating GL contexts by myself about 18 years already and never had a problem. The problems could arise if you create GL 3.0+ context and hope everyone support it. Well, that is not a problem of drivers. Older drivers cannot assume what might happen in the future.

 

If the drivers are buggy, there is no workaround for the problem! 

 

I just look at the whole thing as risky, since if you take that code with you to other projects, one day one of the people trying the game/program out will simply not be able to run it because their driver requires a workaround.

 

I really don't understand this.What kind of workaround? The way how a GL context is created is defined by the specification. Why risky?

 

 

 

You can do the same exact things with SDL or any other library, except you will have less grief during the process. Most of the time, anyways.

 

 

I want to have control in my hands so no intermediary stuff is welcome. It is harder at the start, but the filling of freedom is priceless.

 

 

 

This link is totally out of context. The guy is frustrated by something, but give no arguments for his claims.

Considering platform specific APIs for porting OpenGL, there was an initiative to make them unique. Khronos started development of EGL, but it is not adopted for desktop OpenGL yet.

 

 

 

EDIT: I downvoted you aks9, but I can't undo it. sad.png Your post is helpful, so I'm sorry.

 

Don't be sorry. That was your opinion and you have right to express it through (down)voting. Points means really nothing to me.

Forums should be the way to share knowledge and opinions. Some of them are true, some not. I hope right advices still prevail on the behalf of users.


In Topic: Knowing my REAL OpenGL version - RESOLVED

04 October 2014 - 05:40 AM

 

I'm going to give SDL a shot and see if it fixed my problems! 

 

 

I'm horrified with suggestions to use any of the library/wrapper for OpenGL. :(

It is not easy, but it is always better to understand what's happening under the hood than to be helplessly dependent on others.

Despite of its imperfections, OpenGL is still the best 3D graphics API for me, since I can do whatever I want having to install just the latest drivers and nothing more.

Of course, I need also a developing environment (read Visual Studio). Nobody wants to code in notepad and compile in command prompt.

 

Let's back to your problem. Before going any further revise your pixel format. It is not correct. The consequence is turning off HW acceleration and switching to OpenGL 1.1.

Tho following code snippet shows how to create valid GL context:

PIXELFORMATDESCRIPTOR pfd ;
    memset(&pfd, 0, sizeof(PIXELFORMATDESCRIPTOR));
    pfd.nSize  = sizeof(PIXELFORMATDESCRIPTOR);
    pfd.nVersion   = 1; 
    pfd.dwFlags    = PFD_DOUBLEBUFFER | PFD_SUPPORT_OPENGL | PFD_DRAW_TO_WINDOW;   
    pfd.iPixelType = PFD_TYPE_RGBA; 
    pfd.cColorBits = 32;
    pfd.cDepthBits = 24; 
    pfd.iLayerType = PFD_MAIN_PLANE;
 
nPixelFormat = ChoosePixelFormat(hDC, &pfd);
 
if (nPixelFormat == 0)
    {
strcat_s(m_sErrorLog, LOGSIZE, "ChoosePixelFormat failed.\n");
return false;
    }   
DWORD error = GetLastError();
BOOL bResult = SetPixelFormat(hDC, nPixelFormat, &pfd);
if (!bResult)
    {
error = GetLastError();
strcat_s(m_sErrorLog, LOGSIZE, "SetPixelFormat failed.\n");
return false;
    }
 
HGLRC tempContext = wglCreateContext(hDC); 
wglMakeCurrent(hDC,tempContext);
 
int attribs[] =
{
WGL_CONTEXT_MAJOR_VERSION_ARB, major,
WGL_CONTEXT_MINOR_VERSION_ARB, minor, 
WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_DEBUG_BIT_ARB, // I suggest using debug context in order to know whats really happening and easily catch bugs
WGL_CONTEXT_PROFILE_MASK_ARB, nProfile, // nProfile = WGL_CONTEXT_CORE_PROFILE_BIT_ARB or WGL_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB
0
};
 
PFNWGLCREATECONTEXTATTRIBSARBPROC wglCreateContextAttribsARB = NULL;
wglCreateContextAttribsARB = (PFNWGLCREATECONTEXTATTRIBSARBPROC) wglGetProcAddress("wglCreateContextAttribsARB");
if(wglCreateContextAttribsARB != NULL)
{
context = wglCreateContextAttribsARB(hDC, 0, attribs);
}
wglMakeCurrent(NULL,NULL); 
wglDeleteContext(tempContext);

PARTNERS