Knowing my REAL OpenGL version - RESOLVED

Started by
29 comments, last by 21st Century Moose 9 years, 6 months ago
They shouldn't affect NSight. But are you using the latest headers anyway? You need glext.h & wglext.h https://www.opengl.org/registry/

I am using glew version 1.9.0, and including gl.h. Not exactly sure what version gl.h is using, I never had any issues with it before now.

Advertisement

You're absolutely correct. I used to create a dummy window with my own responsibility on getting the wgl functions. Then creating the second windows calling glew functions.

I need to rethink this - is calling glew safe on two separate contexts? I suspect so with 'normal' back buffers...

According to the GLEW documentation it's safe as long as the used pixel formats have the same capabilities.

So I finally get pissed enough to contact NVidia directly. Their response:

Hi thegeneralsolution,

Seems you are using Optimus system, the OGL 3.3 requirement is for Nsight in Visual Studio, but not for your sample [your sample is under 4.4 as the return value from glGetIntegerv]. That because Nsight in Visual Studio will use OGL 3.3 to render some texture, geometry, etc. Due to your Optimus system, that choose Intel GPU as Visual Studio's render device, which may only support OGL 3.0 in your machine.

Please try to disable Intel GPU in BIOS, or force Visual Studio to use NV GPU as render device. That will solve your issue.

Thanks
An

Its true my system has an integrated chip alongside an NVidia chip, so this is a very interesting and promising lead. I hate when they say "This will solve your issue" though. So presumptuous!

I will love them forever if this fixes my problem though...

That's because every problem with nvidia on linux is optimus, and optimus alone. :)

At least according to google:

"optimus linux problem" --> About 4,020,000 results

Hahaha, that's terrifying!

But I'm actually using Optimius on a Windows machine (Or at least that's what that NVidia rep seems to be telling me, I wasn't familiar with Optimus before this). Will have to wait till later tonight to try this out. Hopefully I can finally resolve this...

Well now I am on the trail of trying to disable Optimus... and it looks like it might not be possible on my Dell XPS 15Z. :\ I tried disabling my Intel Chip on my device manager, and that pretty much breaks everything. Resolution is dropped to 800x600 - and I'm guessing its not GPU accelerated.

Going to follow up with NVidia.. man its really going to suck if this a dead end for my machine, after all this work! :(

Well now I am on the trail of trying to disable Optimus... and it looks like it might not be possible on my Dell XPS 15Z. :\ I tried disabling my Intel Chip on my device manager, and that pretty much breaks everything. Resolution is dropped to 800x600 - and I'm guessing its not GPU accelerated.

Going to follow up with NVidia.. man its really going to suck if this a dead end for my machine, after all this work! sad.png

AFAIK, you cannot disable Optimus. Intel's GPU is the only way NVIDIA's GPU can communicate with a display.

Nsight (and PerfKit, as Nsight relies on it) really had a problem with Optimus, and probably it still has (I haven't tried the latest version yet).

Btw, you should know how to activate NVIDIA's GPU in your application. :)

By default, Intel's GPU is used. Fortunately, it is so easy with Optimus.


Btw, you should know how to activate NVIDIA's GPU in your application.

Well I know my application definitely runs on the NVidia GPU (at least partly, not sure now that I know about this Optimus stuff) - I had problems long ago when I was getting started and I didn't realize I was running off the Intel chip.

So Aks, you think I should conclude that using NSight on this machine is a hopeless case? sad.png

-Michael

*emerges from dark cavern, tattered clothing, singed hair, scarred, and bleeding* I AM VICTORIOUS! *holds up disembodied head of Optimus*

So it looks like Optimus refuses to be disabled on my laptop screen - if I disable my Intel chip, everything goes black. HOWEVER! If I then connect an external monitor, it connects entirely with the NVidia GPU and I can run NSight's debugger without issue!

Finally I can mark this thread as resolved. Man, what a bitch this was!

This topic is closed to new replies.

Advertisement