Can't run GL_NV_path_rendering extension demos

Started by
2 comments, last by wintertime 8 years, 8 months ago

Been trying to mess with this extension, but even the demos don't run. Says the extension is unsupported. I have a Nvidia 980M on my laptop and it should be supported on any CUDA video card. Anyone know of anything that wouldhelp me get stuff running?

https://developer.nvidia.com/nv-path-rendering

Advertisement

Strange. Perhaps the demos are run on Intel graphics by default, if you have auto-switching graphics on your laptop. Try setting the default to the NVidia.

Or print all available extensions to see if everything looks right.


#include <fstream>
#include <string>
#include <algorithm>
#include <gl/gl.h>

...

wglMakeCurrent(hDC, hGLRC);
{
	std::ofstream file("extensions.txt", std::ios::trunc);
	file << glGetString(GL_VENDOR) << '\n' << glGetString(GL_VERSION) << "\n\n";
	std::string str(reinterpret_cast<const char*>(glGetString(GL_EXTENSIONS)));
	std::replace(str.begin(), str.end(), ' ', '\n');
	file << str;
}

And if it's still missing make sure you have the latest drivers (nvidia.com).

Strange. Perhaps the demos are run on Intel graphics by default, if you have auto-switching graphics on your laptop. Try setting the default to the NVidia.

Or print all available extensions to see if everything looks right.


#include <fstream>
#include <string>
#include <algorithm>
#include <gl/gl.h>

...

wglMakeCurrent(hDC, hGLRC);
{
	std::ofstream file("extensions.txt", std::ios::trunc);
	file << glGetString(GL_VENDOR) << '\n' << glGetString(GL_VERSION) << "\n\n";
	std::string str(reinterpret_cast<const char*>(glGetString(GL_EXTENSIONS)));
	std::replace(str.begin(), str.end(), ' ', '\n');
	file << str;
}

And if it's still missing make sure you have the latest drivers (nvidia.com).

Oh shit, this is exactly what is happening. Didn't think to check it, just assumed it was running on my 980M. Now just need to figure out how to make GLFW force use the 980M.

Edit: This was caused my Nvidia's Optimus software for laptops. The solution is super simple, just export a flag on windows. Just add this to your main file with window.h and it will work great.


extern "C" 
{
_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}

There seems to be some support for forcing the better GPU in GLFW already: https://github.com/glfw/glfw/issues/520

This topic is closed to new replies.

Advertisement