problem by loading a shader

Started by
24 comments, last by TheSeb 18 years, 10 months ago
Quote:Original post by JavaCoolDude
In the example code that I hacked together in a rush I certainly do release the char stream after everytime I allocate memory for it.
The problem lies somewhere else, I'm positive.


ahhh, I hadn't seen the first code you posted, only the second one... sorry :)

Advertisement
Quote:Original post by _the_phantom_
Quote:Original post by TheSeb
i would like to do it myself before, nobody else knows of what it comes from ? maybe something which is not in my code ? (i'm just supposing)


ok, my last shot in the dark, how are you setting up the function pointers for the extension?

@_GLoom_
I very much doubt thats going to be the problem, if you look at the classes I linked to above I use a local stream object and dont have any problems like that.


sorry what do you mean by "the function pointers for the extension" ?
if you are talking about what i think, i'm using glew, i have just put the glew.h in the include directory of visual c++ 6, i have also put glew.lib in the lib directory and i use it with this line :
#pragma comment( lib, "glew32.lib")
and i have put the .dll of glew in system32 directory
i have done nothing else. did i forget something ?
Now I'm not familiar with GLEW since I'm diehard GLEE fan, but isn't there an init function that you have to call right after creating your GL context before using the said extensions?
indeed there is a call to glewInit();
and now i see the result :-)
thanks guys for your help and your code (it helped me to make mine).
HAHAHA (dances around camp fire chanting ancient text).
I want a cookie, with chocolate chips please :P
lol
;-)

This topic is closed to new replies.

Advertisement