Jump to content
  • Advertisement

Foxito Foxeh

Member
  • Content Count

    12
  • Joined

  • Last visited

Community Reputation

115 Neutral

About Foxito Foxeh

  • Rank
    Member
  1.   Well, I don't really care if someone knows I can handle buffers, memory, handles, pointers, etc. Actually your answer is out of context here, I would appreciate if you first learn to read the post at the top. I'm not learning OpenGL as a developer, which most people do, I'm learning it as a PROGRAMMER. I don't like developing things but programming. It's easy for a developer to do: "frameworkInit();" and let the framework/machine do the dirty job. Well, that's what a developer do, and as a developer you will never know to solve simple life problems and you will get stuck in a bad designed software. I have the mentality that a good software is the one that is capable to reach almost any user as possible. To achieve this I have to know what's going on down there. Obviously, developers don't care about this, they just use the tools they have in hand to finish something quickly and that works and at the end you get a very heavy/bad performance software and end up posting really high specs to use it. My mind says this is NOT how this has to be, but most of the people, like you think: "don't waste you time (I have to mention I'm a student) learning complex/difficult things, use frameworks". Well yeah, I must recognize this is how small/middle software companies work but we have to emphasize there are different kinds of programmers. Obviously, if the person(s) who wrote SDL and GLFW would read your comment they wouldn't feel happy with your response lol This is not if people who tries to get very underground to understand how and why something happens in the software are "idiots". This is about what they can learn and do after understanding this. You think as a developer, I think as a programmer, that's the difference, don't mix it and call other "idiots" cause they don't do what you do. I have my way of doing things, you have yours. That depends on what you are focusing to do, I'm focusing on different things lol
  2.   Jaja wow, never thought OpenGL would become a mess. I doubt why they didn't add a different function in every new driver release to specifically create that context and just send the attributes as a parameter but anyways, seems I'll need to study it a bit to find a workaround. Thanks promit! :)
  3. Hi, I've read in some StackOverflow and other websites that using wglCreateContext is obsolete. I don't know if this is correct since ALL OpenGL examples I've seen make use of this method. I've always created my OpenGL context this way, by calling wglCreateContext and wglMakeCurrent. Is this correct? Do this changes in different OpenGL versions?   Is the same to create an OpenGL context in 1.x, 2.x, 3.x and 4.x ?   I'm talking about doing it raw, not using extensions. Will this always be the same for any OpenGL context, ignoring the version?
  4. Foxito Foxeh

    About different OpenGL Versions

    Ohh that's what I was trying to understand @mhagain, I didn't know how OpenGL driver worked, didn't know that OpenGL32.DLL is an invoker of GPU driver. By performance, I really haven't noticed performance issues an I have achieved a lot of functionalities with 1.1 in modern (2013 - 2015) nVidia - Radeon - Intel APU/GPU's. It's not "one frame per second" as you said and I have done all by hand. Maybe there are some things you can't achieve so easily in 1.1 but I will discover it later. I haven't tried to call OpenGL in a machine without GPU drivers to see if it's software emulated or if it doesn't work. By what BitMaster said, I now undertand that the bytecode from the DLL is a common interface for GPU drivers. Thanks, that was really helpful from both of you. That was an interesting doubt I had but didn't find some information about it in google :P Thanks!   So if I really understood, actually what I'm doing when calling opengl32.dll, I'm invoking OpenGL driver from the GPU but not specifically version 1.1, but version my GPU driver support and that version still support some of the old API functions (As wglCreateContext, etc.) but not specifically 1.1 version, but 1.1 API functions over a different OpenGL version? So that's why there's a lot of people on the net that say "Do not use 1.1 API functions", because now days, recent OpenGL drivers still support old 1.1 fuctions, but they will be removed cause they're deprecated. So in summary: I'm not actually using OpenGL 1.1, i'm using the X version my GPU support but that X version still support old 1.1 functions and I'm calling them. Is that correct?
  5. I've been working for at least half year in raw OpenGL by making calls to OpenGL32.DLL in windows System32 library. The problem is: I know there are different GPU drivers. Ones support some version of OpenGL, another ones superior versions and I know all support 1.x version...   The problem is: what versions do Windows use in OpenGL32.DLL I have noticed there are a lot of versions from this library. When I used Windows 7 versions was 5 or 6, now in Windows 10 version is 10.0.   Are there some changes in OpenGL32.DLL versions? Since I program raw code scratch from library calls, can I use later versions (2.0 or 3.x) from this DLL? I mean, how do I know which are the functions OpenGL32.DLL has on each version? Do Windows 10 OpenGL32.DLL version 10.0.1 has OpenGL3 or 4 functions?   What are the disadvantages of using the OS core opengl library? Why is it not recommended to use OpenGL 1.x? Do GPU companies will stop using it? What will happen to applications maden by calling OpenGL32.DLL?   I understand very good how OpenGL works but not how drivers and extensions work. (I have always used the standard 1.1, not extensions)   I tried searching information about different OpenGL32.DLL versions but all seem to point to old 2.0 which has very old documentation about it's bytecode functions. They're the basic ones from gl.h but... Where is the rest of the new versions code? Are there other OpenGL DLL's to use in extensions? For example, GLU32.dll is another one which offer quadrics and some complex 3D shapes but it also come natively in Windows System32 libraries. Are there other DLL's to enable Opengl 2.x and above use? I readsomething about GLEW. Do GLEW has a GLEW32.DLL file which comes native in Windows? Do this allows me to use different OpenGL versions? Do I have to install more stuff to enable it? That's what I think is annoying from OpenGL. I don't want my app users to install extra stuff, everything should be already managed by the OS, as OpenGL1.1 does in OpenGL32.DLL on Windows. For example, DirectX natively comes in Windows with different versions (9, 10, 11) so I can directly program to it, but in OpenGL I still can't understand how to use different versions, and it's kind of annoying :c   I'd like to make raw calls to OpenGL but, where or how do I use new OpenGL 2.x, 3.x or 4.x functions?
  6. Foxito Foxeh

    Drawing Crosshair in perspective?

      You switch to an ortho view, then draw your crosshair.   With this kind of setup, each frame looks like this (I note that you tagged GL1.1 so I'll use the typical calls that you'd use with that version):   glMatrixMode (GL_PERSPECTIVE); glLoadIdentity (); gluPerspective (blah, blah, blah, blah); glMatrixMode (GL_MODELVIEW); glLoadIdentity (); // draw 3D stuff in perspective projection here...   glMatrixMode (GL_PERSPECTIVE); glLoadIdentity (); glOrtho (blah, blah, blah, blah); glMatrixMode (GL_MODELVIEW); glLoadIdentity (); // draw 2D stuff in ortho projection here...   SwapBuffers (blah);   If you have followed typical online tutorials you might be under the impression that you can't do this, perhaps because they (assuming GLUT) only reset the projection matrix in the reshape func.  But of course you can do this; you can change the projection matrix mid-frame as many times as you wish (just don't do it between a glBegin and it's corresponding glEnd).  With this kind of setup your 3D stuff is still drawn in perspective, but your crosshair is drawn in 2D and you can use screen co-ords for positioning it (depending on the parameters to your glOrtho call, of course).   Performance: it's not a performance optimization to only change the projection matrix when the screen size changes (e.g in a reshape func if using GLUT).  Calculating a new projection matrix is just a dozen or so floating point ops, compared to the thousands, tens of thousands or hundereds of thousands that you may be doing elsewhere.  It's not even going to register as noise on any kind of performance graph.     Wow thanks dude! This is really working and has no performance issues :)) Thanks a lot, this helped me to create the crosshair ^_^   By the way, there's no GL_PERSPECTIVE in OpenGL 1.1 (Don't know if in superior ones there's). It's GL_PROJECTION :)
  7. Foxito Foxeh

    Drawing Crosshair in perspective?

      So, how to do that?       Thanks! I will try it later and see how this works and I guess the wall bug is what could happen. Thanks! :3
  8. Hi! I'm trying to draw a crosshair in the middle of the screen in a perspective view... I've seen a lot of code's in google but they all just draw in Ortho. How could I draw a crosshair in perspective view and while moving/rotating camera?
  9. Foxito Foxeh

    Antialiasing in 3D with Core OpenGL (Windows)

    Thanks, I will try accumulation, I read the post and looks interesting! I just found how to do Motion Blur but there was not explanation of how accumulation worked, and the FRONT and BACK buffer tips are awesome too! Thanks kolrabi! And Erik, GL_LINE_SMOOTH  is not working when depth_test is enabled but anyways, GL_LINE_SMOOTH  is not a good antialiasing. And about the tag... yeah, It's windows desktop but didn't find other OpenGl tag... jajajaja :P 
  10. Foxito Foxeh

    C# - glu32 DLL rferece?

    Thanks to both, I found what I needed in MINGW include headers. In GL/ there is GLU prototype header. I used it and I've been working fine with it.
  11. Hi, I've been searching for a way to do antialiasing in Core OpenGL (OpenGL 1.1). I need to create antialiasing but when I looked in other sites as SO or StackExchange I saw answers like "Implement your own antialiasing", and I was just like... I want to learn HOW to create antialiasing. So my question is, what do I need to learn to create my own antialiasing implementation? Since there is not antialiasing in GL 1.1 and I strictly have to use 1.1, HOW do I create antialiasing? What do I need to read and learn? Is there some code to learn? I need to find what and how AA works in OpenGL to make my AA implementation. I'm using depth test so please don't suggest line_smooth, neither suggest gl_multisampling because, again, I strictly need to use OpenGL 1.1, not 1.3 or above. Thanks, I'd really appreciate if you can help me
  12. Hi, I'm trying to use both OpenGL32.DLL and GLU32.DLL libraries in C# WinForms (Or WPF using Windows Form Host Integration). What i can't find is the glu32.dll reference. I found opengl32.dll reference in MSDN reference, but I want to use some of glu32 functions. Is there any site that has the glut reference that uses Windows?
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!