Sign in to follow this  
Kincaid

OpenGL Enable Antialiasing

Recommended Posts

Hello there is this simple option for my videocard driver to set antialiasing (2x, 4x 8x, 16x, app-specific) This affects all my (self-written gl) apps (of course) I would like my app to contain this option. Ive searched for antialiasing in opengl, and I encounter things like simpy setting glHint etc, but nothing relating to the 2x 4x etc. (I understand what it is, what its supposed to do, but dont get the implementation like in normal 3d games etc) How can I get these options in my app (like normal games have) Is there something like EnableAntiAliasing (8x)... (I figure this is too simple and a stupid question... so sorry :) ) regards

Share this post


Link to post
Share on other sites
I'm not an OpenGL person, but this Nehe tutorial seems to explain how to do it, and from reading the comments, it explains how to check for four samples and then drops back to two if four is not available, so perhaps this will help.

http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=46

Share this post


Link to post
Share on other sites
I have come across this article in the past, and Im bound to read up on it eventually :)

Though its exatly this sort of stuff that leaves me to wonder about another possible (simpler) call.
In the case of this article, the app itself has to do some considerable coding/work to get it to work, yet it also works fine without any of that, by just enabling the option in my videocard driverpanel....
Somehow the driverpanel app simply enables some global option (antialias everything 16x eg), affecting(/overriding) all apps, that themselves dont contain any supporting code for it...


Share this post


Link to post
Share on other sites
Quote:
Original post by Kincaid
I have come across this article in the past, and Im bound to read up on it eventually :)

Though its exatly this sort of stuff that leaves me to wonder about another possible (simpler) call.
In the case of this article, the app itself has to do some considerable coding/work to get it to work, yet it also works fine without any of that, by just enabling the option in my videocard driverpanel....
Somehow the driverpanel app simply enables some global option (antialias everything 16x eg), affecting(/overriding) all apps, that themselves dont contain any supporting code for it...


Umm, you know, that button on the driver panel...is not some magical "Anti Aliasing Turn On!" button. It does a lot of work behind the scenes....the SAME work you have to manually code to enable it on your application.

Share this post


Link to post
Share on other sites
thats what im asking...
what does it do ?
since any app with NONE of the code mentoined in the lesson or anywhere else, is perfectly antialiased by enabling this.
Certainly it doesnt go around including any code into all these programs. :)

so it seems it boils down to some general EnableAntiAlias (mode) call (which might be complex to the bone in its working) that I should be able to call also...

Share this post


Link to post
Share on other sites
Quote:
Original post by Kincaid
thats what im asking...
what does it do ?
since any app with NONE of the code mentoined in the lesson or anywhere else, is perfectly antialiased by enabling this.
Certainly it doesnt go around including any code into all these programs. :)

Although that's exactly what it does. All 3D related commands inevitably will go through the graphics driver, regardless of the application or game. And the driver is free to modify these commands. For example by injecting FSAA related states.

Quote:
Original post by Kincaid
so it seems it boils down to some general EnableAntiAlias (mode) call (which might be complex to the bone in its working) that I should be able to call also...

Yep. It's the one that posters above already mentioned.

Share this post


Link to post
Share on other sites
Quote:
Original post by sybixsus
this Nehe tutorial seems to explain how to do it [...]

This is indeed how 99% of all developers do it, and although it works fine, it should be noted that technically speaking, it is wrong. But I guess there is no better way.

Function pointers and the extension string are only valid for the context they're created from (or a context that is identical in every respect).
Thus, you have no guarantee that any function pointer or extension advertised in the extension string that you got from "some accelerated context" works for any other context.
Luckily, reality is much more forgiving than theory, in reality it "just works" anyway.

Share this post


Link to post
Share on other sites
Quote:
Original post by Kincaid
thats what im asking...
what does it do ?


If you don't override it in the panel, then you'll just get a normal window back and you need to do all the querying to get a multisampled pixel format window. Presuambly, if you enable it in the panel, you will get such a window right from the start, without having a say about it, and the GL_MULTISAMPLE state will be automatically enabled for all draw calls. So no, there is no magical "enable fsaa now" button, unfortunately, but it shouldn't be hard to implement this.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Partner Spotlight

  • Forum Statistics

    • Total Topics
      627638
    • Total Posts
      2978327
  • Similar Content

    • By xhcao
      Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness. 
    • By cebugdev
      hi guys, 
      are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
      Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic 
      let me know if you guys have recommendations.
      Thank you in advance!
    • By dud3
      How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below? 
      Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
      Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.
       
      References:
      Code: https://pastebin.com/Hcshj3FQ
      The video shows the difference between blender and my rotation:
       
    • By Defend
      I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
      My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
      * make lots of VAO/VBO pairs and flip through them to render different objects, or
      * make one big VBO and jump around its memory to render different objects. 
      I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
      If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?
       
    • By test opty
      Hello all,
       
      On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
      #include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }  
      The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
      What is the problem and how to solve it please?
  • Popular Now