• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By lxjk
      Hi guys,
      There are many ways to do light culling in tile-based shading. I've been playing with this idea for a while, and just want to throw it out there.
      Because tile frustums are general small compared to light radius, I tried using cone test to reduce false positives introduced by commonly used sphere-frustum test.
      On top of that, I use distance to camera rather than depth for near/far test (aka. sliced by spheres).
      This method can be naturally extended to clustered light culling as well.
      The following image shows the general ideas

       
      Performance-wise I get around 15% improvement over sphere-frustum test. You can also see how a single light performs as the following: from left to right (1) standard rendering of a point light; then tiles passed the test of (2) sphere-frustum test; (3) cone test; (4) spherical-sliced cone test
       

       
      I put the details in my blog post (https://lxjk.github.io/2018/03/25/Improve-Tile-based-Light-Culling-with-Spherical-sliced-Cone.html), GLSL source code included!
       
      Eric
    • By Fadey Duh
      Good evening everyone!

      I was wondering if there is something equivalent of  GL_NV_blend_equation_advanced for AMD?
      Basically I'm trying to find more compatible version of it.

      Thank you!
    • By Jens Eckervogt
      Hello guys, 
       
      Please tell me! 
      How do I know? Why does wavefront not show for me?
      I already checked I have non errors yet.
      using OpenTK; using System.Collections.Generic; using System.IO; using System.Text; namespace Tutorial_08.net.sourceskyboxer { public class WaveFrontLoader { private static List<Vector3> inPositions; private static List<Vector2> inTexcoords; private static List<Vector3> inNormals; private static List<float> positions; private static List<float> texcoords; private static List<int> indices; public static RawModel LoadObjModel(string filename, Loader loader) { inPositions = new List<Vector3>(); inTexcoords = new List<Vector2>(); inNormals = new List<Vector3>(); positions = new List<float>(); texcoords = new List<float>(); indices = new List<int>(); int nextIdx = 0; using (var reader = new StreamReader(File.Open("Contents/" + filename + ".obj", FileMode.Open), Encoding.UTF8)) { string line = reader.ReadLine(); int i = reader.Read(); while (true) { string[] currentLine = line.Split(); if (currentLine[0] == "v") { Vector3 pos = new Vector3(float.Parse(currentLine[1]), float.Parse(currentLine[2]), float.Parse(currentLine[3])); inPositions.Add(pos); if (currentLine[1] == "t") { Vector2 tex = new Vector2(float.Parse(currentLine[1]), float.Parse(currentLine[2])); inTexcoords.Add(tex); } if (currentLine[1] == "n") { Vector3 nom = new Vector3(float.Parse(currentLine[1]), float.Parse(currentLine[2]), float.Parse(currentLine[3])); inNormals.Add(nom); } } if (currentLine[0] == "f") { Vector3 pos = inPositions[0]; positions.Add(pos.X); positions.Add(pos.Y); positions.Add(pos.Z); Vector2 tc = inTexcoords[0]; texcoords.Add(tc.X); texcoords.Add(tc.Y); indices.Add(nextIdx); ++nextIdx; } reader.Close(); return loader.loadToVAO(positions.ToArray(), texcoords.ToArray(), indices.ToArray()); } } } } } And It have tried other method but it can't show for me.  I am mad now. Because any OpenTK developers won't help me.
      Please help me how do I fix.

      And my download (mega.nz) should it is original but I tried no success...
      - Add blend source and png file here I have tried tried,.....  
       
      PS: Why is our community not active? I wait very longer. Stop to lie me!
      Thanks !
    • By codelyoko373
      I wasn't sure if this would be the right place for a topic like this so sorry if it isn't.
      I'm currently working on a project for Uni using FreeGLUT to make a simple solar system simulation. I've got to the point where I've implemented all the planets and have used a Scene Graph to link them all together. The issue I'm having with now though is basically the planets and moons orbit correctly at their own orbit speeds.
      I'm not really experienced with using matrices for stuff like this so It's likely why I can't figure out how exactly to get it working. This is where I'm applying the transformation matrices, as well as pushing and popping them. This is within the Render function that every planet including the sun and moons will have and run.
      if (tag != "Sun") { glRotatef(orbitAngle, orbitRotation.X, orbitRotation.Y, orbitRotation.Z); } glPushMatrix(); glTranslatef(position.X, position.Y, position.Z); glRotatef(rotationAngle, rotation.X, rotation.Y, rotation.Z); glScalef(scale.X, scale.Y, scale.Z); glDrawElements(GL_TRIANGLES, mesh->indiceCount, GL_UNSIGNED_SHORT, mesh->indices); if (tag != "Sun") { glPopMatrix(); } The "If(tag != "Sun")" parts are my attempts are getting the planets to orbit correctly though it likely isn't the way I'm meant to be doing it. So I was wondering if someone would be able to help me? As I really don't have an idea on what I would do to get it working. Using the if statement is truthfully the closest I've got to it working but there are still weird effects like the planets orbiting faster then they should depending on the number of planets actually be updated/rendered.
    • By Jens Eckervogt
      Hello everyone, 
      I have problem with texture
      using System; using OpenTK; using OpenTK.Input; using OpenTK.Graphics; using OpenTK.Graphics.OpenGL4; using System.Drawing; using System.Reflection; namespace Tutorial_05 { class Game : GameWindow { private static int WIDTH = 1200; private static int HEIGHT = 720; private static KeyboardState keyState; private int vaoID; private int vboID; private int iboID; private Vector3[] vertices = { new Vector3(-0.5f, 0.5f, 0.0f), // V0 new Vector3(-0.5f, -0.5f, 0.0f), // V1 new Vector3(0.5f, -0.5f, 0.0f), // V2 new Vector3(0.5f, 0.5f, 0.0f) // V3 }; private Vector2[] texcoords = { new Vector2(0, 0), new Vector2(0, 1), new Vector2(1, 1), new Vector2(1, 0) }; private int[] indices = { 0, 1, 3, 3, 1, 2 }; private string vertsrc = @"#version 450 core in vec3 position; in vec2 textureCoords; out vec2 pass_textureCoords; void main(void) { gl_Position = vec4(position, 1.0); pass_textureCoords = textureCoords; }"; private string fragsrc = @"#version 450 core in vec2 pass_textureCoords; out vec4 out_color; uniform sampler2D textureSampler; void main(void) { out_color = texture(textureSampler, pass_textureCoords); }"; private int programID; private int vertexShaderID; private int fragmentShaderID; private int textureID; private Bitmap texsrc; public Game() : base(WIDTH, HEIGHT, GraphicsMode.Default, "Tutorial 05 - Texturing", GameWindowFlags.Default, DisplayDevice.Default, 4, 5, GraphicsContextFlags.Default) { } protected override void OnLoad(EventArgs e) { base.OnLoad(e); CursorVisible = true; GL.GenVertexArrays(1, out vaoID); GL.BindVertexArray(vaoID); GL.GenBuffers(1, out vboID); GL.BindBuffer(BufferTarget.ArrayBuffer, vboID); GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * Vector3.SizeInBytes), vertices, BufferUsageHint.StaticDraw); GL.GenBuffers(1, out iboID); GL.BindBuffer(BufferTarget.ElementArrayBuffer, iboID); GL.BufferData(BufferTarget.ElementArrayBuffer, (IntPtr)(indices.Length * sizeof(int)), indices, BufferUsageHint.StaticDraw); vertexShaderID = GL.CreateShader(ShaderType.VertexShader); GL.ShaderSource(vertexShaderID, vertsrc); GL.CompileShader(vertexShaderID); fragmentShaderID = GL.CreateShader(ShaderType.FragmentShader); GL.ShaderSource(fragmentShaderID, fragsrc); GL.CompileShader(fragmentShaderID); programID = GL.CreateProgram(); GL.AttachShader(programID, vertexShaderID); GL.AttachShader(programID, fragmentShaderID); GL.LinkProgram(programID); // Loading texture from embedded resource texsrc = new Bitmap(Assembly.GetEntryAssembly().GetManifestResourceStream("Tutorial_05.example.png")); textureID = GL.GenTexture(); GL.BindTexture(TextureTarget.Texture2D, textureID); GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)All.Linear); GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)All.Linear); GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, texsrc.Width, texsrc.Height, 0, PixelFormat.Bgra, PixelType.UnsignedByte, IntPtr.Zero); System.Drawing.Imaging.BitmapData bitmap_data = texsrc.LockBits(new Rectangle(0, 0, texsrc.Width, texsrc.Height), System.Drawing.Imaging.ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppRgb); GL.TexSubImage2D(TextureTarget.Texture2D, 0, 0, 0, texsrc.Width, texsrc.Height, PixelFormat.Bgra, PixelType.UnsignedByte, bitmap_data.Scan0); texsrc.UnlockBits(bitmap_data); GL.Enable(EnableCap.Texture2D); GL.BufferData(BufferTarget.TextureBuffer, (IntPtr)(texcoords.Length * Vector2.SizeInBytes), texcoords, BufferUsageHint.StaticDraw); GL.BindAttribLocation(programID, 0, "position"); GL.BindAttribLocation(programID, 1, "textureCoords"); } protected override void OnResize(EventArgs e) { base.OnResize(e); GL.Viewport(0, 0, ClientRectangle.Width, ClientRectangle.Height); } protected override void OnUpdateFrame(FrameEventArgs e) { base.OnUpdateFrame(e); keyState = Keyboard.GetState(); if (keyState.IsKeyDown(Key.Escape)) { Exit(); } } protected override void OnRenderFrame(FrameEventArgs e) { base.OnRenderFrame(e); // Prepare for background GL.Clear(ClearBufferMask.ColorBufferBit); GL.ClearColor(Color4.Red); // Draw traingles GL.EnableVertexAttribArray(0); GL.EnableVertexAttribArray(1); GL.BindVertexArray(vaoID); GL.UseProgram(programID); GL.BindBuffer(BufferTarget.ArrayBuffer, vboID); GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 0, IntPtr.Zero); GL.ActiveTexture(TextureUnit.Texture0); GL.BindTexture(TextureTarget.Texture3D, textureID); GL.BindBuffer(BufferTarget.ElementArrayBuffer, iboID); GL.DrawElements(BeginMode.Triangles, indices.Length, DrawElementsType.UnsignedInt, 0); GL.DisableVertexAttribArray(0); GL.DisableVertexAttribArray(1); SwapBuffers(); } protected override void OnClosed(EventArgs e) { base.OnClosed(e); GL.DeleteVertexArray(vaoID); GL.DeleteBuffer(vboID); } } } I can not remember where do I add GL.Uniform2();
  • Advertisement
  • Advertisement
Sign in to follow this  

OpenGL Getting started with OpenGL development in linux

This topic is 577 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi everyone,

 

I have few questions about getting started with OpenGL development under Linux. Please bear in mind that I have never used Linux in my life until two days ago. I have done a lot of research but I'm confused about few things.

 

1- What is mesa? Is it a driver or something like GLEW? if it is a driver why would you use it rather than Nvidia or AMD driver? Also how can I use OpenGL without installing libgl1-mesa-dev package? doesn't libgl come with Nvidia or AMD drivers or even with linux itself?

 

If I try to compile my code without installing libgl1-mesa-dev package I get an error saying "cannot find -lgl". Doesn't Linux already have the latest version of OpenGL? and if it does how can i link to it? where is it located?

 

2-Why would you use GLEW under linux? I thought GLEW only job to implement functions that links to the driver under windows. The reason for this is because windows only support OpenGL 1.0. So GLEW gives access to OpenGL 1.0+ functions. So Linux doesn't need that since Linux support every version of OpenGL natively. Doesn't it?

 

3- What does the "-l" stands for in "-lgl" or "-lX11"? does it stand for library? 

 

What I want to do is basically create a window and an OpenGL 3.2+ context under Linux. So I assume I only need to use glx and OpenGL. I don't want to use any other dependency that I don't have to. Can I do that under Linux? I know in windows all I need is glew, wgl and windows API.

 

so I my final question is, How can I find glx.h and link to OpenGL under Linux without using mesa or GLEW?

Share this post


Link to post
Share on other sites
Advertisement

Windows is capable of other versions of OpenGL, and GLEW is a library that helps make dealing with extensions easier (on many/all platforms). I don't have much time at the moment, but getting an OpenGL 3.2 context in Linux was similar to the process to do so in Windows. Create a context (legacy) init glew, check for the CreateContext arb, and create another context that meets the 3.2 requirements.

 

The -l is a setting a build option telling the linker to link with a library. So -lX11 will link the X11 library in your code when creating the executable.

 

I can't really answer on the drivers, but I believe I had to grab some mesa packages as well, these may actually be the drivers, but Linux is very new territory for me so I could be wrong. Sorry I don't have more time to get the names correct above, but that should help you a little. Good luck

Share this post


Link to post
Share on other sites

Some links that might be useful:

 

https://www.opengl.org/wiki/Platform_specifics:_Linux (don't use the code at the end)

 

https://www.opengl.org/wiki/Programming_OpenGL_in_Linux:_GLX_and_Xlib (you can use this code to create a window). But you can also use glut (check for freeglut in your distro packages).

 

http://www.mesa3d.org/intro.html

 

On nVidia at least and as far as I know, you're not obliged to get the GL functions/extensions pointers. But it seems that every one does this since it is more portable.

 

Only nVidia now (and probably the latest drivers for the latest graphic cards on AMD) allow to do OpenGL without requiring Mesa. You will then need to install the official drivers, just like under Windows. Sometimes (even often) they can be provided with your distribution as different packages.

Edited by _Silence_

Share this post


Link to post
Share on other sites

2-Why would you use GLEW under linux? I thought GLEW only job to implement functions that links to the driver under windows. The reason for this is because windows only support OpenGL 1.0. So GLEW gives access to OpenGL 1.0+ functions. So Linux doesn't need that since Linux support every version of OpenGL natively. Doesn't it?

 

A misunderstanding on your part.

 

Windows supports OpenGL versions from 1.0 to 4.5, without issues.

 

However, the headers and librarys supplied with the Windows SDK only support up to OpenGL 1.1, with a smattering of extensions.  That's an important distinction because it's perfectly possible for other SDKs or build systems that support all current GL versions to exist; nothing about Windows prevents that.

 

So you use the extension loading mechanism (another important distinction; this is not the same as using extensions) to access higher functionality.

 

The key thing to realise is that the same constraints can exist on Linux.  Depending on your build tools and/or SDK used, you may have headers and librarys for all current GL versions or you may also need to use something like GLEW.

 

It's not the OS, it's the tools.

Share this post


Link to post
Share on other sites

I have few questions about getting started with OpenGL development under Linux. Please bear in mind that I have never used Linux in my life until two days ago. I have done a lot of research but I'm confused about few things.
 
1- What is mesa? Is it a driver or something like GLEW? if it is a driver why would you use it rather than Nvidia or AMD driver? Also how can I use OpenGL without installing libgl1-mesa-dev package? doesn't libgl come with Nvidia or AMD drivers or even with linux itself?

Mesa is a big amorphous thing.

In Mesa you will find a software implementation of OpenGL and OpenGL|ES, the official Intel video drivers, the official Gallium video drivers, and an unofficial implementation of the OpenGL development libraries (it's unofficial because the Khronos group, who owns the OpenGL brand, requires a fee of US$ 10 000 per year for official recognition, and Mesa is a penniless Free software project).

Generally speaking on desktop-oriented Linux-based and BSD-based OSes, you develop against Mesa and you run against the official OpenGL or OpenGL|ES drivers, which may or may not include Mesa, Free, or proprietary binary blob drivers from chip vendors like those from AMD or nVidia.  What your OS ships in its userspace is up to the distributor, but libGL.so is usually defaulted to the one supplied by Mesa and softlinked to the AMD or nVidia binary blobs during setup.
 
You still need an extension wrangler library like GLEW because OpenGL uses plenty of extensions.
 

If I try to compile my code without installing libgl1-mesa-dev package I get an error saying "cannot find -lgl". Doesn't Linux already have the latest version of OpenGL? and if it does how can i link to it? where is it located?

The classic Linux desktop OS, like all POSIX systems, has a case-sensitive filesystem. The correct linker switch to pull in libGL.so is "-lGL".

 

2-Why would you use GLEW under linux? I thought GLEW only job to implement functions that links to the driver under windows. The reason for this is because windows only support OpenGL 1.0. So GLEW gives access to OpenGL 1.0+ functions. So Linux doesn't need that since Linux support every version of OpenGL natively. Doesn't it?

The job of an OpenGL extension wrangler library like GLEW is to wrangle extensions for OpenGL. Not only do different version of OpenGL have different sets of optional functionality (extensions), but different vendors supply different extensions to take advantage of their specialized hardware designs. The use of GLEW to wrangle extensions is independent of the OS on which you are using OpenGL.  If you have access to a shell command line, try typing 'glxinfo|less' and return to see, among other things, a list of extensions the driver you are currently using offers.

 

3- What does the "-l" stands for in "-lgl" or "-lX11"? does it stand for library?

Yes, the "-l" command-line switch tells the compiler driver (which in turn drives the symbolic linker) to look for a certain library in its library search path and use it to resolve any outstanding unresolved symbols. A switch of "-lGL" would tell it to look for a library named "libGL.so" which would be installed by the development package for GL. On Ubuntu, that would be the package called libgl1-mesa-dev but your OS or distribution may differ.

 

What I want to do is basically create a window and an OpenGL 3.2+ context under Linux. So I assume I only need to use glx and OpenGL. I don't want to use any other dependency that I don't have to. Can I do that under Linux? I know in windows all I need is glew, wgl and windows API.

Yes, you can just assume that Linux == X11 and write software like it's 1999. I'd strongly recommend that if you're going to write software for Linux, you target Linux and not X11 because the assumption that they're the same thing has not been a valid one for years.  If you're just writing for yourself and don't plan on distributing your code or upgrading your system, of course you can stick with GLX.  It's bundled as a part of Mesa.

I would recommend that if you want to write software for Linux and need an OpenGL context (or better yet, and OpenGL or OpenGL|ES context), you use libSDL to handle your context creation and input. The overhead is not large, the code is minimal, and your software will also work with a modern Linux OS running Mir or a Wayland display server, on desktops, laptops, tablets, phones, refrigerators, and so on.  Also, it will work on Windows and Mac OS.  One code base to rule them all.  That's a good thing.

 

so I my final question is, How can I find glx.h and link to OpenGL under Linux without using mesa or GLEW?

You can't. Well, you could grab the header from khronos.org, but you're still going to need the development libraries to link to.  Using Mesa is how you do it.

 

You will need to install the development packages required to develop against OpenGL.  I really strongly recommend not using GLX directly, but use libSDL2 instead.  If you install the libSDL2 development packages (on Ubuntu, that's 'sudo apt install libsdl2-dev') it should pull in all the other required development packages as dependencies -- that's how Linux distribution package managers work.

Share this post


Link to post
Share on other sites

WIndows 7 will run any version of OGL including Vulkan. I've compiled and ran OGL for OGL 4.5 and Vulkan. (The Vulkan stuff was someone else's code I compiled as I'm in the process of trying to learn it, not my own code. But the point is it ran on Windows.)

 

I use GLFW with OGL to communicate with Windows, not GLEW. I use GLEW to get the full power of the graphics card (wrangle extensions). So, I'm using GLEW and GLFW, but it's GLFW that communicates with Windows. (In DirectX, I use the Windows SDK to communicate with Windows in Win32, but for OGL I want it to be more portable and so I use GLFW to communicate with whatever the OS is, although I have to admit I've only done Windows with OGL. I hope to do Linux with OGL pretty soon. I've just had too much going on.)

Edited by BBeck

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement