Sign in to follow this  
Followers 0
axel1994

OpenGL
Opengl Linux

10 posts in this topic

So, this is a pretty dumb question.

 

But how do I get Opengl 3.3+ running on linux.

 

Atm glxinfo gives me the info: 3.0 Mesa 8.0.5

My graphics driver is Nvidia geforce GT 630M (2GB)

 

I've tried (and still am trying) to get Mesa 9.2.3 running.

I've got problems installing it.

 

I do know that Mesa 10 will be released quite soon, which supports 3.3.

But there have to be different options.

 

I also found the linux x64 (amd64/em64t) display driver on Nvidia website. But I don't know whether it supports opengl 3.3+

 

What I do know is that on my windows partition I got opengl 4.0 running.

0

Share this post


Link to post
Share on other sites

Not sure why would you want to use mesa if you have a good graphics card? Just install nvidia's graphics drivers, preferably through your distro's package manager.

0

Share this post


Link to post
Share on other sites

The process for getting OpenGL running on any platform is pretty much the same, however, with a GL 3.2+ context , a little more work is involved.

1. Create a system specific window using the system provided api.

2. Create an OpenGL context from the previously provided window device context.

There at one of examples showing how to create OpenGL context....

0

Share this post


Link to post
Share on other sites

Mesa is a software renderer http://www.mesa3d.org/

To get OpenGL to work on linux can be an adventure sometimes, since as a developer it's likely you want the latest

In my case I went the hardcore route and failed many times (I had to use terminal shell to fix xorg after completely breaking it)

 

So goes my warning: Installing the nvidia linux driver manually IS NOT WORTH IT

 

Fortunately if you have ubuntu, there are some simple and safe ways:

 

1. Use the nvidia-current branch (which is new enough for most cases)

You will find it in the Ubuntu Software Center

 

2. Use the edgers repo (from terminal):

a) Add the edgers repo

sudo apt-add-repository ppa:xorg-edgers/ppa

b) Update index files

sudo apt-get update

c) Install any of the nvidia packages, such as the very latest (as of this post):

sudo apt-get install nvidia-331

Edited by Kaptein
3

Share this post


Link to post
Share on other sites

Some day I will get these problems people speak about using nVidia's installer and I'll share their pain, but for me right now, it can't get easier than sh'ing the installer and let it do its job.

 

What actually never worked for me is the dkms that should, in theory, automagically recompile the driver each time I upgrade the kernel. Probably I'm missing something though...

0

Share this post


Link to post
Share on other sites

Thanks for all the responses

 

Mesa is a software renderer http://www.mesa3d.org/

To get OpenGL to work on linux can be an adventure sometimes, since as a developer it's likely you want the latest

In my case I went the hardcore route and failed many times (I had to use terminal shell to fix xorg after completely breaking it)

 

So goes my warning: Installing the nvidia linux driver manually IS NOT WORTH IT

 

Fortunately if you have ubuntu, there are some simple and safe ways:

 

1. Use the nvidia-current branch (which is new enough for most cases)

You will find it in the Ubuntu Software Center

 

2. Use the edgers repo (from terminal):

a) Add the edgers repo

sudo apt-add-repository ppa:xorg-edgers/ppa

b) Update index files

sudo apt-get update

c) Install any of the nvidia packages, such as the very latest (as of this post):

sudo apt-get install nvidia-331

 

I'm using debian weezy.

And yes, I tried to install manually. (so bad)

Somehow I messed up my whole system.

Everything was broken, I could barely login (even through recovery mode)

 

I had to reinstall the system.

 

I'm searching around the web how I could install the drivers.
But each time I try something, x won't run.

0

Share this post


Link to post
Share on other sites

Hmm those ppa etc. Nvidia repos never worked for me. I go with default and later update manually. Currently Nvidia is at OpenGL 4.3 or even 4.4.

Edited by FGFS
0

Share this post


Link to post
Share on other sites

 

Mesa is a software renderer http://www.mesa3d.org/

Mesa does 3D acceleration for many chips.

 

Um, no.  Well, sorta, in that it can use LLVMpipe to JIT the software rendering.  Generally, on Linux, the hardware does the hardware acceleration (using the proprietary blobs or the free driver equivalents).  Like on Windows, you link against the GL API provided by Mesa but the runtime is provided by the hardware vendors (through kernel DRM -- direct rendering module).  That's one reason why installing the nVidia drivers from nVidia is bad: they blow away the Mesa libraries, which means if you're a developer and upgrade the -dev package you will experience open running sores and purulent boils.

2

Share this post


Link to post
Share on other sites

Um, no. Well, sorta, in that it can use LLVMpipe to JIT the software rendering. Generally, on Linux, the hardware does the hardware acceleration (using the proprietary blobs or the free driver equivalents).


Um, yes. Well, sorta. Most of the open source hardware 3D drivers use Mesa as a front end - a bit like Microsoft writing D3D and the hardware manufacturers only needing to write a relatively small hardware specific bit rather than a full 3Dstack like they do with OpenGL.
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • By recp
      Hi,
      I'm working on new asset importer (https://github.com/recp/assetkit) based on COLLADA specs, the question is not about COLLADA directly
      also I'm working on a new renderer to render (https://github.com/recp/libgk) imported document.
      In the future I'll spend more time on this renderer of course, currently rendering imported (implemented parts) is enough for me
      assetkit imports COLLADA document (it will support glTF too),
      importing scene, geometries, effects/materials, 2d textures and rendering them seems working
      My actual confusion is about shaders. COLLADA has COMMON profile and GLSL... profiles,
      GLSL profile provides shaders for effects so I don't need to wory about them just compile, link, group them before render

      The problem occours in COMMON profile because I need to write shaders,
      Actually I wrote them for basic matrials and another version for 2d texture
      I would like to create multiple program but I am not sure how to split this this shader into smaller ones,

      Basic material version (only colors):
      https://github.com/recp/libgk/blob/master/src/default/shader/gk_default.frag
      Texture version:
      https://gist.github.com/recp/b0368c74c35d9d6912f524624bfbf5a3
      I used subroutines to bind materials, actually I liked it,
      In scene graph every node can have different program, and it switches between them if parentNode->program != node->program
      (I'll do scene graph optimizations e.g.  view frustum culling, grouping shaders... later)

      I'm going to implement transparency but I'm considering to create separate shaders,
      because default shader is going to be branching hell
      I can't generate shader for every node because I don't know how many node can be exist, there is no limit.
      I don't know how to write a good uber-shader for different cases:

      Here material struct:
      struct Material { ColorOrTexture emission; ColorOrTexture ambient; ColorOrTexture specular; ColorOrTexture reflective; ColorOrTexture transparent; ColorOrTexture diffuse; float shininess; float reflectivEyety; float transparency; float indexOfRefraction; }; ColorOrTexture could be color or 2d texture, if there would be single colorOrTex then I could split into two programs,
      Also I'm going to implement transparency, I am not sure how many program that I needed

      I'm considering to maintain a few default shaders for COMMON profile,
      1-no-texture, 2-one of colorOrTexture contains texture, 3-........

      Any advices in general or about how to optimize/split (if I need) these shaders which I provied as link?
      What do you think the shaders I wrote, I would like to write them without branching if posible,
      I hope I don't need to write 50+ or 100+ shaders, and 100+ default programs

      PS: These default shaders should render any document, they are not specific, they are general purpose...
             I'm compiling and linking default shaders when app launched

      Thanks
    • By CircleOfLight97
      Hi guys,
      I would like to contribute to a game project as a developer (open source possibly). I have some experiences in C/C++ in game development (perso projects). I don't know either unreal or unity but I have some knowledges in opengl, glsl and shading theory as I had some courses at university regarding to that. I have some knowledges in maths and basic in physics. I know a little how to use blender to do modelling, texturing and simple game assets (no characters, no animation no skinning/rigging). I have no game preferences but I like aventure game, dungeon crawler, platformers, randomly generated things. I know these kind of projects involve a lot of time and I'd be really to work on it but if there are no cleary defined specific design goals/stories/gameplay mechanics I would like to not be part of it x) and I would rather prefer a smaller but well defined project to work on that a huge and not 'finishable' one.
      CircleOfLight97
    • By gamesthatcouldbeworse
      Hi, I finally released KILL COMMANDO on gamejolt for free. It is a retro-funsplatter-shooter with C64 style. Give it a try.
    • By phil67rpg

      void TimerFunction(int value) {  glutPostRedisplay();  glutTimerFunc(1000, TimerFunction, 1); } void drawScene() {  glClear(GL_COLOR_BUFFER_BIT);      drawScene_bug();  TimerFunction(1);  eraseScene_bug(); // drawScene_bug_two(); // eraseScene_bug_two(); drawScene_ship(); drawScene_bullet();  glutSwapBuffers(); }