Jump to content
  • Advertisement
Sign in to follow this  
Scribe

SDL ogl rendering in GDI

This topic is 4720 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, im declaring opengl through SDL and its rendering to GDI generic, not hardware, and it's damn slow. My code is as follows:
    //////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    // Set OpenGL variables
	SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );									// Set red to use 1 byte of memory
	SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );								// Set green to use 1 byte of memory
	SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );									// Set Blue to use 1 byte of memory
	SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );								// Set Alpha to use 1 byte of memory
	SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 32 );								// Set Screen depth to 32-bit
	SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );								// Set the double buffer

    //////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    // Activate main screen
    Screen = SDL_SetVideoMode                                                   // Set as displayed screen
    ( ScreenX, ScreenY, BitDepth,                                               // Insert size and depth info.
     SDL_OPENGL);											                    // Set hardware mode ( stored in
                                                                                // graphics card memory )
                                                   
    if ( Screen == NULL )                                                       // Confirm screen has been
    {                                                                           // initialized checking for error
    
         printf("Error 2;", SDL_GetError());                                    // Print record of error
         
         return 0;                                                              // Return main ( end program )
    }

    //////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    // Load OpenGL
	glViewport( 0,0, ScreenX, ScreenY );										// Setup the OpenGL Window
	glMatrixMode( GL_PROJECTION );												// Set OpenGL for 2d Projection mode
	glLoadIdentity();															// Set the default Screen Matrix
	glOrtho( 0.0f, 1024.0f, 768.0f, 0.0f, -1.0f, 1.0f );						// Apply the Matrix to the Screen
	//gluPerspective(45.0f,(GLfloat)ScreenX/(GLfloat)ScreenY,0.1f,100.0f);
	glMatrixMode( GL_MODELVIEW );												// Set OpenGL for Model mode
	glLoadIdentity();															// Set the default Screen Matrix
	glClearColor( 0.0f,0.0f,0.0f,0.0f );										// Create a Black clear colour
	glClearDepth( 1.0f );														// Create the default depth at 1
	glDepthFunc( GL_LEQUAL );													// Allows images infront only to show	
	glEnable( GL_DEPTH_TEST );													// Activates the above system
	glShadeModel( GL_SMOOTH );													// Allow pixels to blend together
	glDisable( GL_CULL_FACE );													// Prevent removal of back faces
	glHint( GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST );						// Give the nicest vibrance
	glColor4f( 1.0f, 1.0f, 1.0f, 1.0f );										// Set the current colour to white
	glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA );						// Set Alpha and pixel Blending
	glEnable( GL_BLEND );														// Activates the above system
	glEnable(GL_TEXTURE_2D);

Any idea how to force it into hardware guys? (i get this error on nVidia and ati systems) Compiling is VS.net btw using a command line project

Share this post


Link to post
Share on other sites
Advertisement
Try getting rid of SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 ) - I don't know if current hardware supports destination alpha.

Hope this helps,

Pete

Share this post


Link to post
Share on other sites
Try setting the depth buffer to 16 bits instead of 32. Also it might help if you changed windows to display in 16-bit color (can't remember exactly why that tends to make a difference).

Share this post


Link to post
Share on other sites
I wasn't aware OpenGL had anything to do with the GDI. Are you in windowed mode or fullscreen? And how do you know how it's rendering?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I dont know if there is a way to speed it up. Opengl in itself is usually slower to render than direct3d -- at least in my experience, even if you change the buffer depth or if the hardware supports destination alpha -- it shouldnt have too much of an effect on the game performance since most of this should be handled in opengl -- that is unless your forcing things on that should be left to the OS to decide. What is your definition of slow, do you mean that its like 1 fps or something...

Share this post


Link to post
Share on other sites
As others have said, first try changing
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 32 );
to
SDL_GL_SetAttribute( SDL_GL_DEPTH_SIZE, 24 );

Very few cards support a 32bit z-buffer.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!