Sign in to follow this  

opengl + sdl: is hardware accel automatic?

This topic is 3865 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've got an app that in freeglut on an Ubuntu system runs between 25 to 33 fps; I'm trying to port it to Mac OS X. Building it in freeglut gave me an fps of < 3, so I thought "maybe it's glut", since this Mac is new and has a beast of a video card in it and 2x the RAM of Ubuntu box...so I moved it over to SDL. I'm still getting < 3 fps.
void InitVideo()
{
	// INIT VIDEO SDL SIDE

	cout << "Video init..."; cout.flush();

	if (SDL_Init(SDL_INIT_VIDEO) < 0)
    {
      cout << "Unable to start SDL Video: " << SDL_GetError() << endl;
      exit(1);
    }
	atexit(SDL_Quit);

	const SDL_VideoInfo *vidinfo = SDL_GetVideoInfo();
	if (!vidinfo)
	{
		cout << "Video query failed: " << SDL_GetError() << endl;
		exit(1);
	}

	vidflags = SDL_OPENGL | SDL_FULLSCREEN;// | SDL_RESIZABLE | SDL_ANYFORMAT; // don't seem to need these?

	// graphics hardware check
//	if (vidinfo->hw_available)
//	{
//		vidflags |= SDL_HWSURFACE;
//		cout << "Hardware detected." << endl;
//	}
//	else
//		vidflags |= SDL_SWSURFACE;

	// hardware blits
//	if (vidinfo->blit_hw)
//	{
//		vidflags |= SDL_HWACCEL;
//	}
  
	SDL_GL_SetAttribute(SDL_GL_RED_SIZE,   8);
	SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
	SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE,  8);
	SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 16);		// 16bit depth buffer size (enough?)

	SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);		// Turn on double buffering
	
	SDL_GL_SetAttribute(SDL_GL_ACCUM_RED_SIZE, 8);	// This should be at least 2x as deep as depth buffer
	SDL_GL_SetAttribute(SDL_GL_ACCUM_GREEN_SIZE, 8);	// ...which I believe means the entire acccum buffer?
	SDL_GL_SetAttribute(SDL_GL_ACCUM_BLUE_SIZE, 8);	// (which is 16 and this totals at 32)
	SDL_GL_SetAttribute(SDL_GL_ACCUM_ALPHA_SIZE, 8);

	// try for SCREEN_BPP, but SDL_ANYFORMAT flag (set above) will allow a fallback to any mode
	// changed from SCREEP_BPP to 0, just going to let it go with default
	draw_surface = SDL_SetVideoMode(WINDOW_WIDTH, WINDOW_HEIGHT, 0, vidflags);

	cout << "done." << endl;

	// Set up keyboard with a repeat rate (so you can hold the button down for continuous effect)
	if (SDL_EnableKeyRepeat(SDL_DEFAULT_REPEAT_DELAY, SDL_DEFAULT_REPEAT_INTERVAL) == -1)
    cout << "NOTICE: Couldn't init keyboard repeat rate (should still run)." << endl;

	SDL_WM_SetCaption("Orbital Laser Array Vs. Giant Ants", "");
 // _SDL_resize(WINDOW_WIDTH, WINDOW_HEIGHT);

  cout << "SDL Inits complete." << endl;
  
  // INIT OPENGL
  //glClearDepth(1.0f);		// Sets depth buffer's "depth"
  //glEnable(GL_DEPTH_TEST);	// Enable it
  //glDepthFunc(GL_LESS);		// Set style of depth testing (GL_LESS is default.)
  
  //glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	// gonna need this somewhere


}

There's my init code, with some of the stuff I've ruled out remarked out (but left in for your evaluation). When the hardware stuff was being used it made zero difference. I'm not sure where to go from here, so if anyone has any thoughts at all to help me along, I would really appreciate it.

Share this post


Link to post
Share on other sites
In further news, according to a video info query, my computer has no graphics hardware =(

This is on my new computer, a MacBook Pro - brand spankin new (I openned the box a week ago). I got SDL via Fink and it's updated (1.2.11 which is the most recent Mac version on the SDL website), so I'm not sure if there's something I should do that I'm not doing.

I built and tried the OpenGL example for XCode off the website, and it too reports no video hardware found...however it's very smooth for there being no video card detected...that leads me to believe perhaps that the video query is wrong?


const SDL_VideoInfo *vidinfo = SDL_GetVideoInfo();
if (!vidinfo)
{
cout << "Video query failed: " << SDL_GetError() << endl;
exit(1);
}

cout << "Hardware detected: " << (vidinfo->hw_available ? "yes" : "no") << endl;
cout << "VRAM: " << vidinfo->video_mem << endl;



Thanks again.

Share this post


Link to post
Share on other sites
The video info you get back from that call is more related to the 2D blitting SDL-specifics. I haven't played with it on a mac, but the latest SDL on windows uses GDI (the non-DX, non-hardware-accelerated graphics API), and returns values accordingly.

This thread will prove useful to you: similar question

Share this post


Link to post
Share on other sites

This topic is 3865 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this