Flustration

Members
  • Content count

    6
  • Joined

  • Last visited

Community Reputation

101 Neutral

About Flustration

  • Rank
    Newbie
  1. Well, the app initiates at the appropriate resolution and everything... I just lose input.
  2. Okay, I see that the red pixels were actually ENTIRELY by my own hand... however, I cannot get blending to actually work. When I swap thevalues in glBlendFunc call, I get a black screen BTW. [img]http://i.imgur.com/vblF2.png[/img]
  3. All right... now that I have seen your edit, I have removed the alpha testing stuff altogether. Here is what I get now: [img]http://i.imgur.com/2zYZ6.png[/img]
  4. All right, here goes. First, I tried just changing that line. [CODE] glEnable( GL_TEXTURE_2D ); glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glEnable(GL_BLEND); glAlphaFunc(GL_NOTEQUAL, 1); glEnable(GL_ALPHA_TEST); [/CODE] I got this output (there are barely-visible blue pixels where there should be transparency). [img]http://i.imgur.com/QVBMj.png[/img] So then, I changed the alpha func around...: [CODE]glAlphaFunc(GL_GREATER, 0);[/CODE] But I got this output: [CODE]http://i.imgur.com/QVBMj.png[/CODE] totally black!
  5. Hello, all. I am getting red backgrounds on my textures which should be transparent... Here are my source image files: [url="http://dl.dropbox.com/u/31186025/slider_bar.tga"]http://dl.dropbox.co.../slider_bar.tga[/url] [url="http://dl.dropbox.com/u/31186025/slider_slider.tga"]http://dl.dropbox.co...ider_slider.tga[/url] Here are all my OpenGL options: [CODE]glEnable( GL_TEXTURE_2D ); glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); glBlendFunc(GL_ONE_MINUS_SRC_ALPHA, GL_SRC_ALPHA); glEnable(GL_BLEND); glAlphaFunc(GL_NOTEQUAL, 1); glEnable(GL_ALPHA_TEST); glClearColor ( 0.0f, 0.0f, 0.0f, 0.0f); glViewport( 0, 0, 900, 600 ); glClear( GL_COLOR_BUFFER_BIT ); glMatrixMode( GL_PROJECTION ); glLoadIdentity(); glOrtho(0.0f, 900, 600, 0.0f, -1.0f, 1.0f); glMatrixMode( GL_MODELVIEW ); glScalef(1, -1, 1); glDisable(GL_DEPTH_TEST); glLoadIdentity();[/CODE] Here is my texture loading/initialization code: [CODE] texture::texture(SDL_Surface *rawSurface) { if(listInitialized == false) { initList(); } for(int a = 0; a < 256; a++) { if(textureFree[a] == true) { glTexture = a; numTextures++; textureFree[a] = false; break; } } SDL_PixelFormat pixF = *rawSurface->format; std::cout << "BPP : " << (int) pixF.BytesPerPixel; SDL_Surface *surface; textureRect.w = nextPowerOfTwo(rawSurface->w); textureRect.h = nextPowerOfTwo(rawSurface->h); textureRect.x = 0; textureRect.y = 0; imageRect.w = rawSurface->w; imageRect.h = rawSurface->h; imageRect.x = 0; imageRect.y = 0; paddingW = textureRect.w - imageRect.w; paddingH = textureRect.h - imageRect.h; //where within the texture surface is the actual image? SDL_Rect rectDestination; rectDestination.w = imageRect.w; rectDestination.h = imageRect.h; rectDestination.x = paddingW; rectDestination.y = paddingH; GLint nOfColors = rawSurface->format->BytesPerPixel; GLenum texture_format; if (nOfColors == 4) // contains an alpha channel { if (rawSurface->format->Rmask == 0x000000ff) { std::cout << "format RGBA\n"; texture_format = GL_RGBA; } else { std::cout << "format BGRA\n"; texture_format = GL_BGRA; } } else if (nOfColors == 3) // no alpha channel { if (rawSurface->format->Rmask == 0x000000ff) { std::cout << "format RGB\n"; texture_format = GL_RGB; } else { std::cout << "format BGR\n"; texture_format = GL_BGR; } } surface = SDL_CreateRGBSurface(SDL_SWSURFACE, textureRect.w , textureRect.h , 32, rawSurface->format->Rmask, rawSurface->format->Gmask, rawSurface->format->Bmask, rawSurface->format->Amask); SDL_FillRect(surface, &textureRect, SDL_MapRGBA(surface->format, 255, 0, 0, 9)); SDL_SetAlpha(rawSurface, 0, SDL_ALPHA_OPAQUE); SDL_SetAlpha(surface, 0, SDL_ALPHA_OPAQUE); SDL_BlitSurface(rawSurface, &imageRect, surface, &rectDestination); SDL_SetAlpha(rawSurface, SDL_SRCALPHA, 0); SDL_SetAlpha(surface, SDL_SRCALPHA, SDL_ALPHA_OPAQUE); glBindTexture(GL_TEXTURE_2D, textureBank[glTexture]); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexImage2D(GL_TEXTURE_2D, 0, 4, surface->w, surface->h, 0, texture_format, GL_UNSIGNED_BYTE, surface->pixels); if(surface) {SDL_FreeSurface( surface );} }[/CODE] and finally, rendering... [CODE] void texture::drawTexture(int x, int y) { x = x - paddingW; // Bind the texture to which subsequent calls refer to glBindTexture( GL_TEXTURE_2D, textureBank[glTexture] ); glColor4f(1,1,1,1); glBegin( GL_QUADS ); //Bottom-left vertex (corner) glTexCoord2i( 0, 0 ); glVertex3f( x, (y + textureRect.h), 0.0f ); //Bottom-right vertex (corner) glTexCoord2i( 1, 0 ); glVertex3f( (x + textureRect.w), (y + textureRect.h), 0.f ); //Top-right vertex (corner) glTexCoord2i( 1, 1 ); glVertex3f( (x + textureRect.w), y, 0.f ); //Top-left vertex (corner) glTexCoord2i( 0, 1 ); glVertex3f( x, y, 0.f ); glEnd(); }[/CODE] And my output... [img]http://i.imgur.com/AS6RM.png[/img] Halp [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img] Thanks
  6. Hello! My SDL OpenGL application is based on SDL. For some reason, I receive input fine via the SDL_Event unless I add the SDL_Fullscreen flag. In full screen mode, I get no input events at all, strangely. [CODE] //Start SDL, if( SDL_Init( SDL_INIT_EVERYTHING ) != 0 ) { printf("SDL failed to initialize: %s\n", SDL_GetError()); return 1; } //Set up screen SDL_Surface* screen; SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8); SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8); SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8); SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8); if((screen = SDL_SetVideoMode( 900, 600, 32, SDL_OPENGL | SDL_FULLSCREEN)) == NULL) { printf("Error creating SDL surface: %s\n", SDL_GetError()); }[/CODE] Thanks much!