Textures messed up! (Scaled)

Started by
25 comments, last by Riddle 18 years, 9 months ago
Quote:Original post by Riddle
128/128
128/128
800/640 ( Same problem with all resolutions )


If I got you straight, you made this mistake:

800/640 (window size) - this is not a resolution.
128/128 (render target size) - this is your resolution.

All internal rendering is made using render target(buffer), and after all is rendered (and in this case, somewhat screwed, as you say), it's scaled to fit into the window size(it's too late then).

Am I right?
/def
Advertisement
Hmm! I though by render target you mean the object the texture is put on!

I don't really understand what you mean with render target!



And here one more picture: (So you can see what I mean)

Image Hosted by ImageShack.us

[Edited by - Riddle on June 24, 2005 5:20:50 PM]
Quote:Original post by Riddle
I don't really understand what you mean with render target!


Bsically it's the pixel buffer, that the card is writing on. All rendering is performed on it. At the very end of the process, it's contents are scaled and put into the window.

Quote:
How do I setup and get the render target size?

((quoted phrase magically dissapeared))

It depends on what you use as layer between the app and gl. I'm using GLFW, so it does this for me, but only as the app starts. I cannot change it later (at least not with enlarging the window).
How do you init the window, and gl in the first place?
/def
Quote:
Bsically it's the pixel buffer, that the card is writing on. All rendering is performed on it. At the very end of the process, it's contents are scaled and put into the window.


I didn't know this was the rendering target! Heh. (:
Quote:

[qoute]
It depends on what you use as layer between the app and gl. I'm using GLFW, so it does this for me, but only as the app starts. I cannot change it later (at least not with enlarging the window).
How do you init the window, and gl in the first place?

I use Tao.Sdl to createmy window:


if( Sdl.SDL_Init( Sdl.SDL_INIT_VIDEO ) < 0 ){   throw new Water.Exceptions.WaterException( SDLStrings.CouldntInitSDLX, Sdl.SDL_GetError() );}  int flags = Sdl.SDL_OPENGL;      if( isFullScreen == true )  {     flags |= Sdl.SDL_FULLSCREEN;  }  IntPtr videoPtr = Sdl.SDL_GetVideoInfo();  if( videoPtr == null )  {     throw new Water.Exceptions.WaterException( SDLStrings.AttemptToGetSDLVideoInfoFailed );  }  Sdl.SDL_VideoInfo videoInfo = DL_VideoInfo)System.Runtime.InteropServices.Marshal.PtrToStructure(				videoPtr, typeof( Sdl.SDL_VideoInfo ) );    		  if( videoInfo.hw_available != 0 )	flags |= Sdl.SDL_HWSURFACE;  else	flags |= Sdl.SDL_SWSURFACE;  if( videoInfo.blit_hw != 0 )	flags |= Sdl.SDL_HWACCEL;  Sdl.SDL_GL_SetAttribute( Sdl.SDL_GL_RED_SIZE, 8 );  Sdl.SDL_GL_SetAttribute( Sdl.SDL_GL_GREEN_SIZE, 8 );  Sdl.SDL_GL_SetAttribute( Sdl.SDL_GL_BLUE_SIZE, 8 );  Sdl.SDL_GL_SetAttribute( Sdl.SDL_GL_DEPTH_SIZE, 16 );  Sdl.SDL_GL_SetAttribute( Sdl.SDL_GL_DOUBLEBUFFER, 1 );  surface = Sdl.SDL_SetVideoMode( width, height, colorDepth, flags );
Quote:Original post by Riddle
I use Tao.Sdl to createmy window:
*** Source Snippet Removed ***


Looks fine to me. I think I can assume the render target is the same as the window size at creation time.

How about rendering a larger texture? How does this look? Gather information. Try to narrow this flaw a bit...
/def
What are those red lines in the last image?

Anyway Riddle, there's no problem. You just get aliasing in the edges of the characters. If your card supports anti-aliasing, turn it on and you'll probably see some improvement.

Quote:
Quote:Original post by Riddle
I don't really understand what you mean with render target!


Bsically it's the pixel buffer, that the card is writing on. All rendering is performed on it. At the very end of the process, it's contents are scaled and put into the window.


Quote:
How do I setup and get the render target size?



((quoted phrase magically dissapeared))

It depends on what you use as layer between the app and gl. I'm using GLFW, so it does this for me, but only as the app starts. I cannot change it later (at least not with enlarging the window).
How do you init the window, and gl in the first place?
/def


Are you sure you're not confusing some things? The term "Render targets" is used in an entirely different context, when you want to render in other surfaces besides the framebuffer(in a texture for example). Riddle is definately not using render targets. When antialiasing is on, things get rendered in a bigger buffer than the window(2x,4x,8x...) and then get downsampled, but I imagine that's not what you mean. You don't setup up "render target size", you just setup the window size and the viewport size with glViewport(). What do you mean "I cannot change it later"? Sure you can. Resize the window, resize the viewport and you're done.
Quote:
What are those red lines in the last image?


I made them to show you the lines in the original(left) image.

Quote:
Anyway Riddle, there's no problem. You just get aliasing in the edges of the characters. If your card supports anti-aliasing, turn it on and you'll probably see some improvement.

FSAA only improves the edges. The problem is inside the object!



I just tried three different objects.. and no problems!
1. Very small texture: 16/16 no alpha
2. Very big texture: 512/512 no alpha
3. Very big texture: 512/512 with alpha


I don't know why this texture gives me problems.. maybe because has only a few
pixels and then alpha again... hmm!

hmm!


Quote:
FSAA only improves the edges. The problem is inside the object!


FSAA- Full Screen Anti Aliasing. It doesn't matter if the jaggies are the polygon edges or inside the texture. Have you tried it? Anyway, as I said, the result you get is perfectly normal.
Quote:Original post by Riddle
Hi!

I added:
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MIN_FILTER, Gl.GL_LINEAR );
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MAG_FILTER, Gl.GL_LINEAR );
Looks like the normal effects of a bilinear filter to me.

Quote:
FSAA- Full Screen Anti Aliasing. It doesn't matter if the jaggies are the polygon edges or inside the texture. Have you tried it? Anyway, as I said, the result you get is perfectly normal.


Oha, thank you for this info! I will try it now.
::edit::
Hah! I activated FSAA.. the strange lines didn't go away..
but the object itself now looks ugly too in identity rotation! Because FSAA
screwed up the pixels..


Quote:
Looks like the normal effects of a bilinear filter to me.


So how can I make it look better?
Different values for:
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MIN_FILTER, Gl.GL_LINEAR );
Gl.glTexParameteri( Gl.GL_TEXTURE_2D, Gl.GL_TEXTURE_MAG_FILTER, Gl.GL_LINEAR );
?

This topic is closed to new replies.

Advertisement