Jump to content

  • Log In with Google      Sign In   
  • Create Account

sigbus in SDL_FreeSurface (when debugging)


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
1 reply to this topic

#1 yogo1212   Members   -  Reputation: 106

Like
0Likes
Like

Posted 26 April 2014 - 04:21 PM

Hi,

 

i'm currently experiencing a really weird problem with SDL2 and maybe one of you is brighter than i am...

 

Basically, I have this function:



function GetString(fontid: byte; thing: string): TEngineString;
var surf: PSDL_Surface;
begin
  Result := EngineString(thing, dispStringXSalt, dispStringDSalt);

  if not GameResourceHas(Result, grtTexture) then
  begin
 	  // So, why is this method still a catastrophy?

    surf := TTF_RenderUTF8_Blended(fonts[fontid], PChar(thing), foregroundColour);

{$IFDEF ENGINEDEBUG}
        if surf^.format^.BitsPerPixel = 32 then
{$ENDIF}
            GameResourceAdd(GameTextureLoader(Result, surf^.w, surf^.h,
                GL_RGBA, GL_UNSIGNED_BYTE, GL_RGBA, surf^.pixels), Result)
{$IFDEF ENGINEDEBUG}
        else
            raise Exception.Create('invalid pixelformat for font-rendering. ouch.');
{$ENDIF}


    SDL_FreeSurface(@surf);
  end;
end;  

And it raises a sigbus at runtime (inside SDL_SetPixelFormatPalette called from SDL_FreeSurface) ?!

This happens only when debugging. blink.png

Doesn't matter if through IDE or with gdb.

If I don't debug, no error occurs and my little simulation works as normal..........

 

I recently changed the whole method and this is the original:

function GetString(fontid: byte; thing: string): TEngineString;
var width, height: Longint; surf: PSDL_Surface;
begin
  Result := EngineString(thing, dispStringXSalt, dispStringDSalt);

  if not GameResourceHas(Result, grtTexture) then
  begin
 	  // So this method is a catastrophy

    TTF_SizeUTF8(fonts[fontid], PChar(thing), @width, @height);

    surf := TTF_RenderUTF8_Blended(fonts[fontid], PChar(thing), foregroundColour);

    if surf^.format^.BitsPerPixel = 32 then
      GameResourceAdd(GameTextureLoader(Result, width, height, GL_RGBA,
        GL_UNSIGNED_BYTE, GL_RGBA, surf^.pixels), Result)
    else
      raise Exception.Create('invalid pixelformat for font-rendering. ouch.');

    SDL_FreeSurface(@surf);
  end;
end;  

Suprisingly, this method does work without raising an error when debugging :-/

But GetString will be called a lot and I want to optimise it.

 

I am totally puzzled, but I found this: "EDIT: Strange, I recompiled and it worked! EDIT: Recompiled again; it failed. EDIT: I just didn't check the debugger the first time :P"

(here http://www.cplusplus.com/forum/general/59965/)

and this is just how I feel biggrin.png

 

kind regards, yogo1212


Edited by yogo1212, 26 April 2014 - 04:22 PM.


Sponsor:

#2 yogo1212   Members   -  Reputation: 106

Like
0Likes
Like

Posted 27 April 2014 - 05:07 AM

Update:

This version works - even with debugging on (notice the extra declared variables):

function GetString(fontid: byte; thing: string): TEngineString;
var
	width, height: longint; surf: PSDL_Surface;
begin
	Result := EngineString(thing, dispStringXSalt, dispStringDSalt);

	if not GameResourceHas(Result, grtTexture) then
	begin
		// So, is this method still a catastrophy?

		//TTF_SizeUTF8(fonts[fontid], PChar(thing), @surf, @surf);

		surf := TTF_RenderUTF8_Blended(fonts[fontid], PChar(thing), foregroundColour);

{$IFDEF ENGINEDEBUG}
		if surf^.format^.BitsPerPixel = 32 then
{$ENDIF}
			GameResourceAdd(GameTextureLoader(Result, surf^.w, surf^.h,
				GL_RGBA, GL_UNSIGNED_BYTE, GL_RGBA, surf^.pixels), Result)
{$IFDEF ENGINEDEBUG}
		else
			raise Exception.Create('invalid pixelformat for font-rendering. ouch.');
{$ENDIF}

		SDL_FreeSurface(@surf);
	end; 

This makes things really interesting...

Because they are not used, they dont end up in the symbol-table. But they are allocated on the stack!

 

So, could the sigbus be a sigsegv in disguise?

Addresses on x86_64 are byte-aligned and a memory mapping from one of the virtual addresses contained in PixelFormat to a physical address of the stack is veeeeeeeery unlikely....

 

I'm installing the sources of SDL and i'll try to find the line responsible.

 

Nobody of you ever heard of something like this?


Edited by yogo1212, 27 April 2014 - 05:08 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS