Jump to content

  • Log In with Google      Sign In   
  • Create Account


roby65

Member Since 08 Aug 2007
Offline Last Active Jan 07 2013 11:22 AM
-----

Posts I've Made

In Topic: Detecting game state from rendering calls within directx hook

06 January 2013 - 06:12 AM

Hmm. Never did anything like this myself. Depending on the loaded stuff and the version of the application (defining offset in the heap to where the menu-related stuff is kept),
there's bound to be a byte that changes back and forth when you enter/exit the menu.
Try to grab a dump of the memory allocated for the application in and outside the menu several times to find it,
then look into means of accessing this place from your app.

But I've got no idea whether it would work. Just seems less hackish than looking at loaded textures...

This would be my 2nd idea, but it's not reliable too:

this game has a lot of versions and the user could be using one of these, so i should find the offset for every game version, and if the game gets a new update, this would break the state detection until i find the new offset.

Also the uk, eu and usa version has different offsets too, so for every version i should do the offset find 3 times.

 

Roby


In Topic: [C++] Name above other players

29 August 2012 - 02:06 PM

yes, why not with a little upscaling according to linear Z.

otherwise the position of the point in 2D is:
(you can use glm library)

vec4 proj = mul(vec4(vec3(worldPosition), 1), viewProjectionMatrix);
proj *= (1. / 2 * proj.w);
proj += vec4(0.5, 0.5, 0, 0);
vec2 screenPos = vec2(proj) * resolution; // resolution is a vec2.


I liked your code, but it's not working, maybe i made something wrong?

D3DXVECTOR3 worldPosition(100,100,500);
  D3DMATRIX viewProjectionMatrix=lastProjection;
  D3DXVECTOR4 proj;
  D3DXVec4Transform(&proj,&D3DXVECTOR4(worldPosition,1),&(D3DXMATRIX(viewProjectionMatrix)));
  proj *= (1. / 2 * proj.w);
  proj += D3DXVECTOR4(0.5, 0.5, 0, 0);
  D3DXVECTOR2 screenPos = D3DXVECTOR2(proj.x,proj.y) * 1680 * 1050;

In Topic: [C++]2D Game engine, global objects problem

12 June 2011 - 10:59 AM

Thanks for the replies!
I tried to think how not to link together animation and texture manager but they are indeed linked and i cannot do in other ways.

Using the first method i changed my code, but now i'm blocked again, and i can't understand how to fix this error:

uint cAnimationManager::CreateAnimation( std::vector< uint > Frames, Uint32 Delay)
{
int newid=0;
cAnimation anim(m_TextureManager);
for(int i=0;i<Frames.size();i++)
{
anim.SetTexture(Frames[i],i);
anim.SetDelay(Delay,i);
}
newid=m_Animations.size();
m_Animations.insert(std::pair<uint, cAnimation>(newid,anim));

return newid;
}


I get a "no appropriate default constructor avaible" error, how to fix that?

EDIT: Maybe i should switch to pointers, right?

In Topic: [SDL+OGL]Blend not working

20 September 2010 - 08:33 AM

Quote:
Original post by AndyEsser
You never clear the depth buffer either, change to this:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

Does the screen actually clear to red as you have it now?


Yes.

Anyway i found the problem.
In case someone has the same problem, is the surface loading on a texture that doesn't work well with alpha.
New code:


GLuint Common_CreateTextureFromSurface(SDL_Surface *surface)
{
GLenum texture_format;
GLuint texture;
int nOfColors;

nOfColors = surface->format->BytesPerPixel;
if (nOfColors == 4) // contains an alpha channel
{
if (surface->format->Rmask == 0x000000ff)
texture_format = GL_RGBA;
else
texture_format = GL_BGRA;
} else if (nOfColors == 3) // no alpha channel
{
if (surface->format->Rmask == 0x000000ff)
texture_format = GL_RGB;
else
texture_format = GL_BGR;
} else {
printf("warning: the image is not truecolor.. this will probably break\n");
MessageBox(0,"warning: the image is not truecolor","Common_CreateTextureFromSurface",0);
// this error should not go unhandled
}


glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
SDL_PixelFormat *format = surface->format;

if (format->Amask)
{
gluBuild2DMipmaps(GL_TEXTURE_2D, 4, surface->w, surface->h, texture_format, GL_UNSIGNED_BYTE, surface->pixels);
}
else
{
gluBuild2DMipmaps(GL_TEXTURE_2D, 3, surface->w, surface->h, texture_format, GL_UNSIGNED_BYTE, surface->pixels);
}


return texture;

}


In Topic: [SDL+OGL]Blend not working

20 September 2010 - 06:57 AM

Quote:
Original post by AndyEsser
Try changing your glClearColor() function to this:

*** Source Snippet Removed ***


Nope, doesn't work.

Actual code:

void CEngine::InitGL(int w,int h)
{
glEnable( GL_TEXTURE_2D );

glClearColor( 1.0f, 0.0f, 0.0f, 1.0f );

glViewport( 0, 0, 640, 480 );

glClear( GL_COLOR_BUFFER_BIT );

glMatrixMode( GL_PROJECTION );
glLoadIdentity();

glOrtho(0.0f, w, h, 0.0f, -1.0f, 1.0f);

glMatrixMode( GL_MODELVIEW );
glLoadIdentity();


glEnable (GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
}

void Common_DrawTextureOffset(GLuint texture,int w,int h,float x,float y,float x2,float y2)
{
glBindTexture( GL_TEXTURE_2D, texture );

glColor4f(1.0f,1.0f,1.0f,1.0f);

glBegin( GL_QUADS );



//Bottom-left vertex (corner)
glTexCoord2f( x, y );
glVertex3f( 0.f, 0, 0.0f );

//Bottom-right vertex (corner)
glTexCoord2f( x2, y );
glVertex3f( w, 0.f, 0.f );

//Top-right vertex (corner)
glTexCoord2f( x2, y2 );
glVertex3f( w, h, 0.f );

//Top-left vertex (corner)
glTexCoord2f( x, y2 );
glVertex3f( 0.f, h, 0.f );
glEnd();
}

void CEngine::BlitToScreen()
{
SDL_GL_SwapBuffers();
glClear( GL_COLOR_BUFFER_BIT );
}



PARTNERS