Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Member Since 01 Aug 2004
Offline Last Active Jan 05 2015 01:29 AM

Topics I've Started

Memory allocation in changing a struct to a class...

14 January 2008 - 06:47 PM

So, I've been playing around with optimizing some older code (in c++ / DirectX 9), and came across a bunch of struct statements that I had for vertex types, and each one of them shared some common attributes (an x/y/z for the vertex, and a method to get the size of the struct...) - so, doing the logical (in my mind...) thing, I decided to simplify some things by creating a base class with common attributes and deriving the other vertex types from that base class... which in turn would eliminate a lot of (almost) duplicate functions for setting up vertex buffers, etc... so essentially, I went from the following example (simplified for illustration):
struct SingleVertNoColor {
  float x, y, z;
  size_t getSize() { return sizeof(SingleVertNoColor); }
struct SingleVertWithColor {
  float x, y, z;
  int32 color;
  size_t getSize() { return sizeof(SingleVertWithColor); }

... to the following code:
class IVertexType {
  float x, y, z;
  virtual size_t getSize()=0;

class SingleVertNoColor : public IVertexType {
  virtual size_t getSize() { return sizeof(SingleVertNoColor); }

class SingleVertWithColor : public IVertexType {
  int32 color;
  virtual size_t getSize() { return sizeof(SingleVertWithColor ); }

Now... in the struct versions, the sizeof(SingleVertNoColor) is 12 and sizeof(SingleVertWithColor) is 16 (which is expected...). In the class versions, of course, there is the overhead of the _vftable, so the sizeof(SingleVertNoColor) becomes 16 and sizeof(SingleVertWithColor) jumps to 20. Now, this isn't a problem, however, when I go to lock and copy my vertices into my LPDIRECT3DVERTEXBUFFER9, the extra size is no problem, but it always seems to want to put the _vtable pointer at the beginning of the memory space for the derived classes... Is there any way to tell the compiler to push all member variables to the front of the classes' memory allocation, or, since I'm using customized D3DVERTEXELEMENT9 items for declaring my vertexes (i.e. non-FVF), is there any way to add in the extra _vftable pointer (and have it's usage be set to ignore / nothing)? Or - even with the second thought, would I even be guaranteed that my class items would even be in the same order? Any thoughts? Much appreciated in advance.

DirectX matrix problem - what am I missing?

17 February 2006 - 09:54 AM

I'm missing something here... I'm too used to doing things in OpenGL, but I'm trying my hand at some DirectX stuff now, and I'm perplexed by this - I >think< it should work right, but I seem to be missing something... Say I have a simple triangle with vertices:
  v1={250.0, 100.0f, 0.5f}
  v2={400.0f, 350.0f, 0.5f}
  v3={100.0f, 350.0f, 0.5f}
... the X and Y values for each of the vertices are in screen coordinates, which without any messing around with the world, view or projection matrices, I get: - which is what I expect... now, I tried applying a simple view matrix (based on a camera at (0, 3, -5), looking at (0, 0, 0)) and a projection matrix based on my screen, and the triangle is still in the same place (i.e. it doesn't look like any transform took place!) here's the relevant code snippets (I know - it's not nearly optimal, but it's just a test...):
// assume this is created correctly with 
IDirect3DDevice9  *d3dDevice; 
void initialize() {
 	// this function is only called once
 	// at the beginning of the program
  D3DXMATRIX identity;
  D3DXMatrixIdentity( &identity );
  // set the world matrix to the identity matrix
  d3dDevice->SetTransform( D3DTS_WORLD, &identity );
  // set the view matrix to a default too...
  d3dDevice->SetTransform( D3DTS_VIEW, &identity );
  // set our projection matrix
  float screenWidth=1024.0f;
  float screenHeight=768.0f;
  float screenAspect = screenWidth / screenHeight;
  D3DXMATRIX projectionMatrix;
  D3DXMatrixPerspectiveFovLH(&projectionMatrix, D3DX_PI/4, screenAspect, 0.5f, 1000.0f);
  d3dDevice->SetTransform(D3DTS_PROJECTION, &projectionMatrix);
 void render() {
 	// this function is called once per frame
	D3DXMATRIX matView;
  D3DXMatrixLookAtLH(&matView, &D3DXVECTOR3(0.0f, 3.0f,-5.0f),
                               &D3DXVECTOR3(0.0f, 0.0f, 0.0f),
                               &D3DXVECTOR3(0.0f, 1.0f, 0.0f));
  d3dDevice->SetTransform(D3DTS_VIEW, &matView); 	
	d3dDevice->Clear(0, NULL, D3DCLEAR_TARGET | D3DCLEAR_ZBUFFER, D3DCOLOR_COLORVALUE(0, 0, 0, 1), 1.0, 0);

	d3dDevice->SetRenderState(D3DRS_FILLMODE, D3DFILL_WIREFRAME);
	d3dDevice->SetRenderState(D3DRS_LIGHTING, TRUE);
	d3dDevice->SetRenderState(D3DRS_AMBIENT, D3DCOLOR_XRGB(255,255,255));
	// draw my vertex buffer object here...
  d3dDevice->Present(NULL, NULL, NULL, NULL);

Any ideas? I must be missing something... it looks like it's always interpreting the vertices that I pass it as already transformed vertices...

I need some of what they're smoking...

14 February 2006 - 01:17 PM

These are videos from (I believe) a show on television in Amsterdam (correct me if I'm wrong)... I'm not sure what's more odd, the opening song with children dancing like they're in the "Thriller" video, a song where Big (the pig) is farting throughout, the "Barney-esque" sing along with more fart humor, or what sounds like a techno-remixed Wiggles' song (complete with live video of them recording the song) ... just thought I'd share the crack of the day...

sizeof() returning strange value?

14 July 2005 - 04:29 PM

Hi all - I've come across something that is stumping me to no end at the moment... I'm working in C++ (using Visual Studio .NET 2003) - when debugging, I came across something odd that I'm not sure what's going on... basically, given the following code:
typedef unsigned long int32;

__declspec(align (16)) union float4 {
	__m128 m;
	float v[4];
	struct {
		float x;
		float y;
		float z;
		float w;

struct SingleVertWithColor {
	float4	vert;
	int32	color;

Shown above is just the data types... anyhow - when I use the debugger and check the sizeof(float4) - it returns 16. When I take the sizeof(int32) it returns 4. - so - the sizeof(SingleVertWithColor) should return 24, correct? For whatever reason, sizeof(SingleVertWithColor) returns 32... I've tried taking out the __declspec and the __m128 pieces, and in my project settings, the struct member alignment is set to the default... same results no matter what... Any idea what I may be missing here? Thanks!

Slow DirectX Performance

04 March 2005 - 01:38 PM

Just got a more general DirectX related question... so, my laptop that I'm currently developing a small 3D / 2D GUI engine on has a ATI Raedon Mobility 9600 w/ 128 MB ram inside (with the latest drivers installed) - It's a pretty quick machine w/ a Pentium M 1.5 and 512 MB of 2700 DDR RAM... Now, most 3D / FPS games run quite fine on it, but in my engine application, all I currently have is a single window and drawing 2 sets of triangle strips w/out using vertex buffers (i.e. the stream is set to 0 - localized vertex buffers)... anyhow - in both release and debug mode, I'm only getting around 40 FPS! I think I might be running with the DirectX debug runtime libs - which might be the problem... is there an easy way to switch between the debug and release runtime libraries? BTW, I'm using DirectX 9.0 (the Feb '05 release) - I remember through last time I touched DirectX (like somewhere around version 6 or 7), there was a "switch" application installed with it... where is that for 9? Am I missing something? Thanks much.