Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

198 Neutral

About CheeseMonger

  • Rank
  1. CheeseMonger

    A Simple Address Book

    SQLite might be ideal for this sort of thing. From the website: "SQLite is a small C library that implements a self-contained, embeddable, zero-configuration SQL database engine." http://www.sqlite.org/ There are bindings available for many languages, including ADO.NET, here: http://www.sqlite.org/cvstrac/wiki?p=SqliteWrappers
  2. CheeseMonger

    News of SGI selling OpenGL?

    The ARB is already an independent consortium that includes board members from both ATI and nVidia. Presumably, SGI means to sell off the trademarks and licensing rights associated with OpenGL that they protect so vehemently.
  3. CheeseMonger

    Is this evil?

    I know what the warning is there for, I explain as much in my original post. It just prompted me to wonder if it was good practice to use what should be (and indeed is) a valid use of initialiser lists like the example I gave despite it yielding a warning. The fact that there's a proviso for it in the standard is good enough for me.
  4. CheeseMonger

    Is this evil?

    Thanks SiCrane and Fruny. Exactly what I wanted to hear. Cheers.
  5. CheeseMonger

    Is this evil?

    Node::Node() : m_Parent(0), m_Child(0), m_Next(this), m_Prev(this) { } I get a warning for the above because by using the this pointer in an initialiser list, I'm passing a pointer to a not-yet-constructed object to the constructor of another object. However, I'm just storing the pointer and not doing anything with it, so I believe it's safe. Is this an evil and frowned upon idiom, or should I not worry about it? Thanks.
  6. CheeseMonger

    Getting key fields in mysql

    Quote:Original post by sanscrit Ok, so here is the deal: I have a table in mysql that holds player data. The primary key for this table is playerid that is an auto-incremented value assigned by the db. When a character is created, they are added to this table as well as a few others. The problem is since this is an automatically generated field I don't know what it is until I add a player. The mechanic that I am using to do this now is to insert a row in the player table, then query that table for the playerid, and use it to add the player to other tables. This seems inefficient. Is there a better way to do this? Thanks Yes, there is a special function in MySQL especially for this purpose. You can either use the C API function mysql_insert_id() or run a normal SQL query for SELECT LAST_INSERT_ID() For example, you can insert the auto generated id from the foo table into the bar table using the following idiom: INSERT INTO foo (auto,text) VALUES(NULL,'text') INSERT INTO bar (foo_id,text) VALUES(LAST_INSERT_ID(),'text') Refer to the manual for more info: http://dev.mysql.com/doc/refman/4.1/en/getting-unique-id.html
  7. CheeseMonger


    This article illustrates how to do what you are trying to do: http://www.matbooth.co.uk/articles/20050310/ It has sample source code, too.
  8. CheeseMonger

    Porting OpenGL Apps to Linux

    Aha, you're a genius, thanks! I guess the old method creates a single buffered display by default. It still didn't work in my real application though. That is, until I realised I was swapping win instead of gwin. How foolish of me. Thanks again. Works great now.
  9. CheeseMonger

    problem with writing a text

    If you look up the CreateFont() function that you have used on MSDN you will find this: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/gdi/fontext_8fp0.asp Quote:fdwCharSet [in] Specifies the character set. The following values are predefined: ANSI_CHARSET BALTIC_CHARSET CHINESEBIG5_CHARSET DEFAULT_CHARSET EASTEUROPE_CHARSET GB2312_CHARSET GREEK_CHARSET HANGUL_CHARSET MAC_CHARSET OEM_CHARSET RUSSIAN_CHARSET SHIFTJIS_CHARSET SYMBOL_CHARSET TURKISH_CHARSET VIETNAMESE_CHARSET Korean language edition of Windows: JOHAB_CHARSET Middle East language edition of Windows: ARABIC_CHARSET HEBREW_CHARSET Thai language edition of Windows: THAI_CHARSET Try changing ANSI_CHARSET to HEBREW_CHARSET.
  10. CheeseMonger

    Porting OpenGL Apps to Linux

    Thanks for trying. And for spotting that mistake, I've corrected the original post. In that case, can anyone see where I've gone wrong? As far as I can see, I've followed the specification exactly... [Edited by - CheeseMonger on July 11, 2006 9:21:25 AM]
  11. I'm porting some of my OpenGL apps to Linux using Xlib and GLX and I've come across some strangeness that I can't account for in GLX. If you don't know already, some major changes were made in how GLX works to accomodate some new features when version 1.3 was introduced. It remains backwards compatible with GLX 1.2, of course, so there are now two distinct ways to initialise an OpenGL drawing surface. The GLX specs can be found here and IBM have some excellent documentation here if you need to re-familiarise yourself. Now, I can get the old 1.2 method to work and it works fine but I can't get the 1.3 method to work at all. I can't just use the old method for two very good reasons: 1. One of my apps uses an off-screen frame buffer, support for which was introduced in 1.3. 2. I really, really, really want to know why the sodding thing doesn't work so I can tell myself the time I've spent puzzling over it wasn't wasted... ;-) I've spent hours today trying to figure it out and I can see absolutely no reason why it shouldn't work so I want to see if it works on another machine, but I don't have a spare Linux PC handy to test it on. Here's a couple of proof of concept programs I've prepared to illustrate the problem: GLX 1.2 program: http://www.littlewhitecat.com/cheesemonger/glx12.out GLX 1.3 program: http://www.littlewhitecat.com/cheesemonger/glx13.out The source code to these programs follows below, which you should save as "glx.cpp" if you wish to attach a debugger. Both programs should create a yellow filled window for 5 seconds, but the 1.3 program doesn't even draw over what's underneath it. I've really no idea where's it's failing because all the data seems to be present and correct according to the documentation and no X protocol errors are being thrown. My guess it's something to do with the drawable surface created by glXCreateWindow(), but since I can't find any sample code that uses the 1.3 method that works on my system, I've no context for comparison. If it fails to work on everybody else's machine I can at least eliminate the possibility of my machine being at fault and having just spent half a day trying to debug correct code... Thank you for your time, especially if you've managed to test the programs I linked to above. #include <X11/Xlib.h> #include <GL/glx.h> #include <GL/gl.h> #include <string.h> #include <unistd.h> /* Both methods create an X window in the same way */ Window create_win(Display *dpy, XVisualInfo *vi) { XSetWindowAttributes swa; swa.colormap = XCreateColormap(dpy, RootWindow(dpy, vi->screen),vi->visual, AllocNone); swa.border_pixel = 0; swa.event_mask = StructureNotifyMask; Window win = XCreateWindow(dpy, RootWindow(dpy, vi->screen), 0, 0, 200, 200, 0, vi->depth, InputOutput, vi->visual, CWBorderPixel|CWColormap|CWEventMask, &swa); XMapRaised(dpy, win); return win; } /****************************/ /* Old style GLX 1.2 method */ /****************************/ void setup_glx12(Display *dpy) { /* Find an RGBA visual */ int attr[] = { GLX_RGBA, 0 }; XVisualInfo *vi = glXChooseVisual(dpy, DefaultScreen(dpy), attr); /* Create a GLX context */ GLXContext cx = glXCreateContext(dpy, vi, 0, true); /* Create an X window */ Window win = create_win(dpy, vi); /* Connect the context to the window */ glXMakeCurrent(dpy, win, cx); } /****************************/ /* New style GLX 1.3 method */ /****************************/ void setup_glx13(Display *dpy) { /* Find an RGBA frame buffer config, no attribute list is */ /* needed since GLX_RGBA_BIT is a default attribute. */ int n; GLXFBConfig *fbc = glXChooseFBConfig(dpy, DefaultScreen(dpy), 0, &n); XVisualInfo *vi = glXGetVisualFromFBConfig(dpy, fbc[0]); /* Create a GLX context using the best matched frame buffer config. */ GLXContext cx = glXCreateNewContext(dpy, fbc[0], GLX_RGBA_TYPE, 0, true); /* Create an X window */ Window win = create_win(dpy, vi); /* Create a GLX window using the same frame buffer config that we */ /* used for the GLX context. */ GLXWindow gwin = glXCreateWindow(dpy, fbc[0], win, 0); /* Connect the context to the window for read and write */ glXMakeContextCurrent(dpy, gwin, gwin, cx); } int main(int, char **) { /* Connect */ Display *dpy = XOpenDisplay(0); /* Setup an OpenGL capable window using one of the following methods: */ /* Uncomment only one at a time. */ /* setup_glx12(dpy); /* Old style GLX 1.2 method */ setup_glx13(dpy); /* New style GLX 1.3 method */ /* Clear the buffer to yellow */ glClearColor(1,1,0,1); glClear(GL_COLOR_BUFFER_BIT); glFlush(); /* Pause before exiting */ sleep(5); } [Edited by - CheeseMonger on July 11, 2006 8:20:10 AM]
  12. CheeseMonger

    SHA1 padding

    You should indeed let the padding spill over into the next 512-bit block. In the C reference implementation of SHA1 in section 7 of RFC 3174, the comments in the SHA1PadMessage() function on page 17 state: /* Check to see if the current message block is too small to hold * the initial padding bits and length. If so, we will pad the * block, process it, and then continue padding into a second * block. */ Hope this helps.
  13. CheeseMonger

    XServer and OpenGL

    Some further reading reveals: Quote:Bool glXMakeCurrent( Display *dpy, GLXDrawable drawable, GLXContext ctx) BadMatch is generated if drawable was not created with the same X screen and visual as ctx. It is also generated if drawable is None and ctx is not NULL. So my guess is specifying an attribute list for an X Visual that cannot be met (causing drawable to be None) is the cause of the BadMatch errors. The glxinfo output you posted does not indicate any visuals with a 32bit buffer (they are all 16bit), so assuming the WX_GL_RGBA attribute requests a 32bit buffer, the BadMatch error it causes backs up this analysis. You could try explicitly setting the buffersize to 16 with the WX_GL_BUFFER_SIZE attribute instead. I hope this helps... As an aside, invoking "glxinfo -t" gives a much more readable output.
  14. CheeseMonger

    XServer and OpenGL

    Some information on X protocol errors: http://www.rahul.net/kenton/perrors.html Quote:BadMatch errors occur when only specific values are acceptable, but another value is provided. The valid values may be a small set of enumerated integers or they may be a relation between other arguments, e.g., a graphics context in a drawing request must have the same depth as the drawing window. There is rarely more than one possible BadMatch error for any particular request type, so identifying the problem is usually straight forward. In my experience, most BadMatch errors are related to drawable depths. Make sure your windows, pixmaps, visual types, colormaps, etc. have the correct depths in your X requests. You could be passing invalid data to glxMakeCurrent(). Make sure calls to XOpenDisplay(), XCreateWindow() and glXCreateContext() are not failing for any reason.
  15. CheeseMonger

    Delphi Object + WndProc

    I don't know Delphi, but from what I've been reading, I get the impression that "self" is already a pointer, like the "this" pointer in C++. This would mean that "@self" would return a pointer to a pointer, which is probably not what you want and could be the cause of an access violation somewhere.
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!