GDNet+ Basic
  • Content count

  • Joined

  • Last visited

Community Reputation

624 Good

About cow_in_the_well

  • Rank
  1. Can anyone remember this old game?

    This may be a long shot, but I remember playing a really good indie made 'shmup (about ~2004?). I can't remember it's name or who made it, but I have a recollection it was announced by someone here on The clues I can give: it was retro pixel art style, ran at an actual low old-school resolution (ie. not just emulated pixeleyness), and had some kind of piece of radioactive/blue toast as the developer's logo.    I've been really wanting to play it again, but no amount of googling for radioactive toast has turned up anything . So I thought I'd see if anyone out there may remember it and have a copy available.   Also, wow, it's been a long time since I've posted anything here. I guess I had a craving for nostalgia on all fronts .   Thanks!   -Thomas
  2. Bitmap fonts with GDI+

    Hmm, thanks for the idea - I might experiment with that... -Thomas
  3. Bitmap fonts with GDI+

    I'm trying to get cleartype to work when generating bitmap font glyphs (using GDI+ and copying the generated glyphs into a texture). It all works, except when I try to use TextRenderingHintClearTypeGridFit (via graphics->SetTextRenderingHint), it renders the text with strange colours on the fringe of the characters and no alpha blending. General info: - The GdiPlus::Bitmap is PixelFormat32bppARGB - a Graphics object is created from the bitmap, and has TextRenderingHintClearTypeGridFit set as the text rendering hint - Graphics::Clear is called before calling DrawString - The compositing mode on the Graphics object is set to CompositingModeSourceOver. I don't suppose anyone has encountered this before? :) I have tried with standard antialiasing (TextRenderingHintAntiAliasGridFit) and it works, however some fonts don't seem to support anti aliasing (eg. SimSun) so I'm trying to get clear type to work as well... Thanks. -Thomas
  4. Mouse position + raw input

    Hey thanks for the replies - thought I'd let you know what I ended up doing. a. For mouse buttons, I switched to simply using WM_LBUTTONDOWN/UP, etc. These messages pass the mouse position when the event occured, and moving to these messages was pretty trivial (only complication was I have to SetCapture to make sure the game client gets the mouse ups). b. Still use WM_INPUT for mouse moves (to get high res input), however to attach a mouse position to the mouse move events (with proper ballistics applied) I grab WM_MOUSEMOVE and attach the provided position to the most recent mouse move event from WM_INPUT (since multiple WM_INPUT raw mouse move events received in a single frame are accumulated into a single MouseEvent structure this works well). c. For keyboard button events I use GetMessagePos (thanks, it was just what I was looking for!). I do this since mouse button and keyboard button events share the same KeyEvent structure. So it's a mix of raw and windows messages which I think gives the best of both worlds. Cheers, Thomas
  5. Hi, I've got raw input working for mouse input in my thing and it works great for delta-based input (high-res mouse movement). For the GUI system, I'm just using GetCursorPos in the mouse button event handler to get the position of the hardware mouse (to do hit-tests against buttons, etc). This works, however there is is often a problem with the delay between the user actually pressing the button and when the event is pulled off the event queue by the scripts. This especially manifests itself when the FPS drops a bit and player tries to drag and drop a UI element - they click on the icon in the GUI and start dragging but by the time the GUI event handler script gets called for mouse-down, the actual position of the hardware mouse cursor is now outside the icon so the hit test fails (and the drag operation doesn't start). In other words, the GetCursorPos result is different to when the physical event actually occured. Now, WM_LBUTTONDOWN solves this problem by passing in the mouse position for when the event actually occurred, so I wanted to use this paradigm and pass position info with the button event to the scripts. Unfortunately WM_INPUT is pre-ballistics so calling GetCursorPos at that time isn't going to be accurate. In any case, there can still be a processing delay between the actual time the user pressed the button and when the WM_INPUT is pumped out of the window's message queue... So - my question is, is there any way to get a message queue syncronised position of the mouse in the Windows API? Kind of like how GetKeyState is syncronised, but instead it would be for mouse position....or am I barking up the wrong tree here? Cheers, Thomas
  6. WM_INPUT for keyboard - worth it?

    :D! My suspicions paid off - WM_CHAR/WM_DEADCHAR was still being called even though I was returning 0 from WM_INPUT. I set the RIDEV_NOLEGACY flag on my call to RegisterRawInputDevices and ToUnicodeEx works perfectly, just as expected :). Thanks guys. Everything is now going to plan :).
  7. WM_INPUT for keyboard - worth it?

    Nah that doesn't make any difference unfortunately. Just as proof that it's not handling the dead-keys for me, I press two "'" in a row: ToUnicodeEx: result is -1 (39 0 8339 126) ToUnicodeEx: result is -1 (39 0 8339 126) You would have thought the second call would have returned 1, since pressing ' twice should display two ' characters (as it does in Notepad in Spanish mode). Hmm. Just thinking - where is the dead-key state held? I'm guessing the Windows keyboad layer holds it. How does it get reset? I'm wondering if DefWinodwProc or something is calling ToUnicode itself and then resetting the state for when _I_ call ToUnicodeEx on the second key...investigating now...
  8. WM_INPUT for keyboard - worth it?

    I do receive WM_CHAR, it's just that I need to store which key event generated the char event (so I can decide later on whether I want to allow the char to go through to the scripts). Anyway, looking at the documentation that's how I thought it'd behave. However here's some output I'm getting from ToUnicodeEx (the four numbers is the resulting buffer, the first number is the only pertinent one): when I just press "e": ToUnicodeEx: result is 1 (101 0 0 0) when I press "'" and then "e" (with Spanish keyboard layout): ToUnicodeEx: result is -1 (39 0 28858 122) ToUnicodeEx: result is 1 (101 0 28860 122) (the last two values in the buffer seems to be garbage as it changes every time, in any case the return value is 1 indicating the first element should be considered). The code: HKL kblayout = GetKeyboardLayout(0); uchar kbstate[256]; if ( GetKeyboardState(kbstate) ) { wchar_t buff[4] = {0,0,0,0}; int res = ToUnicodeEx( vkey, rkb->MakeCode, kbstate, buff, ARRAY_SIZE(buff), 0, kblayout ); INFO_MSG("ToUnicodeEx: result is %d (%d %d %d %d)\n", res, buff[0], buff[1], buff[2], buff[3]); ... Any ideas? Thanks :).
  9. WM_INPUT for keyboard - worth it?

    Cool - I am now mapping scancode to VK using MapVirtualKey thanks steve. I can distinguish between all left/rights now. My problem now is I need to associate the WM_CHAR with the WM_KEYDOWN that generated it, so I can no longer rely on just using WM_CHAR. :( I'm using ToUnicodeEx to determine character based on keyboard layout, which works fine however I'm having a bit of a hard time with the dead keys. In the CEGUI link you posted they're manually mapping the ASCII key to a unicode diacritic (the switch statement). This seems really horrible, but I can't find any win32 functions in the keyboard layout stuff for doing this for me (the full list of possible diacritics is rather large). Anyone have any suggestions on how to do a proper diacritic key->unicode codepoint conversion without setting up a huge LUT myself (the actual conversion would surely be keyboard layout specific anyway...)? Steve: re: VK_SHIFT - if you press left shift and then right shift (while still holding left shift) you only get the VK_SHIFT event for the first shift. You only get a WM_KEYUP for VK_SHIFT once you have released both shifts. Cheers.
  10. WM_INPUT for keyboard - worth it?

    Okay - found a reason to use WM_INPUT for keyboard. I want to get events for individual left shift/right shift. Unfortunately WM_KEYDOWN/KEYUP is only sent once for VK_SHIFT even if you press both keys. WM_INPUT gets the keydown event for both (I still need to use GetAsyncKeyState to determine left or right because it doesn't set the RI_KEY_E0 in the flags). Also, since I need to remap the Virtual Key's to DIK (for compatability purposes), I was at first excited because it appeared that RAWKEYBOARD::MakeCode seemed to be mapping to the DIK scan codes. Unfortunately, it doesnt seem the case for some keys - for example the down arrow was being reported as Numpad 2 (yes I know, numpad 2 is down arrow, but I would have thought the two keys would be distinct at this low level). Am I missing something here? There doesn't seem to be much documentation on the MakeCode field. It'd be nice to not have to have a giant LUT to convert from the VK to the DIK :). Thanks. -Thomas
  11. WM_INPUT for keyboard - worth it?

    Thanks Matt, much appreciated.
  12. WM_INPUT for keyboard - worth it?

    Ah yes. I am actually needing to use WM_CHAR for text input (one of the main reasons for dropping DInput -_^). So I might as well use WM_KEY* for all keyboard stuff. There's no real issue with doing mouse/keyboard in different ways. Thanks. -Thomas
  13. Hi, So DirectInput has been mostly deprecated (for keyboard and mouse) and everyone is saying to use WM_INPUT instead. So this is what I've done, for both keyboard and mouse. However, I'm wondering if is there actually any real advantage to using WM_INPUT for keyboard, rather than just listening for WM_KEYDOWN/KEYUP? I've been poking around and can't find much information on this. Mouse gets the benefit of high-res input, but does keyboard get any? I can't imagine the overhead of windows processing the device data into WM_KEYDOWN/KEYUP etc causes any significant extra overhead. As an aside, I've only implemented unbuffered input (i.e. using GetRawInputData). According to MSDN buffered input should be used for devices that create large amounts of data. Unbuffered has worked fine for me so far, so I'm just wondering if anyone has had any problems with it for mouse (the only HID that I'm supporting that would create any sizable amount of data). Cheers, Thomas
  14. Hi, my problem is thus: I'm representing a GUI window in -1 to 1 clip space coordinates, which is a quad. Now, I want to clip child GUI components to the region of the parent window, so I'm using a scissor rectangle which somewhat works. The problem is that I'm converting clip space coordinates of the parent window's quad vertices to screen pixels so I can setup the scissor RECT (which takes integer coordinates). However, the way I'm converting from clip -> pixels doesn't always match up to how d3d is rendering the quad so I sometimes get the scissor region off by one. I looked up Direct3D's rasterization rules ( and setup my clip->screen mapping function like so: float rx1 = (clipRect.x * halfScreenWidth + halfScreenWidth); float ry1 = (clipRect.y * -halfScreenHeight + halfScreenHeight); float rx2 = (clipRect.z * halfScreenWidth + halfScreenWidth); float ry2 = (clipRect.w * -halfScreenHeight + halfScreenHeight); D3DRECT region; region.x1 = (LONG)floor( rx1 + 0.5f ); region.y1 = (LONG)floor( ry1 + 0.5f ); region.x2 = (LONG)rx2; region.y2 = (LONG)ry2; Which, from what I can tell, matches up to d3d's top-left-inclusive rule. However I'm still getting problems :/. Any advice? Thanks in advance. -Thomas
  15. Animation in Assassin's Creed

    Probably more important the raw animation data itself is how these animations blend together and transition from one to another (e.g. only transition from A to B when the right foot has hit the ground at the correct angle). We've had mo-cap in game for ages, but only recently has in-game animation (non-cut-scene) started to look really good, which, I believe, is largely due to smarter transitioning (as well as being able to store the metric crap-ton of individual animations required for something like Assassins Creed). CoD4 is another game that has fantastic animation transitions, IMO. NaturalMotion Morpheme is some pretty nifty middleware for tackling this problem specifically.