• Content count

  • Joined

  • Last visited

Community Reputation

166 Neutral

About Stephany

  • Rank
  1.   Right, that's what I meant by "iterating through a loop and applying small amounts of experience per iteration". I was just hoping there would be something more advanced that wouldn't require iteration. If there is no such thing, then there's not much that can be done.   I'm open to a completely different algorithm if anyone has an idea that would result in a similar outcome.   Thanks again
  2.   Why not eliminate character_state as an input to your calculations?     It represents the character's ability to learn and develop skills. Things like death and head injuries will reduce it in the game. It is the only major consequence of death, which makes it a pretty important aspect of the game. I'm still working on the exact details, but death may cut it in half, or it may subtract some from it. But the only way to improve or repair it is through earning experience. Players that avoid death are rewarded by earning more experience.   So you believe the math itself is impossible? Is that why you're suggesting I remove it?   Thanks
  3.   The character state decreases for certain reasons in the game, and increases when the character earns experience. There's not much else to the concept. It doesn't need to increase linearly with the experience, but I'm wanting something that increases the experience and repair by the same amount for +5 +5 +5 as it does with +15, and so on.     If it increases at different rates based on the size of increments, players will catch on to it, and it adds an unwanted gameplay incentive - if the player's state is low, they must engage in small accomplishments before large ones if they want to earn the most experience. This is something I want to avoid, and the reason I'm asking for help. I'm having troubling coming up with something that results in the same outcome with different size increments, since the state changes along with them.   Is it not possible, maybe? I was thinking it would work similar to the math for frame-independent movement friction. But I haven't been able to make anything work.
  4. Hey everyone,   I'm attempting to write a seemingly simple routine that filters character experience by an intelligence-like character state. The state value will be a scaler ranging from 0 to 1. When experience is earned, the scaler represents how much of that experience is granted to the character, and the remaining experience goes toward repairing the state (scaled by a hardcoded value). Here's some quick code to demonstrate: // some defines real REPAIR_SCALER = xxxx; // some known values real input_experience = ?; real char_state = 0 to 1; // the (incorrect) code real earned_experience = input_experience * char_state; real repair_experience = input_experience - earned_experience; char_state = min( 1.0, char_state + repair_experience * REPAIR_SCALER ); AddCharExperience( earned_experience ); SetCharState( char_state ); The problem with this approach is that there will be a significant difference between earning a bunch of small amounts of experience versus earning a single large amount of experience. Since the state that scales the experience is increasing each time its earned, each epsilon(?) of the experience is changing the amount of experience earned. I could mostly mitigate this issue by iterating through a loop and applying small amounts of experience per iteration, but I was hoping there is a more elegant solution. I've been playing around with pow() and exp(), but math isn't my strongest skill, and I'm having trouble coming up with anything.   If anyone has any advice, it will be greatly appreciated. Thank you.
  5. Projected Decals Stretching

      I considered something like this. I ran a small test where I performed an entire quaternion rotation for each triangle to move the projection direction onto its surface, and the result wasn't that bad. The only issue was that the texture was split up between the triangles, which may have been solved with some UV welding, but I figured it was too much (rotations + welding) and moved on. However, computing the shared/average normals for each vertex would basically do the same thing without the problem.       Yeah, my normal decals are not this large. I'm trying to work something out to be used for explosion scorch marks, and possibly some casted shadows. This type of clipping and stretching isn't that noticable on bullet decals.   Thanks for the advice guys
  6. I've been using a projected decal system (very similar to the one described in Game Programming Gems 2), where the decal is projected along a vector onto triangles that intersect the projected 3D box. I'm able to find and clip the triangles without much issue. But I'm having some problems generating the UV coordinates in a way that doesn't stretch significantly across surfaces that run parallel to the projection vector.   This makes sense - I'm doing a 2D planar map onto 3D geometry, so the 3D parts are not going to come out looking all that great. But I'm trying to figure out a way to improve the situation.   To compute the UV coordinates, I find the local/relative position of each vertex between the clipping planes. For example, if a vertex is very close to the right clipping plane, it will get horizontal UV coordinate very close to 1.0. The problem is that the front and back clipping planes have no influence on the UV coordinates, allowing the decal to span whatever distance along that axis without the UV coordinates changing in any way. Here are some images. The top one is just a simple projection onto a flat surface (mostly). The bottom two show the stretching problem:     I've been trying to come up with something that at least improves this, like trying to use the depth (Z +/-) of the vertices to move the UV (x/y) coordinates farther outward in whatever direction they were already closest to. It helped a little with the stretching, but caused some pretty bad distortion. The idea still seams sound in my mind (the UV coordinates become closer to the UV edge as the vertices extend into the Z axis) - but if it is so, I haven't been able to apply it correctly.   One other possibilty is to (some how) compute the average normal of all triangle surfaces and project the decal using the average. Then the stretching is somewhat averaged throughout the whole decal. But I have no idea how to compute an average normal that uses each triangle's surface size to increase how much its normal influences the average (smaller triangles should have less influence on the average than larger ones).   Does anyone have any ideas on how to solve this issue? Some way to UV-map vertices that allows the Z axis to influence the coordinates? Or just any idea to help prevent the stretching?   Thanks for any advice
  7. I did manage to get things working well enough to move on. I use GetForegroundWindow() on startup to detect how things start. If my window starts in the background, then my game internally unfocuses (without any changes to the actual window - it still thinks its focused), and I keep checking GetForegroundWindow() until it becomes the foreground window. Once it does, I stop checking it altogether, and deal with things as normal. As far as I know, there is no way for this situation to occur after the window is created, so it should work well enough. It's not a solution to the problem, but it seems to work around it well enough.
  8.   The window still receives those messages, even if it is a fullscreen borderless window.   Yes, but they don't need to worry about the cursor wandering off of the screen while its hidden. Their window covers the entire desktop/display, so there is no danger of accidentally hovering/clicking something in the background.   Or at least there shouldn't be. In all honesty, I believe there is a slight danger of it happening, even then. I remember having windows unfocus my fullscreen app while clicking inside my game, before I started locking the mouse to the center. I believe it is still possible to scroll over the edge of your window region in fullscreen - although it may only happen when the game switches to a lower resolution display. I'm not sure, as its been a while since I experienced it.     I don't see anything in the docs for WM_SHOWWINDOW regarding z-order, only minimizing, maximizing or restoring under specific circumstances (and a specific call to ShowWindow). Can you provide a reference?   No, you are right. I got that mixed up. It mentions that it sends messages when the "window is being uncovered" and "window is being covered" but it appears that it only sends these regarding maximized windows for some reason. That explains why it didn't work.
  9. This seems like a good idea. I still have hope of making something like this work. But currently, unless I'm doing something wrong, it seems that Windows will allow my app to "capture the mouse" as long as the cursor is hovering over it. It doesn't seem to matter if another window is above it in the Z-stack. So if the user accidentially slides the cursor into the rectangle region of my game window's visibility, Windows still allows me to capture it. And as far as my game knows, my window is top-most and focused, so there seems to be no way to ignore the situation.   Yes, I may have to. But I'm worried that I'll forget most of this confusion when it comes time to fix it again. This is probably the 3rd or 4th time I've attempted to fix this focus problem. And each time, I forget most of the issues involved with it. I rarely mess with windows API functions, so its easy (and nice) to forget it when I'm working on other things.   Yes, I agree. It probably wouldn't cause any problems with normal applications. It also wouldn't cause any serious issues with full-screen applications, because they don't really need to lock/center the cursor when controlling a 3D camera. The reason this is giving my app so much grief is because its windowed, and trying to make use of the Windows hardware cursor instead of generating its own cursor position via Direct Input. If I don't lock it, the cursor will fly off the window into other places while the user is trying to look around in the game. Then as soon as they click, they accidentally click some random thing in the background and lose game focus.
  10.   I apologize if I confused/mixed WM_ACTIVATE into this situation. I mentioned it above because I wanted to show that windows is also sending that message, but I'm actually relying on WM_SETFOCUS and WM_KILLFOCUS to detect focus changes. My app doesn't actually use WM_ACTIVATE*. According to Microsoft, WM_SETFOCUS is sent after a window gains keyboard focus, and WM_KILLFOCUS is sent "immediately before it loses keyboard focus". When I start my window behind another window, it receives WM_SETFOCUS, then just sits there. It comes down to this: For some reason, Windows wants my application to think it has input focus, even though it doesn't. And as far as I can tell, there is no way to ask windows if it is lying to me.     Yes, this threw me off at first. However, there doesn't appear to be any type of GetRealInputFocusWindow() function. Calling GetFocus() returns the handle to my window, as if the user is focused on my window. The only way to notice the problem is by calling GetForegroundWindow() - and even that doesn't help, because once you know there is a problem, there is no way to fix it without messing things up more (such as by calling SetFocus(...), which makes windows do even stranger things later on).     That's the problem - there is no way to detect when your window becomes the input-focus window (or even the foreground window) via messaging. Or if there is, I haven't yet discovered it. WM_SHOWWINDOW seems to indicate that it will tell you when your app becomes top-most, but after testing, it doesn't seem to be reliable - I get wparam={some_window}, even when my window is on top.   My game uses the mouse to rotate a 3D camera, so it hides and locks the real cursor to the center of the screen so it won't get stuck against a display edge (where it can't generate correct delta values in that direction), or just wander off to click outside of the window and disturb gameplay. So when the game window loses focus, I need to restore the cursor back to normal and unlock it, Otherwise, the user is literally stuck in the center - they can't even click on my game's window to fix the problem (they have to alt+tab to it, or etc).   If I could think of some alternate way to handle the cursor issue, I would just ignore this problem and let my game run its pants off in the background. It might bog down the system, but at least they would be able to use their cursor.
  11. After further testing, it seems that the only way I can find to solve this is to call GetForegroundWindow() every frame, and change the focus accordingly. Basically, I'm handling WM_SETFOCUS and WM_KILLFOCUS like normal, and have this code in my update/loop code, to handle abnormal situations like starting behind another window: BOOL fore_state = GetForegroundWindow() == Handle; if( GameFocusState != fore_state )     SetFocus( fore_state ? Handle : null ); // this triggers windows focus changes and sends WM_SETFOCUS or WM_KILLFOCUS I don't like this solution, but it is working. I don't know how expensive GetForegroundWindow() is, or can get on other OS types. I would definitely prefer a solution that relied only on messaging if anyone has any ideas.   The reason GetForegroundWindow() has to be called every frame is because calling SetFocus(null) (to recover from starting unfocused), for some odd reason, prevents Windows from sending WM_SETFOCUS or WM_ACTIVATEAPP* messages when the app is brought back to the foreground the next time. This makes no sense to me, but there is probably some reason for it.
  12. I'm recieving a WM_SETFOCUS message at startup. And then nothing after that to indicate that focus has been lost.     I'm not trying to change the focus at all. I'm just trying to detect what state the focus is in. My game locks the mouse when its focused, so if I don't notice they clicked on something while my game was opening, their cursor gets frozen while my game is in the background. It's very annoying, and it happens quite a lot while debugging as well.   edit: Here are some other people explaining this problem. They may explain it better:
  13. I'm trying to simulate the rare situation where a user starts my game, but then gets impatient and/or clicks on something before the window appears. To debug this problem, I placed a Sleep(1000) right before my game window is created to provide a delay. Then I start my game, and click on another (folder) window to bring it to the foreground before the game window opens.   What happens is that my app gets the typical WM_ACTIVATEAPP + WM_SETFOCUS messages of a newly created window, and that's it. It never receives a WM_KILLFOCUS, or any other message that differs from starting the game with normal focus. To check this, I logged every message type the window received in the foreground and background, then compared them.   To be clear about this behavior, once my game receives WM_SETFOCUS and is behind another window, I can click on buttons and type into text boxes in that front window without my game window ever getting a WM_KILLFOCUS message.   I'm currently fixing this problem by comparing my window handle to GetForegroundWindow() after I create the window - and if it doesn't match, I call SetFocus(null) to remove input focus and trigger WM_KILLFOCUS. This seems to get around the issue by allowing Windows and my program to *notice* it is unfocused. But it feels like a hack. I'm hoping there is a better solution. edit: This temporary fix causes another problem: the next time my game is actually focused (such as by clicking on it in the task bar or using alt+tab to switch to it), Windows never sends it a WM_SETFOCUS message, even though the window is brought to the foreground. However, the next time I click or alt+tab to it, it does send the message. So it's as if the first focus is ignored because I called SetFocus(null).   Could any of these issues be related to my window's creation styles/options/flags? The style flags I use are (WS_POPUP | WS_CAPTION | WS_SYSMENU | WS_MINIMIZEBOX | WS_MAXIMIZEBOX | WS_VISIBLE). Are there any other things I may be doing wrong to cause this to happen? Or is this just a limitation/bug/design of Windows?   edit: it looks like this poster had the same problem:   Thanks for any advice
  14. double WM_ACTIVATE on task-bar minimize

      I think you probably layed out the answer, right there. I'm guessing, but their re-focus code may have checked for reasons to reallocate/resize video resources, and by calling SetFocus(null/hWnd), they're just forcing their app to lose (on resize-start) and regain (on resize-exit) focus, to reuse the same restoration code. Basically like having a CheckVideoResources()-type function, and calling it from WM_SETFOCUS as well as WM_EXITSIZEMOVE.   Anyway, thanks again for your help. I haven't run into any window focus problems since I switched to WM_SET/KILLFOCUS.
  15. double WM_ACTIVATE on task-bar minimize

      I was aware of these and thought I tried to go this route once, but I can't remember what went wrong. After trying again, it seems to be working correctly so far. Thanks a bunch for recommending it.   Two things I was curious about. First, when researching these issues, I came across someone that mentioned using WM_ENTERSIZEMOVE (to call SetFocus(Hwnd)) and WM_EXITSIZEMOVE (to call SetFocus(null)) with WM_SETFOCUS and WM_KILLFOCUS. They mentioned that DirectX (maybe samples?) used this approach. Is there a good reason for this? Perhaps because resizing/moving is disabled or causes issues with full screen focus? Second, is there any reason to catch/use the WM_ACTIVATE(APP) messages when using these two messages? Something it is still needed for, or can I just ditch them?   Wow, thank you so much for clearing that up for me. You have no idea how long I've been struggling/ignoring window focus issues in my application because I didn't know this.