• Content count

  • Joined

  • Last visited

Community Reputation

100 Neutral

About blaze02

  • Rank
    Advanced Member
  1. Just check the D3DPMISCCAPS_CLIPPLANESCALEDPOINTS bit of the PrimitiveMiscCaps of your D3DCAPS9. The new ATI 2900 HD does not correctly clip scaled points, so I'm guessing none of the ATI cards do this correctly. Is there any way to enable it? or do very few cards support it? It may be too much overhead to disable point scaling and calculate the point size on the CPU. I had another problem with the caps if anybody could help. I can't figure out how to get MaxPointSize (from the D3DCAPS9) into pixels. NVIDIA cards like to return 8192. How does that compute to max point size? ATI likes to return 64, 128 or 256, so I was assuming it would always be size in pixels. Thanks for any help.
  2. When the center of a point sprite is just barely offscreen, the entire point sprite is clipped away. Shouldn't the guard band keep this from happening. The D3DRS_CLIPPING render state is TRUE. Everything else is supposed to be automatic. Which of my assumptions is wrong? Thanks. [Edited by - blaze02 on August 15, 2007 6:02:57 PM]
  3. DirectInput Deadlock

    I think its fixed or worked-around. Here's the problem. To reproduce: 1) Start up the app (compiler auto-focuses the window). 2) Unfocus the window before the app is initialized by clicking on another window or hitting the Windows key. 3) Focus the window just before the game is finished initializing. If the timing is right, the app will freeze and use no CPU. The app will remain frozen until the window loses focus, at which point the app continues as if nothing happened. Instead of the window losing focus, the ESCAPE key or numpad ENTER key will unfreeze the app. I'm not sure what is special about those keys. Also, the window's registered message proc does NOT receive the key press to unfreeze the app. The fix/work-around: Originally, D3D graphics would initialize and call ShowWindow. Then, DirectInputwould initialize. Now, input is initialized before graphics. Recalling some info from the article (the one I can't find or maybe the one I dreamed up): ShowWindow triggers the win32 WM_ACTIVATE msg. This msg is first sent to parent windows. Some guessing: Some part of DInput must be locking the child window and then trying to lock the parent window. Meanwhile, the WM_ACTIVATE is locking the parent window and trying to lock the child (to send the WM_ACTIVATE msg?). I still have no idea why ESC or ENTER ends the deadlock.
  4. I stumbled onto an article that talked about this, but my recent attempts to find the article produced nothing. When the app is starting up, if the window loses/gains focus at the right time, DirectInput will deadlock. The deadlock lasts until window loses/gains focus again, then the app runs normally. Sorry for the lack of details. I'm hoping somebody will link me some article describing the problem and how to fix it. If I'm talking jibberish, I'll debug and post some more details later.
  5. Lost Device Driver Bugs?

    S1CA, I think you misunderstood me. The code is fine, but the older DirectX SDK reports errors. Newer SDK's do not see a problem.
  6. Lost Device Driver Bugs?

    I ran into the type of issue I was worried about. It wasn't a big deal, but I was a bit confused at the first. On older DirectX SDK's (specifically Dec. 2004), the debug DirectX DLL will fail without reason. If the program creates queries, the Reset call will fail and the output window will claim the queries are not being released. This guy gets the same problem on Oct. 2004 SDK.�
  7. Optimizing stdext::hash_map

    Hash maps usually grow at some threshold, so I would assume the amount of space to reserve would be something like: map.size() * 1.25 //threshold = 80% But that won't help if we can't tell it to reserve to a specific size. As for the reserving the space, the container must be smart enough to optimize large inserts. It looks like the loading algorithm needs to load the hash_map entries into a vector with reserved size, then inserted into the hash_map all at once. Quote:Quoted from: Scott Meyers Effective STL most standard containers never ask their associated allocated for memory. Never.Where are the boost evangelists when you need them?
  8. I had this idea about optimizing our app. The first thing it does is load 40,000+ variables into a hash_map. The idea is to remember how much space is needed to load everything, and reserve that space before attempting to load all the variables. If I was working with a vector, this would be easy using .size to get the amount of memory and .reserve to allocate it. But with a hash_map, it is proving much harder. Is it possible to reserve space in a hash_map? Is it possible to determine how much space a hash_map currently has allocated? Thanks.
  9. XInput Without Multiple EXEs

    Quote:Original post by Saruman can make a very small version with only the components you need (such as XInput). I don't think they let you do that. If we were to force users to have a recent DirectX runtime, our download size would only increase 44k for the web installer. Then, if necessary, the rest of DirectX would get downloaded when installing our app. Not too sure about the force update. It's really nice to be able to run the app on any computer with XP. Decisions, decisions... Thanks for the comments.
  10. Loop Problem

    Check your logic. You want to "AND" them together. ... quit != 'n' && quit != 'N'...
  11. Currently we have 2 EXEs for our app. One EXE is compiled with the Dec. 2004 DirectX SDK. The other with the most recent SDK. The first EXE allows anybody that has WinXP: SP2 to run the app, but it cannot use XInput. Option #1 Is there any way to accomplish this with a single EXE? If the user has a recent DirectX runtime, the app loads the XInput library and detects Xbox360 controller input. Else, the app skips the Xbox360 input code. Option #2 If that doesn't work out, we are probably going to have a front-end GUI that detects the DirectX version and runs the appropriate EXE. Option #3 Another plan is to cut XInput and detect 360 controller input as if it were a normal joystick. I'm guessing that would not allow us to vibrate the controller, which may be a deal-breaker. Option #4 And the other plan (the reason nobody else runs into this problem): we can force the user to download the newest DirectX runtime. This would solve a lot of problems, but it may not be acceptable to make the end-user do more downloading and installing. Help, suggestions, and small amounts of flaming are welcome.
  12. Disabling Texturing ?

    Quote:Original post by Namethatnobodyelsetook The results are undefined, so no, not right. And it does vary by card, so you're just lucky so far. Are you sure about this? I remember from somewhere that the fixed pipeline always returns black for null textures. It is, however, undefined when using pixel shaders.
  13. Help: Low-level timing oddities.

    Quote:Original post by AndreiVictor 1000ms / ( 60 frames / 1000ms) does not result to a single ms unit right? You aren't doing that though, are you? In SetFPS, it executes "1000 / 60" and sets m_FPSControl to 16. So now your spinloop is waiting for 16ms instead of 16.7. If anything, this would speed up your fps.
  14. Help: Low-level timing oddities.

    I'm thinking that your graphics API is waiting for VSync. The SwapBuffers call may be waiting to the next VSync... but it is doing it AFTER you wait 16ms to lock to 60 fps. See if you can set the presentation interval to immediate. That is what D3D calls it, finding the OpenGL equivalent shouldn't be difficult. Hope that helps.