Jump to content
  • Advertisement

Uphoreum

Member
  • Content Count

    498
  • Joined

  • Last visited

Community Reputation

216 Neutral

About Uphoreum

  • Rank
    Member
  1. The XML serialization/deserialization framework included in .NET will automatically deserialize the text contents of an element into most fundamental types (bool, int, string, etc.), but I'm wondering if there's a way to extend that so that I could turn an element like this: <Vector>0, 0, 0</Vector> into a custom Vector object, for example. I've read through a lot of the documentation and found how to deserialize complex objects with child elements, but I haven't seen a way to parse the inner text differently as I would need to for the above. The only way I know to do that currently is deserialize it as a string and manually parse it after the fact for each Vector element, or to do this: <Vector> <X>0</X> <Y>0</Y> <Z>0</Z> </Vector> I know I need to write the code to split it up and parse the values, I'm just wondering if I can set it up so that it automatically happens if a Vector object is being deserialized. Anyone know about this? Thanks!
  2. So, it seems that adding 1 to the width and height of each sprite quad did the trick. I don't really understand why, except that I know there's some weirdness about Direct3D and exclusive coordinates.
  3. That looked promising, but I tried shifting everything 0.5 units and got the same result. I determined that the broken part is where the edge that the two triangles share is. It goes from the upper left to the bottom right, following the shared edge. I tried reordering the vertices and the misaligned area changed direction to follow the shared edge again. So, the problem is that the texels along the edge aren't necessarily mapping to the right triangle... or something like that. Any suggestions on what might fix this? EDIT: I just noticed that if I make the back buffer twice the size of the window, the problem goes away. [Edited by - Uphoreum on December 15, 2010 10:01:33 PM]
  4. I'm mapping some "old-school" low resolution images into textured quads for use as sprites/hud elements, but the texture coordinates are resulting in mis-aligned textures (see example image below) An important note I think is that, since these aren't power-of-2 textures, I'm only using a fraction of the full texture. In this example it goes from 0.0-0.625 in both dimensions. I tried using the entire image, and exactly half the image and it worked in both cases, but not with a UV coordinate value like .625. Is there a good way to fix this problem? I assumed it was a decimal truncation problem at first, but in this case no digits are lost, so I don't know why it's mapping the texture like this. The horizontal lines should be straight and unbroken. (0,0)-------------(.625,0) (0,.625)----------(.625,.625)
  5. Uphoreum

    [Ruby] Garbage Collection Stutter

    Yea, 1.9 added a lot of performance enhancements -- but the GC is still mark and sweep. I tried disabling the GC, but memory builds up really quickly. I guess you could try to completely avoid allocating anything temporary (that you would just free in C++) in the main loop, but when you start managing memory like this it kind of ruins a big part of Ruby's (or another memory managed language) benefits. It looks like you need to use a different Ruby implementation if you want to do this, such as JRuby, but that adds a whole other layer of dependencies. I think I'm going to call Ruby unsuitable in it's current version and play with Python or Java as alternative languages.
  6. I'm playing around with Ruby and trying to see if it's viable to make small games with. A pretty major issue I've run into is that the garbage collection runs about once a second and causes a severe pause. I've confirmed that it's the GC because the stutter goes away if I disable it. I was thinking I could manually enable/disable the GC so that it would only run occasionally, but I'm guessing that just means really long pauses occasionally rather than short ones frequently. Are there any ways to optimize so that collection is less frequent other than allocating less objects? I don't think I can do much less allocating than I already am. Also, is this particularly bad in Ruby or is it noticeably bad in all garbage collected languages?
  7. Uphoreum

    How Much Does School Choice Matter?

    Quote:Original post by Tom Sloper Quote:Original post by Uphoreum What I seem to be getting from reading various articles is that by far the most important things are experience, a good portfolio, and having and being able to show that you have the required knowledge for the job. You should read even more articles, then. Some more articles you should read: http://www.igda.org/games-game-january-2005 http://www.igda.org/games-game-november-2005 http://www.igda.org/games-game-june-2009 Were those intended to counter my statement? They seem to just reinforce that the school/degree choice is not a big deal.
  8. Uphoreum

    How Much Does School Choice Matter?

    What I seem to be getting from reading various articles is that by far the most important things are experience, a good portfolio, and having and being able to show that you have the required knowledge for the job. In other words, it doesn't matter so much how you got the knowledge, more that you have it. Agree? I mean, if I have an interview, are they going to focus on my education background or are they going to focus more on any cool stuff I show them?
  9. I'm wondering if someone could analyze my current situation and see how easily or difficult it would be for me to get into the game industry (programming/development side). I've been self-teaching programming for about 6 years. As a result, I've spent two (paid) summers as a software developer at a major tech company and am currently working part time as a software developer at a local non-tech company (but writing software for them). My question is, knowing this, how much does it matter where I go to school if I want to eventually get into the industry? Wherever I go, I'll be going for the CS degree. That is, unless you think that's a bad idea for some reason. Also, it is not very important that I get in to games right out of school. I'm fine with working as an SDE somewhere else for a while before. Feel free to add any other comments or advice as well. Thanks!
  10. I'm interested in writing an engine where all the rendering and other engine type things are done in C and the game(s) would be written in Python. So, I'm wondering, would this be done best by wrapping a C DLL with ctypes or by writing Python plugin, or something else?
  11. Uphoreum

    ID3DXSprite Coordinate System

    I just used: D3DXMatrixOrthoLH(&projection, 640, 480, -1.0f, 1.0f) And then set the projection matrix on the device to the above. I did this before calling BeginScene and ID3DXSprite->Begin, but like I said nothing is displayed.
  12. Uphoreum

    ID3DXSprite Coordinate System

    Well I took a shot at setting up the projection matrix, but now nothing gets rendered. I just left the World and View matrices alone. Do I need to do something with them even if I don't necessarily want the camera moved?
  13. I'm wondering if there is a way to change the coordinate system used by ID3DXSprite. In OpenGL, you can use glOrtho and pass values for the left, right, top, and bottom bounds. ID3DXSprite uses the display size as the default, but I'm guessing there has to be a way to change that. Anyone know?
  14. Uphoreum

    SlimDX + Intel GMA 950

    Ah, that was it. The backbuffer format apparently needs to be X8R8G8B8. Thanks!
  15. I have some SlimDX code that was previously working on a machine with an ATI Radeon Xpress 200M, but I'm having issues creating the device on my other machine with an Intel GMA 950. Here is where I set up the present parameters: presentParameters = new PresentParameters(); presentParameters.BackBufferWidth = displayMode.Width; presentParameters.BackBufferHeight = displayMode.Height; presentParameters.BackBufferFormat = Format.X8B8G8R8; presentParameters.Windowed = !displayMode.Fullscreen; presentParameters.SwapEffect = SwapEffect.Discard; presentParameters.Multisample = displayMode.Multisamples > 0 ? MultisampleType.NonMaskable : MultisampleType.None; presentParameters.MultisampleQuality = displayMode.Multisamples; presentParameters.PresentationInterval = displayMode.Synchronized ? PresentInterval.One : PresentInterval.Immediate; And I've tried creating the device as Hardware, Software, and Reference, as well as with HardwareVertexProcessing and SoftwareVertexProcessing. No matter what, I get an Invalid Call error. When it is run, Multisample is equal to None. There must be something that the GMA 950 doesn't support that I have enabled, but I'm not sure what it is. Any ideas? Thanks.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!