Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

186 Neutral

About BattleMetalChris

  • Rank

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. BattleMetalChris

    Interfaces and code duplication

    So, the GameObject itself contains almost no data or functionality, except that needed to request and store pointers to the various components? It's basically just a collection (small 'c') of components? Almost everything it can 'do' is actually done by the components, which themselves are stored and assigned as necessary by the relevant logic units (renderer, physics iterator, game logic etc)? So if I had, say, a ball. The 'Ball' object needs to be drawn, needs to move in the world, and needs to react to its surroundings. When it's created, it would request a 'Renderable' component from the Renderer, which would create one, store it and hand a pointer to the new Ball object. The ball could then configure this accordingly and store the pointer. Now the renderer can happily get on with drawing whatever it has in its list without ever needing to bother with the actual GameObject to which each renderable has been attached. The same applies to the physics and game logic components. Am I getting closer?
  2. BattleMetalChris

    Best comment ever

    In a VBA project at work:   ' Bear in mind that every time the middleware gets updated, the API changes; I presume simply to keep VBA developers on their toes.
  3. BattleMetalChris

    Interfaces and code duplication

    Thanks for the advice and the quick replies :)     Riiiiightt..... I think I'm getting there. The concepts are still floating in my head just out of reach, but they're beginning to brush against my fingertips.   So, I could make a class called Renderable, which contains all the basic things such as position, graphics (or location of graphics at least). To make an object Renderable, I'd just add a pointer to an instance of the Renderable class, rather than directly inheriting from it.   My Renderer would need to take an object of type 'Renderer' - I think I see now where Interfaces fit in. I could make my object implement a 'Renderer' interface (but I'll not call it IRenderer, for fear of angering TheChubu...  :wink:  ) and implement a method to direct it to the renderer class. True, I'd need to duplicate those couple of lines but the actual heavy-lifting would be handled by the Renderer component.     Better?     Excellent read, thank you.
  4. BattleMetalChris

    Interfaces and code duplication

    To start, my background is C++, Just tinkering really; a few simple 3D games but nothing professional. Anyway.   I'm starting developing for my Android phone, which realistically means learning Java. Its been pretty easy so far as the two languages are similar.     However, I'm getting a little lost on the use of Interfaces. Every Java resource I've read goes crazy on the idea of interfaces, and uses them for almost everything. What I don't get however, is how they are better than plain old inheritance. The only thing they seem to be able to do is allow (a form of) multiple inheritance, which you're not allowed to do in Java by just extending classes, and in the cases I've been sketching out in readiness for coding, it appears to mean a lot of redundant code.   Say I have a class GameObject, being the abstract base class for everything in my game. Something in the game might be a visible physics object, so if I want an object that does that using Interfaces, I would write a Renderable interface and a Physics interface and implement both.   But say I have an object which doesn't use physics. I would have it implement just the Renderable interface.   Now I have my two objects which I can store in a collection and pass to a Renderer, the interfaces guaranteeing that both objects have the appropriate methods to be able to be drawn.   However, because Interfaces only contain the method declarations, not any actual functionality itself, in this case I'd have to write the drawing methods TWICE, for both the renderable physics object and the plain non-physics object.   In C++, I'd just have them inherit from a Renderable and a Physics base class to give them all the functionality they need, but in Java I can't do that as you can only inherit from one class. Given this restriction, currently, I'm looking at just having a Renderable class extend the Base class, and the Physics object extend the Renderable without using Interfaces at all. But given how much they're pushed in the various coding resources, I can't help but think I'm missing something.       Why are Interfaces used instead of just Inheritance alone?
  5. BattleMetalChris

    Function "Overload" Question

    Could you not just pass a reference to a vector of the parameters? This also lets you use templates if you plan on having several versions for different parameter types. [source]typedef std::vector<std::string> ParamsVec; int foo(ParamsVec& params) { std::string str; cin >> str; int count = 0; for (ParamsVec::iterator it = params.begin(); it != params.end(); ++it) { if ((*it) == str) return count; count++; } }[/source] EDIT: aargh, stupid source tags, why do they cut off angle brackets if I specify a language type?...
  6. BattleMetalChris

    SampleLevel not returning alpha component

    haha, I feel like an idiot, I can't believe that didn't occur to me.
  7. I'm sampling a texture in a Geometry shader. As you can't use Sample() in a GS, I'm using SampleLevel() and specifying the mip level (0 as I need to sample it at full resolution). It's not picking up the alpha component of the texture though - the statement float4 sample1 = Image.SampleLevel(samPoint, sampleCoords[0], 0, 0); although the RGB components are correct, the alpha component is always 1. The texture is a 32-bit .bmp with which I have created using another application I've written. I've checked the texture using the DirectX texture tool and the alpha channel is definitely present in the .bmp file. I've also checked in my code in the debugger and also checked the format of the texture resource in pix and the texture is definitely being loaded as an R8G8B8A8_UNORM. EDIT: Actually, looking at the resource in Pix, the alpha channel is completely white, so it looks like the problem is to do with loading the texture rather than sampling it. Is there anything special I need to specify when I load the texture so it'll pick up the alpha channel from a 32 bit .bmp file? This how I load my texture: ID3D10ShaderResourceView* texView = 0; HRESULT hr = D3DX10CreateShaderResourceViewFromFile(m_device, fileName.c_str(), loadInfo, 0, &texView, 0);
  8. BattleMetalChris

    Using strings in HLSL

    Given that 32 bits seems to be the smallest integer data-type hlsl can use, would it be prudent to pack a string of 64 chars into 16 ints, pass them like that, then use bitshifting to unpack them in the shader? What about packing into four int[4]s (I know hlsl packs everything internally into arrays of four 32-bits), or would that be exactly the same as passing 16 ints?
  9. BattleMetalChris

    Using strings in HLSL

    I'm making a shader that will draw a line of text to the screen using a bitmap containing all the characters needed (this is mainly because I can't get DrawText() to work at all.) The way I am intending to implement it is that I've used a little tool I've written to encode the bounding box coords in uv-space of each character into the pixels in the top few lines of the bitmap. I then pass the bitmap and my line of text into the shader - for each character in the text string, the geometry shader draws a quad and uses the ascii value of the character to lookup into the texture and get the uv coords for that character. Now I've come to write the .fx file I've come up against a problem I'd expected to be trivial; How can I pass a string into the shader and then access it? The HLSL reference says that there is a string type, but that 'There are no operations or states that accept strings, but effects can query string parameters and annotations.' Does this mean I can pass one in, but can't read it? Or define one within the shader, but can't pass one in? The (untested) code I have to pass the string into the shader is: m_textStringVar->SetRawValue((void*), 0, text.length() * sizeof(char)); where 'text' is a std::string and m_textStringVar is an ID3D10EffectStringVariable*
  10. BattleMetalChris

    Alternative to singleton

    Antheus, that's a fantastic explanation, thank you
  11. After reading this thread, I started considering the singletons I use in my own project (my own DX10 renderer) I currently have a few manager classes, (mainly for loading, storing and distributing resources) implemented using the Curiously Recurring Template pattern. This was done to stop me having to pass them in as a reference to everything that might need a resource; if anything needs a resource, it can just ask for one by calling a static function which then calls the non-static one in the instance and the manager takes care of creation, reference counting and destruction. This struck me as nicely elegant and stripped out lots of cases where I was passing 2 or 3 references in, every time I created something like a game entity (MeshManager, TextureManager, InputManager etc), but reading up I've been seeing that this might not be as good an idea as I thought. Why is reference passing 'better', and is there a better way of doing it than just putting references in the constructor arguments of the objects that need to use them? Or is it ok to continue as I am?
  12. BattleMetalChris

    Why does my win32 application terminate early?

    Apoch, you genius. It was Avast, sandboxing the exe. Added it to the list of exceptions and everything works perfectly.
  13. BattleMetalChris

    Why does my win32 application terminate early?

    I don't have that kind of 'antivirus'. I think you're confusing me with my parents Even so, if a reinstall of VS doesn't work, I'll run a scan and see what happens.
  14. BattleMetalChris

    Why does my win32 application terminate early?

    I don't think it's a code issue, I think it's a problem with VS2010. Even a new, empty project with just #include <windows.h> int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int showCmd) { MessageBox(0, "Hello World!", "Message", MB_OK); } which is about the simplest Win32 program I can think of, does exactly the same thing. The message box appears for about 15 seconds, then the program exits. Messageboxes are modal, so I don't think there's any message that my application can send which would cause it to exit. I'm trying a reinstall.
  15. This is driving me insane. I've had some time off my main project, - my own DX10 rendering system comprising some 20,000 lines of code - and have this week come back to it. I've found it's (for no reason I can work out) terminating around 15 seconds into execution. With some murderous application of comment tags, I've stripped the entire thing back to just window creation and message loop in one main.cpp file (I've commented out all the preprocessor lines in there too) and it's STILL doing it. It's not a quit message - I've placed breakpoints after the message loop terminates and it never reaches them. For what it's worth, this is the current program as it's currently set to compile, in it's entirety: #include <windows.h> LRESULT CALLBACK WinProc(HWND hWnd, UINT msg, WPARAM wParam, LPARAM lParam); int g_clientWidth = 800; int g_clientHeight = 600; int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, int showCmd) { WNDCLASS wc; wc.cbClsExtra = 0; wc.cbWndExtra = 0; wc.hbrBackground = (HBRUSH)::GetStockObject(BLACK_BRUSH); wc.hCursor = ::LoadCursor(0, IDC_ARROW); wc.hIcon = ::LoadIcon(0, IDI_APPLICATION); wc.hInstance = hInstance; wc.lpfnWndProc = WinProc; wc.lpszClassName = "Poo\n"; wc.lpszMenuName = NULL; = CS_HREDRAW | CS_VREDRAW; RegisterClass(&wc); HWND hWnd = 0; hWnd = ::CreateWindow(wc.lpszClassName, "Caption", WS_POPUP, 0, 0, g_clientWidth, g_clientHeight, NULL, NULL, hInstance, 0); ShowWindow(hWnd, showCmd); UpdateWindow(hWnd); MSG msg; ZeroMemory(&msg, sizeof(MSG)); while (msg.message != WM_QUIT) { if (PeekMessage(&msg, 0, 0, 0, PM_REMOVE)) { TranslateMessage(&msg); DispatchMessage(&msg); } else { // game loop } } MessageBox(0, "Finished", "FPS", MB_OK); return 0; } LRESULT CALLBACK WinProc(HWND hWnd, UINT msg, WPARAM wParam, LPARAM lParam) { switch(msg) { case WM_DESTROY: PostQuitMessage(0); return 0; case WM_KEYDOWN: switch (LOWORD(wParam)) { case VK_ESCAPE: DestroyWindow(hWnd); return 0; } break; } return DefWindowProc(hWnd, msg, wParam, lParam); } Does anyone have ANY idea how I might go about getting to the bottom of this? Various attempts at googling just find tutorials about terminating an application, not ways of stopping it terminating.
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!