• Content count

  • Joined

  • Last visited

Community Reputation

100 Neutral

About thebjoern

  • Rank
  1. D3DXSPRITE problem - wrongly scaled?

    Ok, got it, thanks again! Yeah, was really wondering which part made it resize it... that it was the filter wasn't that intuitive for me.... thanks!
  2. D3DXSPRITE problem - wrongly scaled?

    Ok, finally I understand, thanks again Steve! It seems that D3DXCreateTextureFromFileEx not only loads the texture and fits it into an area the size of the next higher power of two (in my case my 424x69 texture would be 512x128), but it actually [b]stretches[/b] it too. My original expectation was that it puts the 424x69 image into the 512x128 texture, and the excess area would just be either zeroed out or random uninitialized values, and when I draw my sprite and pass in the image size 424x69 it would just copy that section of the original image to the screen. But seeing that it stretches the texture to fill the 512x128 texture, then it also makes sense that it cuts it off. So the D3DX_DEFAULT_NONPOW2 setting actually fixed this. Is that what it means in the MSDN documentation by "[color=#000000]Mipmapped textures automatically have each level filled with the loaded texture. " [b]?[/b] [/color]Even though I only specified one level... Sorry if this thread seems kinda silly and newbie-ish... but I am a newbie in this ;-) Anyway, it works now, so I can move on :-) cheers, Bjoern
  3. D3DXSPRITE problem - wrongly scaled?

    Here is the above mentioned screenshot, now with the source rectangle set to the size of the original image file, it just clips the parts that have been scaled up
  4. D3DXSPRITE problem - wrongly scaled?

    Hi Steve, thanks for your reply. I was loading already with [left]D3DXCreateTextureFromFileEx, but I didn't set the [/left][color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif][size=3][left]D3DX_DEFAULT_NONPOW2 flag. However I just changed my code for testing, and am checking for the size of the texture (which as you said is rounded to the next power of 2), and I am storing the size of the original image in the file - for example the buttons are 29x29 and therefore end up 32x32, which makes sense). So when I am rendering it now, I am using that rectangle (ie 0,0,29,29) as the source rectangle, however I still got the scaling problem.[/left][/size][/font][/color] [left][color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif][size=3]The difference compared to before is that now the textures fill the size of the screen as expected, but because they are still scaled they are now also cut off. Please see the screenshot. [/size][/font][/color][/left] [left][color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif][size=3]So somewhere there must still be some scaling happening. But I don't know where, because I am not setting any view matrices or anything.[/size][/font][/color][/left] [left][color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif][size=3]Hmm, I guess I must still be doing something stupid somewhere...[/size][/font][/color][/left]
  5. D3DXSPRITE problem - wrongly scaled?

    Anyone got an idea? I still can't make sense of it... :-/
  6. D3DXSPRITE problem - wrongly scaled?

    I just wanted to illustrate the size problem with a screenshot where you can see the file displayed in 100% and behind that how it is actually rendered
  7. Hi, I am using D3D9 for rendering some simple things, a movie, as the backmost layer, then on top of that some text messages, and now wanted to add some buttons to that. Before adding the buttons everything seemed to have worked fine, and I was using a [color=#2A2A2A]D3DXSPRITE [/color]for the text as well (D3DXFONT), now I am loading some graphics for the buttons, but they seem to be scaled to something like 1.2 of its original size (in my test window I centered the graphic, but it being too big it just doesnt fit well, for example the client area is 640x360, the graphic is 440, so i expect 100 pixel on left and right, left side is fine [I took screenshot and "counted" the pixels in photoshop], but on the right there is only about 20 pixels) My rendering code is very simple (I am omitting error checks etc for brevity) [color=#0000cd]// initially viewport was set to width/height of client area[/color] [color=#0000cd]// clear device[/color] [color=#0000cd]m_d3dDevice->Clear( 0, NULL, D3DCLEAR_TARGET|D3DCLEAR_STENCIL|D3DCLEAR_ZBUFFER, D3DCOLOR_ARGB(0,0,0,0), 1.0f, 0 );[/color] [color=#0000cd]// begin scene[/color] [color=#0000cd]m_d3dDevice->BeginScene();[/color] [color=#0000cd]// render movie surface (just two triangles to which the movie is rendered)[/color] [color=#0000cd]m_d3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE,false);[/color] [color=#0000cd]m_d3dDevice->SetSamplerState( 0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR ); // bilinear filtering[/color] [color=#0000cd]m_d3dDevice->SetSamplerState( 0, D3DSAMP_MINFILTER, D3DTEXF_LINEAR ); // bilinear filtering[/color] [color=#0000cd]m_d3dDevice->SetTextureStageState( 0, D3DTSS_COLORARG1, D3DTA_TEXTURE );[/color] [color=#0000cd]m_d3dDevice->SetTextureStageState( 0, D3DTSS_COLORARG2, D3DTA_DIFFUSE ); //Ignored[/color] [color=#0000cd]m_d3dDevice->SetTextureStageState( 0, D3DTSS_COLOROP, D3DTOP_SELECTARG1 );[/color] [color=#0000cd]m_d3dDevice->SetTexture( 0, m_movieTexture );[/color] [color=#0000cd]m_d3dDevice->SetStreamSource(0, m_displayPlaneVertexBuffer, 0, sizeof(Vertex));[/color] [color=#0000cd]m_d3dDevice->SetFVF(Vertex::FVF_Flags);[/color] [color=#0000cd]m_d3dDevice->DrawPrimitive(D3DPT_TRIANGLELIST, 0, 2);[/color] [color=#0000cd]// render sprites[/color] [color=#0000cd]m_sprite->Begin(D3DXSPRITE_ALPHABLEND | D3DXSPRITE_SORT_TEXTURE | D3DXSPRITE_DO_NOT_ADDREF_TEXTURE);[/color] [color=#0000cd]// text drop shadow[/color] [color=#0000cd]m_font->DrawText( m_playerSprite, m_currentMessage.c_str(), m_currentMessage.size(), [/color] [color=#0000cd]&m_playerFontRectDropShadow, DT_RIGHT|DT_TOP|DT_NOCLIP, m_playerFontColorDropShadow );[/color] [color=#0000cd]// text[/color] [color=#0000cd]m_font->DrawText( m_playerSprite, m_currentMessage.c_str(), m_currentMessage.size(), [/color] [color=#0000cd]&m_playerFontRect, DT_RIGHT|DT_TOP|DT_NOCLIP, m_playerFontColorMessage ) );[/color] [color=#0000cd]// control object[/color] [color=#0000cd]m_sprite->Draw( m_texture, 0, 0, &m_vecPos, 0xFFFFFFFF ); // draws a few objects like this [/color] [color=#0000cd]m_sprite->End()[/color] [color=#0000cd]// end scene[/color] [color=#0000cd]m_d3dDevice->EndScene();[/color] What did I forget to do here? Except for the control objects (play button, pause button etc which are placed on a "panel" which is about 440 pixels wide) everything seems fine, the objects are positioned where I expect them, but just too big. Btw I loaded the images using D3DXCreateTextureFromFileEx (resizing wnidow, and reacting to lost device etc works fine too). For experimenting, I added some code to take an identity matrix and scale is down on the x/y axis to 0.75f, which then gave me the expected result for the controls (but also made the text smaller and out of position), but I don't know why I would need to scale anything. My rendering code is so simple, I just wanted to draw my 2D objects 1;1 the size they came from the file... I am really very inexperienced in D3D, so the answer might be very simple... thanks Bjoern
  8. Ok, thanks for the information! Nevertheless, I would still like to find out if there is an easy way of determining if I got DX11(10) or DX9 installed on the system - BEFORE loading any of the DX dlls? I don't even need an exact version of DX, just if its 9/10/11 running there so I know which of my dlls I have to load... thanks, Bjoern
  9. Hi Matt, thanks for your quick reply. Basically what I meant is that I have an exe, which then loads a dll with my code, depending on the setting in my config file (ie DX9/10/11/OpenGL). So in my dlls I just have different approaches on how I render the (simple) graphics that we require. In that approach I was hoping not to make the user download/install DirectX (on the Vista/Win 7 we can already assume a minimum of DX10, and for XP we provide -if necessary- a link to install DX9c), and keep the installation as simple and straightforward as possible. [so the installer only checks on XP for the minimum requirement of DX9c, on the newer OS versions we don't check] Therefore I was hoping that when I start my exe I can detect what's already available on the system (ie DX9c,DX10 or DX11), and accordingly load the appropriate dll of our code. That's why I was trying to find out if there is a simple programmatical way of doing that, with some code that doesn't need to be specifically installed, but can somehow determine which version of DX the user has [and I am only using D3D, and D3DX, no XInput or XAudio or anything). So far I only saw that DirectXSetupGetVersion() which seems to be from DX9 times and not supported anymore? Does that make sense what I am saying here? cheers, Bjoern
  10. Hi all, I have created a simple application, which loads from a cfg file whether I want to use DX9, DX10, or DX11, and then proceed to load the dlls with the according functionality, which all works fine and renders fine. Now I want to move on to determine which DX I can use, but I only know of using DSetup by calling DirectXSetupGetVersion() to find out. But it seems that is outdated? And I don't even have DSetup.dll on my machine (even though I had the .lib in my June 2010 SDK lib folder). So what is the correct way of finding out, other than already loading a dll and trying to get the device of that version (ie CreateDevice), and if it fails, falling back to a lower version (which doesn't seem like a great way of doing it), until I get one working (or falling back to OpenGL, which I am supporting too)? Thanks for your help, Bjoern
  11. Thanks for your replies. So I suppose I will have to go and change the design a bit, so that just the decoding happens on the other thread, and the copying of the data will happen on the main rendering thread. I had just hoped I could do it the same way as I did it for DX11, that way I could reuse more of the code... anyway, thanks for your help, Bjoern
  12. Thanks for your replies. The reason I wanted to go that route was that I already had a working system and class framework in place from when I first implemented it for DX11 (which seems to be quite friendly with threaded environments and allows me to create and modify resources on the other thread quite easily). So all I needed to do was to derive from those same base classes and create OpenGL specific implementations of them. However, as you both seem to be so strictly against it, I might have to reconsider I suppose. But can you please explain to me what the reason is that I should try to avoid multiple contexts at all cost? P.S. in case you wonder why I want to make an OpenGL version of the renderer: I want to get it to work on Mac too... Again thanks for your time and comments! Bjoern
  13. Hi, thanks for your replies, I have played around with that today, and almost got there, except for the shared lists.. So what I am doing now on entering the decoding thread is first I create a new context: m_hdc = m_window->GetHdc(); m_hrc = wglCreateContext(m_hdc); Then I am setting the same pixel format as for my "main context": SetPixelFormatGL(hdc); Then make it current: wglMakeCurrent(m_hdc, m_hrc); And then finally I try to share the lists: wglShareLists(mainContext->m_hrc, m_hrc) Note that I have omitted the checking code here for brevity. The "mainContext" is the context that was created first on the rendering thread (ie the only thread that does actual rendering, as the decoder thread only just hopes to copy the new videoframe data to one of the textures (which is not actively used for rendering with glBindTexture() during the time)). And my problem is that the wglShareLists() function call always returns false, but I don't know what to change to make it work... any ideas? Btw: As the decoder is created on demand by the application, it means that the mainContext in most cases has already created other textures and things (for the UI). Might this be the reason of the failure? thanks, Bjoern
  14. Hi, I have recently created an application which can decode video files and then render the videos to the screen. I have done that in DX11, but I now want to do it for OpenGL as well, but I am new to OpenGL (and generally to rendering), but have programmed for many years. Basically my main thread does the rendering of all the objects, and the decoder is on another thread. I have a class RenderObject which represents anything that can be rendered, and derived from that I have a DoubleBufferSurface class, which has an "active surface" (the one being renderer by the rendering thread) and a "buffer surface" (the one the decoder writes its current frame to). Once the buffer surface is finished being updated, the pointer to the "activeSurface" changes to point to this surface and vice versa, so that the new videoframe will be rendered on the next frame. This all seems to work well in DX11, but when I now started implementing it in OpenGL, I found that I cannot call any OpenGL functions on my thread (they all return errors). So my question is: Is it easily possible to implement this for OpenGL (the decoder thread does no rendering, all it wants to do is write the data from the video to the buffer (or inactive) surface, by using glTexSubImage2D())? What are the steps necessary to make this work? thanks, Bjoern
  15. Ok, thanks for your help Jason, and Dieter, much appreciated, as it helped me to finally get it to work (together with the help of PIX, which is a great tool). Now some small issues left to fix, then can move on to the next challenge..