• Advertisement

sunnysideup

Member
  • Content count

    20
  • Joined

  • Last visited

Community Reputation

105 Neutral

About sunnysideup

  • Rank
    Member
  1. I'm a little confused. Whenever I call wglChoosePixelFormatARB() what exactly am I supposed to do with the returned pixel format? Should I call SetPixelFormat() passing int the returned pixel format as the second paramater? If so should I pass NULL for the PIXELFORMATDESCRIPTOR parameter?   Here's the bit of code in question: UINT pixFormatCount = 0; // Specify the desired pixel attributes for frame buffer. int pixAttribs [] = { WGL_DRAW_TO_WINDOW_ARB, 1, // Can be drawn to window. WGL_DEPTH_BITS_ARB, 24, // 24 bits for depth buffer. WGL_STENCIL_BITS_ARB, 8, // 8 bits for stencil buffer. WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB, // Use hardware acceleration. WGL_SWAP_METHOD_ARB, WGL_SWAP_EXCHANGE_ARB, // Exchange front and back buffer instead of copy. WGL_SAMPLES_ARB, 4, // 4x MSAA. WGL_SUPPORT_OPENGL_ARB, 1, // Support OpenGL rendering. WGL_DOUBLE_BUFFER_ARB, 1, // Enable double-buffering. WGL_PIXEL_TYPE_ARB, WGL_TYPE_RGBA_ARB, // RGBA color mode. WGL_COLOR_BITS_ARB, 32, // 32 bit color. WGL_RED_BITS_ARB, 8, // 8 bits for red. WGL_GREEN_BITS_ARB, 8, // 8 bits for green. WGL_BLUE_BITS_ARB, 8, // 8 bits for blue. WGL_ALPHA_BITS_ARB, 8, // 8 bits for alpha. 0 }; int pixelFormat = 0; // Find pixel formats that match our pixel attributes. BOOL result = wglChoosePixelFormatARB(m_hDC, &pixAttribs[0], NULL, 1, &pixelFormat, &pixFormatCount); // <-- What do I do now? SetPixelFormat(m_hDC, pixelFormat, NULL); // <-- Is this correct? The book I'm reading explains how to choose a pixel format, how to enumerate available pixel formats, and how to query individual attributes, but it doesn't explain what I'm supposed to do with the pixel format.
  2. OpenGL Questions about OpenGL

    There's an NVIDIA Nsight Eclipse edition, but as far as I can tell it's only avaiable for Mac and Linux. I don't know why they couldn't release a Windows version. It's frustrating for hobbyists on Windows who don't have the money to spend on Visual Studio.   [edit] Having a second look at the Nsight Eclipse edition I'm not sure it can be used to debug opengl applications, oh well. I just came across http://apitrace.github.io/ has anyone used this? Is it any good?
  3. Now I get the follwing error whenever I try and ALT+ENTER. DXGI ERROR: IDXGISwapChain::GetContainingOutput: The swapchain's adapter does not control the output on which the swapchain's window resides.
  4. Thank you ajmiles. Fixing my code solved the problem as long as I only specify D3D_FEATURE_LEVEL_11_1 and D3D_FEATURE_LEVEL_11_0  in the feature level array. But  if I add D3D_FEATURE_LEVEL_10_1 to the array it still selects that feature level. I guess if I actually started using some Direct3D 11 only features then it would work?
  5. I have a GTX 570 graphics card which SHOULD support DirectX 11, but the highest feature level that D3D11CreateDevice selects is 10_1. I had a look around on these forums and it seems that other people have had this problem and that it might be Nvidia Optimus selecting the Intel integrated graphics chipset instead of the dedicated Nvidia GPU. So I added my .exe to the list of programs in the Nvidia control panel and then used the following code to iterate through the available adapters to hopefully select the correct adapter.   Here's the code: /* Create the dxgi factory. */ Microsoft::WRL::ComPtr<IDXGIFactory1> dxgi_factory; HRESULT hr = CreateDXGIFactory1(__uuidof(IDXGIFactory2), (void**)&dxgi_factory); if (FAILED(hr)) { throw std::runtime_error("Failed to create dxgi factory."); } if (FAILED(dxgi_factory.As(&m_dxgi_factory))) { throw std::runtime_error("Failed to obtain IDXGIFactory2 interface."); } D3D_FEATURE_LEVEL feature_levels[] = { D3D_FEATURE_LEVEL_11_1, D3D_FEATURE_LEVEL_11_0, D3D_FEATURE_LEVEL_10_1, D3D_FEATURE_LEVEL_10_0, D3D_FEATURE_LEVEL_9_3 }; int num_feature_levels = ARRAYSIZE(feature_levels); Microsoft::WRL::ComPtr<ID3D11Device> device; Microsoft::WRL::ComPtr<ID3D11DeviceContext> context; /* Iterate through each feature level. */ for (int f = 0; f < num_feature_levels; f++) { /* Iterate through available adapters. */ while (m_dxgi_factory->EnumAdapters1(f, &m_dxgi_adapter) != DXGI_ERROR_NOT_FOUND) { hr = D3D11CreateDevice(m_dxgi_adapter.Get(), D3D_DRIVER_TYPE_UNKNOWN, 0, create_device_flags, feature_levels, num_feature_levels, D3D11_SDK_VERSION, &device, &m_feature_level, &context); /* If success break out of loop. */ if(SUCCEEDED(hr)) break; } if(SUCCEEDED(hr)) break; } This still doesn't work. Someone on these forums also suggested looking in the DirectX caps viewer and in the DXGI 1.1 > NVIDIA GeForce GTX 570 > Direct3D 11 folder it only lists D3D_FEATURE_LEVEL_10_1. It's like my gpu isn't being recognized as a DirectX 11 capable device.   Does anyone have any suggestions? Thanks.
  6. My lowly triangle will not render :(

    I never did figure out what was wrong, but I rewrote the whole thing and got it working.
  7. Hello, I'm having some trouble getting a simple triangle to render. I've done this before (yesterday actually with a cube), but I must have lost it or something. Looking in PIX it looks like the vertices all get set to zero after the vertex shader stage. I don't understand why, but I'm a noob so I don't understand this stuff. I was hoping somebody might take a look at the code and see if they can't help me out, I'd really appreciate it. Thanks.
  8. [quote name='NumberXaero' timestamp='1352188795' post='4997917'] Are you updating your projection matrix? [/quote] There's no projection transformation. This is actually just a slightly modified version of the triangle tutorial in the SDK docs. I (perhaps stupidly) thought that resizing the swap chain's buffers would cause the triangle to retain it's shape when the window is resized. I'm guessing that would require a projection transformation? If I wanted to make a 2D game for example where the sprites were rendered to a square, would I need to do a projection transformation on the square. Pardon my ignorance.
  9. I think I might be able to better clarify what I'm getting at. I've rendered an equalateral trianlge with vertices (0.0, 0.5, 0.5), (0.5, -0.5, 0.5), and (-0.5, -0.5, 0.5) without any projection transformation. How can I make it so that the the triangle remains equalateral whenever the user resizes the window? Isn't this what ResizeBuffers() is supposed to do or would this require a projection transformation or something like that?
  10. Eeek. It doesn't look like the code hilighting came out right. Sorry about that guys :/
  11. I've programmed a simple little app that renders a triangle to the screen. Whenever I maximize the screen or drag the resize bars the triangle squishes and stretches along with the window. I wrote a function that uses IDXGISwapChain::ResizeBuffers() to resize the swap chain's buffers in response to a WM_SIZE message but it doesn't seem to fix the distortion. Am I misunderstanding the purpose of IDXGISwapChain::ResizeBuffers()? Here's my OnResize() function. I'm either doing something wrong, or ResizeBuffers doesn't do what I though it did. Sorry if this is a dumb question [img]http://public.gamedev.net//public/style_emoticons/default/wacko.png[/img] [source lang="cpp"]HRESULT OnResize(LPARAM lParam) { g_pImmediateContext->OMSetRenderTargets(0, 0, 0); // Release any extraneous instances of swap chain's buffers. The render // target view typically holds references to swap chain's buffers. g_pRenderTargetView->Release(); UINT width = LOWORD(lParam); UINT height = HIWORD(lParam); // Resize swap chain's buffers. HRESULT hr = g_pSwapChain->ResizeBuffers(1, width, height, DXGI_FORMAT_R8G8B8A8_UNORM, 0); if(FAILED(hr)) { MessageBox(NULL, "[ERROR] Failed to resize buffers.\nOnResize() in rederer.cpp", "ERROR", MB_ICONERROR | MB_OK); return hr; } // Get buffer and create render target view. ID3D11Texture2D* pBackBuffer = nullptr; hr = g_pSwapChain->GetBuffer(0, __uuidof(ID3D11Texture2D), (void**)&pBackBuffer); if(FAILED(hr)) { MessageBox(NULL, "[ERROR] Failed to get back buffer.\nOnResize() in renderer.cpp", "ERROR", MB_ICONERROR | MB_OK); return hr; } hr = g_pD3DDevice->CreateRenderTargetView(pBackBuffer, NULL, &g_pRenderTargetView); pBackBuffer->Release(); if(FAILED(hr)) { MessageBox(NULL, "[ERROR] Failed to create render target view." "\nOnResize() in renderer.cpp", "ERROR", MB_ICONERROR | MB_OK); return hr; } g_pImmediateContext->OMSetRenderTargets(1, &g_pRenderTargetView, NULL); // Set up viewport. D3D11_VIEWPORT vp; vp.Width = (float)width; vp.Height = (float)height; vp.MinDepth = 0.0f; vp.MaxDepth = 1.0f; vp.TopLeftX = 0; vp.TopLeftY = 0; g_pImmediateContext->RSSetViewports(1, &vp); return S_OK; }[/source] And here's the little snippet inside of WndProc() where I call it. [source lang="cpp"]case WM_SIZE: if(g_pSwapChain) { OnResize(lParam); } return 0;[/source] Thanks for the help.
  12. It was a joke guys. Lighten up a little... [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]
  13. Oh fudge it... I'll just disable the warning for now and deal with it later. I haven't been using Direc3D long enough to start losing hair over it [img]http://public.gamedev.net//public/style_emoticons/default/laugh.png[/img] .
  14. Should I just use the Windows SDK version of d3d11.h? I'm confused.
  15. [quote]Remove all references to DXGIType.h in your project. This header doesn't exist in the Windows SDK, and the DirectX SDK version conflicts with the new winerror.h.[/quote] That seems to be my problem. How do I "remove all refernces to DXGIType.h".
  • Advertisement