• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.

mattropo1is

Members
  • Content count

    24
  • Joined

  • Last visited

Community Reputation

148 Neutral

About mattropo1is

  • Rank
    Member
  1. eglChooseConfig() allows you to specify a set of config attributes for color depths (EGL_RED_SIZE/etc.) for your frame buffer. However, there doesn't seem to be a way to specify the format such as ARGB, RGBA, XRGB, RGBX as is possible in DirectX. You seem to be able to do this for textures, but not during initial frame buffer context setup for OpenGL ES 3+. Is that correct? Are there no ways to specify XRGB vs ARGB explicit?   I know I can practically do this by creating an RGBA format and then always assuming the A is 255 in shaders. My goal would be to get XRGB format.  
  2. Thanks guys. Yes, I can confirm if you just lock DXT compressed textures, the data you get back is compressed. As for D3DXLoadSurfaceFromSurface(), you might also find (as I did) that D3DX11LoadTextureFromTexture() actually works better/easier. Yes, it is deprecated for Win8, but it was ok for our purposes. However, there is a new MS DirectXTex library for doing these operations that is compatible with Win8. Thanks for the help!
  3. Hi guys, I have some textures that when created, were created as DXT1 compressed. I'd like to map them, and get the data back out. To that end, I created a staging buffer, do a copy of the original resource to the staging buffer, then map the staging buffer. This works just fine for non-DXT compressed textures; and the data all looks correct with the test textures I used. questions: 1. When I map a DXT1 compressed texture - am I seeing compressed bits? 2. If they are not the compressed bits, what format are they in (RGB8)? 3. If they ARE compressed bits, 3a. Can I specify the staging buffer they are mapping to have a different format such as RGB8 to get the non-compressed bits (i.e use it as a decompressor)? 3b. If I cannot do 3A, then are there software decompressors around in a library that I might use to get the raw bits back out? Thanks in advance
  4. Well, after much investigation and experimentation - it comes down to much of what has been said. -The internal DX texture functions do NOT seem to understand or honor sRGB/gamma properly in png/jpg/dds files. The DX functions pretty much just load things assuming they are non-sRGB into unorm/standard rgb component internal formats. -The nvidia photoshop dds plugin doesn't seem to mark things as sRGB properly. Sadly, this doesn't really matter as it's probably not honored anyway. -You will have to have some sort of meta-data file, or a policy, that tells you which files need to be created using an sRGB texture surface/buffer. -Make sure your content generation tools are outputing to sRGB. Adobe has a color space setting for this. Some tools do NOT have sRGB output. Make sure you can get your favorite content into the sRGB format you want before you make the jump. -If you wish to use DX texture loading routines, you need to use the sRGB load flags on D3DXGetImageInfoFromFile() (i.e. D3DX11_IMAGE_LOAD_INFO Info.Filter = D3DX11_FILTER_NONE | D3DX11_FILTER_SRGB_IN | D3DX11_FILTER_SRGB_OUT). This preforms the correct interpretation of the data. -Both your back buffer AND the textures need to be a DXGI_FORMAT_xxxx_SRGB format type. -There are very few DXGI_FORMAT_xxxx_SRGB formats. Only 6 to be exact, and 4 are BCx compressed types. If you're not very careful, odds are very good you'll pay for painful load-time conversion because most of the old formats are ARGB, but the SRGB formats are RGBA. If you're looking for single-channel formats - you'll be sorely dissappointed. -It's useful at first to change your pixel shader to do an pow(2.2) on each pixel (using the old textures) to see what you SHOULD be getting. Then, convert the textures and make the format and loading changes, then compare the results. They should look nearly the same. It certainly does appear that at some point, you'll likely want to come up with your own texture format or get by using some extra meta-data file to specify whether the image is sRGB (diffuse channels, etc), or not (normal maps, data maps, etc). Thanks for the help.
  5. [quote name='MJP' timestamp='1323130627' post='4890914'] ... The loader [i]will[/i] properly handle DDS files, so you also have the option of creating a preprocessing pipeline that converts and compresses your textures in advance so that you don't need to do anything fancy at load time (or using a plugin for your content creation software that will save as DDS). [/quote] I have Photohop cs5, and used the nVidia dds plugin to save the file. I had Photoshop set to sRGB space, and when I went to save using the nVidia DDS plugin, it didn't show any obvious checkboxes to create an sRGB dds texture. I saved it, and the D3DX11CreateShaderResourceViewFromFile still created the non-sRGB UNORM texture. Have you gotten D3DX11CreateShaderResourceViewFromFile to properly return an sRGB texture? What was your tool/process for creating the sRGB texture? M
  6. [quote name='MJP' timestamp='1323127197' post='4890888'] Yeah, it won't load as an sRGB format by default. You have to instruct the function to load it using an sRGB format (and not convert it) using the loadInfo parameter. Something like this should work: [code] D3DX11_IMAGE_LOAD_INFO loadInfo; loadInfo.Width = D3DX11_DEFAULT; loadInfo.Height = D3DX11_DEFAULT; loadInfo.Depth = D3DX11_DEFAULT; loadInfo.FirstMipLevel = D3DX11_DEFAULT; loadInfo.MipLevels = D3DX11_DEFAULT; loadInfo.Usage = D3D11_USAGE_IMMUTABLE; loadInfo.BindFlags = D3D11_BIND_SHADER_RESOURCE; loadInfo.CpuAccessFlags = 0; loadInfo.MiscFlags = 0; loadInfo.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB; loadInfo.Filter = D3DX11_FILTER_SRGB_IN | D3DX11_FILTER_SRGB_OUT | D3DX11_FILTER_NONE ; loadInfo.MipFilter = D3DX11_DEFAULT; loadInfo.pSrcInfo = NULL; D3DX11CreateShaderResourceViewFromFile(pd3dDEvice, pFilename, &loadInfo, NULL, &pTextureResourceView, NULL); [/code] [/quote] Thanks. Yes, this does get the desired result for this file, but the problem is that this code *forces* the jpg into sRGB space (it does look correct). But what I want is for D3DX11CreateShaderResourceViewFromFile to create the *correct* resource view based on what the texture file *says* it is. If the texture is not sRGB (i.e. normal map/albieto/etc), then it should load as the RGBA8_UNORM. If it IS saved sRGB, then I'd want the RGBA8_UNORM_SRGB version... Any ideas on that one?
  7. I've got a block of code that's loading what I'm pretty sure is an sRGB jpg (as the file properties on it say it is enocded sRGB color format, and I selected the sRGB profile when saving it in Photoshop) that I wish to use as a texture. [code] hResult = D3DX11CreateShaderResourceViewFromFile( pd3dDevice, pFilename, NULL, NULL, &pTextureResourceView, NULL ); pTextureResourceView->GetDesc( &desc ); desc.Format; [/code] When i run the above code on my texture, it always loads as DXGI_FORMAT_R8G8B8A8_UNORM, not DXGI_FORMAT_R8G8B8A8_UNORM_SRGB as I would expect. Further, when I use the texture, it's washed out (too bright) as if it's been loaded as RGB. If I do a pow(sampleColor, 2.2) in the pixel shader, then the color looks right. Am I missing something on loading the texture correctly to get it to recognize it as sRGB?
  8. [quote name='ApochPiQ' timestamp='1319694027' post='4877471'] Maybe I'm just confused (or sleep deprived!) but I don't see how this is a bug necessarily. You allocate memory pre-main() via invoking string constructors, via your static object. So if you check for leaks at the end of main(), you're not going to see that memory freed. You have to check for leaks after static destruction occurs, which is the purpose of the _CrtSetDbgFlag() call; if you don't get a report at program exit, it means you have no leaks - which, in this case, is exactly what I'd expect, after you've seen that result from other tools. [/quote] In almost every other case, I would say you're right. But the problem is this is an infrastructure library that many people will be using. I can imagine people using the CRT leak detection tools and seeing those two 'leaks' and assuming I'd been too lazy to clean up a trivial memory leaks. It's a professionalism issue IMHO. My other motivation is that if you don't REALLY know what is causing the leak, you could be masking much bigger problems later. Because I root caused it, changed the code, and it fixed it legitimately - I can NOW say with 100% confidence that it wasn't a real bug and have left it. But before, when I didn't know 100%, I was acting on faith. And in my experience, faith tends to blow up on you at 1am the night of code freeze. Further, the MS documentation EXPLICITLY calls out the STL containers as having had this problem fixed. This is one case it's clearly not fixed. I'll be submitting a bug tomorrow on it. That and it's just good nerdy fun to sort these things out. Now I can run all the configurations of my program without a peep out of the leak reporter and say that 'it doesn't leak' without the 'well, besides this leak that's not really a leak'. Thanks guys for all the help!
  9. SOLUTION!!! Wow. So, I started searching my code for static variables to see if I wasn't leaving one or two around - paying very close attention to any STL objects I was using (which was primarily std::vector and std::string). In searching for the keyword static, I found this: [code] class foo { private: static bar m_mybar; };[/code] Looking at bar I see among the many members: [code] class bar { private: std::string m_name; std::string m_value; };[/code] BLAMO! I changed foo's m_myBar to: [code]static bar* m_myBar;[/code] then create a global initializer to new/delete the m_myBar object at startup/shutdown. Leak goes away! I double-verified this was the culprit by initializing m_name and m_value to actual strings, and sure enough, those strings are dumped by _CrtDumpMemoryLeaks(); The reason they were 8/16 bytes was because an 'empty' std::string operator contains a pointer and an int length. On x86, 8 = sizeof(int)+sizeof(void*) and 16 = sizeof(int)+sizeof(void*) on x64. Long story short: *static* STL container objects are NOT fixed as the MS doc indicates - and DO show up as leaks using _CrtDumpMemoryLeaks(). Time to file a bug report. Holy cow. And the crowd goes wild.
  10. [quote name='ApochPiQ' timestamp='1319677394' post='4877413'] Remove the explicit call to _CrtDumpMemoryLeaks(), and let it run at program exit (I forget the flag for that offhand, check MSDN). You won't get false-positives from the CRT that way. [/quote] According to the microsoft docs, it's: _CrtSetDbgFlag ( _CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF ); However, I don't get a report at exit as the docs say I should... Hmmmm. MORE data. I was checking the microsoft docs ([url="http://msdn.microsoft.com/en-us/library/x98tx3cf.aspx"]http://msdn.microsof...y/x98tx3cf.aspx[/url]), and see that you ALSO need to #define _CRTDBG_MAP_ALLOC_NEW: [code] #define _CRTDBG_MAP_ALLOC #define _CRTDBG_MAP_ALLOC_NEW #include <stdlib.h> #include <crtdbg.h> #ifdef _DEBUG #ifndef DBG_NEW #define DBG_NEW new ( _NORMAL_BLOCK , __FILE__ , __LINE__ ) #define new DBG_NEW #endif #endif // _DEBUG[/code] Adding THIS gives me my answer: [code] Detected memory leaks! Dumping objects -> c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\include\crtdbg.h(1116) : {268} normal block at 0x0000000001E32A20, 16 bytes long. Data: < ? > F0 A1 B5 3F 01 00 00 00 00 00 00 00 00 00 00 00 c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\include\crtdbg.h(1116) : {267} normal block at 0x0000000001E2FF80, 16 bytes long. Data: < ? > C0 A1 B5 3F 01 00 00 00 00 00 00 00 00 00 00 00 [/code] which is this code in crtdbg.h: [code] _Ret_bytecap_(_Size) inline void * __CRTDECL operator new(size_t _Size) { return ::operator new(_Size, _NORMAL_BLOCK, __FILE__, __LINE__); } [/code] However, many people seem to report that crtdbg.h is sometimes blamed for an allocation when it's not if they have their config wrong, so I'm a little skeptical. Instead, what I did was set a memory checkpoint at the begin/end of my program. It reports that I'm not leaking any memory: [code] Deleting: 0 bytes in 0 Free Blocks. 0 bytes in 0 Normal Blocks. 12448 bytes in 7 CRT Blocks. 0 bytes in 0 Ignore Blocks. 0 bytes in 0 Client Blocks[/code] . One also sees this in their notes: [quote]In some cases, [b]_CrtDumpMemoryLeaks[/b] can give false indications of memory leaks. This might occur if you use a library that marks internal allocations as _NORMAL_BLOCKs instead of [b]_CRT_BLOCK[/b]s or [b]_CLIENT_BLOCK[/b]s. Older versions of the Standard Template Library, earlier than Visual Studio .NET, caused [b]_CrtDumpMemoryLeaks[/b] to report such false positives, but this has been fixed in recent releases.[/quote] I believe I DID see this before when I was using static std::string objects. They were reported as leaking (so maybe Microsoft HASN'T completely fixed it?). However, I can't find anymore static std:strings anywhere. Perhaps another stl container object that I'm using(?)
  11. [quote name='ApochPiQ' timestamp='1319676162' post='4877405'] Best I can suggest is to start stripping bits of functionality out of the program until you can pinpoint what causes the reported leak. You might have quicker results starting from an empty project and adding in code until it shows up. Depending on your compiler version it may also be a known CRT implementation bug. [/quote] Fascinatingly enough - I get the leak with this code: [code] int WINAPI wWinMain( HINSTANCE hInstance, HINSTANCE hPrevInstance, LPWSTR lpCmdLine, int nCmdShow ) { _CrtSetDbgFlag(_CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF); // can comment this line in/out with same results _CrtDumpMemoryLeaks(); return 0; }[/code] This tells me it's happening before the *start* of the program. The size of the 'leak' depends on whether I'm doing x86 or x64 build. But in both cases Visual Leak Detector detects no leaks. I added one to make sure it was working and it reported that leak correctly. x86 build [code] Detected memory leaks! Dumping objects -> {270} normal block at 0x0093E270, 8 bytes long. Data: < E > C8 9B 45 01 00 00 00 00 {269} normal block at 0x0093E228, 8 bytes long. Data: < E > A8 9B 45 01 00 00 00 00 [/code] x64 build [code] Detected memory leaks! Dumping objects -> {268} normal block at 0x0000000001E02A20, 16 bytes long. Data: < ? > F0 A1 EB 3F 01 00 00 00 00 00 00 00 00 00 00 00 {267} normal block at 0x0000000001DFFF80, 16 bytes long. Data: < ? > C0 A1 EB 3F 01 00 00 00 00 00 00 00 00 00 00 00 Object dump complete.[/code] Furthermore, when I add back in the int* p=new int 'leak', it's address is usually within about +0x80 bytes of allocation 268 - which may imply it's getting allocated before my program starts...
  12. [quote name='ApochPiQ' timestamp='1319663817' post='4877341'] Static objects (e.g. Foo object; at global scope in a file) will not allocate on the free-store and will not show up as leaks. If [i]those[/i] objects in turn make allocations, e.g. Foo's constructor calls new, then you will get stuff pre-main that shows up as leaks. Chief suspects here are singletons of any flavor. In my experience, if you're not getting file/line information from the CRT debugger, one of three things is happening: [list=1][*]You didn't instrument the right file (doing this can be hard in large projects, so I feel your pain there)[*]There's some evil going on with other macros replacing new out from under you[*]Someone is calling malloc() or something else which allocates via the CRT but doesn't do it via overloaded new[/list] Best of luck; I've found these kinds of things hideously frustrating. If you're on a suitable version of Windows, you might try Application Verifier as well - I found it a little easier to get reliable callstacks etc. from when tracing leaks. [/quote] I'm pretty sure that #2 isn't happening. I wrote all the code originally and never used any new/malloc overloading. For the most part, I'm using standard C calls and not linking to other libraries. I might do more checking the #1 (there are a few more places I can add the code too), and maybe search for a malloc() - but I'm pretty sure I only use new.
  13. [quote name='SiCrane' timestamp='1319663960' post='4877343'] Another option is Visual Leak Detector which goes to extreme lengths to replace every single instance of just about every memory allocation function you can think about with it's own definitions. [/quote] So, tried Visual Leak Detector. Nice product by the way. It actually reports NO leaks (unless I add back in the one that I intentionally added). Other ideas?
  14. Hi guys. I am cleaning up code and got my leaks down to only 2 16-byte leaks (64x build). Visual Studio's CRT memory leak detector gives me the allocation #'s, and I've had great luck cleaning up all the other leaks using the _CrtSetBreakAlloc() with this method. However, with these two leaks - it never hits the breakpoint and just starts up. So, right in my main, I do this: int WINAPI wWinMain( HINSTANCE hInstance, HINSTANCE hPrevInstance, LPWSTR lpCmdLine, int nCmdShow ) { int* pi = new int(); } which obviously leaks, but it reports that leak as allocation #261. However, the reported leaks are allocations #259 and 260 - which is before main starts. Looking at the dumped data, I had previously had 5 additional leaks of static global std:string objects that showed up allocated before they started. Since I had the dump of the memory and could read the strings, and was able to find them. But with these two, I can't get a line reference number and they appear to be holding pointer data (possibly?). I'm using the #define DBG_NEW new ( _NORMAL_BLOCK , __FILE__ , __LINE__ ) to retrieve file/line numbers in the output dump, but am not getting a report. My guess is because static allocations don't new()? Is there some cool trick of CRT memory functionality that could report back where these are happening? Am I just not instrumenting the right file(s) to report the line number back (I have a lot of files in this solution) - or do static allocations like this not get line numbers? Thanks in advance
  15. [quote name='clb' timestamp='1319457632' post='4876278'] If you converted a triangle list of an object to a line list, to render a wireframe display of the same object, you can simply use the vertex normals from the triangle list for the lines as well, and use a (vertex normal) dot (camera direction) >= 0 test to determine which color to pass out from the vertex shader to pixel shader for drawing. [/quote] Duh - surprised I didn't think of this. Great answer. I'll give it a try when I get a chance as the points of the line list have normals that are VERY easy to calculate.