Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Promit

Member Since 29 Jul 2001
Online Last Active Today, 12:03 AM

#5173492 Alternative to FBX for Animation Export ?

Posted by Promit on 13 August 2014 - 09:15 PM

Collada seems like the obvious first stop?




#5173489 What software(libraries, apis) should i use for my game?

Posted by Promit on 13 August 2014 - 09:09 PM

SDL 2.x is excellent. In retrospect I wish it were available when we'd started work on our engine, because it would've probably saved me a bucketload of work. The 1.x branch available at the time was not suitable for modern development. They've made great strides with the new version though. My experience so far is that I'd probably be opkay with shipping it in a pro level title, though saying it and doing it are two separate things.

 

You can make a game without any libraries at all, if you so desire. (Apart from the underlying operating system and graphics API stuff, which is necessary.) There are reasons to do this. They probably do not apply to you.




#5172984 Game Institute (again)

Posted by Promit on 11 August 2014 - 07:46 PM

 

 


I would not bother paying them.
 

Both GameInstitute and 3DBuzz?

 

Both of them look like junk to me. I saw the GameInstitute code back in the day when it was still free content, and there's nothing in there I'd pay for. Doesn't look like he's changed anything, just added a handful more. 3DBuzz looks even less useful.

 

In the past fifteen years, I've gone from a 12 year old dreamer to a 27 year old professional with title credits. I've been on GameDev.Net for pretty much the entire journey, and never once have I felt that paying for online content was a good idea. The times I got real value for money were when I bought books written by knowledgeable professionals focused on specific subjects. GT and 3DBuzz are nowhere near that bar.




#5172922 Game Institute (again)

Posted by Promit on 11 August 2014 - 03:32 PM

I would not bother paying them.




#5172336 cbuffer per object or share one cbuffer

Posted by Promit on 08 August 2014 - 01:52 PM

In general, if you repeatedly map/discard the same buffer, the driver is going to be forced to silently allocate in the background. After all, there's a potentially long delay between draw submission and hardware processing. Better to do it explicitly up front, IMO. Most of the documentation now seems to recommend triple buffering for every dynamic buffer to avoid stalls -- that's a three frame latency. Anything that involves mapping any given buffer at high frequency makes me nervous.

 

That said, I have gotten the impression that the drivers' internal handling for cbuffers is not similar to the other, more traditional buffers. I think that the high frequency Map/Discard pattern is common enough to justify significant optimizations to that code flow inside the driver. But that basically has to involve hundreds of shadow copies internally, so once again I wouldn't do this if at all possible. The memory savings are likely to be an illusion due to the internal copies.

 

(One alternative possibility is that many cbuffers are a waste of memory if the entire buffer is simply copied into the command buffer. I have no evidence to suggest this actually happens, but it might be a reasonable optimization for small high frequency buffers.)




#5172200 Best Direct3D9 tutorial for beginners

Posted by Promit on 07 August 2014 - 11:00 PM

http://www.drunkenhyena.com/cgi-bin/dx9.pl




#5171780 OpenGL ES on desktops

Posted by Promit on 05 August 2014 - 09:19 PM

We run the same render path for desktop and mobile GL. I have a "proxy" header, which defines functions like gl::Clear, gl::VertexAttribPointer, etc instead. The proxy changes a few function names as needed, and a few things are switched on either compile or runtime flags. (Not many things.) I don't try to fully emulate correct GL ES behavior on desktop; I just want my ES code to run. This proxy does that and works across Windows, Linux, Mac, and iOS. 

 

As for shaders, I found that trying to share them across desktop and mobile was a catastrophe. I now use hlsl2glsl (linked above) with some custom patches for full Mac and ES support. I should remember to assemble those changes into a pull request some day.




#5168425 Textures tear at distance

Posted by Promit on 22 July 2014 - 11:42 AM


I just realized playing with znear seems to help a lot. I've moved it to .5 from .01 and the z-fighting seems to have almost vanished! =D
Correct - The overall Z precision is a function of (zfar-znear) / znear. The presence of znear in the denominator means it has a huge impact on your available depth precision. The larger you can make it, the better.


#5168068 Prevent Paging

Posted by Promit on 20 July 2014 - 11:04 PM

Run a 64 bit process, memory map everything in your game using the default settings (ie don't lock pages), and then let the OS do its job. Interference with the underlying memory management systems will only make things worse. Memory mapping frees you from the problem of worrying about what's allocated in the first place.

 

If things start paging, buy a bigger server.

 

And against my better judgement, I am going to point out the Windows function VirtualLock.




#5168047 Get a IDirect3DVertexShader9 from a ID3DXEffect interface

Posted by Promit on 20 July 2014 - 07:18 PM

Call GetVertexShader? Presumably you need to provide a handle to a pass (GetPass/GetPassByName), though the documentation doesn't really bother to say.




#5168036 Patenting an Algorithm?

Posted by Promit on 20 July 2014 - 05:28 PM


In other domains, things are different. Take for example, video coding standards like H264. These are not self contained. You can't release a new standard, or a small update to the standard, every couple of months and declare all blueray players sold up to that date deprecated. When you create the standard it must be top notch, state of the art. Also, you can't just show a couple of power point slides to show the general idea, because you actually want every implementation of that standard ever build to behave exactly the same. You have to provide a reference implementation, which shows exactly every single operation.

if only that were consistent with reality.

 

Most industry-developed standards, including h.264, are developed in committee by a wide group of participating members. These members typically hold patents on some aspect of the technology, and they agree to make that technology available to the standardization group. Why? Because everyone with an IP stake in that particular standard agrees that everyone else with a stake can use it, thus putting all of the members on an even footing and not being required to pay large license fees. This part makes sense.

 

BUT: Let's say you have a patent whose invention is not finally included in the published standard. What happens at this point is you no longer have a stake contributed to the working group, which means you're no longer part of the royalty fee deal. You have to buy a license! Which is expensive. Oops. So what happens when somebody like MPEG-LA get together is not only about choosing the most competent technology. It is also about many groups vying to get as much IP into that standard as possible, regardless of its technical merits. Somewhere at the intersection of technical and financial back-and-forth is where the final standard is set.

 

Vorbis, Dirac, Theora, etc are developed by collaboration between open source volunteers, and groups who were never a party to these big standards discussions. Their argument is that by focusing strictly on technical excellence, rather than engaging in proxy patent battles, they can produce a superior final product. I don't know to what extent that holds up in reality, but certainly Vorbis has very much held its own from a technical standpoint relative to AAC, nevermind the aging and relatively poor MP3 standard. Theora doesn't seem to fair as well. We also have the unusual case of WebP/WebM, where an independent proprietary technology was opened up after the fact. Layered on top is the reality that none of the open codecs have any traction in hardware decoders, which has become more and more of a problem over time.

 

MPEG-LA has also wielded its total contributed patent chest as a weapon against competitors, notably Microsoft's VC-1 in the HD-DVD era.

 

At the end of the day, patents create significant distortions to how things are created, shared, and licensed. Whether those distortions are positive or negative depend on what you're talking about and from what perspective you're looking at it. There's a real sense in the software industry that patents as a whole have created more problems than they've solved.




#5167632 How to pack your assests into one binary file (custom file format etc)

Posted by Promit on 18 July 2014 - 10:35 AM

All you need is a file with a header section, an index section, and all the actual file data following. The index can be as simple as a list of pairs: file name and offset. Building the file, you take all your input files, and write them into a buffer while making index entries that record the name and offset. Then build your header and index, and dump the whole thing to disk.

 

(You can improve this process by writing placeholder header and index blocks to a file, then write the files directly to the target without an intermediary buffer. Then seek back to the beginning of the file and write the correct header and index.)

 

Reading is simple: load the index, use it to find the data. Maybe even memory map the whole thing.

 

You can choose the layer compression and more complex file arrangements on top of this scheme, but it's not really necessary. This is effectively how zip and tar files are laid out. Personally I just use a library like PhysicsFS instead of developing my own formats.




#5167516 Patenting an Algorithm?

Posted by Promit on 17 July 2014 - 08:49 PM

I have a close friend who spent some time in patent law. Regardless of the issues you're asking about, there's something very important to understand: patents have no value if you can't defend/prosecute them. Let's say you have a patent on your algorithm and somebody is using it without your permission. What are you going to do about it? Nothing, that's what. Because you don't have the quarter million dollars to start the case.




#5166579 obj file format

Posted by Promit on 13 July 2014 - 08:50 AM

Personally I'd re-export them as triangles in a modeling tool, if possible.




#5166314 is this much importatnt to have a strong fixed gdd?

Posted by Promit on 11 July 2014 - 04:44 PM

I am of the opinion that the best place to put a GDD is a paper shredder. Prototype ideas and see if they work. Writing up a whole concept serves no purpose.






PARTNERS