Download the UDK, CryEngine 3, Unity3D, Ogre3D , Panda3D and start looking at how they work. Some will come with source, some will just be a pipeline exposing some scripting or bridge language, but it will give you some insight into modern engines and how they are designed.
Game development have become so large that people usually specialize these days. Architect, Network, Game, Graphics, Physics, Audio, Animation, Tools etc.. those are the usual "big" fields people specialize in. You can try to do all but truth is companies are not looking for people who can do "it all", they are looking for experts who are the "best" in one or 2 fields. This is for large companies of course, smaller companies might have different priorities. Focus on what you like to do and become an "expert" in that field, learn good programming practices and expand your mind, stay curious.
Hodgman: fantastic! What wonderful examples - I'm really keen on that Nvidia one.
ddn3: Windows only! I was unable to open the exe, but working with Unity sounds fantastic.
Seeing Arauna's list though: Do I need any special features to have the light curve in the glass? I have no experience with that, just a lot of curiosity.
Yep Arauna is a full path tracer which will reproduce the proper caustics of glass. That is the fun thing about raytracers, no tricks required to reproduce these complex light - object interactions and reflections. There was an older version of Arauna, see if its still out there.
Scripting languages were meant to be special purpose domain specific languages but as games became more and more complex and the availability of flexible complete high level embeddable solutions (Lua, Python, etc. ) were available people migrated to those or branched implementations of such for their own purpose.
Interpreted or compiled is more a matter of performance requirement, but "scripting languages" are still domain specific and you can very well have several scripting languages within a single game ( one for the AI for example, camera and another for gameplay logic etc ).
At the high level you can use any language as a "scripting" language but usually there is some requirement for high productivity which leads developers to create a domain specific language or API with am embeddable language and that is what eventually evolves to their "scripting" language.
UI : handles logic and events from user and handles stateful transitions of UI views.
Persistence and Packing : handles the serialization and deserialization of game states and objects.
AI : handles higher level intensive AI logic ( visibility checks, A*, optimization algor, behavior trees, etc.. ) which ever is appropriate for your game.
Making a "full" game engine is a herculean task these days, even pros don't do that anymore. They usually license out a 3rd party engine like Cryengine, UDK, Unity, etc.. The ones which do go for it, usually spend years on their engine only to see tech eclipse them and pass them buy ( unless your R*, EA, Bungie which can spend lots $$$$ on it to evolve with the times ).
If this is jut for learning go for it, but don't try to make the be-all engine.. Just get a handle on the overall picture..
If the bulk of the stats are the same across all types of creatures it's best to make a table based format which can be imported / exported from common spread sheet programs like google docs or access. This allows you to keep the data organized. If a creature needs a custom non-standard attribute or property u can probably extend your system to support custom scripts ( Lua ) in the place of of a standard property / stat..
-unlimited version control : version control which can handle unbounded branch management and code variations on a complete system wide level. Can guarantee 100% stateful restoration of code and support classes and platform, allowing for unfettered experimentation and code sharing.
-smart code auto-generator : using advance AI heuristics analyze user code library and does code auto-complete not at a basic syntax level but higher level.
-meta everything : all code, support tools, interfaces, api are meta-tag and allowed for meta-analysis and offline refactoring.
-runtime visualization tools : advance visualization routines allow user to see allocation, process, information flow, higher order logic, etc.. of their program in action.
-smart syntax coloring : just better versions of what we have now
If you want the server to be authoritative then you can't initiate the actual thrust until 1/2 ping time has passed locally (assuming symmetrical latency both ways), at which time you can assume that the packet has arrived on the server and it then will initiate the thrust command as well and prorogate that command + state snapshot to the other clients.
If it's just a simple 2D physics game you should be able to project the current state of the object to timeCurrent using the state snapshot + action command for the other clients. This is the same for any action you do, such as releasing the thrust etc..
It's focus, you have the skills to execute but your not focusing on the details to see the ramifications. Create and foster an attentive pause in your programming routine where you ask and answer questions about what and how your doing something. Like how will it affect other submodules? Is there a better object model than this? Is this the fatest / easiest way to my goal? etc.. just asking questions will force you to go out of autopilot mode ur in and that will expand the depth of your understanding and hopefully catch these design issues early on.
I use to think that scripting wasn't worth the effort for "small" project but it turns out those are probably the best project for scripting. Why?
Scripting is a work multiplier, create a domain specific language for your game making every line you write do more work than trying to do it in something like C++ / C# / Java etc.. Small projects are usually limited by manpower, the more work you can get done in fewer lines of code, over the long run results in massive increase productivity..
2nd for small projects, it's less risky to try new things. Learning what to do and what not to do when integrating and building scripting framework is the same for a small project as a big one. The only difference is the mistakes u make on a "big" project will cost u much more $$$$ and manpower to fix. Better to learn on a small project..
3rdly the more languages you learn the more out of the box thinking you will do from your dominate language giving u better insight and more rounded approach to programming in general.
Any of those engines could work but Unity 3D is the best choice. However even that said what you want will require some level of expertise, maybe more than a single person would have, so my guess is multiple developers and depending on the scope of the project maybe several months to a year..
Unity 3D has a web player which u can use to publish your Unity 3D content. Of course you will have to setup the web sever and hosting backend yourself none of those game engines do that for you.
Otherwise known as comprehensive unit tests and functional tests, both of which you should have already. I hate to harp on it, but code without tests is broken code.
I think TDD has it's place, but for smaller development houses where it's just 2-3 programmers you have to make do with what time you have. Sure for larger teams where it's critical that components function to specification and teams are separated by timezones u don't want to mess around with email tag trying to get core critical systems working. For individual developers its up 2 u if you want to do TDD and to what extent. I find for individual developers code maintenance becomes a much larger problem, try looking at code you wrote 3 years ago for example..
The core of data driven system is that objects are initialized by a data descriptor object be they data from XML, scripting, internally, etc.. This initialization chain follows all the way down to all the child objects. Objects themselves are not usually created on demand but rather by a factory due to the complexity of creation in data driven designs.
A data descriptor is just a unified data accessor class which makes it simple for data access and accounting. Imagine if u will an object definition is a tree and iteratiring down the tree to create all the components and child components, you'll need to keep track of that iteration as well as have data accessors to make sure data conforms to both read / write access.
Data driven design can be as simple or complex as you like but the drawbacks with such a system is that u have to have runtime checks for validity and data and class integrity which can increase the burden of testing and maintenance. So people usually write tools to manage this for them, when the data set becomes too large. Another common design pattern for data driven system is to use object composition vs inheritance to create complex objects but that's really just user preference and both can co-exist.
It's a question of scale, sure Python is fast enough for a subset of Dwarf Fortress.. To simulate DF in its entirety in Python? Maybe depends on alot of things, how well DF is written, the bottlenecks of DF and how well you know Python and if you can leverage optimized versions like CPython or IronPyhon. One thing is for sure, a high level languages like Python makes writing complex simulations like DF easier..
DF looks like alot of cellular automatas and control logic running in a micro threaded message passing scheme is my guess. Something like that shouldn't be too hard to code up in Python, give it a try. I love those old school sim games aka Rollercoaster Tychoon etc.. which is very similar to DF in ways..