Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


#ActualBacterius

Posted 03 June 2014 - 08:46 AM


for simplicity i made one big header in the root of folder tree that includes all the small header down the folder tree (so i got one summa header having 400 lines of includes)

 

Are you sure this really simplifies things? I have found common headers like these are a mistake, as they obscure the relationships between modules, and also happen to increase build times by a lot. I would strongly recommend the traditional approach of including only what your module depends on. That way you can even feed your code to e.g. doxygen or some other tool and it will generate a dependency graph which will show which modules depend on which, aiding in refactoring and design.

 

And maybe try to reduce the number of "modules"? I have a hard time believing you have 400 pieces of independent functionality in your code that can't be logically grouped in some fashion. A common separation is math/graphics/window/physics/input/network for games, but that's not the only one possible of course. But 400? Come on, not every function needs its own module.

 

 

 


on the opposite each module includes that summaric folder - there is a little inconvenience becouse i must write something like
 
 
c:\mingw\bin\g++ -O3 -w -c phys_scanline.c -fno-rtti -fno-exceptions -I ..\..\..\
 
 
this "-I ..\..\..\" is needed for reach this main summaric header up the folder tree

 

This is kind of unusual - typically your build system will keep track of the root source directory and all includes are done relative to it, this is best especially if your headers are going to be consumed by others, since they won't be including your headers from their respective directories, so if they all #include other headers relative to their own location in the header tree, well, it isn't going to work. Also, may I suggest compiling C code with an actual C compiler... or if it's (some subset of) C++, mark it as such, people are going to be confused if they find C++ code in a .c file while trying to understand why your project isn't building for them.

 

 

 


like the same trouble i got with managing .o for linker,
all my .o lays in my big folder tree, so in command line for linker i must build a very long list (I said it is close to 400 modules (not all already on, because im revriting my old framweork)) of the paths to this obj files 
 
Managing of this 400 headers references and 400 obj references is burdensome [adding one module may be not so much work but as i said those modules are small and often added, also refactoring, esp changing the folder structure is then extremally burdensome] and consumes an important amount of energy (on the other side this "multilevel of folders type of project" has some advantages (it is 
well ordered, graphics modules are in one folder, window modules in another ) and finaly i dont know if i could hold the tree structure of the project minimalising the large amount of work managing all the references - and maybe to make it more clear yet (I need maximum possible clarity in all sources and tied managment scripts and binaries,  )

 

So don't reference every single source file in the root makefile? Seems to me you aren't managing your dependencies properly, why does the root makefile (or batch file or whatever) need to know that you added a file somewhere deep down in the graphics module? In most projects I've seen and worked on, each module is compiled down to a static library by a separate makefile (for instance, graphics.a, window.a, etc..), with possible dependencies between modules (for instance, graphics.a may depend on math.a) and the root makefile simply links those together with the main program to build the final executable (without any knowledge of every source file involved - just the major modules). Basically, implement the modules as libraries (even if they aren't going to be released separately - developing them separately helps a lot with modularity and is just better practice across the board).

 

In other words, the graphics module knows how to build itself, and the root makefile simply asks the graphics module to do so and then links the result with the other modules. No need to overcomplicate things with 400 batch files smile.png


#2Bacterius

Posted 03 June 2014 - 08:43 AM


for simplicity i made one big header in the root of folder tree that includes all the small header down the folder tree (so i got one summa header having 400 lines of includes)

 

Are you sure this really simplifies things? I have found common headers like these are a mistake, as they obscure the relationships between modules, and also happen to increase build times by a lot. I would strongly recommend the traditional approach of including only what your module depends on. That way you can even feed your code to e.g. doxygen or some other tool and it will generate a dependency graph which will show which modules depend on which, aiding in refactoring and design.

 

And maybe try to reduce the number of "modules"? I have a hard time believing you have 400 pieces of independent functionality in your code that can't be logically grouped in some fashion. A common separation is math/graphics/window/physics/input/network for games, but that's not the only one possible of course. But 400? Come on, not every function needs its own module.

 

 

 


on the opposite each module includes that summaric folder - there is a little inconvenience becouse i must write something like
 
 
c:\mingw\bin\g++ -O3 -w -c phys_scanline.c -fno-rtti -fno-exceptions -I ..\..\..\
 
 
this "-I ..\..\..\" is needed for reach this main summaric header up the folder tree

 

This is kind of unusual - typically your build system will keep track of the root source directory and all includes are done relative to it, this is best especially if your headers are going to be consumed by others, since they won't be including your headers from their respective directories, so if they all #include other headers relative to their own location in the header tree, well, it isn't going to work. Also, may I suggest compiling C code with an actual C compiler... or if it's (some subset of) C++, mark it as such, people are going to be confused if they find C++ code in a .c file while trying to understand why your project isn't building for them.

 

 

 


like the same trouble i got with managing .o for linker,
all my .o lays in my big folder tree, so in command line for linker i must build a very long list (I said it is close to 400 modules (not all already on, because im revriting my old framweork)) of the paths to this obj files 
 
Managing of this 400 headers references and 400 obj references is burdensome [adding one module may be not so much work but as i said those modules are small and often added, also refactoring, esp changing the folder structure is then extremally burdensome] and consumes an important amount of energy (on the other side this "multilevel of folders type of project" has some advantages (it is 
well ordered, graphics modules are in one folder, window modules in another ) and finaly i dont know if i could hold the tree structure of the project minimalising the large amount of work managing all the references - and maybe to make it more clear yet (I need maximum possible clarity in all sources and tied managment scripts and binaries,  )

 

So don't reference every single module in the root makefile? Seems to me you aren't managing your dependencies properly, why does the root makefile (or batch file or whatever) need to know that you added a file somewhere deep down in the graphics module? In most projects I've seen and worked on, each module is compiled down to a static library by a separate makefile (for instance, graphics.a, window.a, etc..), with possible dependencies between modules (for instance, graphics.a may depend on math.a) and the root makefile simply links those together with the main program to build the final executable (without any knowledge of every source file involved - just the major modules). Basically, implement the modules as libraries (even if they aren't going to be released separately - developing them separately helps a lot with modularity and is just better practice across the board).

 

In other words, the graphics module knows how to build itself, and the root makefile simply asks the graphics module to do so and then links the result with the other modules. No need to overcomplicate things with 400 batch files smile.png


#1Bacterius

Posted 03 June 2014 - 08:42 AM


for simplicity i made one big header in the root of folder tree that includes all the small header down the folder tree (so i got one summa header having 400 lines of includes)

 

Are you sure this really simplifies things? I have found common headers like these are a mistake, as they obscure the relationships between modules, and also happen to increase build times by a lot. I would strongly recommend the traditional approach of including only what your module depends on. That way you can even feed your code to e.g. doxygen or some other tool and it will generate a dependency graph which will show which modules depend on which, aiding in refactoring and design.

 

And maybe try to reduce the number of "modules"? I have a hard time believing you have 400 pieces of independent functionality in your code that can't be logically grouped in some fashion. A common separation is math/graphics/window/phyics/input/network for games, but that's not the only one possible of course. But 400? Come on, not every function needs its own module.

 


on the opposite each module includes that summaric folder - there is a little inconvenience becouse i must write something like
 
 
c:\mingw\bin\g++ -O3 -w -c phys_scanline.c -fno-rtti -fno-exceptions -I ..\..\..\
 
 
this "-I ..\..\..\" is needed for reach this main summaric header up the folder tree

 

This is kind of unusual - typically your build system will keep track of the root source directory and all includes are done relative to it, this is best especially if your headers are going to be consumed by others, since they won't be including your headers from their respective directories, so if they all #include other headers relative to their own location in the header tree, well, it isn't going to work. Also, may I suggest compiling C code with an actual C compiler... or if it's (some subset of) C++, mark it as such, people are going to be confused if they find C++ code in a .c file while trying to understand why your project isn't building for them.

 


like the same trouble i got with managing .o for linker,
all my .o lays in my big folder tree, so in command line for linker i must build a very long list (I said it is close to 400 modules (not all already on, because im revriting my old framweork)) of the paths to this obj files 
 
Managing of this 400 headers references and 400 obj references is burdensome [adding one module may be not so much work but as i said those modules are small and often added, also refactoring, esp changing the folder structure is then extremally burdensome] and consumes an important amount of energy (on the other side this "multilevel of folders type of project" has some advantages (it is 
well ordered, graphics modules are in one folder, window modules in another ) and finaly i dont know if i could hold the tree structure of the project minimalising the large amount of work managing all the references - and maybe to make it more clear yet (I need maximum possible clarity in all sources and tied managment scripts and binaries,  )

 

So don't reference every single module in the root makefile? Seems to me you aren't managing your dependencies properly, why does the root makefile (or batch file or whatever) need to know that you added a file somewhere deep down in the graphics module? In most projects I've seen and worked on, each module is compiled down to a static library by a separate makefile (for instance, graphics.a, window.a, etc..), with possible dependencies between modules (for instance, graphics.a may depend on math.a) and the root makefile simply links those together with the main program to build the final executable (without any knowledge of every source file involved - just the major modules). Basically, implement the modules as libraries (even if they aren't going to be released separately - developing them separately helps a lot with modularity and is just better practice across the board).

 

In other words, the graphics module knows how to build itself, and the root makefile simply asks the graphics module to do so and then links the result with the other modules. No need to overcomplicate things with 400 batch files smile.png


PARTNERS