Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 21 Jul 2012
Offline Last Active Today, 12:35 AM

Posts I've Made

In Topic: Stir, a C/C++ build tool

08 September 2014 - 01:59 PM

I often build projects that depend on external packages, for example FreeType2. Such packages typically don't install their headers in the standard systemwide directory but in a package-specific subdirectory, requiring that I add additional search paths on the command line (eg. -I/usr/include/freetype2). Similarly, the names of link libraries in dependent packages need to be set externally (eg. -lfreetype). On GNU/Linux systems, the pkg-config tool has come to be useful for determining those values, and both CMake and the autotools have integrated that utility. Prior to pkg-config, the autotools were developed to autodiscover the location and presence of the required dependencies.

Additionally, being able to specify the location of external dependencies allows staging builds of more complex systems. If, for example, I wanted to test out my project against a new upstream release of FreeType2 without upgrading my system, (because users of Freetardix 7.1 found it fails to build on their favourite system, and I'm developing on Linux Julep "Mostly Moron" still), I can build the new FreeType2 locally and point my project to it.


The user can set include_dirs as well as libs (anything without .a gets a -l), although I now realize somehow library search directories (-L) got left behind. I forgot about pkg-config (thank you), but it should (keyword smile.png ) currently work by just putting it (the back-tick quoted cmd) in the respective cflags and ldflags settings. Maybe I should come up with specific settings (pkgs, pkg_libs, pkg_cflags, etc) to make it less error-prone though.


Can always write a tweaked config file and use a command-line option to change the build directory (compiling into the source tree is not yet supported).


A DSO is a "dynamically shared object" -- a .so file on ELF systems, a DLL on COFF systems, a .dylib on MACH-O systems, and so on. Building such things can sometimes get pretty hairy (eg. a COFF system generates a .LIB and a .DLL file).


Currently the link command depends on the file type of the set output file. Stir seeing .so will trigger a bunch of "-shared" flags. I haven't really got around to testing that yet. I admit COFF will very well take some extra work (been a while since I've dealt with .DLLs).


I understand Mac OS (MACH-O, not ELF) is fairly popular. Also, CMake and the autotools support cross-building, so you can host on one platform and target another (like embedded, or Android, or iOS).


Not sure about Mac OS, if the compilers work the same way generating one object per source and then link, things should work out (mainly I just have hard-coded file extensions and command-line options use - instead of configured to /, being the problem porting).

As for cross-building, the compiler and linker are settings, so you can still have the system compiler be "gcc", but then have "gcc_elf_x86-64" for osdev kernel development. or Atmel, etc. Not sure it would matter to Stir.


Well, on a Unixlike system (eg. Linux), typically you build locally as a user, then install into your system (make; sudo make install). Few people actually build their own software (probably fewer than on Microsoft Windows), most people just install packages. Packaging requires the same "make; make install" sequence, except instead of installing in the real system the software is installed into a staging area and the package created from there -- no root privileges are required. With autotools, you just type for example "make install DESTDIR=/tmp/package". That's what creating a deb or an rpm does behind the scenes.

Also, staging installation is useful for the above-described staging builds.


In Stir, I have several "pseudo-targets" that make calling Stir look like common usage of Make, 'clean' and 'all'. I could add 'install' that runs

either a default platform-specific install pattern or runs a section of the config file (I already have sections called before/after that can run shell scripts for before the project and after a successful link).


You might also be interested in Meson (developed by a foaming-at-the-mouth CMake evangelist) and quagmire (a defunct project by one of the autotools maintainers) as inspiration.


Thanks, I'll have a look at those later tonight.

In Topic: Stir, a C/C++ build tool

08 September 2014 - 10:37 AM

How do I extend stir with custom rules (resource compiler, Qt moc, gir generator)?


Planned. I concentrated on getting the building of a basic C/C++ source project working first and adding complexity from there.

The first thing to be implemented would be one-to-one situations of one file extension being generated from another (as c src to obj).


Do I need to modify the binary itself if I need to support additional languages (eg. generate documentation, bindings autogeneration)?


Like I said, it's just starting becoming usable: for small C/C++ projects [without many extras]. It is by no means complete.


How do I do localized configuration (eg. pick up header and library paths from external dependencies)?

Will DSOs be supported?


Could you explain these?


Do you intent to make this portable to non-POSIX platforms? non-ELF platforms?


Windows. Where else would you see a tool like this usefully ported?


Will this build tool support making source distribution tarballs (or the equivalent)?  Build artifact staging and installation? Integrated test infrastructure?


tarballs planned. Could you explain the second one. I don't have a whole lot of experience with testing; I would probably need help meeting user expectations when it comes to testing.


What is your intended differentiating factor to distinguish stir from existing, widely-used, understood, and supported tools, ie why would I want to use stir instead of the autotools, CMake, kbuild, etc?


At least right now, simplicity for simple projects. The kind of thing where you could "g++ *.cpp" + list the includes/libraries but want to speed things up compile time (or not have to remember the command across the terminal not having the history). For a first person shooter game project I started with Make, went to CMake. CMake with globbing files was ok, but thought that it's configuration should be simpler for simpler tasks and that at some point there's no reason for it to generate a makefile rather than do the build itself.




Thank you for your response; those questions were quite helpful :)

In Topic: (Super) Smart Pointer

11 July 2014 - 10:15 AM

Read Buckeye's original post again. If the AI has a pointer *to the pointer* to the enemy, they can all see whether the enemy object exists/NULL or not with a single if-statement. No looping. The AI just sees a lack of enemy next time they get updated.

In Topic: Terrain - map editor

10 July 2014 - 08:00 PM

I would go height map for a terrain editor. Bullet's height map object allows for the map array to be dynamically modified. Creating a new btBvhTriangleMeshShape every time the terrain is modified may or may not be too slow, but using the height map wouldn't require quite as much Bullet-related code.


On the other hand, do you really need physics *while* the terrain is being altered/extruded/etc? You could stop physics, allow the user to extrude (terminology?) a hill, and then start simulating again after building a new btBvhTriangleMeshShape. If your editor includes placing other objects (like debris), you might need to use an invisible sphere to push away other objects before creating a new btBvhTriangleMeshShape. Or even try to detect which non-static objects would need to be moved and just warp them above the new highest point in the area and let them fall back down (the user will have to re-place them, but hey, they rose a hill there).


Disclaimer: I've never written such an editor; I use Blender (poorly).

In Topic: some assembly unknowns

08 July 2014 - 02:29 PM

If you use NASM to output a raw binary it will assume 16-bit code for a DOS .com executable.