What you propose is a little old school. A step up from CPU based ROAM but a step down from modern geoclipmapping (a monolithic draw for all the terrain). At the newer end of the spectrum, you could have a single grid that uses tessellation based on distance and rockiness. Depends on if you want to support DX9.0, 10, 11.1 or 12.
If you plan to spend most of your work on close to ground, don't try and use a cube map. I've done a few planet to ground's and you can pregenerate a torus, which will allow you to pregen stitching. If you just do a grid by VertexID, you can stitch different sized grids, stitching by passing in a stitch flag to the constant buffer. So, similar to weighting bones in animation, each grid chunk can be passed a scale and position value to tell it where to go with a flag to say which edges are stitched. The height is set (on a 2D plane) by a monolithic height map. Each tile or chunk in the height map is scaled differently, depending on the grid size. Alternatively, you can multisample like in Microsoft's geo clipmap example. Either way then just wrap the grid by a spherical equation.
If you want random terrain, you can do rolling hills etc easy enough. The problem is pathing water for rivers. That and an endless world of boring. I'm a fan of random terrain with predetermined major geography (ie a continent sized height map) and crafted points of interest.
There are different variants of Linux and Linux like OS's which would be more appropriate than generic Fedora or whatever. IPTables is just one system, something like OpenBSD might be more appropriate. Either way, you would still be fiddling with the OS networking additional to the programming. The old school way would be to use SNORT to read packet contents and make routing decisions. Personally if all you are doing is routing, and not value adding like building a firewall with a gui, I'd just use a routing specific variant of linux : https://en.wikipedia.org/wiki/List_of_router_and_firewall_distributions
Posted by TeaTreeTim
on 28 February 2016 - 06:56 AM
In the mid 1980's my father, a programmer, said soon his job would be replaced by software that could write software. Sure now Visual Studio can create a program using a wizard that would serve the needs of that time but things became more complicated so we still need programmers.
In 1988, soon after Unix was announced as mathematically proven correct, my Software Engineering 101 lecturer said to the class: "in the future all software will be proven mathematically correct". Microsoft never proved Windows and again things became more complex.
I guess my point is I wouldn't look too far into the crystal ball.
Back to the here and now: what others didn't mention is you will need to re-factor, not just because of change in requirements, but because as your skill and experience increases you will find better ways to do things.