Jump to content

  • Log In with Google      Sign In   
  • Create Account


#ActualJTippetts

Posted 30 August 2012 - 02:25 PM

It's not really practical to think of such a thing as a single model, and impose the restriction that it be handled as a single model. It's not a single model. On modern hardware, it can not be done with a single model, not even remotely. Consider that if you have an 8000x6000 mile heightmap, sampled at 1 yard resolution (which is actually relatively coarse; even better would be 1 ft resolution). That equates to a heightmap 14080000x10560000 in size. That, my friend, is a gigantic freakin' heightmap. You are talking multiple gigabytes of data, just to store the vertex buffer. The index buffer would be another huge chunk of data. Every additional vertex attribute such as normals would contribute multiple gigabytes of data. Very few computers could keep that much data in memory at once, and certainly no consumer grade video cards are available with terabytes of video RAM that could keep it as a single model. No, a world that size absolutely MUST be split up in a sensible manner.

And it's not even just the impracticality of handling that as a single model. You also have to consider the impracticality of moving that world dataset around. Any time someone connects to your world, they need the world data streamed to them. You going to stream the entire freakishly gigantic single-model world to them? How many days do you want them to wait while it downloads? No, you need to have the data partitioned so that you only move around the data that is required for a given player. Anything else would be such an irresponsible waste.

And how would you construct such a world in the first place? Here is an exercise for you. Sit down with a 3D modeler and construct a heightmap at 1 yard resolution that is 1 mile by 1 mile in size. Take a measure of how long it takes you to sculpt and texture so that it looks good. Now, multiply that effort by 48 million, and you'll have a reasonable guess at how long it'll take you to construct the larger terrain by hand.

Typically, for planet-scale terrain, you are going to have to rely on procedural generation to construct the vast majority of the data set. There simply is not enough time for you to do it any other way. And given that, you can effectively store an entire world as a single unsigned integer seed to feed to the generator. The generator can be set up to spit out chunks of the world at a time, rather than the whole shebang.

Consider a world like World of Warcraft, to give yourself a sense of scale. The game consists of 4 continents. I read somewhere that the largest, Kalimdor, was approximately 60 sq. km. in size. The world you propose is approximately 77248512 sq. km. in size. 12 million times bigger. How many people are you going to hire to populate that space?

As a final thing to think about: exactly how many players will you have, and exactly how much space will they need to occupy? Because your proposal would be sufficient to give 48 million players each a full square mile to play with. If you have fewer players, there will be vastly more space for each player. The effect of this would be a world that feels very, very empty and dead. If a player has to travel hundreds of miles to encounter another player, they might as well be playing single player. And if they all congregate in certain areas, that leaves huge swathes of the land unoccupied; essentially, those areas of land represent wasted resources that do not contribute to the final experience. So many man hours of time, so many computer hours of time, wasted on creating content that nobody will ever see.

My personal opinion is for you to aim smaller. Instead of a world 8000x6000 miles in size, shoot for one 8x6 miles. That's still a pretty freaking big place to explore, and yet is vastly more manageable to handle and build.

#1JTippetts

Posted 30 August 2012 - 02:19 PM

It's not really practical to think of such a thing as a single model, and impose the restriction that it be handled as a single model. It's not a single model. On modern hardware, it can not be done with a single model, not even remotely. Consider that if you have an 8000x6000 mile heightmap, sampled at 1 yard resolution (which is actually relatively coarse; even better would be 1 ft resolution). That equates to a heightmap 14080000x10560000 in size. That, my friend, is a gigantic freakin' heightmap. You are talking multiple gigabytes of data, just to store the vertex buffer. The index buffer would be another huge chunk of data. Every additional vertex attribute such as normals would contribute multiple gigabytes of data. Very few computers could keep that much data in memory at once, and certainly no consumer grade video cards are available with terabytes of video RAM that could keep it as a single model. No, a world that size absolutely MUST be split up in a sensible manner.

And it's not even just the impracticality of handling that as a single model. You also have to consider the impracticality of moving that world dataset around. Any time someone connects to your world, they need the world data streamed to them. You going to stream the entire freakishly gigantic single-model world to them? How many days do you want them to wait while it downloads? No, you need to have the data partitioned so that you only move around the data that is required for a given player. Anything else would be such an irresponsible waste.

And how would you construct such a world in the first place? Here is an exercise for you. Sit down with a 3D modeler and construct a heightmap at 1 yard resolution that is 1 mile by 1 mile in size. Take a measure of how long it takes you to sculpt and texture so that it looks good. Now, multiply that effort by 48 million, and you'll have a reasonable guess at how long it'll take you to construct the larger terrain by hand.

Typically, for planet-scale terrain, you are going to have to rely on procedural generation to construct the vast majority of the data set. There simply is not enough time for you to do it any other way. And given that, you can effectively store an entire world as a single unsigned integer seed to feed to the generator. The generator can be set up to spit out chunks of the world at a time, rather than the whole shebang.

Consider a world like World of Warcraft, to give yourself a sense of scale. The game consists of 4 continents. I read somewhere that the largest, Kalimdor, was approximately 60 sq. km. in size. The world you propose is approximately 77248512 sq. km. in size. 800000 times bigger. How many people are you going to hire to populate that space?

As a final thing to think about: exactly how many players will you have, and exactly how much space will they need to occupy? Because your proposal would be sufficient to give 48 million players each a full square mile to play with. If you have fewer players, there will be vastly more space for each player. The effect of this would be a world that feels very, very empty and dead. If a player has to travel hundreds of miles to encounter another player, they might as well be playing single player. And if they all congregate in certain areas, that leaves huge swathes of the land unoccupied; essentially, those areas of land represent wasted resources that do not contribute to the final experience. So many man hours of time, so many computer hours of time, wasted on creating content that nobody will ever see.

My personal opinion is for you to aim smaller. Instead of a world 8000x6000 miles in size, shoot for one 8x6 miles. That's still a pretty freaking big place to explore, and yet is vastly more manageable to handle and build.

PARTNERS