Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 04 Jul 2003
Offline Last Active Yesterday, 02:54 PM

#4978761 Drawing Hardware for 2D and 3D

Posted by JTippetts on 10 September 2012 - 08:33 PM

Not 100% sure what it is you're asking for. You're looking for a drawing tablet? What does that have to do with baking textures? Baking is a function of software, and Blender can do that just fine. You're looking at ZBrush, but you don't need sculpting?

To be honest, under $200 is a pretty small budget if you're serious about a tablet. You can cover the software bases with Gimp, Blender and Sculptris for free, and if you're not going to be hand-drawing a bunch of stuff, you can get by with just that. But I highly recommend that you spend the money to get a decent quality tablet, if you insist you need a tablet. Trying to go cheap will just net you frustration. I've worked with the cheaper tablets before, and their surface area can get pretty cramped. A good, quality tablet with a nice big surface area is a dream to work with. You certainly won't obtain a "professional level" tablet for under $200. Even a small Intuos will run you $230 or so, and like I said before, you really want to have a bigger tablet.

Additionally, graphics work tends to be fairly heavy weight. If your laptop is weak, you're probably just going to have to learn to deal with long render times. Rendering is processing heavy, and while renderers such as Blender Cycles can really save the day by offloading onto the GPU, it takes a pretty good quality graphics card (preferably an NVidia) to make it happen, otherwise you have to fall back to CPU. Laptops suffer especially with this kind of work due to their relatively poor heat dissipation. If you've got all your cores running at 100% doing a render, that generates a lot of heat. Also, doing texture work can really hit the RAM, so having a decent base of installed RAM will do wonders when you have multiple 2048x2048 texture sheets open in Gimp, with an instance of Blender running so you can see it on the model. These days, even lower end machines can come installed with 6 gigs or so, so the more the merrier.

#4978135 Best language to be "versatile" in game making.

Posted by JTippetts on 08 September 2012 - 06:25 PM

Guys, let's not turn this into a language flame war thread, okay?

@OP: As you can see, it's not really about the tools so much as it is about the craftsman. Any of the languages mentioned will be more than enough for your needs, and any language you choose is going to offer up plenty of learning opportunities. There is a lot of argument and discussion about the easiest language, the hardest language, etc... but rest assured that all of them offer challenges aplenty, and all of them offer ways to both create awesome things and shoot yourself in the foot many times in the process. The important thing is that whatever you choose, be it C#, C++, Python, Lua or something more esoteric like D or even (don't do this) Brainf**k; stick with it and give yourself plenty of time to learn it and gain experience. A good programmer can change languages very quickly; after all, most languages have far more in common with one another than differences.

#4976811 Geologically Accurate Video Games

Posted by JTippetts on 05 September 2012 - 07:31 AM

Some aspects of change could be slow-paced enough that you wouldn't have to spend game time on them every single frame, or the updates could be spread out across multiple frames. For example, consider hydraulic erosion. The Navier-Stokes equations for hydraulic flow and erosion have already been approximated on the GPU,. That could be an on-going task that is incremented in small bits each frame, and could even be implemented on the GPU using OpenCL if desired. Thermal erosion could be done in a similar manner. As Bacterius indicated, though, it would be tricky having "persistent" erosion in a large-scale world. Procedural methods compress very well; you can represent an entire world using a single integer seed to give to the generator. However, when changes are imposed upon the generated data set, there needs to be a record made of those changes stored to disk. Something like erosion causes changes across the entirety of the data set, which can result in a very large chunk of change data that must be stored to disk in order to maintain persistence. This could easily result in hundreds of gigabytes of data, even compressed, with a very large world.

BTW, someone earlier in the thread made mention of a plate tectonics thread. That thread is http://www.gamedev.net/topic/623145-terrain-generation-with-plate-tectonics/ and it looks mightily interesting. The guy even posted a link to source code. I haven't fiddled with it yet, but I might have to download it and tinker for awhile. This thread has got my brain fired up for procedural stuff again.

#4976683 Geologically Accurate Video Games

Posted by JTippetts on 04 September 2012 - 10:12 PM

If you read the book Texturing and Modeling: A Procedural Approach, in Chapter 14, F. Kenton Musgrave talks a bit about the differences between teleological and ontogenetic modeling. Specifically, teleological is "a) the study of evidences of design in nature... c) a doctrine explaining a phenomena by final causes" whereas ontogenetic is "based on visible morphological characters". In essence, it is the differences between generating, say, a terrain using precise modeling of real-world tectonic and erosive forces, as opposed to modeling a terrain using random crap that happens to create something similar to what you are trying to achieve; ie, noise fractals and other techniques.

It's a discussion that has been occurring for a long time now; since the earliest days when modern graphical techniques were being pioneered in the 70s and 80s. While advances in hardware and computation speeds have been astounding, there are still very real, practical limitations on the available computing power. There are other constraints as well, especially in a game, where the (im)patience of the player is the key constraint. The aforementioned book has a number of quotes that I find to still be relevant, even in this day and age.

"Scientific models, or 'physical' models in computer graphics parlance, do not generally map well into efficient algorithms..."
"When you use a physical model... you often waste time computing the answers to questions about which you care not."
"Ontogenetic models tend to be--indeed ought to be--simpler and more efficient, both in programming and execution time, than corresponding physical models."

I remember once generating a new world in Dwarf Fortress. I love that game, but sweet hell was that generation process an onerous and frustrating expenditure of time. It even "failed" a couple times, forcing me to restart the process. I can only imagine how much longer it would have taken to generate it using more physically accurate processes, rather than the crude and error-prone processes being used.

Of course, for offline tools (such as those employed in making a game like Skyrim), I am certainly very much in favor of more accurate simulations. However, you also have to understand that extreme "realism" might ultimately run completely counter to the core idea of games, which is fun. Certainly, detailed and precise physical models are kind of cool, but you have to wonder if it is worth pouring hundreds of hours of design and execution time (detailed physical models can be VERY computationally expensive) for something that the vast majority of players are just going to charge across, guns blazing, far more intent on gibbing the enemy noobs than in studying the terrain around them for physical accuracy. That time might have been better spent adding more areas or maps, or in other ways enhancing the gameplay and "fun".

Eventually, I think, a team can abstract away a lot of the design for the physical processes, and amortize that expenditure across several projects, but it will always be a balancing act, and one in which fun and gameplay must always take precedence over physical accuracy. I think it is a worthy goal, what you want to achieve, though. I do recommend that book, though. It's a very enlightening read.

#4976670 Generating Large Maps Using Libnoise (Limits?)

Posted by JTippetts on 04 September 2012 - 09:25 PM

Something that might work would be a hybrid algorithm. The world is generated procedurally on-the-fly, but any changes or edits done to the world are stored as diffs, similar to the process of applying a patch to source code. A particular world chunk would be generated using the pre-determined seed, then the diff files would be checked and any relevant changes made before presenting the chunk to the player. That way, you don't have to store the whole world, just enough data to represent the changes made to it. To be even smarter about it, you could track the number of changes made to a given chunk, and once it reaches a certain arbitrary threshold, then the chunk could be stored as-is, rather than as diffs. This could make heavily-modified chunks more compact and quicker to load.

#4976237 Generating Large Maps Using Libnoise (Limits?)

Posted by JTippetts on 03 September 2012 - 06:02 PM

You understand that an array of float 100000x100000 in size is roughly 37 gigabytes of memory, right? That's not a limitation of libnoise, that's a limitation of modern PC computing. If you absolutely must have that whole chunk of data available at once, your best bet is to break it up into blocks and save it to disk a block at a time; after asking the player if it is okay to use up 37 Gb of their hard drive space, of course...

#4975582 Kind of stuck in learning.

Posted by JTippetts on 01 September 2012 - 05:10 PM

For what it's worth, learning the deprecated stuff won't necessarily be wasted time. Many of the concepts transfer over directly; most of the differences come in the application of those concepts. It is not as if GL3+ is any kind of radical departure from deprecated GL. The basic ideas have been there for a long time.

#4975551 The Novices Guide to becoming a game Programmer and artist!.

Posted by JTippetts on 01 September 2012 - 03:34 PM

There doesn't necessarily need to be a progression. For instance, in 18-some years of programming, I've never found a need to muck with Java. What you should worry about instead of language progression, is learning one language really well. Once you have a strong grasp on programming, you will understand that the differences between most languages is very superficial. (Barring, of course, the differences between imperative languages such as C/C++ and functional languages like Lisp or Erlang) If you know programming, you can pick up a new language in a matter of days, become proficient with it within weeks.

Focus less on specific languages and more on becoming a proficient programmer.

#4974906 Best 3D Landscape SW for exchange

Posted by JTippetts on 30 August 2012 - 02:19 PM

It's not really practical to think of such a thing as a single model, and impose the restriction that it be handled as a single model. It's not a single model. On modern hardware, it can not be done with a single model, not even remotely. Consider that if you have an 8000x6000 mile heightmap, sampled at 1 yard resolution (which is actually relatively coarse; even better would be 1 ft resolution). That equates to a heightmap 14080000x10560000 in size. That, my friend, is a gigantic freakin' heightmap. You are talking multiple gigabytes of data, just to store the vertex buffer. The index buffer would be another huge chunk of data. Every additional vertex attribute such as normals would contribute multiple gigabytes of data. Very few computers could keep that much data in memory at once, and certainly no consumer grade video cards are available with terabytes of video RAM that could keep it as a single model. No, a world that size absolutely MUST be split up in a sensible manner.

And it's not even just the impracticality of handling that as a single model. You also have to consider the impracticality of moving that world dataset around. Any time someone connects to your world, they need the world data streamed to them. You going to stream the entire freakishly gigantic single-model world to them? How many days do you want them to wait while it downloads? No, you need to have the data partitioned so that you only move around the data that is required for a given player. Anything else would be such an irresponsible waste.

And how would you construct such a world in the first place? Here is an exercise for you. Sit down with a 3D modeler and construct a heightmap at 1 yard resolution that is 1 mile by 1 mile in size. Take a measure of how long it takes you to sculpt and texture so that it looks good. Now, multiply that effort by 48 million, and you'll have a reasonable guess at how long it'll take you to construct the larger terrain by hand.

Typically, for planet-scale terrain, you are going to have to rely on procedural generation to construct the vast majority of the data set. There simply is not enough time for you to do it any other way. And given that, you can effectively store an entire world as a single unsigned integer seed to feed to the generator. The generator can be set up to spit out chunks of the world at a time, rather than the whole shebang.

Consider a world like World of Warcraft, to give yourself a sense of scale. The game consists of 4 continents. I read somewhere that the largest, Kalimdor, was approximately 60 sq. km. in size. The world you propose is approximately 77248512 sq. km. in size. 12 million times bigger. How many people are you going to hire to populate that space?

As a final thing to think about: exactly how many players will you have, and exactly how much space will they need to occupy? Because your proposal would be sufficient to give 48 million players each a full square mile to play with. If you have fewer players, there will be vastly more space for each player. The effect of this would be a world that feels very, very empty and dead. If a player has to travel hundreds of miles to encounter another player, they might as well be playing single player. And if they all congregate in certain areas, that leaves huge swathes of the land unoccupied; essentially, those areas of land represent wasted resources that do not contribute to the final experience. So many man hours of time, so many computer hours of time, wasted on creating content that nobody will ever see.

My personal opinion is for you to aim smaller. Instead of a world 8000x6000 miles in size, shoot for one 8x6 miles. That's still a pretty freaking big place to explore, and yet is vastly more manageable to handle and build.

#4974181 Story for the Picture

Posted by JTippetts on 28 August 2012 - 11:27 AM

Remy was dead.

You get used to death, living in a place like this. The Nationalists don't give two shits for the life of a poor, rag-cloaked frog like Remy, not with old Pierre Whats-is-name around, the guy with the pinstripe suit and the golden teeth whose ugly mug you see on all the vidscreens from here to Saint-Denis, telling us all to be good, obedient little citizens. When the Nationalists come, with their lazerifles and burst grenades, death is just a quick flashburst away. Life is cheap.

I've seen it all, a thousand times, even though Kansas was only five years ago and Paris has been home for less than half that time. The shiny face masks, the clear plexi riot shields, the flashbursts. The screams, the flames, the busted out windows and smoking holes where wooden walls used to be. The blood, the pain, the fear. Seen it all, a goddamn surfeit of it in my life, but nothing like Remy. Nothing like seeing that beatific pacifict's face striped with blood, perched atop a slumping sack that used to be a person. A person, by god, and the only person in my life who ever meant anything, at least for a long, long time.

Kate and I, we ran from the smoke and the blood, jostled about by the crowd. She held my hand and pulled me along, swearing at me to move faster or by hell she'd cut my throat and dump me in the Seine. I followed, blindly. Not only the smoke and the smog caused me eyes to tear and my nose to run.


We came to a square, where Nationalists with rifles stood sentry around a milling press of people. The vidscreen there was alight, the smiling clown face of Pierre Mulleneaux leering down at the lot of us with that cheery jes' folks grin that, even in my relatively short time in Paris, I've come to hate. Makes me want to ball up my fist and put it right through the screen. Better; ball up my fist and cram it right down his damned throat in real life. A worthy goal, but laughable. Think the Nationalists are bad? The Mulleneaux Guard are even worse. They'll stare you coldly in the eye while they lift a boot and kick your kidneys out through your piss-hole.

If I could get my hands on a lazerifle, I'd give it a shot. Better to die trying to take a shot at the Great Father, than live with the knowledge that Remy is dead, that I'm alone once more. But to do that, to have that hilariously slim chance at revenge, I had to live. Kate was a tough one, hard as you could ask for underneath that child-like face smudged with dirt and blood. She hauled me along, snarling for all the world like she was ready to follow through on her threat to dump me with the rest of the bodies floating down the Seine. But she tugged me along, through the press, away from the Nationalists and the clownish japing of the asshole on the vidscreen, toward a place where we could be, if not safe, then at least temporarily secure while we figured things out. Figured out what we would do without Remy, what would happen to the movement without the one whose serene peace kept us all from turning blindly into bloodthirsty savages, monsters as bad as those we sought to resist.

Without Remy, I feared, we would be lost. For who was there left to remind us of how it once was? To remind us of the green fields, the tall trees, the sky unblighted by smog? To read to us from strange books about the voice of the people, to give us hope, to make the vision real in our minds? What do we know of green fields and tall trees? Kate, she was a Yorker, she'd never seen a tree in her life before the Turkish Army blew the southern half of her enviro-pod to hell and gone, killing half her family in the process and spilling the rest of the refugees out into the blighted and poisoned English countryside to try to scratch out their survival. The smart ones like Kate, they found rusty old barges and headed for the mainland. The dumb ones... well, you know what happened to them.

And me? Kansas was no paradise either. It was peaceful, though, at least relatively so. I could remember nights spent in safety, at least, behind the shimmering glow of the Faraday towers. But how could I ever think to take Remy's place? How could I lead anyone, as incapable as I was of even governing myself? For it was in me to shake off Kate's firm guiding hand, to snatch the lazerifle from the hands of the nearest Nationalist thug and to go down, like the old mopics say, in a blaze of glory. It was my driving urge, my only coherent thought. Just grab a rifle and blast away until the flashburst came and I didn't have to worry about it anymore, didn't have to see Remy's body so blasted and broken, didn't have to see those staring eyes. Didn't have to endure that dancing jackass on the vidscreens telling me what a goddamn paradise Paris was, what a brave and wonderful future we all faced if we would only accept into our hearts the laws of the Great Father.

But if I didn't lead them, then who else was there?

#4974163 Point inside (or outside) 3d object

Posted by JTippetts on 28 August 2012 - 10:40 AM

One possible means of testing if a point is inside an arbitrarily complex object is to pick a second point somewhere that you know is not inside the object, and test the line segment formed by the two points against the object, counting the number of times that it intersects the shell or mesh of the object. If the number of intersections is odd, then the point is inside. Otherwise, the point is outside. Consider a simple circle. For a point inside the circle to a point outside it, the line will cross the shell exactly once. For a point outside to a point outside, the line will either not cross the shell at all, or will pass across the circle, crossing once as it enters and once again as it leaves.

#4973117 Fullscreen mode not working with signle buffer

Posted by JTippetts on 24 August 2012 - 03:01 PM

Might want to post some code so we can see what you're doing. Also...

even though I used delay functions after and before glutSwapBuffers.

... this statement makes me very uneasy. It makes me suspect you are doing something wrong; like, say, putting some kind of sleep() or busy-wait before and after swapping. Which probably isn't what you really want to do.

#4972795 Lua: A return keyword followed by a table

Posted by JTippetts on 23 August 2012 - 05:01 PM

Each file in Lua is compiled as a chunk and executed as if it were an anonymous function. So the return statement essentially boils down to a single anonymous function that returns a table. The returned table contains 4 inner tables, named description, metrics, texture and chars.

If the above source file were named script.lua for example, and you executed the line t=dofile("script.lua") then after execution, the variable t would hold a reference to the table created and returned by the above. This is one trick for using Lua as a data description language.

#4972480 Wrapping my head around a hex tile grid

Posted by JTippetts on 22 August 2012 - 10:39 PM

My current game project, Goblinson Crusoe, is a hex-based game. The way I handle it, though, is that the underlying scene manager is just a basic rectangular grid, for simplicity. The hex system is just a layer that sits on top of it. In this layer, I can convert a World coordinate to a Hex coordinate, and vice versa (although, when converting from Hex to World, I just calculate the World coord of the hex center). Objects, then, don't need to track their current Hex coordinate, but can just call the hex layer to calculate it when needed. The hex layer also provides other operations such as listing all neighbors of a given hex, listing all hexes within a given radius of a hex coordinate, and so forth; pretty much any kind of hex operation.

Movement and animation are all done in world coordinates. Pathfinding is done on the hex grid. So when a character wants to move, the pathfinder will try to find a valid path on the hex grid between the character's hex tile and the destination tile. This path is generated and returned as a list of hex coordinates. Since the game is a turn-based RPG fixed to the hex grid, this set of hexes is simply converted to a set of world coords representing the centers of each tile, and the character is progressed along the vector path thus produced.

Since the underlying scene structure is a simple rectangular grid, there is nothing weird to worry about as far as trying to manage the scene as a hex grid. Objects don't belong to a given hex, but they are maintained in the spatial grid. Frustum culling using regular grid cells is quicker than trying to do it with hex-shaped cells. Hexes are an appropriate abstraction for the game board, but not necessarily appropriate for the scene structure, and you just compound your work by trying to force the issue.

#4972425 Is python fast enough for a simulation heavy game similar to Dwarf Fortress?

Posted by JTippetts on 22 August 2012 - 05:48 PM

Washu hit on the key points for making sure that your game is fast when using Python, at least in my experience. Some parts should be offloaded to C++ modules. I always tended to look at it on the basis of abstraction. Some things operate best at a low level of abstraction. Bare-metal stuff, like physics and rendering. Other stuff works well at a higher level of abstraction, such as logic and AI. The logic and AI can live well inside Python, and in fact is very well served by living in Python, but the rendering, collision, physics and pathfinding stuff would probably be best handled by C++, built as a layer upon which the higher levels of abstraction can operate.