Jump to content

  • Log In with Google      Sign In   
  • Create Account

Milcho's Journal

2D Skeleton Woes

Posted by , 02 February 2013 - - - - - - · 1,136 views
2d animation
About two months ago, I started writing a 2D game. Given that my previous work was on a 3d deformable terrain, I figured a nice 2D game would be a nice change of pace, and give me less hassle. I was right...mostly.

Character animation in 3d is not a simple task. There's some great software out there to help you animate it, heck, two years ago even I wrote a simple character animation program that had the ability to automatically attach a mesh to a skeleton. But enough nostalgia!

It seems that skeleton animation in 2D should be easier. You only have one simple rotation angle to worry about, and no need to account for the gimbal lock problem using those pesky yet mathematically beautiful quaternions.

Here's a simple 2d texture skeleton in my game:
Posted Image
Seems straight forward enough to animate. You don't have to necessarily worry about vertex attachments, you can make each bone have its individual sprite, and design them in such a way that they blend in together.

But you don't want to design your animations twice do you? Because a walking animation should be able to be played both walking LEFT and RIGHT. So, need a way to easily flip animations. Unlike in 3d where you can rotate your animation along some axis to orient it in the proper direction, in 2D you have to actually mirror the animation.

I'm cheap, so I decided to go for a cop-out - I'm going to only flip the sprites instead of the whole skeleton. Sound good? Yea. But we can't just flip the image in the sprite - you have to take the actual sprite rectangle and mirror all its vertices along a certain axis. Since SFML doesn't support this, time to write my own CustomSprite class.
Still, not the hardest thing. The simple code for flipping a vertex along an arbitrary x-axis:
sf::Vector2f CustomSprite::FlipHorizontal ( const sf::Vector2f &point ) const

{
  sf::Vector2f pt = point;

  pt.x = m_axisIntersect - ( pt.x - m_axisIntersect );

  return pt;

}
The variable m_axisIntersect specifies the line along which to flip. The same method can be used to flip along an arbitrary y-axis-aligned line.
So, here are the results:
Posted Image

Ok, the actual bones (which may be a little hard to notice - they're the thin blue lines) aren't flipped, but the sprite flip seems to have worked fine. The results look promising so far.

Except, I forgot - my character isn't just going to stand always oriented up. Due to the physics of the game, he will lean on slopes and corners. Here's an example:
Posted Image

So, wait, what happens if I use the direction flip on a slope? Well...
Posted Image

Oh, right. I'm flipping along the axis-aligned line that passes through the center of the character, so of course - the program is doing exactly what I told it to do, even if i told it to do was wrong.

It looks like I'm going to have to apply a mirroring along an arbitrarily oriented line now. Mirroring around an arbitrary line isn't that bad, though it's certainly more involved.
Supposing we have a line that passes through the point p, by which you want to mirror. The basics are then:
1. Translate all points by a vector -p - so now the origin of the line matches the global origin.
2. Rotate all points so that the line you want to mirror by is aligned with one of the axis
3. Mirror around that axis using the same method above
4. Undo step 2
5. Undo step 1

While I was thinking about this, I realized that I actually have all my sprites on the model in local coordinates already - they store their positions relative to the model's origin, which is the center point through which the mirroring line will have to pass. And I'm already setting the model's orientation when I touch a slope, so I already have a function that rotates it.
In fact, I was setting the rotation like this:
float angle = atan2( -up.x, up.y )*180.f/(float)PI;

m_rootBone.SetRotation( angle ); 
However, I knew that when I mirrored the model, I could simply re-adjust to 'up' vector that the model received so that it was now facing the right direction:
float angle = atan2( -up.x, up.y )*180.f/(float)PI;
if ( m_rootBone.GetSpriteFlip().first == CustomSprite::xAxisFlip )
{
  m_rootBone.SetRotation( 360.f - angle );
}
else
{
  m_rootBone.SetRotation( angle );
}
And the results now looked good:
Posted Image

Sure, the actual skeleton was nowhere near what the sprites displayed were, but that doesn't matter. The skeleton is only used to draw sprites, not for collision, or any other purpose.

At the end making a 2d skeleton overall was easier than a full 3d skeleton, but had some challenges that you don't face when dealing with 3d.


Today's Lesson (for me): Pointers that mysteriously get deleted!

Posted by , 31 January 2013 - - - - - - · 656 views

I'm posting this as an excercise/lesson, hopefully its useful to someone.
Eight years after I started learning c++, I was still caught off guard by this.

Basically I had code like this: (ignore the LOG_DEBUG - that was just put there when I was testing this)
struct ViewMember
{
  ViewMember(Widget *wid, int wei) : widget(wid), weight(wei) { }
  ~ViewMember() { LOG_DEBUG << "calling View Member destructor"; delete widget; }
  Widget *widget;
  int weight;
};
now, in a class called View, I have a member - widgetList is of type std::vector<ViewMember>
View& View::AddWidget (Widget *toAdd, int weight)
{
  if (weight < 1)
    weight = 1;
  if (Contains(toAdd))
    return (*this); // won't add same widget again
  widgetList.push_back(ViewMember(toAdd, weight));
  needToReorganize = true;
  return (*this);
}
This code (barring typos I may have made in copy/re-arrange) compiles fine, without warnings or errors.

However, once I actually tried to access something in widgetList, I got a crash - it turns out my widgetList[i].widget was an invalid pointer... as if something had deleted it.

I went through this, and I've figured it out, but I thought I'd share since I think it's somewhat important/interesting.

The Problem


The Solution



Water, water, everywhere...

Posted by , 21 November 2011 - - - - - - · 1,065 views

I've been working on water, slowly progressing forward. To those who might wonder, keeping track of, generating storing and updating water when you're dealing with a 5km planetoid (our current test planet) isn't quite straight forward. This is sort of a backpost, since i already had basic water in my last post. But this is a bit more in depth.

The water simulation we went with is not like anything I've read about. There were several methods I considered before going with what we have now.

First was particle water.
The pros: Good water simulation. Realistic waves, breaking etc. possible.
The cons: Hard to extract surface. Impossible to keep track of all particles on any significant planetary scale.
This was obviously not going to work for us.


Height-field based water.
The pros: Significantly less storage. Easy surface extraction. Decent water simulation.
The cons: Braking waves are harder (though not impossible). How do you do a height field based water on a spherical planet? The answer: not well. You can either split into 6 separate height fields, or try to create one with polar coordinates based on a even point distribution.
This is too bad because back before we went for an actual planet, on a flat 2d terrain, this was my top choice

What I went with:
Storing water in a 3d voxel density grid. Much like terrain.
Pros: Storage concerns were already figured out - storing can be done in same datablocks as we store terrain - thus its possible on a planetary scale.
Cons: It's not a very realistic simulation. It's hard to make huge waves.


There was also one other pro, which i didn't realize until later - updating water was made just somewhat easier by the fact that I stored water on a grid. Of course, the grid is NOT oriented with the surface, yet due to a range of densities [-127,127] - it was possible to achieve a perfectly smooth water surface anywhere on the planet despite the grid being all squirrly.

Here are some screenshots of the apparently misaligned grid and the non-the less smooth water surface:

Posted Image Posted Image




And here is a video of the new water shader:

http://www.youtube.com/watch?v=DxkK226C4pY

And a video with the older shader, but the only video of water spreading in a huge hole.

http://www.youtube.com/watch?v=c7V9EFp9XnQ
Update: video of the water on a small planet (200m radius)


http://www.youtube.com/watch?v=a_vlkAcodl8



For more info, and a demo of the project, you can visit at http://blog.milchopenchev.com.

Thanks.


Grass, water and detail textures

Posted by , 29 October 2011 - - - - - - · 684 views

Here's a video of the work we've done on water, grass and detail textures. There's also a new build with these featuers on the blog: http://blog.milchopenchev.com

We haven't really had time to post any detailed description on the technicals behind the water or detail maps, but hopefully soon we will. As always, thanks for reading.






Triplanar texturing and normal mapping

Posted by , 10 October 2011 - - - - - - · 3,843 views



How we handled doing normal maps when also doing tri-planar texturing.

Note: this is a duplicate post from our project blog: http://blog.milchopenchev.com - the formatting may be a bit off, sorry.

For our texturing, we had no choice but to use tri-planar texture mapping - since we generate an actual a planet and the terrain can be oriented in any direction. Combine that with the fact that the terrain is diggable, we had to make the texture adapt to any angle. Triplanar mapping was the perfect solution.

Doing normal mapping on top of triplanar mapping may seem hard at first, but it's just a little harder than triplanar texture mapping.


Posted Image


To obtain the final fragment color for triplanar mapping, you basically sample the same texture as though it was oriented along the three planes (See diagram on right).

Once you have a sample from each of these planar projections, you combine the three samples depending on the normal vector of the fragment. The normal vector essentially tells you how close to each plane the projection actually is. So if you have a mostly horizontal plane, the normal vector would be vertical and thus you would sample mostly from the horizontal projection.

This same principle can be used to compute the normal from a sample from a normal map. Instead of sampling from the texture, you would sample from the normal map. The RGB color you get would give you the normal vector, as seen in that plane. Then you can combine these normals using the same weights that you use to compute the mixture from the texture coordinates.

Posted Image



Basically you obtain three normal vectors, one on each plane, and each having a certain coordinate system that is aligned with the texture on the side.

On the picture on the right, the red, green and blue are the axis on each projection of the texture, while the dark purple is a sample normal vector. You can imagine, the closer the fragment's normal is to each plane the more it samples from that plane. One thing is that unlike texture mapping, is that when the normal is close to the plane's, but is facing the opposite direction, you have to reverse the normal map's results.

This is what the code for obtaining the normal of one texture from its three normal projections looks like in our terrain shader:



vec4 bump1 = texture2DArray(normalArray, vec3(coordXY.xy, index));

vec4 bump2 = texture2DArray(normalArray, vec3(coordXZ.xy, index));

vec4 bump3 = texture2DArray(normalArray, vec3(coordYZ.xy, index));




vec3 bumpNormal1 = bump1.r * vec3(1, 0, 0) + bump1.g * vec3(0, 1, 0) + bump1.b * vec3(0, 0, 1);

vec3 bumpNormal2 = bump2.r * vec3(0, 0, 1) + bump2.g * vec3(1, 0, 0) + bump2.b * vec3(0, 1, 0);

vec3 bumpNormal3 = bump3.r * vec3(0, 1, 0) + bump3.g * vec3(0, 0, 1) + bump3.b * vec3(1, 0, 0);



return vec3(weightXY * bumpNormal1 + weightXZ * bumpNormal2 + weightYZ * bumpNormal3);


Where weightXY, weightXZ and weightYZ are determined like so from the normal that's calculated at that fragment:
weightXY = fNormal.z;
weightXZ = fNormal.y;
weightYZ = fNormal.x;

I realize that it sounds a bit counter-intuitive that we need the normal before we can calculate the per-fragment normals, but this normal can be simply obtained by other means, such as per-vertex normal calculations. (We obtain it through density difference calculations of the voxels)Finally, to get good results you need an actual good normal texture. We only had time to create one (neither of us are graphics designers), so here's a video of the rock triplanar normal map, with a short day length on our planet:

http://www.youtube.com/watch?v=lEE5mOwYni8





PrEdiTer project intro

Posted by , 24 September 2011 - - - - - - · 643 views
procedural, editable, terrain
The Procedural Editable Terrain project is just what it sounds like - a project to make an engine for terrain that is both procedurally generated, and allows for editing functionliaty (lowering, raising etc.)

The project evolved from my previous project for simply procedural terrain. One other person joined me on this project, and has been helping with various tasks on the project.

Currently the project has the basic functionality as described above, and some additional things. The major features currently are:

  • Persistent perlin-noise based terrain generation
  • Data stored in discrete voxels, allowing for modification of the terrain
  • Planet generation of variable raidus, currently tested with 100km raidus, theoretically supporting much larger.
  • Planet-based biomes - desert, savanna, temperate, polar, distributed with variation accross the planet.
  • Basic physics from extracted terrain information - accurate collision detection with terrain
  • Different LODs, on more powerful PCs support comfortably half a kilometer viewing distance
  • Custom terrain shader supporting custom lighting, triplanar texturing and blending between any two textures
  • Custom sky shader displaying the moving sun and related effects.
So, in an effort to increase the number of people aware of my work to 4, I'm going to be posting some blog entries describing some ideas.

You can read more technical stuff on the blog, where you can see also download the current version of the program: http://blog.milchopenchev.com
Currently, a lot of the options are not exposed to the user through a nice interface, but some options are accessible via a console (~).


Here are some screen shots:

Posted Image Posted Image

Posted Image Posted Image

Posted Image Posted Image

Posted Image Posted Image

Posted ImagePosted Image

http://www.milchopenchev.com/RandomPics/prediter/thumbs/screen_polar_1.jpghttp://www.milchopenchev.com/RandomPics/prediter/thumbs/screen_polar_2.jpg





Terrain generator resumed as Procedural Editable Terrain

Posted by , 09 September 2011 - - - - - - · 579 views

Well, it's been a long time since I've posted here.

The terrain project I've been working on has resumed, under the name PrEdiTer - Procedural Editable Terrain.

It's currently new blog is here: http://blog.milchopenchev.com

---


Procedural Terrain + Texturing (with screenshots)

Posted by , 17 February 2011 - - - - - - · 850 views

The terrain is finally starting to look like..well.. terrain. With normal-based texture coordinate assignment and 3d texturing, the results are promising. The updated version is also up on my site, available for download, and in fact, I encourage you to try it and would appreciate any feedback.

The main problem was how to assign texture coordinates properly. Well, part of that solution was to use the Normal vector and the elevation of the vertex. If the terrain at that point was not approx. horizontal (based on the normal vector) then grass or snow cannot hold on to it, thus it's set to the rock texture.

The height is just used for determining how the horizontally-oriented faces are textured - with snow or grass. However, if only the height is used, then there's a clear cut-off for the switch, making it seem unnatural. So, I just added a (consistent) variance of +/- 100 m to the height, and the clear-cut line was dissolved.

The visibility remains ~1.5km, which is illustrated in the last screenshot. That mountain is 1km high.

I also increased the amount of detailed perlin noise (I use 2 perlin noise functions), which now generates some interesting looking formations, like the overhang and the archway seen below.

So, here are the screenshots:

Posted Image

Posted Image

Posted Image

Posted Image

Posted Image

Posted Image




Procedural terrain generation progress

Posted by , 15 February 2011 - - - - - - · 1,924 views

I've completed the start of the procedural terrain. You can download the test on my site here. I know not a lot of people read this, but if you are one of the few, and you decide to download it, I'd appreciate some feedback on how well it runs.

If you do download it, you can move with wasd, and holding down the right mouse button. There's no load screen right now, so initially, please wait and don't move until the terrain loads (~10-20 seconds). After that you can move as far as you want, and the program should generate the terrain around you.

On my two year old PC I get around 200fps, (bottom left corner). I can crank up the terrain resolution higher, and I can make visibility past 0.5km, but it will probably have a very adverse effect.

Right now, I use two layers for the different LODs. The high resolution (colored blue), is generated in close proximity to the camera. (it's colored blue for testing purposes), and everything else is the low resolution. I'm thinking, I might get better performance if I add a medium resolution, and decrease the low-resolution's sampling density.

Update: Added a third LOD, which means there's visibility of 1.5km now, and I actually gained a small speed boost in rendering.

Also added collision with terrain, which prevents the camera (and any future objects) to go inside the terrain. Since the terrain is density-function based, there is an actual 'inside', so collision isn't based on polygon intersection. The method I'm using also allows for both elastic and inelastic collision. It also has a nice side-effect that it's decoupled from the rendered polygons, allowing any object to collide properly without the need for the ground to be rendered at all at the object's location.

Updated version has replaced the old download.

Anyway, here's a screenshot. Though the best way to experience the procedural (theoretically infinite, though 'arbitrarily large' is a better term) terrain is to download the program from the link above.




Posted Image




New screenshots, with 3 LODs

Posted Image

Posted ImageEdit: new screenshot. I might switch to a texture atlas, because 3d texturing has it's quirks. Alternatively I could make my own mipmaps for the 3d texture.

Posted Image




Density-based Terrain Screenshots mini-post

Posted by , 09 February 2011 - - - - - - · 936 views

Today, in a mini-update, some screenshots of terrain generated by a combination of the Marching Cubes algorithm and a perlin-noise density function.

The images below have 2 parts: a small grid in a 64x64x64 meter cube, sampled at 2 samples per meter, and a large grid in a 512x512x512 cube sampled at 0.25 samples per meter (1 sample every 4 meters). Internally both are represented by the same number of "blocks" - a 4x4x4 grid of "blocks", each block consisting of 32x32x32 samples of the same density function.

The denser grid is suppose to be around the character, however, there's some decisions to make about this method, so no specifics yet.

The whole thing contains something on the order of 80,000 faces.

First an overview of the whole 512x512x512 m cube

Posted ImagePosted Image

And how it looks when the camera is standing at eye-level in the middle of the denser patch:

Posted ImagePosted Image




Note that this method of generating terrain can easily generate overhangs and caves, in difference to a heighmap-based approach. There's still a lot of work to do to make it completely useable, but for now, it's a solid start for my engine's terrain generator.








December 2016 »

S M T W T F S
    123
456789 10
11121314151617
18192021222324
25262728293031