Fost - Mr. Robot Art
The Ripple Effect
- click image for flash anim...
For shaders in the game, I didn't really want to go beyond the vertex shader 1.1 spec if possible. Keeping to that means more hardware T&L cards will be supported, but VS1.1 only allows 128 instruction slots. Of course, it will get emulated on the CPU, and for a game like Mr. Robot, we can probably get away with a fair amount of that, but every little helps.
This however proved difficult with the coolant shader. It was already doing the lighting, and texture scrolling, and scaling of the textures in the vertex shader. The coolant object in Mr. Robot is a 1 tile unit cube scaled to match the appropriate size set in the editor, so I use a multiple of the vertex world position to scale the texture coordinates - that way, the texturing is all at the right scale no matter what size you set the water to be. After all that, there wasn't a lot left to play with, but I wanted to have a mess with putting a 'wobble' into the vertices just to make it a tiny bit more interesting.
A pretty simple way to do this I thought, would be to apply a sine wave to the vertex height, based on the vertex horizontal position + time.
The first thing I did was subdivide the top of the coolant object quite a bit (without any vertices to wobble, it wouldn't look very good at all!) Then after much messing around, I finally got something running at the right scale (Initial tests resulted in screen sized tidal waves :( ). Problem was, it was a little too regular. So, the solution to that is more waves: if you apply lots of waves though a mesh, all at different amplitudes, wavelengths and frequencies, you can make a pretty nice irregular wave pattern. I've actually done this before many years ago with multiple octaves of noise for a pre-renderered water animation. Pretty quickly though, I hit the vertex shader instruction limit, and so whilst I was testing, I dropped out some of the point lights (Mr. Robot supports ambient light, a single directional light, and up to 5 point lights per room).
The problem with this kind of thing, is that it's always a matter of adjusting loads of parameters which all affect each other and so something pretty simple ends up taking all day :( Finally, I whittled it down to 3 sine waves - one running the length of the water, one along the width, and a much finer one at 45 degrees. With a bit of tidying up of the HLSL code, I managed to keep it all inside 128 instructions with all the lights turned on.
The code below is what I'm using - projectedVertex is the vertex in camera space ready for output (I think, not sure I've got the terminology right there :? ) and worldSpaceVertex is the vertex in world space. There's a slight issue in that there's also a small amount of horizontal wobble, but it's so small you can barely see it and is fine for the game. I think it's probably caused by using the wrong matrices to work out the output position, or adding a value to the object in camera space - bear in mind that I'm an artist and not a programmer, so I don't really understand what I'm doing :)
//Sine Wave X
tempY = 0.4*worldSpaceVertex.x; //Scale wavelength (big number=small wavenlength)
tempY = sin(g_time + tempY); //can multiply g_time to increase frequency
tempY *= 2; //Scale Amplitude
projectedVertex.y += tempY; //Add result to vertex position
//Sine Wave Z
tempY = 0.4*worldSpaceVertex.z;
tempY = sin(g_time + tempY);
tempY *= 2;
projectedVertex.y += tempY;
//Sine Wave X+Z
tempY = worldSpaceVertex.x + worldSpaceVertex.z;
tempY = sin(g_time + tempY);
tempY *= 1;
projectedVertex.y += tempY;
//END WAVE EFFECT//
- click image for flash anim...
This was a fun little effect to make for the teleporter. I used to love the really simple teleport effect in Lunar Jetman on the Spectrum, and so this is my mini-homage to that. Just a simple point emitter with very slow moving particles so they have some velocity and they can be velocity aligned. then they are just scaled along one axis - which since they are velocity aligned, and all moving away from a point turns the whole system into a big spiky orange ball.
- click image for flash anim...
You'll have to excuse the 'home movie' shaky camera in the flash anim; it's an artefact of scrolling around for the video, whilst the game camera is also trying to smoothly track the player position. Anyway, it should be possible to see what's going on here. It's an 'X-Ray' type effect, along the lines of many recent RTS games that allow you to see your units through buildings. It works by rendering the robot mesh once looking red and partially transparent, but with 'zenable' set to false (so the shader doesn't use the Z buffer), and then again in a second pass using the standard shader (with Z-Buffer enabled). A little messing around with the order in which things are drawn and we can control whether or not the X-Ray effect will be seen on a per-shader basis.
At this point, I'm not 100% sure if we'll be able to use it as I'll need to re-export any models that we don't want to see the effect on. Probably robots, crates and consoles should still obscure the player. Thankfully it's not necessary, as all the rooms are designed so you don't get confused by to much view-obscuring geometry, but it's a nice little feature for tracking exactly where you are.
More Particle Effect Shots
Asimov taking damage from a hostile robot.
Active waypoint looping effect.
Asimov taking damage in shallow coolant.
Poo Bear - Mr. Robot Programming
Lots of work on ghost hack gaming this month. I'm trying to get all the core functionality working so I can play through it all and finalise the design. To that end I'm trying to get the first part of the game fully working and integrated with ghost hacking. We have two modes that I call "reality" and "hacking". It's very important that they connect together properly so you can move easily between them.
- Picking up ghost hack items in the real world and being able to configure them on your avatar.
- Other robots giving you objectives to complete that require you to hack into something, recognizing when you do it and the appropriate dialogue firing.
- Being blocked by a door in reality, performing a hack, the door opening as a result.
The first ~30 rooms introduce various characters and ease you into learning how the game works both in reality and in hacking. This is the bit I want to get completely working. Even though we don't have all the animations, the models, the items or the frontend screens completed it is important to show the core systems running and meshing together. This is a very powerful development technique in an environment where specifications are nebulous. Some people might find it strange, but developers rarely understand how the entire game will work in detail until it is almost complete. This is usually because the game is implementing ideas that are new to the developers. Note that I didn't say the ideas themselves are new, that happens very rarely indeed. Unless we just make Starscape sequels or games based on our old mainstream titles we will be working with ideas and concepts we haven't implemented before. In that situation you really need focus on getting things working quickly so you can review them.
This prototyping technique continues right through development until all the unknowns are implemented and working. In an ideal world you would create all the prototypes at the start and only when you were happy with everything would you begin production at which point there would be no more unknowns at all. The reality (especially on a tight budget) is the prototypes start out big and flakey and then over the course of the development the prototypes become smaller and more focused as the game locks in on its final form. What I'm doing now with hacking will probably be the last prototype phase in this development. I'm not trying to prove whether hacking works; I've already done that in a much rougher earlier prototype.
I now want questions answered like:
- Do all my item ideas work and have I forgotten anything? 'Items' refers the things you use to modify your ghost hacking avatars, like attack programs, defensive upgrades, restoratives, etc.
- Do the systems I've designed to synchronise reality and hacking together work as intended? This includes things like when all the avatars die during a hack and you get thrown back into reality and must then search real space for energon pickups you can trade in to buy their runtime lives back.
One of the assumptions I made is that when you complete your hacking objective you'll have to work your way back to an exit to get out. There was no way of testing what that would really be like unless I had a rough prototype.
I needed to know:
- How long the hack would take.
- How many resources I'd need to get in and out, which corresponds to how much preparation is needed before taking on the hack.
- How thrilling it would be "running" for the exit with a damaged party and limited remaining resources.
- How annoying it would be to get killed next to the exit.
If this assumption is proved wrong then I have two choices:
- Use playtesting to make it easier to get in and out reducing the chance of you being killed near the exit. This means putting things off until the playtesting phase in beta.
- Throwing the player out of hacking mode when his objective is complete.
An auto exit on objective complete has an interesting side effect in that it reduces your journey time by 50% and therefore reduces your random encounters by the same amount. This means the hacking part of the game would take half as long. If that was a problem you could just double the size of the circuit maps, but then they are created by hand using a time consuming editor. This is why we need a prototype now, because we must know how big the hacking maps need to be. Especially as some have already been made ;)
Save & Load
I don't actually want players to have to save their game. Eh? Currently we have in game objects called backup points where you can save your "brain" and restore it if anything goes wrong. A slight twist of logic, but I think it works. So the player makes progress up to the next backup point (or waypoint) and then jumps on it and the game saves automatically. So you don't have multiple saves and you don't need to remember where you got to last time. You just start up the game and there you are.
As you strike out for the next waypoint you will have 3 lives which can be exhausted by falling in water or being caught by enemy robots. If you run out of lives you automatically reset to the last waypoint. You can recover your "life" if it gets low by picking up energon units and if you get stuck you can initiate a local reset of the room you are currently in (at any time, for free).
It's another set of ideas that really need prototyping. I'm trying to prevent people going through a loop of explore->die->reload->repeat. Yet I want them to feel the tension of being down to their last nugget of health and desperately searching for that elusive waypoint. A fine balance to achieve indeed.
The way this is implemented requires a copy of your current game save to be loaded into memory. Then, as you progress, the in-memory save is updated with what you have done. If you die or elect to go back to the last waypoint then the in-memory save is thrown away. If you hit a new waypoint then the in-memory save is written to disk and becomes permanent. The amount of information that needs to be recorded is huge. For example: if you move something from one position to another then that needs recording. Anything and everything you can interact with and change in any way needs recording and it all needs to be put back where it was and in its original state if a reset occurs.
I thought it would be interesting to post a shot of the editor in use just to show how truly awful the game can look whilst we are making it :) . We have found that a really good way for us to work is as follows:
- I spec out a room design, and start to block it in.
- If there are any scenery objects that I require for the room that don't exist yet, I tell Fost, and after we've discussed what it might look like he exports a quick placeholder object of the right size which I use to continue building the level.
- As soon as Fost gets round to it, he exports the finished object, and because we've already defined the object (but used a placeholder visual representation) every room that needs it automatically starts looking good the next time the game is fired up.
This saves a lot of time, because we don't have to go through and re-edit the rooms re-inserting finished art. The flipside is it requires a lot of imagination to visualise what the final room could look like.
The picture below is of the 'Hub' room work in progress. It's the central command area for the entire ship, and all sections are accessed from it. Hopefully, Fost will be able to post a comparison shot next month and we'll get to see what it all is supposed to look like - even I don't know yet! If we manage to ship the editor with the game, everyone will have a far better experience than I have, because they'll be able to build rooms using finished artwork and room will looks exactly the same as how you are building it in the editor.
The Eidolon's Central HUB built from placeholder blocks. Note-In case you hadn't guessed, it's not supposed to look this bad!
Tune in next month for the exciting conclusion to 'building the Hub room' ... ;)