• Advertisement
Sign in to follow this  
  • entries
  • comments
  • views

Entries in this blog

DaggerXL Continues

I'll take a quick break from the Alien Awakening posts to post about DaggerXL.

I've been busy with various work, holiday and home life issues so I haven't been spending a lot of time on my hobby projects the last couple of weeks but that is changing now that Thanksgiving is finished. So the next build is planned soon - the speed of development should pick back up after that, more like it was initially. :)

That said, I'll talk about some of the developments that have been completed. Soon I'll put up a similar update regarding DarkXL. Note, there are lots of images in this post.


The map functionality is nearly finished, allowing you to pick various provinces and then fast travel to the various locations. Not all the UI functions will be implemented - there will be a full UI pass for the next build - but the location filter, arrows, exit button and province selection work.

There isn't much information on mapping the location longitude and latitude to the close up views. To accomplish this I track the range of location coordinates in a province and match this to the range of the province in the close up textures themselves. This is done by examining the textures on load and examining the borders and colors. In the screenshots below you can see some red lines that show the province area in the texture - just for debugging of course. I still have to tweak the location rendering on the map, there appears to be a small offset in some cases, but it's probably good enough for the next build.

Approximate Cubic Texture Filtering

When I was at work one day (working late), it occurred to me that I could approximate bicubic texture filtering with 4 bilinear texture samples (instead of 16 point samples) and very little ALU work. It's approximate since I can't use the exact weights without taking all the samples (16 textures samples would be too expensive though), but it actually works surprisingly well. In addition I apply a small Luminance based contrast adjustment when this feature is enabled, as well as a larger alpha based contrast adjustment to make flats look sharper. So I implemented the feature in the engine, it only took about 15 minutes or so for the basic implementation, as an optional extended feature. Some people may not like the look and 4 texture samples instead of 1 for all the base textures may cause a noticeable hit on older hardware so the feature will probably default to off initially. Make sure to click on the images to see the large versions for better comparison.

Note that this is a runtime filter, to replace plain bilinear when it's enabled. In the future the more advanced texture upscaling will be implemented in addition.


Approximate BiCubic.


Approximate BiCubic.


Approximate BiCubic.


Approximate BiCubic.


Approximate BiCubic.


Approximate BiCubic.

Atmosphere Tests

I've done some initial color tests for the atmosphere and terrain. I wanted to see what a yellow atmosphere with a blue star would look like and how the atmospheric scattering would effect red colored plant life. The sunlight has a slight bluish color (rather then pure white or yellow/red) and the "red sky" effect is blue and the "blue sky" color is now yellow. When using semi-physically based rendering methods, this naturally generates interesting gradients and sunsets.

In this blog post you will see screenshots that I generated using an existing program, so you can think of these as programmer "concept art." The terrain engine for this project hasn't even been started yet, I'm still working on getting my other projects to a certain stage first. However I will continue to do concept work on this project with existing code from my other projects, programs and small test apps so when I do start I'll have everything well planned - essential for a project this large.

Atmosphere test screenshots:

Mid-Morning. The sky is yellow and the sun blue.

Approaching midday.

You can see the atmospheric scattering mixing with the red "grass" and rock. Note that this is a color test, the final texturing will look much better. Also this is not final terrain, just a mix of two noise based fractals at different scales for now.

Ground Color Test.

This is the quality level that I expect to achieve for the sky and atmospheric scattering, though as I mentioned above the terrain texturing should look better.

It's interesting to see that physically based algorithms can generate an alien atmosphere. This has been a short post this time, but I'll talk more about the world and of course the AI in future posts.

As usual, any comments are welcome.

These blog posts are also mirrored on the Alien Awakening wordpress blog.


So far I've talked about Language and Thought, Emotions and Behavior as well as some goals such as giving the appearance of intelligent individuals living in groups and societies. In order to allow entities to "think" in unique ways - they must have different personalities, skills, goals and memories - there must be a unique data about every agent (at least every agent that the play has come in contact with or who is part of the local group the player is interacting with). In addition, entities must be able to learn, or give the appearance of learning, as well as being given a set of skills and knowledge ahead of time. In order to accomplish this, there must be a single brain or cognitive architecture and the knowledge, personality, experience, memories and so on are data that feeds this architecture. In other words THOUGHT is derived from DATA plus ARCHITECTURE.

While this architecture isn't based on a single Cognitive Architecture, it is inspired from many well-known architectures such as SOAR, ACT-R, EPIC and CLARION. So some of the concepts contained within may seem familiar - however all aspects of this will be tuned and implemented to match the task at hand. In other words, some of the concepts are familiar but the implementation and final design is my own.

Emotional states will have a large impact on the final thought process of an agent and will play a much larger role than in most (if not all) existing Cognitive Architectures. In this sense, the final architecture may not be considered a true Cognitive Architecture - but the primary goal is provide the appearance of a intelligent and emotional beings not to put forth a new theory of Cognition.


Memory is dividing into Long Term Memory and Short Term Memory (also known as Working Memory). Long Term Memory is like a hard drive for a computer, it stores all the facts, memories, rules and so on that a person knows and remembers over the long term. Short Term Memory serves as a work area, where facts go when recalled as well as where short term sensory data is stored. If someone tells you a phone number, it goes into Short Term Memory where it can be used in some mental processing (such as dialing the number or writing it down) and may be forgotten afterwards. Data is short term memory decays and is forgotten after a time unless it is refreshed (repeating a phone number to yourself in order to remember it long enough to get to the phone) or committed to long term memory. This short term memory also serves as a "scratch pad" of sorts for the thinking process.

Long Term Memory is split into several categories. ACT-R uses 2 categories: Declarative memory and Procedural memory. SOAR uses these same categories, though they call the Declarative memory Semantic instead, but they add a third category: Episodic Memory. I take a similar approach. Many of these cognitive architectures do not include emotional state in the memory, however the Cognitive Architecture in Alien Awakening will include emotional affects as discussed in the last post. In addition a belief system has been added to the Declarative Memory - it stores not only facts but beliefs as well. It is possible for an entity to be unsure about an absolute fact, thus even facts are beliefs with this system.

Declarative Memory: Declarative or Semantic memory stores facts and beliefs. Examples include: 1 plus 1=2, California is a state in the United States, Smith's hair is black and so on. A belief rating is stored along with each entry, indicating the strength of the entity's belief. If an entity is looking for Smith in a crowd, they will look up pertinent information about Smith (such as black hair) and try to find a match to the information stored in working memory. If Smith is found and it turns out that his hair is brown - maybe the description was incorrect - then the belief that his hair is black can be replaced by the belief (with very high percentage belief) that it is brown. Declarative memories can also illicit an emotion response, depending on the emotional state when the memory was obtained. For example, a fact such as that the sun is red when setting may be stored along with an emotion of awe - since that was the emotional state the entity was experiencing when acquiring the fact. This would result in an increase in awe or associated emotions when thinking about a sunset or the color red.

Procedural Memory: Procedural memory is basically a set of rules in the general form if x then y. External stimuli and memories are eventually distilled into rules using learning processes. Internal learning is possible, converting from facts and beliefs or episodic memories into general or specific rules. In addition rules can explicitly learned, such as from direct instruction. Like beliefs in Declarative Memory, rules also have a belief rating - how likely the entity feels that this rule is correct and accurate.

Episodic Memory: Episodic memory stores small memories or episodes in an entity's life. Things such as "I caught a large bass when fishing in the river, it was difficult because the pole was too short." This could give rise to learning, such as picking a longer pole when fishing for bass next time. Emotions are also attached to these memories, which allows the emotional state to be modified when recalling specific events. This should also give rise to generalized emotional categories, in this case about fishing.

In this way there is a goal oriented use of memory, attempting to generate and use rules and strong beliefs in order to decide on actions (more on this to follow) as well as emotional responses to stimuli and memories. An object, person, smell or sight may illicit a memory that generates an emotional response regardless of the overall goal or task. If the emotional response is strong enough, such as seeing a place where a grave tragedy occurred, the emotional behavior may override the cognitive behavior or affect it in some way. Emotional as well as logical cognitive responses work together to determine the final actions taken by an entity.

Stay tuned for Part II which will talk about group cognition, group memory, shared memory and more. It should be obvious that each entity cannot have completely unique memories about everything, such as how to use objects or facts - there must be a way to share the burden and yet keep the entities unique. More on that next time...

The Blue Star

Solar System

Alien Awakening will take place on a roughly Earth sized planet in a star system similar to Vega, modeled after that system to some degree even. The system will have a blue main sequence star (class A), roughly 2.5 times as large as our sun (Sol) near the middle of it's life. The star is almost 70 times as bright as our sun, with a proportional increase of energy and radiation output, so the planet that Alien Awakening takes place on is about 7.1 AU - far enough to form liquid water and thus life. The solar system also includes a Jupiter size planet at about 16 AU and a larger gas giant, about 10 times the size of Jupiter, at about 65 AU. Like Vega, there is also a debris field starting at 80 AU and extending out to about 120 AU. Comets are a common site within the inner solar system, though the larger gas giant does a pretty good job of sweeping out a path and keeping the inner solar system clear of debris.

To get an idea of what the solar system might look like, I found an artist depiction of the Vega System. Click on the image to get a larger view. Below the image is a link to the image source.

Vega System
Artist Impression of the Vega System.

Image Source

Atmosphere and Fauna

Due to the atmospheric make up of the planet and spectrum of light from the blue star, red light is not the most abundant color of light to hit the surface. What this means is that the best wavelengths to absorb for photosynthesis are not red and blue but rather blue and green. The result is that red is the predominant color photosynthesizing plants rather then green. Below you can see an image of what that may look like, click on it to link to the NASA article (where the image originates) describing the possibility of non-green plant life on other planets.

Non-green plants possible on other planets.

This also means that the aliens must be adapted to distinguish between different shades of red very readily, instead of green. The result is that the alien vision is more sensitive to red light making green and blue somewhat muted in comparison. In addition, different wavelengths of light are scattered in the atmosphere - the sky is not the same blue color as on Earth. Instead it is a yellow color, deepening to blue as the sun sets. The sun is smaller in the sky, the star is 2.5 times bigger then Sol but also 7 times further away. However it is also much brighter, causing the atmospheric scattering to be stronger. The planet has one moon, like Earth, though it is about 2 times larger then ours, it has a smaller satellite in orbit as well. The moon has an atmosphere, due to it's increased mass, though it's smaller satellite does not.


The larger moon has a much greater effect on tidal movements, the difference between "low tide" and "high tide" is profound - settlements would not survive well near the coasts. The amount of light seen at night also varies more, during full moons the nights are much brighter then on Earth. The star, and thus the planet are younger then their Sol counterparts and the planet tends to be hotter as well. The result is a hotter, much more humid environment - very rich in the diversity of both plants and animals. Fortunately life also formed earlier then on Earth, including intelligent life.

The alien world will be fleshed out more in future posts, thanks for reading. :)

Emotions and Behavior

Why simulate emotions?

Language and "conscious" thought alone are not enough to portray an intelligent alien individual. Pure cognitive systems will not give rise to emotions - without them a simulated being can never seem real or believable. Emotions effect the behavior of many living beings, such as humans and animals (and in this case aliens) and differentiate them from purely mechanical or computer controlled devices. In other words, a simulated being without emotions will seem artificial at best. However it should be obvious that having a huge list of emotional labels and transition states between them would be a daunting - perhaps even impractical task. A system is needed that can not only describe the current emotional state but also describe the personality and moods of an entity. It must be possible for the environment, situations, events and actions to modify the current emotional state - even mood. It must also be possible to transition between emotional states in a sensible and believable manner and finally for the personality of an entity to change slowly over time. For example, a child is naturally curious - easily aroused, easily stimulated and sometimes fearless. As they grow older that curiosity often becomes muted, for some replaced by fear and anxiety.

Modeling Emotion

There are a variety of ways that humans have come up with to categorize personality, using some sort of testing and finally a ranking in various categories. Probably the most famous one, is The Big Five - Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism. The Revised NEO Personality Inventory then sub-divides each broad category into 6 sub-categories. For example, for Openness, it is Fantasy, Aesthetics, Feelings, Actions, Ideas, Values. As you can see this leads to 30 different categories. How do we track an emotional state with these systems? How do we deal with the fact that these categories often have large amounts of overlap - i.e. they are not orthogonal? Can we break this down into something simpler, which still being able to reconstruct these features? It turns out that we can indeed.


The PAD emotion model represents the space of emotions as a set of three (approximately) orthogonal axes: Pleasure/Displeasure (P), Arousal/Non-arousal (A) and Dominance/Submissiveness (D). Note that each axis ranges from positive to negative values, which will be normalized between -1 and +1 for my purposes. So pure pleasure would be +1 P, where pure displeasure would be -1 P. So a given emotional state can be given as a three dimensional value within this space. Some examples:

Anger = -0.51*P + 0.59*A + 0.25*D = (-0.51, 0.59, 0.25)
Fear = -0.64*P + 0.60*A - 0.43*D = (-0.64, 0.60, -0.43)

In order to "label" your current emotional state, you'd find the closest matching point. Obviously the actual emotional state is not a discrete label, but a seamless range of values - meaning it's possible to be "sort of happy" or "a little" sad but "very" angry. What's more we have implicit transitions. We can start at a given emotional state and have things effect that state in a believable way.

There are a variety of things that can effect a being's current emotional state:

  • Situational Effects (someone steals your favorite hat, you eat a hot meal, etc.)

  • Personality or Temperament and Mood.

  • Biological effects (are you sick, do you have a headache)

  • Drug intake

  • Emotional changes due to an entity's own behavior. (Feeling shame for stealing)

Personality or Temperament

We can use the same emotional space to model an individual's personality. This personality would define their average emotional state over a variety of conditions - basically their emotional equilibrium. In between personality (or temperament), which changes slowly - more slowly as you get older it seems - and an entity's moment to moment emotional state are moods. The types of moods that will be experienced are affected by both the personality and exterior events. For example if a disaster occurs (house is destroyed in a fire, friend becomes deathly ill, etc.) then the entity may become depressed. This doesn't necessarily permanently change their personality, they can snap out of it with time but does change their emotional equilibrium for a long period. Shorter moods may also occur, such as being in a bad mood because of a hard day at work and being in an especially good mood because of a raise. The Personality or Mood will have a large impact on current emotional state of an entity as well as affecting how quickly that state can change. For example, if an entity is depressed it is difficult to feel happy - and even when it does happen it is short lived. Emotional states will tend to decay toward equilibrium, as defined by the mood and personality, so the more extreme the personality or mood the quicker emotions tend to decay from the opposite end of the spectrum.

The Effect of Emotion on Behavior

As I've stated above behavior effects emotion and emotion effect behavior. The emotional state of an entity will affect its speech, from the tone of voice to what they say. Emotion effects and sometimes even hinders thought and can cause the entity to be reactionary. For example, if an alien sees something that really scares them they may run away without thinking. Or if something is very funny, they may laugh out loud without meaning to. The difficulty of overriding an instinctive, emotional behavior would depend on the intensity of that emotion. For example, a solider may run from their post. They know that they should stay and fight, they may even want to stay and fight but the fear is too overwhelming. Alien's will have some measure of willpower that will indicate their ability to control their emotions. Emotions will also have more subtle effects on behavior, helping to govern how they feel about an individual, thus how friendly they are - how trusting they are. They will affect how well an entity does a task or how quickly they run an errand. They will affect the relationships between aliens, animals and the player.


Obviously rigorous modeling of emotion for all entities in the world all the time is impractical. Thus groups will also have an emotional state, essentially what can be described as the "median" state. For example, a group that is constantly under attack may generally feel afraid and anxious. This will allow for group dynamics at a macro level as well as individual dynamics at an individual level. While a group may have a general emotional state, the state of an individual would vary depending on personality and personal events. A group may dislike a player but an individual of that group, who has nothing but good experiences for example, may love the player. At all times every entity should feel like a unique individual, yet allow groups to interact in a meaningful way.

Later I'll discuss the emotional modeling in more detail as well as fleshing out the Cognitive/Emotional/Behavior links.

Thanks for reading, any comments are welcome.

Language and Thought

One of the key elements of Alien Awakening is the portrayal of intelligent alien individuals who appear to think on their own and have the ability to converse with each other and the player. One of the most important aspects of this is language. One of the most important things that separate human beings from other, non-sentient animals such as dogs, is our language. Language is not just a tool for communication but a tool for thought itself. Our language defines the elemental concepts of conscious thought - that which lies beyond the basic qualia of feeling and need. By changing the language, not just the language we are speaking but the language that we use to form our conscious thoughts, we change the way we think. As an example, one could imagine a species like the Borg from Star Trek - their language would have no words for I or self. As a result they cannot form thoughts of individuality or self - it is a completely foreign concept. Before they could think of themselves as individuals, they would need to learn how to think of individuality.

It follows, then, that the language in Alien Awakening performs several very important tasks: it forms a means of communication between the player and aliens, it forms a visible form of communication between the aliens themselves to make them feel more alive and finally it forms a framework for the alien thought process itself. This thought process would inherently be alien - the concepts expressible in the alien language would certainly be different then our own - and yet understandable if you learn to think in the alien language (or at least understand the concepts that they can express) . This means that the alien AI must express "conscious" thought in their own language - if it is not expressible then it cannot be thought of at a conscious level. Of course, at first this seems crazy, but the AI would need to be scalable. When dealing with individuals, each individual has it's own thoughts and perceptions. When talking about groups acting out of sight or far away, a group AI would take over. It would still track the individuals - similar to particles in a massive particle system - but work at a much higher level. This would allow individual behavior when needed as well as group behavior for the rest of the world.

So, then, the language should be relatively simple but must be complex enough to give the aliens adequate - even rich - expressiveness of thought and action. First thing, the basic rules of the language must be simplified. For example, in English, there are a large number of phonemes (a group of sounds that serve the same purpose in a language) which each have a number of allophones (individual sounds that belong to the same phoneme). To add additional complexity, there is a poor match between spelling and phonemes. Several things will be simplified here - each phoneme will have exactly one allophone, one sound. Each phoneme will also be represented by a single symbol or letter. In addition the number of unique phonemes will also be reduced, allowing players and aliens to "speak" directly in phonemes - greatly reducing the complexity at the base level.

There is a lot more to talk about when it comes to the language system but that will have to wait for future posts. Feel free to let me know what you think.
Hello, I've written some journal entries in the past in addition to my wordpress blogs. However the response has always been much less then on those blogs and forums so I lost interest in updating my blog here. Sorry. :(

Anyway I'm in the planning stages of my next project, Alien Awakening. It is my most ambitious project ever, so I'm still planning and will be until DarkXL reaches beta (3 full projects is too much for me). Anyway I've created a wordpress blog for the project where I can put up my thoughts, plans and concepts - but some of the posts tend to be longer and more technical then my other blogs so it would also make a good fit here too.

I'm going to mirror my blog posts here, so the first couple of posts will be spent "catching up." I'll probably also mention important events for DarkXL and DaggerXL here as well.

So what is Alien Awakening?

Alien Awakening is a game/simulation of an alien world. You play as a human stranded on this world, somehow inhabiting the body of an alien. When you speak, the sounds are alien to your ears and completely foreign. You can spend your time in this world trying to figure out how you got there, what happened to your real body and how to get home. As you attempt to unravel those mysteries you discover at least one sentient alien race, primitive but unique. You'll discover, though, that there is more to this world - to these people - then you'd ever expect. But this isn't a linear game or adventure, at it's heart it is a simulation. The aliens speak and understand their own language, you can learn to converse with them - even to live with them. You can lead or follow, help or hinder, learn or destroy. The goal is to make you feel like you are on a real world - everything is simulated or procedural. Everything changes, the seasons, animals migrate, alliances form and shatter, creatures grow old and die, new life is born and raised. The game never ends, even if you find out how to go home you may decide that you'd rather stay and continue to live with these people.

I'll also include the opening story for the game - as much as I have so far anyway. :)

"The Awakening"

The Cave

You blink, attempting to shake off the grogginess, it feels as if you've been asleep for days. As your visions clears, it slowly dawns on you that you are not where you should be. You vaguely remember a flashing of lights and blaring alarms, people running about in confusion. Of what? You don't remember, it's such a blur. Then blackness, where you knocked out? Around you there is nothing but glistening rock illuminated by the soft glow of the strange looking platform on which you now lay. Something feels strange... you look down upon your body and scream - sound echoing off the cavern walls. Where your body once was, you see another body - you have 2 arms and 2 legs but they are completely alien, unlike any creature on Earth. After laying, confused and shocked, for what seems like an eternity you finally decide to try and figure out what's going on. Surprisingly you have no problem controlling your new body and get up with little effort.

As you look around you see evidence of other machines, at least you think they are machines, but nothing you're familiar with. You start looking for a way out, maybe you can get some answers outside, maybe even find out who's responsible and get them to change you back. The cavern seems to have a simply layout, it doesn't take long to find the lone tunnel that leads out of the room you started in. You see small machines and a few strange plants - well you assume they are plants anyway, no point in getting too close just in case - but nothing else other then the glittering stone walls. There's not much light but enough thanks to the machines.

Eventually you see a single book lying on the ground, an oddly familiar yet strange sight in these caves. It's a little damp but the pages are thick and whole. You open it to finally see something you understand - it appears to be a journal, though you suspect the author may not be entirely sane. He claims that this cave is not on Earth at all! Yeah, so I woke up on some alien world. But you continue to read... waking up in an alien body... cave with strange machines and even stranger plant life... It claims that there is a settlement not far from here. Then it goes on to describe some of the basics of their language. The aliens here have never heard any English, I have to speak their language for them to understand me... This must be some kind of mind game, you feel the need to get out of this cave and find out what is really going on.

The World

You finally stumble into the sunlight, it takes your eyes a while to adjust but when they do... Where are you? You look around stunned, unable to move as you stare at the landscape... So alien... As you begin to regain your composure it dawns on you that the journal was right after all. You are on an alien world, in an alien body with no idea how you got there or how you can get home. With nothing else to go on, you head in the direction of the alien settlement.

And so begins the Alien Awakening.

A short update.

This is just a short update. I wanted to mention that I setup a wordpress blog/site for DaggerXL. General progress and eventually demos/builds will be hosted there. I'll talk more about DarkXL when I have more time to write another post. :)

As you can see from the wordpress blog here , decent progress has been made. Textures and lighting from the camera are now supported. This means that proper normals are now setup for models. Also, since in Daggerfall, uv coordinates are stored in subtexel coordinates (16 units per texel), the uv coordinates used for the 3D hardware must be scaled based on the resolution of the texture used on the surface. Anyway here are some screenshots of these changes in action, which you can also see on the wordpress blog.

Last entry I talked a little about my project DarkXL and about the full screen glow. This time I'm going to talk a little about the new dynamic lighting (one of the "extended" features in the upcoming build).

DarkXL adds optional dynamic lighting that affects things like powerups, various props and weapons fire. As you can see from the following screenshot (a wireframe view of DarkXL), the vertex density in the environment is fairly low.

This means that vertex lighting is right out. Outlaws used vertex lighting in the hardware version for the headlamp... unfortunately it quite often looked awful because of the low vertex density. Jedi Knight suffered from this problem a lot too, although most lights were static so it was easier to deal with. Because of this I implemented per-pixel lighting. I do 2 lights per pass in order to support shader model 2 (dynamic lighting isn't available on lower end cards, other then the headlamp/flash when you fire like in the original), and just zero out the color for the second light if only 1 light is to rendered in that pass.

Here are some screenshots with dynamic lighting highlighted:

The lighting uses a smoothstep for attenuation, like so:

attenuation = smoothstep(0.0f, 1.0f, 1.0f-saturate(d));
where d = scaled distance from the light source (scaled by 1/light_radius).

This allows light attenuation to end at exactly zero at the radius but still have a smooth non-linear falloff. By changing 0.0f to some other value, you can get an "inner radius", where the attenuation is 1.0 inside the inner radius and falls off towards the outter radius.

I've also started a "secret" project recently that I haven't talked about until now: DaggerXL - a Daggerfall port/remake similar to DarkXL. The project takes a back seat to DarkXL until atleast beta, but allows me to work on something else with my free time (maybe once a week) so I don't get burned out. And a Daggerfall remake has been something I've wanted to do for a long time. Like DarkXL, it will use the original game data and emulate the original game play. It will also add easy moddability like DarkXL. Where it will diverge somewhat is that extended features will include adding a lot of gameplay elements that were intended for the original and got canned. Of course there will be the "classic" mode for people that want a truly faithful experience. The enhancements will include things like being able to dock your ship in a town with docks, sailing your ship, proper shape changing spells, being able to store items in your house (in the furniture) and more. :) So far I've got the basic engine up and running, using code from DarkXL. Things like core rendering, player physics, scripting, control mapping and so on. I've got 3D objects rendering, although I'm not loading the proper textures yet. And I'm parsing and building dungeon blocks, although the positioning/orientations aren't right yet - it does load and cache all the models and render.

3D models with a default texture (not a Daggerfall texture):

So stay tuned for more progress on both these projects. I'm going to try to maintain a journal entry at least once a week. Next time I'll talk about the visibility system in DarkXL, which is suprisingly different then you'd expect from a modern engine due to things like overlapping sectors. Also I expect to get dungeon blocks working with DaggerXL, possibly with textures. We'll see how it goes. Any comments are welcome. :)
I started this journal as a way of talking about hobby development projects such as DarkXL but I haven't been posting any entries. So I want to start up again. This entry won't be a very long one but I hope to start posting entries about the development of DarkXL so far and then future developments.

For those of you that don't know, DarkXL is a sort of remake of Dark Forces but in the spirit of a port. I guess a recreation is a little more accurate. Unlike Doom, Duke Nukem 3D, Quake I - III, and other popular FPS's - the source code for Dark Forces was never released. However before I started the project I found an old community - df-21.net - that centered around Dark Forces. I really enjoyed Dark Forces back in the day and I think it deserves to be preserved just like Doom and Duke Nukem 3D. So I started building this project from scratch. At the time I was entertaining the idea of doing something similar for Daggerfall but for various reasons decided on Dark Forces instead (in the short term).

Anyway, DarkXL requires the original data files to run and attempts to faithfully emulate the game play, sound, music, game flow, etc. of the original game. Ultimately it will fully support all the user levels and mods that people have made over the years. But it is a windows app, uses hardware acceleration and supports new features to enhance the original experience - including higher resolutions and color depths, texture filtering, improved control and mouse look, and various optional "extended features."

I will talk about one of those extended features today and how I attempted to make it enhance the original look in a way that is consistent with the original game's intent. That is the "glow" effect.

The original game had special colors in the palette that were treated as full bright, so you can get bright lights in dark areas. Looking at the movies and later games you'll see a nice soft glow (when done well) around really bright light sources, such as blaster shots, wall lights, engine glow on ships, etc. Some older (pre-HDR) games used a simple intensity thresholding to determine which parts of the scene glow. As you can imagine this would have been really simple to implement without changing the pipeline: copy the backbuffer and downsample, apply your threshold value (so pixels below the threshold are black), blur the result, add it on top. But I wanted to capture the intent of the original artists, so if you ask "which pixels should glow?" the answer becomes really obvious - the full bright pixels of course.

Since the alpha value is already used, I had to generate a "glow mask" in order to know which pixels are emissive and how much (to support true color textures and multiple levels of transparency and emissiveness for DarkXL mods). So I setup 2 MRTs, one for the main render and one with 4 "effect channels". The first channel (the only one right now) is used for the glow mask. So as the geometry is being rendered, it writes out the emissiveness to the glow mask. This allows it to be generated without rendering the geometry again and opens up the possibility of having other post process effects later using the other channels.

Here is some screenshots with the glow, which I hope shows how the glow enhances the original intent rather the drastically changing the look and feel. I've shown some of these screenshots before in the image of the day.

before (with older light settings, another extended feature)


Other screenshots of the glow effect.

Alpha Demo!

I've finally released the first alpha demo! The demo requires the full version of Dark Forces, since it uses the game's data. In a later post I'll talk about the technical challenges faced with getting the game this far and future developments.

If you want to give the demo a try, you can view the instructions and downloads here:

Catching up....

Here is the second devpost as I catch up to my current progress :) It will be especially long but should catch us up to where I am now. Any comments welcome.

3DO objects are loading and rendering. Solid color polygons, textured polygons and point rendering is supported. I had to cheat when viewing the ship since I don't have VUEs working yet. Kyle's ship starts in a sector you can never get to, so I had to turn off sector rendering and fly over there to get a screenshot.

3DO objects are a text based model format used by Dark Forces (and other games such as Jedi Knight). The format supports subobjects, where each object can reference a texture. In addition flat shading (with or without a texture) and gouraud shading can be specified. The format supports triangles, quads and point rendering, all of which are used by Dark Forces.

For these objects I create a vertex/index buffer for each and store surface color data in the vertices. The models can have subobjects that animate (rotation and/or translation) but skinning is not supported.

The "INF" system has also been implemented. "INF" serves a similar function to Doom's sector and linedef flags, but allows for much for flexibility. You can think of it as a primitive scripting language. Each level has an "INF" file that essentially acts as the level script. It works on the concepts of "triggers", which can be adjoins (a wall that connects two sectors, essentially a 2D portal), when crossing or nudging (using) or sectors themselves. These triggers fire off when entering, leaving or landing on a sector. The other major objects are "elevators", which have multiple "stops" where actions can occur such as moving floor or ceiling heights, playing sounds, sending messages to other elevators or triggers, completing a mission goal, displaying text, etc. Since Dark Forces supports VUEs (animated sequences in game), moving floors/ceilings, moving subsectors and even rotating subsectors a lot can be accomplished with this system.

All the doors, elevators, switches, etc. run through this system making level design much more flexible then the original Doom. Here's some shots of various working doors, switches and elevators:

The 2D and 3D sound system is now "finished." The player makes the jumping/landing sounds, shooting (and hitting) make sounds, punching makes sounds (with hit and miss sounds), picking up items makes sounds, hitting/killing enemies, etc. This, of course, includes playing sounds from logic scripts. Also all the elevator sounds and pages are implemented.

I've also replaced Dark Forces hardcoded logics with logic scripts, which can be combined just like in the original. Logics are linked to objects in the object definition file ".O", which allows a decoupling of objects and their logic. This allows multiple logics to be run on the same object, in order. This is great as far as code reuse to flexibility.

All logics are currently defined in script files, which are loaded, parsed and compiled at runtime - which allows easy and fast changes with no more more work then simply editing a text file. Currently there is a CoreLogics file which contains all the "core DF logics." Modders will be able to specify their own script file which may add or override core behaviors.

There are a set of functions for each logic, for example Logic Storm1 has L_Storm1_Setup(), L_Storm1_Update, L_Storm1_AcceptMessage(), etc. When a logic is read from the level's ".O" file, it immediately creates the logic and attempts to link it to the properly named functions in the script file(s). If no function is found, then execution of that stage is simply skipped - which may be desirable. Not all logics need to implement all possible functionality.

So to put it simply, to create a new logic simply define a new name in the .O file, such as Logic: Storm2, and setup the functions for that logic in the script file. That's it. It literally takes seconds to add new logics if the functionality is simple :)

Here's an abbreviated sample:

/*Setup logic message masks, this determines which messages this logic is interested in.*/
void L_Storm1_SetupLogic()
//Example setup, from Storm1 logic.

/*Setup the object that uses this logic, sets up HP,
object flags and other things*/

void L_Storm1_SetupObj()
//Example setup, partial listing from Storm1.
obj_HP = 30;

/*Main update, called every frame if object is activated or in an active sector*/
void L_Storm1_Update()
//Partial listing from Storm1, near the end.
//Most functionality is left out for space.
float ground_height = Map_GetFloorHeight();
if ( obj_loc_z > ground_height )
obj_loc_z -= 0.1f;
if ( obj_loc_z < ground_height )
obj_loc_z = ground_height;

/*Messages that this logic can recieve. Includes things like damage and sound/player notifications.*/
void L_Storm1_SendMsg()
/*shows the general structure from Storm1, but the implementation is left out for space.*/
if ( obj_Alive == 1 )
switch (obj_uMsg)
obj_HP -= msg_nVal;
if ( obj_HP <= 0 )
obj_HP = 0;
obj_Alive = 0;
//...more stuff...
//...state/anim stuff...
//...similar to above...

For those interested I actually use AngleScript for the scripting backend. But the end user just has to edit these files, everything else is taken care of automatically. The goal is to make it easy to mod, without having to worry about compilers, getting pointers or handles to objects, etc.

So now I'm working on reworking the visiblity/portal system, which has unique issues in Dark Forces that don't exist in modern games. I'll be talking about that and the first Alpha Demo in my next post.


As promised, my first dev post... expect more later :)

I've been working on this project for a little while, writing a Hardware Accelerated version of Dark Forces from scratch. It's still in fairly early form, but in the near future I will be releasing an Alpha Demo, featuring the first level of the game, so I wanted to make this known before I get to that point.

The game requires the original Dark Forces data, it will allow you to play through the complete game and I'm planning on making sure it works with most (or all of the released levels and mods.

The point of this project is to allow people to play Dark Forces in high resolution, on Windows in either "classic" mode (no new features) or "extended" mode. All modes will benefit from newer control methods like mouse look, and high resolutions and color depth. The extended version will get new features such as real-time lighting, slopes, dual adjoins, vertical adjoins, better model support (md2 and/or md3) and more.

Enough rambling, here are some early screenshots of the first level. It includes the following things already functioning properly: HUD (though I need to replace the fonts with Dark Forces fonts), controls (walking, running, jumping, strafing, mouse look, etc), objects (including collecting weapons and powerups), collision and "physics", inventory, ammo and weapons, lighting, headlamp/IR goggles, and more

Screenshots are resized by Photobucket...


I decided to start a journal on GameDev to talk about the progress of a project called DarkXL. I've been working on it for a while, so in order to catch up I'll be making several posts that talk about the development so far...
Sign in to follow this  
  • Advertisement