Hey,
So, I'm using Unity to make a small demo for an FPS. Some time ago, I wrote the accuracy values for my weapons and I figured I'd make the accuracy work like this: (100 - Accuracy) = pixels in size that your crosshair is. So 98 accuracy is a 2-pixel crosshair (I know that's tiny, but still).
But now I'm wondering if maybe I should be doing it a different way. I'm trying to make the firing so that it raycasts in a random position within your crosshair and then it creates a bullet in front of your gun, and that bullet's position is smoothed into the point where the raycast hit over like, .10 seconds or so.
However, I'm not sure if the accuracy will remain covering the same amount of game-space regardless of your resolution. Should I be doing a percentage of your screen 'area' instead of raw pixels as the size of the crosshair? Should I just use game units instead of pixels? What's the most common way for this to be done, and is that the best way?
Any help would be appreciated,
Thanks!
FPS "Accuracy" value that affects the size of your crosshair
I think the best way would be to keep resolution-agnostic; use game units. Offset the path of each bullet by a random angle, using you're accuracy metric (weapon precision, player's current speed, etc.) to control the size of the offset. You'd then work backwards into screen space to get the final size of the crosshair, which I suppose should be the angular diameter of the area in which the bullet might hit.
Rather than approaching accuracy this way, why not approach it from the user's perspective? To the user, the "accuracy" of the weapon equates to how far away from an enemy they can be for all rounds aimed at the target to hit a part of the target. Weapons have a cone of fire within this accuracy range. So, if I aim perfectly at the center of the torso of a guard standing 100m away, a weapon which will ensure that every round I fire will hit the guy's torso is accurate to 100m.
If your game has a headshot mechanic, then you could instead use the head of the average target as the accuracy figure, so, a weapon accurate to 100m means that up to 100m away, any shot aimed perfectly(i.e. without user error) at the centre of a target's head will hit its head.
You could use "Minute of arc" as your input accuracy value, which is what rifle manufacturers use to measure their accuracy, http://en.wikipedia....of_arc#Firearms
This is far better than a value which has mathematical meaning to the programmer but is totally meaningless to the player, and which will force you to use trial and error while balancing the accuracy of your game's weapons. What you then do is add to this value (high values are worse) if the player is moving, subtract from it if they are crouched, etc.
[edit]
sorry, i misunderstood the intent of the OP. However, my answer to this is simple - make the gap in the crosshairs range between 0 MoA (crosshair is totally closed) and an arbitrary value somewhere in the high MoA's (say, 10 MoA) and use your current calculated MoA to interpolate between these two positions.
If your game has a headshot mechanic, then you could instead use the head of the average target as the accuracy figure, so, a weapon accurate to 100m means that up to 100m away, any shot aimed perfectly(i.e. without user error) at the centre of a target's head will hit its head.
You could use "Minute of arc" as your input accuracy value, which is what rifle manufacturers use to measure their accuracy, http://en.wikipedia....of_arc#Firearms
This is far better than a value which has mathematical meaning to the programmer but is totally meaningless to the player, and which will force you to use trial and error while balancing the accuracy of your game's weapons. What you then do is add to this value (high values are worse) if the player is moving, subtract from it if they are crouched, etc.
[edit]
sorry, i misunderstood the intent of the OP. However, my answer to this is simple - make the gap in the crosshairs range between 0 MoA (crosshair is totally closed) and an arbitrary value somewhere in the high MoA's (say, 10 MoA) and use your current calculated MoA to interpolate between these two positions.
So, I took your advice, johnchapman, and went with game units:
That pretty much just raycasts to the center of the screen then offsets that by a random number from -2.2 units to 2.2 units (those are game measurement units) on both its Y and Z (left/right and up/down). It then creates a bullet, which has its own code that moves itself to 'pos' over .10 seconds. It works very well and offsets them evenly regardless of how far you are from your target (I tested on a wall). Of course, I'm going to change it from 2.2 to something based on your accuracy value in the near future.
However, now I'm wondering how to find the proper calculation for sizing the crosshair appropriately based on your accuracy. I could try to figure out the percentage of your screen in pixels that the maximum offset is by using one of the WorldToScreenPoint functions (translates world space into screen space) and then sizing the width/height at double that value...? Think it'd work or have any suggestions?
var ray:Ray = movement.CharacterCamera.ViewportPointToRay(Vector3(0.5,0.5,.70));
var hit:RaycastHit;
weaponry.Text.transform.position = Vector3(0.5,0.5,1);
if (Physics.Raycast(ray,hit))
{
var offset:Vector3 = Vector3(0,Random.Range(-2.2,2.2),Random.Range(-2.2,2.2));
pos = (hit.point + offset);
weaponry.currentProjectile = Instantiate(weaponry.Bullet,transform.position,transform.rotation);
}
That pretty much just raycasts to the center of the screen then offsets that by a random number from -2.2 units to 2.2 units (those are game measurement units) on both its Y and Z (left/right and up/down). It then creates a bullet, which has its own code that moves itself to 'pos' over .10 seconds. It works very well and offsets them evenly regardless of how far you are from your target (I tested on a wall). Of course, I'm going to change it from 2.2 to something based on your accuracy value in the near future.
However, now I'm wondering how to find the proper calculation for sizing the crosshair appropriately based on your accuracy. I could try to figure out the percentage of your screen in pixels that the maximum offset is by using one of the WorldToScreenPoint functions (translates world space into screen space) and then sizing the width/height at double that value...? Think it'd work or have any suggestions?
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement