Bombshell93

Members
  • Content count

    207
  • Joined

  • Last visited

Community Reputation

245 Neutral

About Bombshell93

  • Rank
    Member
  1. So I've made a Second Attempt at CV. This time I've tried to focus on what I can demonstrate, skills wise I have stuck to examples of how I've used my skills as opposed to what skills I suppose I have. it feels somewhat empty, I only have 1 piece of accompanying material at the moment demonstrating what I've actually made, but I have 3 game projects demonstrable in UE4 and 2 demonstrations in Unity, 1 for mobile and 1 for PC, Tutorial Game for Game Maker, made it to explain to my class the basic concepts behind development in Game Maker and how they can approach various problems, as an added bonus it feels quite nice to control. http://i.imgur.com/qcB1De2.gif I really need to get a more impressive project in my portfolio and available to show, I have plenty of documentation, but my on hand projects are kind of lack-luster. Comments and Criticisms are greatly appreciated, Thanks, Scott R Howell
  2. a note on the role, ideally I'd be a graphics programmer, I'd be happy as an engine or tools programmer, but I'll accept any offer of a programming role, I don't expect miracles, its my first job in industry after all, so any programming role to garner the experience is my goal. any decent cv examples or sources that come to mind? I've so far been given cv's meant for a single page, cv's closer to booklets, trim and documentation like cv's, pretty and heavily designed cv's, the only consistent information I see is name and contact details.
  3. side step a moment here to thank you all for the advice by the way @Gibbon that seems about my best bet at the moment, I'd hoped that my Game Design course would help with that, but I've been working with teams who are programming illiterate and with tools where visual scripting is mandator. The result has been technically simple projects with little if any programming of any kind. In terms of a game to put in a portfolio, is it worth spending the time to make it in a programming heavy engine or spending the time to make it from scratch, or would it be more impressive to spend more time polishing it in an existing engine (such as Unity for example)?
  4. just wrote this out and the page crashed, happy thoughts, happy thoughts, happy thoughts A quick word on the portfolio piece, its pants, I know it is, its so far 4 days worth of work, 1 of which was writing out the maths library the rest has been establishing a good workload distribution and communication method for javascripts workers. laying out stuff I've actually done, I'm not sure how to lay it out in a cv. summing up what I've done (split up as early, learning, and recent) early handful of half-complete game maker games, (where the interest started) a few toolsets and being part of a collaborative mmorpg project in game maker using 39dll. I then moved on to XNA where I made a handful of unpublished games, fps's, rpg's, platformers, mostly generic representations of genre's. I also got interested in graphics programming making my first deferred renderer in XNA. I also made babies first 2d physics engine in java along with a 2d sidescrolling platformer, unpublished. learning I wrote rendering pipelines in c++ and c#, using opengl and directx, containing, asset management, mesh batching, atlas batching, light propagation volumes, morph animation, skeletal animation, reflective materials, material editors, particle engines (because there's always at least 1 particle engine), voxel terrain, boxel terrain, height map terrain editors, point cloud rendering, z-prepassing, tangent mapping, parallax mapping, displacement mapping, tiled light sorting, shadow mapping, ambient occlusion, bloom, micro faceted light approximation, etc. I attempted to make a few engines, where I expanding my programming knowledge tackling multithreading, game object relationships, networking, low latency game loops, loading and saving files of various types proprietary and existing. I wrote a few 3d physics engines which grew increasingly complex, but I rarely used them for anything outside of a few playful experiments. I also spent a lot of time developing my skills as a 2D and 3D artist. recent talking about past 3~4 years wherein I decided this is what I wanted to do as my career, at this point I became far more confident in my ability to self teach through issues and my ability to learn new languages, tools and programming subject matter. as for things I did, I made a handful of Unity and Unreal games, becoming particularly familiar with Unity, I continued my experiments with game engine and tool development while applying for my current Game Design Degree. The course has seen me make design documentation, understand the design process for gameplay far better, get invested in the academia, make a handful of more complete though unimpressive games and has made me far more aware of networking opportunities in the games industry. I've also written 3 academic papers the third of which is being expanded for my dissertation, "What can we learn from Super Mario 64 about Game Feel and its Components", "Is Design For Intuition An Inherent Advantage Of Video Games", "The Professional Utility Of Game Jams". tl;dr? I've made a lot of half-finished games, have invested a lot of time into expanding my knowledge of graphics programming and games programming, most of these projects have been scrapped between computers, I'm currently going through a degree for game design where chances for programming portfolio pieces are not as abundant. How could I best convey that I have this experience and am capable of using it, in a way that would interest an employer? EDIT in terms of time I have a year of University left, during which I intend on being on the hunt and then after University I'd hope to either have a job lined up or a good lead on the hunt
  5. HI, My name is Scott R Howell, I am a programmer / game designer just entering the industry. I've not been on this site in 3 years, but the community was a great help during my stupid years learning programming and the technical side of game development. In my time missing I've been putting myself through a Game Design Degree of which I'm approaching my final year, over the course of this I've become convinced I am a capable programmer, trading conversation with some of the universities tutors and while networking at development events. ORIGINAL:   EDIT: So I've made a Second Attempt at CV. This time I've tried to focus on what I can demonstrate, skills wise I have stuck to examples of how I've used my skills as opposed to what skills I suppose I have. Any and all comments, criticisms and opinions are welcome and greatly appreciated, Thank you for your time, Scott R Howell (Bombshell)
  6. [SOLVED] MinGW, Glut, Glew problems

    I've opened msys and I have the source for glew but I've not used command line builders before (at least not without an IDE between me and them), google isn't helping as each page I find is assuming the reader has used other such tools before, I haven't a clue what I'm doing to be honest. EDIT: sorted thanks for the help guys, For those unaware or in need of the help, Using MSYS: open msys and type cd "Path to Makefile" make a lot easier than I was making it  
  7. [SOLVED] MinGW, Glut, Glew problems

    well I've now tried every lib and dll file available from the glew pre-compile's (x84 and x64) the only things giving me a different error (which btw is also the error I have not got any links to glew libraries) is Win32/glew32s.lib 21:10:10 **** Incremental Build of configuration Debug for project Sample **** Info: Internal Builder is used for build g++ -static-libgcc -static-libstdc++ "-LC:\\MinGW\\lib\\Release\\win32" -o Sample.exe main.o -lopengl32 -lglu32 -lfreeglut -lglew32s Warning: corrupt .drectve at end of def file C:\MinGW\lib\Release\win32/glew32s.lib(tmp/glew_static/Release/Win32/glew.obj):(.text$mn+0x7): undefined reference to `_imp__wglGetProcAddress@4' C:\MinGW\lib\Release\win32/glew32s.lib(tmp/glew_static/Release/Win32/glew.obj):(.text$mn+0x4): undefined reference to `_imp__wglGetProcAddress@4' c:/mingw/bin/../lib/gcc/mingw32/4.8.1/../../../../mingw32/bin/ld.exe: C:\MinGW\lib\Release\win32/glew32s.lib(tmp/glew_static/Release/Win32/glew.obj): bad reloc address 0x4 in section `.text$mn' c:/mingw/bin/../lib/gcc/mingw32/4.8.1/../../../../mingw32/bin/ld.exe: final link failed: Invalid operation collect2.exe: error: ld returned 1 exit status and the dlls with x64/glew32.dll saying it is an unknown file type and stopping immediately or Win32/glew32.dll 21:10:10 **** Incremental Build of configuration Debug for project Sample **** Info: Internal Builder is used for build g++ -static-libgcc -static-libstdc++ "-LC:\\MinGW\\lib\\Release\\win32" -o Sample.exe main.o -lopengl32 -lglu32 -lfreeglut -lglew32s Warning: corrupt .drectve at end of def file C:\MinGW\lib\Release\win32/glew32s.lib(tmp/glew_static/Release/Win32/glew.obj):(.text$mn+0x7): undefined reference to `_imp__wglGetProcAddress@4' C:\MinGW\lib\Release\win32/glew32s.lib(tmp/glew_static/Release/Win32/glew.obj):(.text$mn+0x4): undefined reference to `_imp__wglGetProcAddress@4' c:/mingw/bin/../lib/gcc/mingw32/4.8.1/../../../../mingw32/bin/ld.exe: C:\MinGW\lib\Release\win32/glew32s.lib(tmp/glew_static/Release/Win32/glew.obj): bad reloc address 0x4 in section `.text$mn' c:/mingw/bin/../lib/gcc/mingw32/4.8.1/../../../../mingw32/bin/ld.exe: final link failed: Invalid operation collect2.exe: error: ld returned 1 exit status
  8. hi, I'm trying to get started with OpenGL in C++ but I'm having constant issues with it, for IDE I have chosen Eclipse with MinGW as my tool chain. while using MinGW and GLUT works fine I want to make use of the greater functionality OpenGL offers such as VBO's, Shader Programs, etc. all signs seem to point to Glew being the best option and some sources claiming it my only option but for the life of me I keep running into errors, at the moment this is how the console reports, I am using a minGW built library, but I've also tried the dll as a library and the pre-built libraries from the glew sourceforge 18:15:36 **** Incremental Build of configuration Debug for project Sample **** Info: Internal Builder is used for build g++ -static-libgcc -static-libstdc++ -o Sample.exe main.o -lopengl32 -lglu32 -lfreeglut -lglew32 main.o: In function `ZN4MeshC2Ev': C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:4: undefined reference to `__glewGenBuffers' main.o: In function `ZN4MeshD2Ev': C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:11: undefined reference to `__glewDeleteBuffers' main.o: In function `ZN4Mesh4FillEP6VertexPjjjj': C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:29: undefined reference to `__glewBindBuffer' C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:31: undefined reference to `__glewBufferData' C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:32: undefined reference to `__glewBindBuffer' C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:34: undefined reference to `__glewBufferData' main.o: In function `ZN4Mesh4DrawEv': C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:37: undefined reference to `__glewBindBuffer' C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:38: undefined reference to `__glewBindBuffer' main.o: In function `main': C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:62: undefined reference to `glewInit@0' C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:65: undefined reference to `glewGetErrorString@4' C:\Users\Bombshell\EclipseCpp\TTEngine\Sample\Debug/../main.cpp:67: undefined reference to `glewGetString@4' collect2.exe: error: ld returned 1 exit status 18:15:36 Build Finished (took 127ms) and here are the files [spoiler] header.h #ifndef SAMPLE_HEADER_H #define SAMPLE_HEADER_H #include <GL/glew.h> #include <GL/glut.h> #include <stdio.h> #include <windows.h> struct Vertex { public: float px, py, pz; Vertex() { px = py = pz = 0; } Vertex(float pX, float pY, float pZ) { px = pX; py = pY; pz = pZ; } }; class Mesh { private: unsigned int* bufferObjects; Vertex* vertexData; unsigned int* indexData; unsigned int vertexCount; unsigned int indexCount; public: Mesh(); ~Mesh(); void Fill(Vertex* vertices, unsigned int* indices, unsigned int vertCount, unsigned int indCount, GLenum usage); void Draw(); }; #endif /* HEADER_H_ */ main.cpp #include "header.h" Mesh::Mesh() { glGenBuffers(2, bufferObjects); vertexData = 0; indexData = 0; vertexCount = 0; indexCount = 0; } Mesh::~Mesh() { glDeleteBuffers(2, bufferObjects); delete bufferObjects; bufferObjects = 0; delete vertexData; vertexData = 0; delete indexData; indexData = 0; } void Mesh::Fill(Vertex* vertices, unsigned int* indices, unsigned int vertCount, unsigned int indCount, GLenum usage) { if (vertexData != 0) delete vertexData; vertexData = vertices; if (indexData != 0) delete indexData; indexData = indices; vertexCount = vertCount; indexCount = indCount; glBindBuffer(GL_ARRAY_BUFFER, bufferObjects[0]); glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * vertexCount, vertexData, usage); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, bufferObjects[1]); glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(unsigned int) * indexCount, indexData, usage); } void Mesh::Draw() { glBindBuffer(GL_ARRAY_BUFFER, bufferObjects[0]); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, bufferObjects[1]); glEnableClientState(GL_VERTEX_ARRAY); glVertexPointer(3, GL_FLOAT, sizeof(Vertex), ((void*) 0)); glDrawElements(GL_TRIANGLES, indexCount, GL_UNSIGNED_INT, ((void*) 0)); glDisableClientState(GL_VERTEX_ARRAY); } void display() { glClearColor(0.0f, 0.0f, 0.0f, 1.0f); glClear(GL_COLOR_BUFFER_BIT); glBegin(GL_QUADS); glColor3f(1.0f, 0.0f, 0.0f); glVertex2f(-0.5f, -0.5f); glVertex2f(0.5f, -0.5f); glVertex2f(0.5f, 0.5f); glVertex2f(-0.5f, 0.5f); glEnd(); glFlush(); } int main(int argc, char** argv) { glutInit(&argc, argv); glutCreateWindow("OpenGL Setup Test"); GLenum err = glewInit(); if (GLEW_OK != err) { /* Problem: glewInit failed, something is seriously wrong. */ fprintf(stderr, "Error: %s\n", glewGetErrorString(err)); } fprintf(stdout, "Status: Using GLEW %s\n", glewGetString(GLEW_VERSION)); glutDisplayFunc(display); glutMainLoop(); return 0; } [/spoiler] as far as C++ goes I'm fairly new, I've used it in the past for basic DirectX projects, like a model viewer but not extensively and not with any issues like this which weren't already heavily documented. I've tried google all I can but I just can't seem to get around it, Any and all help would be greatly appricieted, Thanks for reading, Bombshell
  9. I wasn't entirely sure where to put this, but I need a concept check. I've been bending my mind around the concept of 4 spatial dimensions and what it would mean in a gaming environment, after some thought I managed to conceive of 4 spatial dimensions, my friend got lost when I was trying to explain it so I'll try and keep it clean so you get the concept before I pose my question. A 1D line is a series of possible states of a 0D point from the lines lower limit / start to its upper limit / end A 2D shape is a series of possible states of a 1D line A 3D shape a series of possible states of a 2D shape So by extension, a 4D area is a series of possible states of a 3D shape so if I were to take a 4D area with the 4th dimension ranging from 0 to 1. am I right in saying that if W is my 4th dimension I may accurately represent the concept of 4 spatial dimensions via Lerping 3D points between a position regarded as W of 0 and a position regarded as W of 1? if so am I also right in saying a 4th dimension with a range exceeding 1 the 3D points would follow a curve defined by the points positions per W unit? to my understanding these points would actually represent the shape of 3D space as opposed to the objects within 3D space, meaning an object moving at for example 10mph in W0 may in W1 be moving at 100mph and then in another location 10mph in W1 could be 100mph in W0 via relative stretching and squeezing. I hope I didn't just confuse myself just to sound like an idiot XD I figured the idea of 4 dimensions in which to interact would be an interesting mechanic specifically for puzzlers, but I suppose the concept in other genres would allow for interesting 4 Dimensional level design, literately adding a new depth to the game. regarding the title about interactivity, the idea is just like WASD have you moving in X and Y, I'd imagine Q and E moving you through W, only being able to see the world in 3 dimensions it would just look like things being warped when if my concept is right it would be moving through a 4th dimension. Thanks for reading, Any and all informations, examples, documentation on this concept or similar concepts would be greatly appriceated. Bombshell
  10. Depth pre pass worth it ?

    well I imagine each model would have a limited number of textures for it so as long as their not 1024x or bigger you could merge them into a big texture and have a UV offset as part of the instance, but this is limited by the texture size and count. texture arrays I've had no experience with but this might be a good starting point, http://www.rastertek.com/dx10tut17.html   I used his tutorials a while ago and they were fairly easy to follow, I'll og out on a limb and say this probably follows the trend, as for how useful they are, I've heard of them but again, I've never used them myself so I dont quite know. And to put what I initially said in a nutshell, replace Diffuse and Specular Buffers with a MaterialID / UV Buffer and light the screen per material sampling the diffuse and specular from textures passed in by the material... that pretty much sums it up in a much less ballsed up way.
  11. Depth pre pass worth it ?

    Materials with deferred? how about an 8/16-bit(depending on your needs) MaterialID Buffer (to make room you can throw out Diffuse and Specular Buffer and replace then with UV's) divide the screen into a 16x16 grid (or whatever size, fiddling with it you'd probably find a better resolution) of Partial Screen Quads, in whatever way you see fit for each material fill a list of what cells contain it and fill each materials instance buffer. for each shader, for each material that uses shader, set shader parameters from material and draw Partial Screen Quad by the materials instance buffer (I'm wording this like crap but I hope you understand) in the shader which as well as the materials parameters has been passed the materials ID, if the pixel is not the current material return junk. if the material is the right one, get the diffuse and specular via the texture the material passed and the UV buffer, normals you should have in a buffer so you dont have to transform the normals out of tangent space every time. the rest should explain itself. I've probably spoken gibberish or gotten something horribly broken, I've not tested this, it pretty much came to me on the spot.
  12. I think I'm still suffering from haloing and some artifacts on spheres causing dark / light rings, but I'm still over the moon that I've got it working to some degree, SSAO baffled me until a moment of doing nothing and it fit itself together! best feeling in the world is jumping over a programming hurdle. Old Version [spoiler] #define SAMPLECOUNT 8 float PI = 3.14159265f; float4x4 WVP; //View Projection Matrix float4x4 WVPI; //Inverse View Projection Matrix float3 sampleVectors[SAMPLECOUNT]; float sampleRange; float depthBias = 0.002f; float2 GBufferSize; sampler NormalSampler : register(s2); sampler NoiseSampler : register(s3); struct VertexShaderInput { float4 Position : POSITION0; float2 UV : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float2 UV : TEXCOORD0; float4 ProjectPos : TEXCOORD1; }; VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; output.Position = input.Position; //using a Full Screen Quad which does not require projection output.UV = (output.Position.xy / output.Position.w) * float2(0.5f, -0.5f) + 0.5f; //get UV for GBuffers via screen position output.UV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f; //half pixel offset output.ProjectPos = output.Position; return output; } float unpack(float2 packed) { const float2 conversion = float2(1.0f, 1.0f / 256.0f); return dot(packed, conversion); } float3 decode (float2 enc) { float4 nn = float4(enc, 0, 0) * float4(2,2,0,0) + float4(-1,-1,1,-1); float l = dot(nn.xyz,-nn.xyw); nn.z = l; nn.xy *= sqrt(l); return nn.xyz * 2 + float3(0,0,-1); } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float4 normalSample = tex2D(NormalSampler, input.UV); float noiseSample = tex2D(NoiseSampler, input.UV * (float2(GBufferSize.x, GBufferSize.y) / 8)).r; float3 normal = decode(normalSample.xy); //Spheremap Transform normal compression, not my own work, based off a Cry3 implementation float depth = unpack(normalSample.zw); //nature of the game does not require a wide depth range so it has been compressed into 16-bit float4 position = float4(input.ProjectPos.xy / input.ProjectPos.w, depth, 1); //reconstruct position from normal position = mul(position, WVPI); position /= position.w; float4 output = float4(1,1,1,1); float angle = noiseSample * PI * 2; //convert [0-1] range noise into radians float cosAngle = 1-cos(angle); float sinAngle = sin(angle); float3 unit = normalize(normal); //I used its own variable incase I'd need to change it. float3x3 rotationMat = float3x3( //Rotation matrix to rotate the sample vector by angle 1 + cosAngle * (unit.x * unit.x - 1), -unit.z * sinAngle + cosAngle * unit.x * unit.y, unit.y * sinAngle + cosAngle * unit.x * unit.z, unit.z * sinAngle + cosAngle * unit.x * unit.y, 1 + cosAngle * (unit.y * unit.y - 1), -unit.x * sinAngle + cosAngle * unit.y * unit.z, -unit.y * sinAngle + cosAngle * unit.x * unit.z, unit.x * sinAngle + cosAngle * unit.y * unit.z, 1 + cosAngle * (unit.z * unit.z - 1) ); for (int i = 0; i < SAMPLECOUNT; i++) { //make sure sample vector is within surface normal hemisphere float3 sampleVector = dot(sampleVectors[i], normal) < 0 ? -sampleVectors[i] : sampleVectors[i]; sampleVector *= sampleRange; //transform sample vector by angle around normal sampleVector = mul(sampleVector, rotationMat); //get sample vecotors world position > projected position > UV float4 samplePosition = mul(float4(position.xyz + sampleVector, 1), WVP); float2 sampleUV = (samplePosition.xy / samplePosition.w) * float2(0.5f, -0.5f) + 0.5f; sampleUV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f; //sample depth float sample = unpack(tex2D(NormalSampler, sampleUV).zw); //modify final value by dot product float mod = dot(normalize(sampleVector), normal); //if sample is closer to view than origin caculate occlusion value if (sample < depth - depthBias) output -= (saturate(1 - ((depth - depthBias) - sample)) * mod) / SAMPLECOUNT; } return output; } technique Technique1 { pass Pass1 { VertexShader = compile vs_3_0 VertexShaderFunction(); PixelShader = compile ps_3_0 PixelShaderFunction(); } } [/spoiler] EDIT: Okay so I kept working on it and I've got it looking better without the bugs too, I made some silly mistakes, correcting them I've now had to go down to 6 samples but I'm more than happy with the quality of the occlusion, I'm using a Min BlendState so only the darkest comes through, and I'm running multiple passes giving a smoother looking final occlusion. as you can see the haloing issue isn't as glaring and the samples normals are now actual normals and no the samples direction (a mistake on my part) which has fixed the issue with artifacts on rounded surfaces.   #define SAMPLECOUNT 6 float4x4 WVP; //View Projection Matrix float4x4 WVPI; //Inverse View Projection Matrix float3 sampleVectors[SAMPLECOUNT]; float sampleRange; float depthBias = 0.0f; float PI = 3.14159265f; float2 GBufferSize; sampler NormalSampler : register(s2); sampler NoiseSampler : register(s3); struct VertexShaderInput { float4 Position : POSITION0; float2 UV : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float2 UV : TEXCOORD0; float4 ProjectPos : TEXCOORD1; }; VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; output.Position = input.Position; //using a Full Screen Quad which does not require projection output.UV = (output.Position.xy / output.Position.w) * float2(0.5f, -0.5f) + 0.5f; //get UV for GBuffers via screen position output.UV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f; //half pixel offset output.ProjectPos = output.Position; return output; } float unpack(float2 packed) { const float2 conversion = float2(1.0f, 1.0f / 256.0f); return dot(packed, conversion); } float3 decode (float2 enc) { float4 nn = float4(enc, 0, 0) * float4(2,2,0,0) + float4(-1,-1,1,-1); float l = dot(nn.xyz,-nn.xyw); nn.z = l; nn.xy *= sqrt(l); return nn.xyz * 2 + float3(0,0,-1); } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float4 normalSample = tex2D(NormalSampler, input.UV); float noiseSample = tex2D(NoiseSampler, input.UV * (float2(GBufferSize.x, GBufferSize.y) / 8)).r; float3 normal = decode(normalSample.xy); //Spheremap Transform normal compression, not my own work, based off a Cry3 implementation float depth = unpack(normalSample.zw); //nature of the game does not require a wide depth range so it has been compressed into 16-bit float4 position = float4(input.ProjectPos.xy / input.ProjectPos.w, depth, 1); //reconstruct position from normal position = mul(position, WVPI); position /= position.w; float4 output = float4(1,1,1,1); float angle = noiseSample * PI * 2; //convert [0-1] range noise into radians float cosAngle = 1-cos(angle); float sinAngle = sin(angle); float3 unit = normalize(normal); //I used its own variable incase I'd need to change it. float3x3 rotationMat = float3x3( //Rotation matrix to rotate the sample vector by angle 1 + cosAngle * (unit.x * unit.x - 1), -unit.z * sinAngle + cosAngle * unit.x * unit.y, unit.y * sinAngle + cosAngle * unit.x * unit.z, unit.z * sinAngle + cosAngle * unit.x * unit.y, 1 + cosAngle * (unit.y * unit.y - 1), -unit.x * sinAngle + cosAngle * unit.y * unit.z, -unit.y * sinAngle + cosAngle * unit.x * unit.z, unit.x * sinAngle + cosAngle * unit.y * unit.z, 1 + cosAngle * (unit.z * unit.z - 1) ); for (int i = 0; i < SAMPLECOUNT; i++) { float3 sampleVector = sampleVectors[i]; //transform sample vector by angle around normal sampleVector = mul(sampleVector, rotationMat); sampleVector = dot(sampleVector, normal) < 0 ? -sampleVector : sampleVector; sampleVector *= sampleRange; //get sample vecotors world position > projected position > UV float4 samplePosition = mul(float4(position.xyz + sampleVector, 1), WVP); float2 sampleUV = (samplePosition.xy / samplePosition.w) * float2(0.5f, -0.5f) + 0.5f; sampleUV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f; //sample depth float sample = unpack(tex2D(NormalSampler, sampleUV).zw); //modify final value by dot product float mod = 1 - dot(decode(tex2D(NormalSampler, sampleUV).xy), normal); //if sample is closer to view than origin caculate occlusion value if (sample < depth - depthBias) output -= (saturate(1 - ((depth - depthBias) - sample)) * mod) / SAMPLECOUNT; } return output; } technique Technique1 { pass Pass1 { VertexShader = compile vs_3_0 VertexShaderFunction(); PixelShader = compile ps_3_0 PixelShaderFunction(); } }     I'm mostly concerned about the haloing artifacts but the odd occlusion on spheres is a bit worrying too. Any and all comments are greatly appreciated, Thanks for reading, Bombshell
  13. So I figured making another thread would be pointless as it'd likely die after 1 or 2 replies, I'm adding SSAO and I have the random sample vectors transform their way to screen space UV's fine, but I need to rotate them randomly per pixel so I don't get the ambient occlusion looking like messed up shadows. from what I understand I would transform the sample vector into the normals space, rotate its X and Y by sampling a noise texture and transform it back into world space, but how would I do this? (if anyone has a link to an article or something, that would be great, as this seams like something I would want to learn for later use too) [b]EDIT:[/b] I eventually found something for this, though the math is not firmly in my head as I'd like it, I have managed to make the rotation matrix and the AO is looking good now.
  14. well the sampler state is set, I just don't think it works on cubemaps properly, I've fiddled around and nothing I do changes this (possibly a limitation of XNA), which is a bit of a pain as I'd intended to stick to Shader Model 2 but I had to move to 3 in order to use texCUBElod.
  15. that did the trick nicely, thanks so much, I thought it was a problem sampling from the GBuffers and that the shader using texCUBE had nothing to do with it