OculusVR just open sourced RakNet  a C++ multiplatform network library used for example by Unity or Gamebryo. It's totally free even for commercial usage (BSD license and all patents granted). It's also a great reference for writing your own multiplayer transport layer.
 Home
 » Viewing Profile: Topics: Krzysztof Narkowicz
Krzysztof Narkowicz
Member Since 13 Aug 2010Online Last Active Today, 09:23 AM
Community Stats
 Group Members
 Active Posts 40
 Profile Views 2,811
 Submitted Links 0
 Member Title Member
 Age Age Unknown
 Birthday Birthday Unknown

Gender
Not Telling
Topics I've Started
OculusVR acquired and open sourced RakNet
07 July 2014  03:07 PM
Generate random perpendicular unit vector
26 January 2013  04:33 PM
Hi,
I have a unit vector v1 and would like to generate a random perpendicular unit vector v2 with uniform or close to uniform distribution.
Potential solution is to generate a random unit vector and cross it with v1 to get v2. Now this has two issues  non uniform distribution and for some inputs it generates degenerate vectors, so it requires running in a loop until a valid vector is returned.
My current solution is to generate a random angle. Build a rotation matrix M using v1 and two perpendicular vectors. Those vectors are generated by crossing v1 with axis of it's smallest component and by crossing result with v1. Finally transform vector [ cos(randomAngle), sin(randomAngle), 0 ] using rotation matrix M.
It works, but I feel that I'm doing excessive amount of calculations and there is a much simpler way to solve it.
Gamma correct gradient
03 October 2012  11:21 AM
I'm wondering why sRGB gradients look "better" than linear space gradients. Fox example the gradient from black(0x000000) to white(0xFFFFFF) clearly lacks some black shades  http://scanline.ca/gradients. I tried to check it on different LCD's, but everywhere the sRGB one looks better. Additionally the CIELAB gradient looks just like the sRGB one. On the other hand examples like this one  http://filmicgames.com/archives/354 prove that the color in the middle of the gradient should be ~0.73 and not 0.5. So I'm quite lost here.
Perspective correct depth interpolation
07 September 2010  06:32 AM
I'm writing a depth only software rasterizer and have a few questions about perspective correct depth interpolation.
Vertex position in homogeneous coordinates in clip space: [x, y, z, w]
Vertex position after projection: [x', y', z'] = [x/w, y/w, z/w]
1. In some articles they say that You should interpolate z/w and 1/w. And later divide it to calculate persp corr depth. Other ones (C. Hecker's texture mapper docs) say that You can just interpolate 1/z'. This two ways lead to two different equations, so what is the correct way?
2. If it's ok to interpolate 1/z', then what to do, when z' is 0? For example DirectX style projection matrix returns z' from [0;1] range. Should I tweak it to have z' in (0;1] range or am I missing something?
Aggregated deferred lighting
13 August 2010  09:21 PM
1st pass: output scene depth
2nd pass: lighting pass. Output to first render target light color * light attenuation and light direction * intensity( light color * light attenuation ) to second render target.
3rd pass: usual scene rendering which uses aggregated lights for lighting. Here we can add also standard forward directional light.
There are three main benefits:
1) You can use almost any lighting model.
1) It's faster. You don't need to render GBuffor or normals + exp in 1 pass. You don't need to encode/decode normals, exponent...
2) It's decoupled from scene normals (high frequency details). This means that You could render lighting at lower resolution.
I posted a more detailed info on my blog.
Sadly I don't have any time now to code a demo. I'll do it after a week or so and present results.
What do You think about that? Did someone did something like this before?
[Edited by  KriScg on August 14, 2010 5:26:30 AM]