Jump to content

  • Log In with Google      Sign In   
  • Create Account

Yann L

Member Since 06 Feb 2002
Offline Last Active Mar 30 2012 02:53 PM

Topics I've Started

Your sixth sense

01 January 2010 - 04:15 PM

Well not really an additional sense, but almost. And it's kind of useless. But it's a fun little thing to try if you're bored. Did you know that most people can physiologically sense the polarization of light ? Not merely its effects (such as reduced reflections), but the actual plane on which the electromagnetic field propagates. We apparently have dedicated sensors for this on our retina. The evolution decided we didn't really need them in our everyday lives, so we don't perceive the effect consciously. But with a little training, you can reactivate it. And the precision is quite impressive. You can distinguish linear versus circular polarization, and you can (approximately) see the angle of linear polarization, both for the electric and the magnetic field components. Introducing Haidinger's Brush. It's easy to try right there, if you are on an LCD screen. After only a few minutes, I could easily see the yellow bar (apparently the magnetic plane of the light wave). The blue bar, the electric field component, takes a bit more time to become clear for me. Due to the lack of a circularly polarized source, I couldn't test that one. I'll try to see the natural polarization of the sky tomorrow at sunset. So, can you see polarization ? And making use of your new sense, how is your TFT polarized ? Mine is at about 45 degrees.

Quadbuffered stereo in D3D10 ?

06 July 2008 - 03:39 AM

Hey there, My first post ever in this forum :) Well, as some might know, I'm coming from the evil side (OpenGL), but I'm currently converting a GL based engine to D3D10, mainly due to ATI/AMDs incompetence of delivering functional OpenGL drivers... The conversion was surprisingly painless, and works quite flawlessly. For one exception: quad buffered stereo support. I'll give a quick explanation for people unfamiliar with the term. The idea is to have two separate framebuffers, both double-buffered, where each one will be displayed on a separate DVI (or double-DVI) output of your graphics card. While rendering your frame, you may specify to which framebuffer the output should go to (typically called left and right). The two or four DVI outputs are then usually connected to a polarization based stereoscopic display or projection system. Both outputs display an excact 1:1 copy of your desktop, except for the quadbuffer-enabled 3D windows, where the left or right framebuffers will be displayed respectively. So essentially, does anyone know of a way to get such quad buffered stereo in D3D10 ? Has native support been introduced since version 10 ? Of course, there is a way to fake it (ab)using dual screens. However, this will only work in fullscreen mode, and with a custom (software) mousepointer. Unfortunately, being an industrial visualization application rather than a game, this is not an option for our software. Any ideas from the D3D pros ?

Deferred shading without MRTs and with MSAA

12 April 2007 - 12:16 AM

I need some ideas from you guys with the creative minds here ! Ok, here's the situation: I'm currently rendering a very complex 3D scene to a 16bit floating point offscreen surface. I'm highly fragment limited. Now, I've recently developed a new kind of dynamic soft shadow algorithm that looks really good and isn't too expensive, so I'd like to add it to the engine. Unfortunately, doing so in the traditional way (by simply including it into the main shaders) forces me to rerender parts of the scene several times, which is a performance killer. So I'd like to use partially deferred shading to add the shadows after the main render pass. So far, so good. Essentially, the information I need after the main render pass is complete would be: * The (floating point) pixel colour. Doh, that's easy, it's just the output of the shaders as they are right now. * The pixel position in world space. The position should be reasonably accurate. * The pixel normal, also in world space. This one can be less accurate. This would be easy with MRTs, but unfortunately I can't use them. The main reasons are the lack of MSAA support with MRTs on most current GPUs, and the huge memory requirements of the additional buffer (since all MRT colour buffers need the same internal format, I'd have to go with a second FP16 RGBA buffer - ugh). So MRT is not an option. What to do ? After my main render pass, I have two valid buffers: an RGBA FP16 buffer, where the alpha component is currently unused. And a 24bit depth buffer. Given these two buffers, I need to somehow convey the pixel world space position and the normal. The position is easy, I can simply unproject it from the depth buffer and the inverse camera matrix in a pixel shader. The normal, however, is tricky. I only have one additional free channel, the alpha channel. To store a full featured normal, I'd need at least two. I could also use finite differencing on the depth buffer to recover the normal, but that would lead to incorrect data on the object silhouette edges. I've also though about quantizing the normal hemisphere (only the normals pointed towards the camera are visible, everything pointing back is culled anyway) into 256 directions, and using lookup tables to encode the pixel normal into a single byte. That wouldn't be very accurate, but still OK (although I'm afraid of banding artifacts). The problem is that it will completely fail when MSAA is enabled. As soon as the MSAA buffer is resolved, my alpha values (that are now actually indices into the quantized normal hemisphere) will be blended on the edges, and *BAM* ! Unfortunately, I cannot afford rendering the entire scene twice per frame, even when using simplified shaders (so to write only the normal to a separate buffer). There's just too much geometry. Well, I'll probably still do it this way if nothing else helps, but it's really a last resort. Any ideas ? Maybe a way to encode the normal as a polar angle to the pixels view direction, and recover the missing second angle by differencing the depth map ? Or somehow encoding a correction factor in the alpha component of the colour buffer, that compensates for the errorneous depthmap differencing on polygon edges ? I'm open to all weird suggestions ! Edit: oh yeah, I'm using OpenGL btw, but that shouldn't really matter. Thanks, Yann

Lucid dreams

04 October 2006 - 01:53 PM

So to change a little from all those political and whatnot threads in the lounge... ;) We've discussed dreams here in the lounge a few times in the past, and some people mentioned lucid dreams. I always had a lot of very vivid dreams, often with some semi conscious parts, so I thought I knew what they were talking about. Well, now I know better. Two days ago, I had my first real lucid dream. And it was absolutely amazing. This was one of the coolest things I've ever experienced. I guess it's one of those things that you can't understand until you actually experienced it yourself. Did you always dreamt about having a Startrek like holodeck ? Or some kind of ultra-advanced "Matrix"-like virtual reality system to play with ? Well, this is it. It started slowly, without me even realizing what was about to happen. I "awoke" in the middle of the night. Well, at least I thought I was awake. Later on, when searching the net about lucid dreams, I discovered that this was commonly called a "false awakening". Basically, you dream that you wake up without knowing that you are dreaming. So far so good. So I get up, and realize everything is dark. I can still see, since strange moving lights are glowing in from the window. I try to switch on the light, but nothing happens (again, later on I read on the net, that dysfonctional light switches are typical of lucid dreams). I go to the window, and look outside. The sky is full of very bright stars, and some kind of coloured nebulae. Hundreds of triangular objects, like distant shapeships, fly across it. And this is the moment I realize I'm dreaming. I often had dreams like this, nothing unusual until now. But I never made it across this critical point. So two days ago, I did. And that's where the adventure begins :) Basically, once you realize you are dreaming, you begin to consciously live your dream. It really becomes a virtual reality kind of setup from this moment on. This is very hard to explain, but your mind operates like if you were awake, ie. your mind is perfectly conscious and clear, with a feeling of time and event causality as in real life. OK, so I was in my room, quite amazed. I remembered that I read about something like this some time ago, and realized that I must be experiencing a lucid dream. I went downstairs, and looked around me. Everything was extremely realistic: the visuals, sound, smells, even touch. There were small anomalies, like the walls being a different colour. But all in all, it was an almost perfect reproduction of my own house. I went outside, and wandered around in my garden, and into the fields behind. Everything was there, although in a slightly different layout. Oh yeah, and the sky was full of triangular spaceships, but whatever ;) So I wandered around for a few more minutes, into the woods behind my house. The level of realism and control was insane. I could touch and smell the trees, I could feel the cold touch of water drops on the grass. The feeling of time and causality was perfectly normal too, as in real life. Unbelievable. Suddendly things started to glitch. The "simulation" started to get unstable, and I found it harder and harder to keep control. Then I woke up, for real this time. I could remember every single detail of that trip. Later, I retraced the path I did in my dreams in reality, so to estimate the time of the dream. It took around 5 minutes. So yeah, I was left very impressed. I searched around the net, and found several resources about the subject. It seems that with practice, people are able to do amazing things in lucid dreams, essentially controlling them completely. From flying, over praticing martial arts, composing music, to having sex ! Of course, I want to have one again, as soon as possible. I read about several induction techniques, but they seem complex and require a lot of practice. I tried one last night (the WBTB technique mentioned here), but failed. Not surprising, considering my lack of practice. So, a few questions: * Did anyone here experience such a real lucid dream ? Did you find it as amazing as I did ? * Does anyone have experience with artificial induction ? What techniques do you use ? How long did it take to master them ? * What level of control do you have in your dream ? Can you build your own worlds, or change dream environments on the fly ? Can you summon objects or people at will ? Can you talk to people, do they respond and react in a realistic way ? * Did you manage to stabilize the "simulation", and keep it running for a longer time ? How long can you run it without major and annoying glitches ? That's some real cool stuff. I never realized what a superb kind of VR system we had hidden in our brains ;)

Macs, xcode, and the switch...

28 June 2006 - 11:46 AM

I'm really fed up with both Windows and Linux. Both are mediocre operating systems, each with their very own set of flaws, bugs, inefficiencies and other annoyances that make me go insane lately. And Windows Vista is a complete abomination. So I decided to switch over to the Dark Side ™, and got myself a Macbook Pro this morning. I've been playing around with it for the entire day. I'm not terribly impressed, unfortunately. It's not bad at all, and certainly better than the other bunch (Windows, Linux) in many aspects, but it's horribly lacking in others. I don't know if my expectations are really that outlandish, or if I'm getting over sensitive lately, but well - I just want a stable, intuitive and powerful OS to work with. And OSX isn't quite there, at least from what I could gather during my first impressions. Anyway, here's a small review from my first experience on Mac OS X. I know that there are quite a few Mac users here, so maybe you can give me some advice on how to tweak OSX in order to make it more efficient, or shout at me when some of these points are due to user-stupidity ;) Notebook: * Really nice design. Probably the best from all notebooks I've seen so far. * Screen: a little too panoramic, and lacks height eventhough it's a 17". Good image quality though. * Where is the HDD activity LED ? It's extremely annoying not to know what your hard drive is doing. * The trackpad lacks the right mouse button. No, single button mice and trackpads are not cool and/or "easy". They're just annoying. * Magnetic power plug snap: funny. I hope this thing is far enough away from the HDD... OSX: * Nice visuals. Much more elegant than the cheap "in-your-face" Vista eyecandy. But still too much at places, and things like minimize/maximize animations cannot be turned off. * Multi display support is excellent. * The entire GUI feels slugish, much slower than Windows or KDE. It's as if there was a slight lag between screen updates and the mouse pointer. Scrolling in Finder windows, Safari or Firefox is slow. Considering that this notebook is the current top of the line product, such sluggishness is simply unacceptable. A GUI must be fast and responsive. * The Dock is the spawn of Satan. PLEASE, tell me there is a third party taskbar-like replacement somewhere out there... * Application folders are cool. And the DMG install process is really nice, IMO (I know many will disagree, but you'll love this after fighting for ages with corrupt MSI packages fucking up the registry...) * Text rendering and antialising is not nice at all. It really doesn't look good, especially on a large LCD. I don't understand this. Apple has always been the platform of choice for publishing and design, how can it be that their font rendering is so inferior to Microsofts or even Freetype ? Is it possible to replace their font rendering engine with FT2 or similar ? * I'm not sure if I like the Finder yet. I find Windows Explorer more intuitive, but that might be mental conditioning. Can anyone recommend a good file manager for OSX ? Some Norton/Midnight commander clone would be nice. Yeah, I know that MC does in fact exist for OSX, but it seem GTk+ based, which is just - eww. I tried Disk Order, which looks cool, but is highly bugged and crashes every two minutes. * How do I change the font used in context menus, the desktop, etc ? * How on earth do I access the OpenGL settings of the onboard ATI X1600 ? * How can I quickly switch to the desktop, if it's covered with Windows ? XCode 2.3: * Mixed feelings so far. Much better than every Linux IDE I've ever tried, but inferior to Visual Studio. * The layout is amost non-customizable, at least compared to VS2005. Yeah, you have three basic layouts to choose from, but all three suck. An IDE where you cannot fully customize the layout is completely unacceptable nowadays. * The debugger IDE is weird. It seems to lack a lot of functionality, or it hides it so well I cannot find it: * Where can I access the stack memory ? * How can I set the instruction pointer to the current cursor position (eg. to skip instructions) ? * "Step into" doesn't seem to work at all, if the function is in a different source file * I've managed to crash it twice while simply stepping through a (rather complex) program. I haven't really tried multimedia options yet. This is where the machine should really excel the PC, so I'm curious. I guess if I'll use it more regularily, I'll eventually get used to some of the differences to Windows. Except for the Dock... Updates will probably follow... :)