Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 29 Aug 2004
Offline Last Active Sep 27 2015 10:37 AM


Posted by on 17 April 2014 - 08:11 PM

The OpenGL wiki has me confused here:




Index buffers

Indexed rendering, as defined above, requires an array of indices; all vertex attributes will use the same index from this index array. The index array is provided by a Buffer Object bound to the GL_ELEMENT_ARRAY_BUFFER​binding point. When a buffer is bound to GL_ELEMENT_ARRAY_BUFFER​, all rendering commands of the formgl*Draw*Elements*​ will use indexes from that buffer. Indices can be unsigned bytes, unsigned shorts, or unsigned ints.

The index buffer binding is stored within the VAO. If no VAO is bound, then you cannot bind a buffer object to GL_ELEMENT_ARRAY_BUFFER​.


You CAN'T bind to GL_ELEMENT_ARRAY_BUFFER unless you have already bound a VAO?! This makes no sense to me as one would naturally want to bind to GL_ELEMENT_ARRAY_BUFFER when filling it in and not want to be worried about binding an associated VAO. Is there another way to fill an index buffer? Does having 0 bound as VAO count as having a VAO bound? 


#5078886 Stable Cascaded Shadow Maps have made me lose all my hair

Posted by on 19 July 2013 - 04:29 AM

What is the actual problem? Screenshots? 

#4834580 jointOrientX vs rotateX

Posted by on 12 July 2011 - 06:59 PM

Possibly a pre rotate or a post rotate. I recognise the naming style, might be from 3ds? Anyway, check the specific collada exporter documentation for info might be a place to start.

#4833873 Shadowmapping Geometry Clipmaps

Posted by on 11 July 2011 - 12:01 PM

That would be really nice. I am at work so couldnt check the video. But I can see how that will work as all we can assume the directional light as a point alight, calculate the coefficients and then only use the coefficient depending upon the direction of light.

Don't need to assume it is a point light at all! Spherical harmonics are great at encoding complex lighting environments (not as great as Haar wavelets apparently but I haven't looked into them). Think of it as compressing a full environment map into just a few numbers (massively lossy of course). Another way to think of an environment map is "what colour/brightness is the incoming light from each possible direction". So you can reverse this and instead encode into the environment map for a single point (vertex or texel) what colour/brightness that point is when a directional light is cast on it from all possible directions. In some cases it will be lit, some it will be shadowed by other geometry, and some it will have secondary illumination from ambient lighting and light bounces. Then you can encode this environment into a SH with a limited number of coefficients, and hard code it into vertex data or textures. Then when you want to simulate a directional light you can encode the directional light into the same number of SH coefficients and simply multiply all the environment coefficients by these, like a mask, in your shaders. The directional light can be created by taking a cardinal axis SH and rotating it (there is a fairly easy way to rotate SH) to the direction of the light. If you want you can also create much more complex lighting environments and apply them instead.
Google for Precomputed radiance transfer (PRT) and spherical harmonics and it throws up a few papers.

#4829065 Beginning a large Game Dev project; 2nd attempt

Posted by on 29 June 2011 - 07:28 AM

I started working on a project just like this for Android, and by far the most time consuming things were AI and GUI. If you can find a handy library that will manage most UI components for you then that should make the task a lot easier. If you use a cheating AI then it will make that part a lot easier as well, but I was restricting my NPCs to using the same ship physics model and inputs as the player, which was simply a thrust button and left and right turning. Obviously the actual graphics side of things is pretty easy when everything is sprites and in 2D, as is simple collision (which is good enough for a game like this imo). It is still worth using a grid for spatial partitioning, speeding up rendering, and collision detection.
As far as learning C++ is concerned its difficult to say. I'm tempted to say do it in C#, but I can't vouch for linux C# support. I know there is a .NET implementation for linux, but not how good it is, or what there is in the way of IDE support for C#. I guess java is another option, but personally I hate it so can't really suggest it myself.

#4828254 How do the maps work in Civilization?

Posted by on 27 June 2011 - 08:26 AM

In Civilization V there are no "sides" to the map (except for the top and bottom of course), so if you keep moving your camera, lets say to the left, you keep on circling around the map. How is this achieved? I was going to wrap my terrain around a cylinder, but I don't think that would produce the same effect, as the map in Civilization looks flat.

Its called "wrapped" or "toroidal indexing". Basically just take the absolute xy coordinate of the tile you want to draw (can be any values, including outside the range of valid tile indexes), then you use the an unsigned mod operation with the max width and height to determine the "wrapped" index. e.g.:

x = -47
width = 20
wrappedx = x % width;
if(wrappedx < 0) wrappedx = width + x;

#4828225 Projection and Shadow Mapping

Posted by on 27 June 2011 - 07:21 AM

Sounds right to me. But if you are going to rescale z don't use a matrix in the fragment program, just rescale the z value (*2-1). But why not just adjust your projection matrix instead? If you are writing custom fragment depth (which you will need to as you are transforming the z in your fragment program) you are losing your early z check.

#4827756 VC++ 2010, glew will not link

Posted by on 25 June 2011 - 08:35 PM

I don't agree with those two guides, my opinion is you should keep glew in its own folder with an include and a lib folder, and copy the dll to the same directory as your programs exe file. So:
  • create a directory called glew where ever you usually put sdks/libraries (e.g. programming/libs or programming/sdks), create an directory inside it called include and in that one called GL (so glew/include/GL), and one called lib (glew/lib)
  • copy the .h file to the include/GL directory, and the .lib file to the lib directory.
  • copy the dll file to the same directory as your exe file is in.
  • in your project settings for your exe project add the lib directory to Configuration Properties->Linker->General->Additional Library Directories, and add the include directory to Configuration Properties->C/C++->General->Additional Include Directories.*
  • in the project settings still: add glew32.lib to Configuration Properties->Linker->Input->Additional Dependencies. Alternatively use #pragma comment(lib, "glew32.lib") in a code file in your exe project.
* You should try to organise your code base so that you can use relative directories rather than putting in the full path, as it makes moving your code around, and changing directory structure easier. e.g.:
Main directory C:\Programming.
Under that you have: SDKs and src.
Under SDKs you have glew and other libraries that you haven't written yourself. i.e. external dependencies of your code.
Under src you have all your own projects.
So if under src you have a project called "hello_world", you can add the glew directories using $(SolutionDir)..\..\SDKs\glew\include and $(SolutionDir)..\..\SDKs\glew\lib. Then if you were to copy your entire Programming directory to another location, or rename it, these relative directories would still work.

#4822176 Probably overly complicated template class not working

Posted by on 11 June 2011 - 03:29 PM

This isn't a nice design. For one thing dynamic_casts are costly, and you certainly wouldn't want to use them every time you call any function in your entire game. Don't add runtime complexity just to get out of having to type some extra text! Also don't add complex design to get out of it either. If you want your Object class to be able to access these different systems either put them into a globally accessible object, or pass them to the Object in the constructor. But my opinion is that it is better to separate your operations from your data:
Object contains the specification for an Object (i.e. the data: position, colour etc.)
Another class (e.g. Renderer) contains the methods to draw an Object.
Another class (e.g. Scene) contains the set of object instances you wish to draw.
So you create a bunch of Objects, add them to a Scene, and then pass the Scene to the Renderer to actually do the drawing.

#4821448 Need help drawing floors in ray caster

Posted by on 09 June 2011 - 01:25 PM

I may be missing something here, but fundamentally what makes floors any different from anything else your ray caster will be drawing?

If he means ray caster as in Doom then the walls and floors use different techniques. In a Doom style engine the walls are drawn by intersecting a ray in 2D with a wall section, then scaling a 1D slice of texture onto the screen. Obviously to draw the floor needs a completely different technique. My guess is that you simply project a pixel onto a plane and then mod the plane coordinates by texture size, and that's your pixel colour.

#4821433 GLSL Directional Light

Posted by on 09 June 2011 - 12:53 PM

Can't see anything immediately wrong (assuming the various matrix*normal calculations have .w component set to 0, to prevent translation). As usual, de-construct your shader, outputting each stage as colour until you find where the values aren't correct. Start by making sure your material and light parameters are getting into the shader correctly but just outputting their colours to all pixels.

#4820604 Strange issue with Sean O'Neil's atmospheric scattering (ground).

Posted by on 07 June 2011 - 12:04 PM

Yeah, I don't see anything wrong with the atmospheric scattering part. I think you should go back and double check that the camera position is really what it is supposed to be. Just because your lighting works doesn't mean everything is definitely correct as far as the scattering code expects it to be. In fact it MUST be wrong, because the scattering code is copy pasted from a working example. You just need to diagnose until you find the incorrect parameters.

#4820559 Strange issue with Sean O'Neil's atmospheric scattering (ground).

Posted by on 07 June 2011 - 09:50 AM

In that case, please post the code where you are setting all the uniforms for your shader.


vec3 v3Pos = ;
What is going on here?
I assume you removed the interpolated vertex position variable from here for some reason?
Could you please post your vertex shader as well?

#4820470 Strange issue with Sean O'Neil's atmospheric scattering (ground).

Posted by on 07 June 2011 - 05:20 AM

How are you ensuring that your input data is definitely within the inner to outer atmosphere range? If, for instance, you are using the default or improved Perlin noise algorithm, they do not provide numbers that are clamped between -1 and 1, they can exceed those bounds. How about doing some shader diagnostics? Output height / (OuterRadius - InnerRadius) as the colour. Add branches to output red if the value escapes 0 to 1 bounds. Simplify the shader then build it back up, adding each stage, testing values against what they should be by outputting them as colour.

#4815650 C++ Change screen resolution

Posted by on 25 May 2011 - 10:54 AM

Your graphics card and monitor will only support a limited set of possible resolutions. You should use EnumDisplaySettings (http://msdn.microsoft.com/en-us/library/dd162611(v=vs.85).aspx) to determine the supported modes, then change mode to the one you want, while sizing and positioning your window to fill the screen. You should read the MSDN page for ChangeDisplaySettings (http://msdn.microsoft.com/en-us/library/dd183411(v=vs.85).aspx) carefully and check the return value for the specific error.