Hi again. Sorry for the late reply. Thank you to everyone who's trying to help me.
Do you have any immediate mode code at all?
I had to google a bit to remember what that meant. No, I don't have any immediate mode code.
Are you checking for errors in Ubuntu, even though it looks correct?
I'm not checking for errors through glGetError yet (because of lazyness) but I'm using GL_KHR_debug which reports even performance warnings.
Can you compile and/or run it as OpenGL 3.1 under Ubuntu to see if you get similar results as Windows?
That's tricky. The Ubuntu driver only supports 3.2 and 3.3 when using the core context. The greatest non-core version available is 3.0, so the best I could do would be having a 3.0 context and loading 3.1 extensions. I'll try that if nothing else works.
It sounds that your Intel Chip has some unsupported extensions you are using that the Ubuntu driver support so you may need to check the extension string
That's true. The Windows driver doesn't support geometry shaders, for example. But my engine checks if all needed extensions are present, so unless I'm missing an extension (which I don't think I am) that's also not it. I'm going to take a look at it anyway.
and shader compiler output eve if it dosent fail there may be warnings in it.
My engine always does that.
At least check the error code of GetLastError() Winapi.
I don't know what GetLastError has to do with anything, but I'll take a look at that.
Remember that glGetError may be resetted when running any gl function after that one that fails so you need to double check this too.
I don't think so. According to the reference "No other errors are recorded until glGetError is called, the error code is returned, and the flag is reset to GL_NO_ERROR.". Also, ApiTrace always lists all errors and warnings.
Do yourself a favor and test it on a proper windows gaming pc (nvidia or amd card) to verify.
If I had a "proper Windows gaming PC" I wouldn't be using a laptop. Decent computers are pretty expensive where I live, so it's not that simple. Remember: just because you can afford a good computer it doesn't mean everybody can. :)
The biggest hurtle I had when porting my XNA engine to OpenGL was to remember to check glGetError after each call in DEBUG mode so I knew exactly which lines were failing.
Yeah, that's something I should do now. Even though ApiTrace verifies errors after each call, it's better to do it on my own.
My issue was driver defaults. On windows, my driver defaulted some state (blend state and cull states) that weren't done on OSX and Linux. I had to write a state manager for my engine to maintain it's own state and not rely on OpenGL's.
My engine is designed not to assume any kind of default value.
Also, test with something basic and small, set to an identity matrix, test across all platforms and see if it's the same. A simple cube at 0,0,0 should suffice. This will allow you to start to pinpoint where the differences are.
THIS. For some reason, I haven't even thought about using a simpler scene. Doing this will probably make finding the problem much easier.
Again, thank you to everyone who replied. Have a nice day.