Sign in to follow this  
Kazade

OpenGL Deferred Shading Engine test

Recommended Posts

Hi, Could some of you test my deferred shading engine for me? Its needs a lot of work still but I just wanna make sure it runs on other graphics cards. It can be downloaded from here at the bottom of the page. I just need to know 1) Whether it runs 2) Whether or not it looks like it should 3) The frame rate and the specs of your machine. I wanna make sure that it runs basically before I continue adding to it. Cheers Luke. P.S. Needs OpenGL 2.0 support and floating point textures.

Share this post


Link to post
Share on other sites
It looked ok to me. It only ran at about 19fps though.

AMD64 3000+ @ 2.3Ghz
1GB RAM
ATI Raedon 9800 128Mb

Share this post


Link to post
Share on other sites
I don't think it's working correctly for me.

It ran at constant 60FPS (Might have had v-sync switched on), but all I could see on the models appeared to be normal maps.

I could see what looked like a light moving around the room, but the textures applied to the surfaces were just the purply blue that normal maps are, they didn't change as the light moved at all.

My system:

OS : Win XP Professional
Graphics : NVIDIA Geforce 6800 GT 256MB
1.5 GB RAM, 2.6 GHz HT Pentium 4.

Share this post


Link to post
Share on other sites
Quote:
Original post by weasalmongler
I don't think it's working correctly for me.

It ran at constant 60FPS (Might have had v-sync switched on), but all I could see on the models appeared to be normal maps.

I could see what looked like a light moving around the room, but the textures applied to the surfaces were just the purply blue that normal maps are, they didn't change as the light moved at all.

My system:

OS : Win XP Professional
Graphics : NVIDIA Geforce 6800 GT 256MB
1.5 GB RAM, 2.6 GHz HT Pentium 4.


One of my friends has had the same problem...

I know the cause of it, but I dont know why its happening, in error.txt there should be a line saying that the variable normalMap cant be found. A call to glGetUniformLocation is failing, but I have absolutely NO idea why it would fail only on some computers. Any ideas?

Share this post


Link to post
Share on other sites
Well, it did not work for me. The log says:

Attempted addition of NULL font!
Window Initialised Successfully
YOUR CARD DOES NOT SUPPORT OPENGL 2.0! SORRY.

I have a Geforce 6200(*does* support OpenGL 2.0, at least according to the box), Pentium 4 3.0ghz, and 1 gig ram. I have the latest drivers for the graphics card.

Share this post


Link to post
Share on other sites
Quote:
Original post by Ezbez
Well, it did not work for me. The log says:

Attempted addition of NULL font!
Window Initialised Successfully
YOUR CARD DOES NOT SUPPORT OPENGL 2.0! SORRY.

I have a Geforce 6200(*does* support OpenGL 2.0, at least according to the box), Pentium 4 3.0ghz, and 1 gig ram. I have the latest drivers for the graphics card.


Now that is wierd...

this is the code that checks for OpenGL 2.0

if (!GLEW_VERSION_2_0)
{
MessageBox(NULL, "OpenGL 2.0 support IS REQUIRED!", "ERROR!", MB_OK);
MSG("YOUR CARD DOES NOT SUPPORT OPENGL 2.0! SORRY.");
return false;
}

so GLEW is saying that its not... i'll look into it.

Luke.

Share this post


Link to post
Share on other sites
Quote:

One of my friends has had the same problem...

I know the cause of it, but I dont know why its happening, in error.txt there should be a line saying that the variable normalMap cant be found. A call to glGetUniformLocation is failing, but I have absolutely NO idea why it would fail only on some computers. Any ideas?


Yeah, that's the error that I got in error.txt. Must be the same problem.

Share this post


Link to post
Share on other sites
It ran, but this is what the monsters and spotlit walls look like (normal maps apparently):
http://cwiki.org/index.php/Image:Kengine.jpg

Specs:
AMD Athlon 1800+
Windows XP
GeForce 6800 GT 256MB
ForceWare 77.77

The 6200 does not support fp16 linear blend (only nearest neighbour), but everything 6600 and higher does. Is fp16 linear blend needed?

I am writing GLSL 1.0 / OpenGL 2.0 software myself, so I don't believe it is a driver problem?

Share this post


Link to post
Share on other sites
Hmm, strange...

Says my card doesn't support opengl2.0

But...

I can run games like the elder scrolls oblivion?

ATI 200 series express integrated

AMD Athlon 64 3500+

1GB ram

Share this post


Link to post
Share on other sites
OK, so there are 2 problems:

1) Normal maps being shown instead of diffuse.
2) OpenGL 2.0 detection not working

Number 1 is a big problem, the GLSL shaders are being compiled and run fine. The reason its not working is because a uniform variable wont set. This is really odd as the shader only uses 2 samplers, the first works fine, the second cant be found. I dont understand why, the variable is definately there, the shader works fine on other cards, it just seems to be on the GeForce 6800 GT that it doesnt????

Number 2 is not to do with my code, its something to do with GLEW, i'm sure GLEW just reads the version string from the drivers. This happened to one of my friends but he updated his drivers and it worked fine. Maybe the drivers arent showing support?

If anyone has any ideas why I would be having problems, please let me know, i'm stumped. The shader with the problem is gBufferFillFrag.frag in the shaders folder if anyone wants to see if they can spot the mistake.

Cheers

Luke.

Share this post


Link to post
Share on other sites
My card doesn't support OGL 2.0 so skipped testing this but couldn't you just check with glString or whatever it is instead of using GLEW's function to do this? Or as a fallback if GLEW fails.

Share this post


Link to post
Share on other sites
Quote:
Original post by Kazade
OK, so there are 2 problems:

1) Normal maps being shown instead of diffuse.
2) OpenGL 2.0 detection not working

Number 1 is a big problem, the GLSL shaders are being compiled and run fine. The reason its not working is because a uniform variable wont set. This is really odd as the shader only uses 2 samplers, the first works fine, the second cant be found. I dont understand why, the variable is definately there, the shader works fine on other cards, it just seems to be on the GeForce 6800 GT that it doesnt????

Number 2 is not to do with my code, its something to do with GLEW, i'm sure GLEW just reads the version string from the drivers. This happened to one of my friends but he updated his drivers and it worked fine. Maybe the drivers arent showing support?

If anyone has any ideas why I would be having problems, please let me know, i'm stumped. The shader with the problem is gBufferFillFrag.frag in the shaders folder if anyone wants to see if they can spot the mistake.

Cheers

Luke.


Try using GLee instead of GLEW. That might solve the 2.0 issue for some people.

Share this post


Link to post
Share on other sites
I have Radeon 9600 and it looked a little weird. There's no ambient lighting on the walls so they are completely black until light shines on them.
Also objects disapper too quickly (too small zFar?).
It looks like this:


On the first shot you can see that walls are completely black but those guys on the left are lit even when there's no light shining on them.

Also first time I run it I got this log file:

Attempted addition of NULL font!
Window Initialised Successfully
OPENGL 2.0 SUPPORTED!
Starting task initialisation....
---- Input
---- System
---- Game
Could not open texture file: noshader.tga
Could not open texture file: noshader_bump.tga
*************Initialised models***********
----- Start of Deferred Shading Initialisation -----
----- Initialising shading buffers.
----- Buffers initialised successfully.
----- Loading G-buffer shader programs.
Link successful. The GLSL vertex shader will run in hardware. The GLSL fragment shader will run in hardware. Validation successful.
uniform variable name not found! normalMap
----- G-Buffer shader programs loaded.
----- Loading light shader programs.
Link successful. The GLSL fragment shader will run in hardware. Validation successful.
[.\KDeferredShader.cpp line 179] GL Error:invalid operation
Link successful. The GLSL fragment shader will run in hardware. Validation successful.
----- Light shader programs loaded.
----- Deferred Shading Initialisation Complete -----
*************Physics System Loaded*************
Tasks Initialised!
uniform variable name not found! normalMap
uniform variable name not found! normalMap
uniform variable name not found! normalMap

... this goes on and on (1 missing uniform for frame?) ...



So I copied those tgas and renamed them to noshader.tga and noshader_bump.tga.
It does not report missing textures anymore but still reports missing uniforms.

Share this post


Link to post
Share on other sites
Doesn't work correctly, what shows on my screen looks a lot like this screenshot posted above, http://cwiki.org/index.php/Image:Kengine.jpg

Specs:
AMD64 3000+ Venice
512 MB Ram
Geforce 6600GT 128 MB
Forceware 84.21

Share this post


Link to post
Share on other sites
Well that serves me right for rushing it! The uniform variable message in the log comes up on my computer its just where i added something in the wrong place, i've fixed it now. I just noticed, I was including OLD versions of the shaders in the uploaded version, i've updated it now, so it SHOULD be alright now *fingers crossed*

Please could all those who had the normal map problem retry? Cheers guys!

Luke.

Share this post


Link to post
Share on other sites
@b2b3

I think the reason it looks like the guys are lit incorrectly is its a point light, so anything within the lights 'sphere' is lit, where there is not point of reference for the centre its difficult to tell, but I think its working alright, it'll look better when its finished but it looks how its supposed to at the moment, if that makes sense :)

As for the black walls, I think they are just REALLY dark, in the second shot you can just make out the wall texture to the right of the light.

Share this post


Link to post
Share on other sites
It works perfectly now. Coloured monsters, one red spot, one blue spot, and a bunch of faint grey bricks in the unlit areas. Nice work!

Again,

AMD 1800+, 2GB RAM
Windows XP SP2
GeForce 6800 256MB
Driver: 77.77

Share this post


Link to post
Share on other sites
Runs fine on AMD64 3500+, Radeon X800 Pro 256mb, 2GB. 50-60 fps.

Texture filtering seems to be a weak point. Perhaps it is because of the Radeon board having problem filtering float taxtures. The mipmaps get really blocky.

[Edited by - Demus79 on April 4, 2006 3:02:56 PM]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Partner Spotlight

  • Forum Statistics

    • Total Topics
      627642
    • Total Posts
      2978354
  • Similar Content

    • By xhcao
      Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness. 
    • By cebugdev
      hi guys, 
      are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
      Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic 
      let me know if you guys have recommendations.
      Thank you in advance!
    • By dud3
      How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below? 
      Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
      Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.
       
      References:
      Code: https://pastebin.com/Hcshj3FQ
      The video shows the difference between blender and my rotation:
       
    • By Defend
      I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
      My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
      * make lots of VAO/VBO pairs and flip through them to render different objects, or
      * make one big VBO and jump around its memory to render different objects. 
      I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
      If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?
       
    • By test opty
      Hello all,
       
      On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
      #include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }  
      The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
      What is the problem and how to solve it please?
  • Popular Now