Jump to content

  • Log In with Google      Sign In   
  • Create Account


m41q

Member Since 01 Mar 2012
Offline Last Active Mar 31 2013 05:42 PM
-----

Topics I've Started

Shaders: Lighting has different brightness on different PCs

15 December 2012 - 05:46 AM

Hi,

I´m currently developing an online multiplayer game using OpenGL.
Until recently, I used the fixed OpenGL rendering pipeline, but I (nearly) successfully switched to shaders (it looks MUCH better Posted Image )

But there is an issue with the lighting:
On my computer everything is fine (I guess because I took suitable values for the lights) ;
but on two other computers, the lightning is pretty much brighter than on mine. Therefore we can not figure out some good looking values for the light...
Could it have something to do with me having ATI and my friend Geforce?
Anyway, I´ll attach two images of the situation to this post.

Now for the code:
As far as the shader goes, here is the code of the fragmentshader:
[source lang="java"]varying vec4 diffuse, ambient;varying vec3 normal, lightDir, halfVector;varying vec4 glColor;uniform sampler2D tex0;uniform sampler2D tex1;uniform sampler2D tex2;uniform int lighting;uniform int texturing;uniform int alphamap;void main (void){ vec3 n,halfV; float NdotL,NdotHV; vec4 color = glColor; if (lighting==1) {color*=2*ambient;color.a=1; } else if (lighting==2) { color*=ambient;color.a=1; n = normal; NdotL = dot(n,lightDir); if (true) { NdotL+=0.6f; NdotL/=1.6f; color += diffuse * NdotL; } } switch (texturing) { case 0: gl_FragColor=color;break; case 1: gl_FragColor=color*texture2D(tex0,gl_TexCoord[0].st);break; case 2: vec4 texture0 = texture2D(tex0,gl_TexCoord[0].st); vec4 texture1 = texture2D(tex1,gl_TexCoord[1].st); vec4 texture2 = texture2D(tex2,gl_TexCoord[2].st); if (alphamap==0) {gl_FragColor = color*mix(texture1,texture2,texture0.g); } else if (alphamap==1) {gl_FragColor = color*mix(texture1,texture2,texture0.r);} else {gl_FragColor= vec4(1,1,1,1);}break; }}[/source]
Usually lighting equals 2 and texturing equals 1.
There are some calculations in the vertexshader as well:

[source lang="java"] if (lighting>=1) { ambient = gl_FrontMaterial.ambient * gl_LightSource[0].ambient; ambient += gl_LightModel.ambient * gl_FrontMaterial.ambient; } if (lighting==2) { normal = normalize(gl_NormalMatrix * gl_Normal); lightDir = normalize(vec3(gl_LightSource[0].position)); halfVector = normalize(gl_LightSource[0].halfVector.xyz); diffuse = gl_FrontMaterial.diffuse * gl_LightSource[0].diffuse; }[/source]

Finally, I set up the lighting in the program itself:

[source lang="java"]//Lighting GL11.glLightf(GL11.GL_LIGHT0,GL11.GL_SPOT_DIRECTION,180); GL11.glLightf(GL11.GL_LIGHT0,GL11.GL_CONSTANT_ATTENUATION,1); GL11.glLightf(GL11.GL_LIGHT0,GL11.GL_LINEAR_ATTENUATION,0.0f); GL11.glLightf(GL11.GL_LIGHT0,GL11.GL_QUADRATIC_ATTENUATION,0f); FloatBuffer lightPosition = BufferUtils.createFloatBuffer(4); lightPosition.put(0.0f).put(1.0f).put(0.0f).put(0.0f).flip(); GL11.glLight(GL11.GL_LIGHT0,GL11.GL_POSITION,lightPosition); FloatBuffer whiteLight = BufferUtils.createFloatBuffer(4); whiteLight.put(0.7f).put(0.7f).put(0.7f).put(1.0f).flip(); GL11.glLight(GL11.GL_LIGHT0, GL11.GL_SPECULAR, whiteLight); GL11.glLight(GL11.GL_LIGHT0, GL11.GL_DIFFUSE, whiteLight); FloatBuffer lModelAmbient = BufferUtils.createFloatBuffer(4); lModelAmbient.put(1.4f).put(1.4f).put(1.4f).put(1.0f).flip(); GL11.glLightModel(GL11.GL_LIGHT_MODEL_AMBIENT,lModelAmbient); GL11.glEnable(GL11.GL_LIGHTING); GL11.glEnable(GL11.GL_LIGHT0); GL11.glEnable(GL11.GL_COLOR_MATERIAL); GL11.glColorMaterial(GL11.GL_FRONT_AND_BACK,GL11.GL_AMBIENT_AND_DIFFUSE);[/source]
Basically, the scene consists of a simple ambient light and one directional light (representing the sun).
Now, I get that 1.4f may be a pretty high value for the ambient light, but when I turn it down, the scene gets very dark (at least on my computer).

Do you have a clue of where to look for the mistake?
Is there some kind of function or calculation that may be treated differently by different graphic cards or even by different versions of GLSL?
It´s like one computer adds a certain value to the light and the other doesn´t...

Thanks in advance for any clues

Using shaders vs fixed pipline

13 October 2012 - 09:51 AM

Hey,

when I wanted to implement texture splatting for the terrain in my game, I found a post, wich suggested using shaders.
Since I had nearly no experience with shaders until then, I thought it would be nice to learn more about them.
I implemented a simple shader and noticed that the lighting wasn´t there anymore. It looks like one has to program lighting, fog and all that stuff again oneself in the new shader. Is that true?

It would be nice if I just could change the texturing for example and leave the lighting as it is in the standard pipeline.

The only thing I want to do is a nice transition between two or three textures. I know it is possible to do this with the fixed pipeline as well, but I also heard about shaders being much faster doing this. Can I use shaders AND the functionality of the default fixed pipeline or do I have to either do it without shaders or program this functionality myself?

thanks in advance

Problems using JBullet engine

28 May 2012 - 03:53 AM

Hi,

I was searching for a good physics engine, when I discovered JBullet, a java port of Bullet; compatible with LWJGL.
The problem with JBullet is, that LWJGL got updated, and JBullet not, for a long time:
LWJGL now doesn´t support "indirect Buffers" anymore and JBullet uses them; for that problem I found this solution:
http://blog.lolyco.com/sean/2011/05/14/trying-out-jbullet-intbuffer-is-not-direct/
This guy fixed the problem and recompiled JBullet.jar.

When I tried to do this, I got some errors and was able to solve most of them, but now I get an error when compiling using the ant-script that goes with JBullet. I neither understand it, nor a soultion on the internet was to be found:
BUILD FAILED
D:\workspace\J-workspace\JBullet\build.xml:77: java.lang.IllegalStateException: can't find set method for class java.lang.Object
build.xml: 77 refers to the ant-script, relevant section here:
<target name="instrument-classes">
	    <taskdef name="instrument-stack"
		    classname="cz.advel.stack.instrument.InstrumentationTask"
		    classpath="${run.classpath}">
	    </taskdef>

<!-- Next line is important:  -->
	    <instrument-stack dest="${build.classes.dir}" packageName="com.bulletphysics" isolated="true">
		    <fileset dir="${build.classes.dir}" includes="**/*.class"/>
	    </instrument-stack>
</target>

I never worked with ant-files before and couldn´t find a solution for this error.

Now my question is, could anyone
- tell me what I got wrong with the ant-file OR
- tell me how I can get JBullet working in another way OR
- has anyone got an idea of anohter physics engine working with Java?

(Btw. I know the issue would be solved if I downgraded LWJGL from 2.8.3 to 2.2.2, but that is currently not an option.)

So, has anyone got an idea, please?
Thx in advance

UDP - Packet loss, client and server on same computer

03 April 2012 - 07:16 AM

Hi,

my multiplayer-game ran quite well until I increased the viewrange for the forests quite a bit.
When testing this, I got an incredible high packet loss.

Usually this would not confuse me, since I´m using UDP and packet loss is expectable.
The problem is that
  • The loss is quite high, about 17% are lost (without increased viewrange: max. 0,8% over the internet!)
  • I´m running server and client on the same machine, how can this loss be caused?!
I thought of the buffer of the Client not being big enoug, but changing the size from 8kB to 50kB doesn´t have any effect on the loss...
So I cannot think of a way a UDP-Packet can be lost, especially that many.

I´m programming in Java, using the common UDP-Sockets: http://docs.oracle.com/javase/1.4.2/docs/api/java/net/DatagramSocket.html
My initialization code (client) looks like this:
//CPORT is the port I´m using
socket = new DatagramSocket(CPORT);
//setting the time the receive-call blocks
socket.setSoTimeout(1);

So, do you have any idea how those UDP-Packets can be lost?
thx in advance

Game causes lag on VoIP

08 March 2012 - 01:05 PM

Hi,

the thing is this:
The networking of my game works relatively good (using java and udp (DatagramSocket)), but I have an issue when testing it with my friends over Skype or Teamspeak and I don´t have an idea why that is so.
We are connected using hamachi (it simulates a LAN-connection) and there is no lagging at all in the game, but every time the client tries to send data to the server, skype skips a some millisecs, causing great lagging so I´m not able to understand what my friend is saying at all.

For me it looks like sending a package over udp takes relatively long and blocks skype from using the connection in this time.
If you need some code, calling the method looks like this:
data = new byte[....];
....
DatagramPacket sendPacket = new DatagramPacket(data,data.length,HOST_IP,SERVER_PORT);
socket.send(sendPacket);
The maximum size of an udp-package is 1024 in my application; is this to big?
Or is there another, better way of sending packets using udp?


Hope you have suggestions on this,
thx in advance

PARTNERS