Jump to content

  • Log In with Google      Sign In   
  • Create Account

Transmission latency - OpenGL


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
10 replies to this topic

#1 donguow   Members   -  Reputation: 173

Like
0Likes
Like

Posted 30 January 2012 - 09:21 AM

Hi all,

I got a problem and hope someone here can help me out. I have a server which performs rendering according to the request from the client and then sends the rendered image back to the client. Currently, the server just simply sends every single image to the client at a time so it introduces significant delay. Is there any way to stream data from the server to the client that can reduce the transmission latency?

I will really appreaciate your help!
Dong

Sponsor:

#2 Brother Bob   Moderators   -  Reputation: 8201

Like
0Likes
Like

Posted 30 January 2012 - 09:37 AM

Unless you are talking about client and server as defined by the OpenGL specification (which I assume you're not since then you wouldn't be sending images), then this has nothing to do with OpenGL. Moving to the networking forum.

#3 Antheus   Members   -  Reputation: 2397

Like
0Likes
Like

Posted 30 January 2012 - 10:03 AM

Latency cannot be changed, it's limited by speed of light these days.

What can be increased is bandwidth.

Instead of sending requesting a single image and waiting for response, request 10 images and as they are received, keep requesting more. Such mechanism is used even in local applications, both OGL and DX support multiple buffers. While one is being displayed on screen, the other is waiting and the third is being drawn to.

If you cannot request more, then there is no solution, except changing the physical configuration of the server.

#4 turch   Members   -  Reputation: 590

Like
0Likes
Like

Posted 30 January 2012 - 10:21 AM

You can encode the raw frames with some sort of video compression. How well that works depends on what you're rendering. Most video compression is based on differences between frames - if you have a fairly static scene with little movemen (like a cursor on the desktop) it will help dramatically. If you are rendering something like a first person point of view, not so much (because moving even slightly will change the entire frame).

#5 Antheus   Members   -  Reputation: 2397

Like
0Likes
Like

Posted 30 January 2012 - 10:32 AM

Most video compression is based on differences between frames - if you have a fairly static scene with little movemen (like a cursor on the desktop) it will help dramatically. If you are rendering something like a first person point of view, not so much (because moving even slightly will change the entire frame).


Video compression will drastically increase latency. The higher the compression, the worse the latency.

Ideal temporal video compression would take all frames into consideration, which means waiting for all frames to be generated before they can be analyzed for compression.

Size of each image itself doesn't impact latency beyond some basic factors, it's only a factor of bandwidth.

#6 turch   Members   -  Reputation: 590

Like
0Likes
Like

Posted 30 January 2012 - 11:15 AM


Most video compression is based on differences between frames - if you have a fairly static scene with little movemen (like a cursor on the desktop) it will help dramatically. If you are rendering something like a first person point of view, not so much (because moving even slightly will change the entire frame).


Video compression will drastically increase latency. The higher the compression, the worse the latency.

Ideal temporal video compression would take all frames into consideration, which means waiting for all frames to be generated before they can be analyzed for compression.

Size of each image itself doesn't impact latency beyond some basic factors, it's only a factor of bandwidth.


You are correct, I misread the question as one of bandwidth :D

#7 hplus0603   Moderators   -  Reputation: 5309

Like
0Likes
Like

Posted 30 January 2012 - 11:37 AM

It depends on whether the latency is introduced by bandwidth limitations, or by latency limitations.
If the latency is introduced because it takes a long time to send the image because bandwidth is limited, then compression can decrease latency. (Note: B-frames, that need to see "the future," are unlikely to be a good match for real-time video compression)
However, the problem is so vague that I don't quite understand how to answer it. What is the service? What does the client do, an what does the server do? What data does the client send to the server? What does the client do with the data it receives from the server?
enum Bool { True, False, FileNotFound };

#8 donguow   Members   -  Reputation: 173

Like
0Likes
Like

Posted 30 January 2012 - 10:22 PM

Hi all,

What's I am testing now is on a single machine, both client application and server application are on the same PC. So bandwidth is not really a big issue, I think. Of course, if we have better hardware, the processing time can be reduced.

To make it a litle bit clearer, I will briefly discribe my system as follows:

Suppose I have two machines, one is server and the another one is client. The server stores a database of 3D models and will be responsible for performing rendering when the client requests. For example, for thin clients which oftent lack graphics processing units, it's impossible to perform rendering by itself or for pretty big and complex models it may take quite long to process at the client so it could be better if we offload the computational workload to the server.

My idea is that the server renders the 3D model and save the result as an image file (let's say a bitmap file) and then sends it to the client. So the client wont do anything except sending the request to the server and displaying the image on the screen. This way is quite simple to implement but yields quite low performance. Like every move of the user, it has to send a request to the server and then wait for the update.

I just think that there should be a smarter way to do this. I have read a paper, and what people have done is to generate a live video while the server is rendering and then stream video data to the client to be displayed. The paper did not mention very detail how to implement that so it just gave me the basic idea. And to be frank, I am not familier with video coding/encoding so just hoping that someone here can help me to figure it out.

Thank for your help,
Dong

#9 Antheus   Members   -  Reputation: 2397

Like
1Likes
Like

Posted 31 January 2012 - 08:41 AM

This way is quite simple to implement but yields quite low performance. Like every move of the user, it has to send a request to the server and then wait for the update.

Yes, there is no way around it. To render a new image, you need to wait for input.

I have read a paper, and what people have done is to generate a live video while the server is rendering and then stream video data to the client to be displayed.


Games keep going, whether user presses a button or not. So to render next frame, they don't need to wait for user input. But reactions to key presses will still be delayed by time needed to send the events and update the state.


To understand your poor performance, measure the time needed to:
1) register input on client
2) send this input to server
3) generate the image
4) send this image to client
5) display image on client

Sending a raw 1024*768*4 image over 52mbit LAN (wifi, usual home router) takes ~0.5 second.
Sending same image compressed as JPEG (~200kb) takes only 0.03 seconds, but also requires server to compress the image and client to decompress, which takes some time as well. So compression will take a total of T_compress + 0.03s + T_decompress time to send.

Steps 1-5 must take less than what is perceptible to human eye. 30 FPS would be very good. Or, 1s/30 = 0.033s. So all steps, 1-5 must take less than 0.033 seconds. In above example, the JPEG image would simply be too big, despite compression unless everything else takes 0 time.


These are the hard limits that must be addressed first. Once you ensure that your server is capable of performing full roundtrip at desired framerate, then you can start thinking about other optimizations, should they still be needed.

OnLive had to develop custom hardware to tackle this problem, but they also deal with high latency of WAN. But it's doable.

What such services cannot do, is render in advance. Anything that depends on user input (key press, mouse move), cannot be reliably predicted, so those will always be constrained by physical limits.

#10 hplus0603   Moderators   -  Reputation: 5309

Like
1Likes
Like

Posted 31 January 2012 - 11:17 AM

Are you familiar with services that already do this, such as OnLive?
enum Bool { True, False, FileNotFound };

#11 donguow   Members   -  Reputation: 173

Like
0Likes
Like

Posted 01 February 2012 - 05:58 AM

Thank Antheus, the above steps are very straightforward to locate the bottleneck. I will check it out. I did think of using compression techniques to deal with transmission latency. I will be highly appreaciated if you can point me to some tutorials or examples of how to implement it.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS