using apache, custom server, and php for real-time game?

Started by
10 comments, last by slicer4ever 11 years, 1 month ago
this might be a bit weird to go about things, but let me just put it this way: I have a public web server, I can run custom applications on said server, but i can't open public ports for my programs(so i only have apache+php). This is what i want to do(and i've done it for a non-persistant games in the past) use php to communicate with my custom server software(i know how to do this), and keep a connection open through apache/php to my clients, and stream data through that connection. what i don't know, is that if i keep these connections open, well apache start to drop client requests while other clients are connected? my php scripts communicate to my server by opening a localhost connection to the server, is apache/php intelligent enough to put a connection to "sleep" while it's waiting for a response from the server? I don't know what to google to find information about doing such a thing, has anyone tried this before?
Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.
Advertisement
Yes, you can have connections open for a long time.

They are now called "comet" techniques. This could serve as a good jumping-off point: http://en.m.wikipedia.org/wiki/Comet_(programming)
thanks frob, long-polling looks like what i want. however, looking over the wiki article on how it's implemented, from what i understand, when it receives a response, it opens a new connection to the server. is this the only practical way to continuously stream data? i've been running alot of tests, and it would seem so. Since i can't seem to force apache to flush small bits of data on demand(the only way i've figured out how to get apache to send a stream of data is via padding it with alot of blank data.)
Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

This indeed looks messy. Are you sure you want to support that? :D

You're trying to force a square peg through a round hole.

If you need a persistent connection with real-time data updates, you need a TCP connection, not a HTTP connection. If you HAVE to do it with servers and clients you don't have full control over, look into Websockets and see if they are supported in your environment. Without websockets, COMET-style requests are your best bet.

Typically, you'll want to double-buffer the requests.
enum Bool { True, False, FileNotFound };

You're trying to force a square peg through a round hole.

If you need a persistent connection with real-time data updates, you need a TCP connection, not a HTTP connection. If you HAVE to do it with servers and clients you don't have full control over, look into Websockets and see if they are supported in your environment. Without websockets, COMET-style requests are your best bet.

Typically, you'll want to double-buffer the requests.

yea, i know this isn't the ideal scenario to get things working, but i don't have a dedicated public server that i have access to at the moment, so i'm trying to cut the sides off that square peg to make it fit.

this is a c# application, not a browser application, so i have full control over how the client connects.

I like the idea of double-buffering requests, so if i understand what your saying correctly, i attempt to maintain two active connections with the server, so when i get a response, the other connection is still their while i attempt to re-create another connection.

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

i attempt to maintain two active connections with the server, so when i get a response, the other connection is still their while i attempt to re-create another connection.

Yup. It works best when the server can know that there are two requests incoming, and returns the data for the first request when the second request comes in. If you're using plain PHP, you may need to use shared state through something like memcache to make that work, and you'd need to be polling or something to actually sequence it right. It's a right mess when all you have is a "single process per request" model.
enum Bool { True, False, FileNotFound };

i attempt to maintain two active connections with the server, so when i get a response, the other connection is still their while i attempt to re-create another connection.

Yup. It works best when the server can know that there are two requests incoming, and returns the data for the first request when the second request comes in. If you're using plain PHP, you may need to use shared state through something like memcache to make that work, and you'd need to be polling or something to actually sequence it right. It's a right mess when all you have is a "single process per request" model.

well, what i was thinking is doing something like so:

1. client makes request to server, php takes the request, and passes it to my custom server.

2. custom server receives request from php(which begins waiting for a response from the server, before responding to the client.)

3. my curstom server checks who is requesting the data, and adds the request into that user's queue.

4. once the server has data to submit, it selects the first available php request, and submits the data(then the php request hands that back to the client).

5. if no request is availble, the data is buffered while waiting for a request to come in.

do you see anything wrong with this approach?

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

i attempt to maintain two active connections with the server, so when i get a response, the other connection is still their while i attempt to re-create another connection.

Yup. It works best when the server can know that there are two requests incoming, and returns the data for the first request when the second request comes in. If you're using plain PHP, you may need to use shared state through something like memcache to make that work, and you'd need to be polling or something to actually sequence it right. It's a right mess when all you have is a "single process per request" model.

well, what i was thinking is doing something like so:

1. client makes request to server, php takes the request, and passes it to my custom server.

2. custom server receives request from php(which begins waiting for a response from the server, before responding to the client.)

3. my curstom server checks who is requesting the data, and adds the request into that user's queue.

4. once the server has data to submit, it selects the first available php request, and submits the data(then the php request hands that back to the client).

5. if no request is availble, the data is buffered while waiting for a request to come in.

do you see anything wrong with this approach?

Everything.

Ok, let's be blunt: I would not allow this approach in my office. It is crap. So many things can go wrong it is scary.

Communicate over TCP. Find another hosting option.

Sorry for not helping with this response.

i attempt to maintain two active connections with the server, so when i get a response, the other connection is still their while i attempt to re-create another connection.

Yup. It works best when the server can know that there are two requests incoming, and returns the data for the first request when the second request comes in. If you're using plain PHP, you may need to use shared state through something like memcache to make that work, and you'd need to be polling or something to actually sequence it right. It's a right mess when all you have is a "single process per request" model.

well, what i was thinking is doing something like so:

1. client makes request to server, php takes the request, and passes it to my custom server.

2. custom server receives request from php(which begins waiting for a response from the server, before responding to the client.)

3. my curstom server checks who is requesting the data, and adds the request into that user's queue.

4. once the server has data to submit, it selects the first available php request, and submits the data(then the php request hands that back to the client).

5. if no request is availble, the data is buffered while waiting for a request to come in.

do you see anything wrong with this approach?

Everything.

Ok, let's be blunt: I would not allow this approach in my office. It is crap. So many things can go wrong it is scary.

Communicate over TCP. Find another hosting option.

Sorry for not helping with this response.

well, perhaps if you'd actually point out why/how "so many things can go wrong", or provide some examples, i'd actually have a better idea of why such things could go wrong, instead of just walking in, and saying "Nope, find another way, but i'm not going to give you any reason why it won't work." I think i've made it clear that i'd choose another path if i could, so instead of re-iterating a known fact, you might actually be....helpful and tell me why this path is so adamant to failing,

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

This topic is closed to new replies.

Advertisement