Jump to content

  • Log In with Google      Sign In   
  • Create Account


#ActualDavidGArce1337

Posted 11 November 2012 - 05:24 AM

So much info, but I understand it better now. (Excuse my lateness!)

One thing that this brought to mind is. With "cloud" gaming, the data sent to the users depends purely on the resolution, no?(If not, do correct me! jaja)

Meaning, say, If I can't stream a 240p video in real time. I can't play a 240p game in real time? And flash video runs at 24 FPS? Or used to...?
My math sucks and I know there is some computation to this...I guess my question(s) would be.

1- How does one convert resolutions(240p,360p,480p,720p,1080p,etc...) to data(?) to figure out how much "download speed" a user "needs" to run the "game" in 24FPS, 30 FPS and 60FPS?

And off the "cloud" topic,

2- How much data is sent to users usually on MMO's and FPS games? Since I can play MMO's and such, with my slow connection. Heck, I could play some MMO's on 32K Dial-up...

3- Adding to (2), HTML5 is a client based runner, correct? As in, the calculation would be the same if the complexity was the same if the "game" ran on HTML5(browser) or C++,C#,Java, etc... client? Because they are all client based systems, right?


(I will reply sooner this time! Thank you all!)

#2DavidGArce1337

Posted 11 November 2012 - 05:24 AM

So much info, but I understand it better now. (Excuse my lateness!)

One thing that this brought to mind is. With "cloud" gaming, the data sent to the users depends purely on the resolution, no?(If not, do correct me! jaja)

Meaning, say, If I can't stream a 240p video in real time. I can't play a 240p game in real time? And flash video runs at 24 FPS? Or used to...?
My math sucks and I know there is some computation to this...I guess my question(s) would be.

1- How does one convert resolutions(240p,360p,480p,720p,1080p,etc...) to data(?) to figure out how much "download speed" a user "needs" to run the "game" in 24FPS, 30 FPS and 60FPS?

And off the "cloud" topic,

2- How much data is sent to users usually on MMO's and FPS games? Since I can play MMO's and such, with my slow connection. Heck, I could play some MMO's on 32K Dial-up...

3- Adding to (2), HTML5 is a client based runner, correct? As in, the calculation would be the same if the complexity was the same if the "game" ran on HTML5(browser) or C++,C#,Java, etc... client? Because they are all client based systems, right?


(I will reply sooner this time! Thank you all!)

#1DavidGArce1337

Posted 11 November 2012 - 05:20 AM

So much info, but I understand it better now. (Excuse my lateness!)

One thing that this brought to mind is. With "cloud" gaming, the data sent to the users depends purely on the resolution, no?(If not, do correct me! jaja)

Meaning, say, If I can't stream a 240p video in real time. I can't play a 240p game in real time? And flash video runs at 24 FPS? Or used to...?
My math sucks and I know there is some computation to this...I guess my question(s) would be.

1- How does one convert resolutions(240p,360p,480p,720p,1080p,etc...) to data(?) to figure out how much "download speed" a user "needs" to run the "game" in 24FPS, 30 FPS and 60FPS?

And off the "cloud" topic, 2- How much data is sent to users usually on MMO's and FPS games? Since I can play MMO's and such, with my slow connection. Heck, I could play some MMO's on 32K Dial-up...


(I will reply sooner this time! Thank you all!)

PARTNERS