Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

leehairy

How can i control a connection timeout on a TCP socket?

This topic is 5300 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have an "isAlive" function that simply connects to an open port on a server via TCP. (i havent got the server code so i cannot change anything). 1) If the server is up and is accepting connections i can find out immediately. 2) If the server is down the timeout on my client connection takes an inordinate amount of time to complete. My client code simply opens up a socket and calls connect. If it fails the server is deemed dead. I have tried :- select(fd, NULL, writefds, NULL, timeval) with a timeval set to 3 secs but it makes no difference? The BSD setsockopt() fn takes a READTIMEVAL param, but the one on windows does not. Can anyone help me or point me in the correct direction Thanks

Share this post


Link to post
Share on other sites
Advertisement
I''m pretty sure that the timeout flags for setsockopt() were implemented in Winsock 2.0. Check your documentation for the SO_RCVTIMEO and SO_SNDTIMEO flags.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!