Sign in to follow this  
dmreichard

[.net] C# Asynchronous Sockets

Recommended Posts

dmreichard    395
Hey all,

I was wondering what the preferred methods are for thread safety when working with the asynchronous socket/stream methods; more specifically what is the "right" way to block all other threads AFTER the EndRead method call returns so that a client's OnReceiveData() method doesn't clash with the main server thread and other client threads?

Thank you for your time,
David

Share this post


Link to post
Share on other sites
CadetUmfer    234
If you'd provide more detail on the general architecture (how many threads, what their general role is, how they communicate), I can be more help.

What you're probably looking for are [url="http://www.albahari.com/threading/part2.aspx#_Signaling_with_Event_Wait_Handles"]Event Wait Handles[/url]. Anyone working with threads in .NET should read that entire ebook.

Share this post


Link to post
Share on other sites
XXChester    1364
I think what you are looking for is using the lock keyword in c#. This will provide you with thread safety but you have to be careful because you can run into deadlocks.

Share this post


Link to post
Share on other sites
dmreichard    395
[quote name='typedef struct' timestamp='1307380151' post='4820171']
If you'd provide more detail on the general architecture (how many threads, what their general role is, how they communicate), I can be more help.

What you're probably looking for are [url="http://www.albahari.com/threading/part2.aspx#_Signaling_with_Event_Wait_Handles"]Event Wait Handles[/url]. Anyone working with threads in .NET should read that entire ebook.
[/quote]

Absolutely!

Basically, the main server thread is in a loop that first checks for any pending connections, and if there are some it accepts them as a TcpClient, passing this into the constructor of a custom class named "Client"
Client contains an OnRead(IAsyncResult ar) method which is passed as a delegate into the TcpClient's network stream's BeginRead() method.

OnRead() calls the stream's BeginRead() method at the end to continuously grab new incoming data. Effectively, there is one server thread plus however many clients which each have a thread receiving data with the NetworkStream's BeginRead() method.

Now, assuming I'm doing this correctly up to this point; how would I ensure that when data IS available(upon EndRead() returning), the client's OnRead() method can block all other threads UNTIL it is finished processing the incoming packet (as this method can and will modify data in the global model that the main server thread also has access to).

Hope this helps, and thank you both for your help thus far!

Share this post


Link to post
Share on other sites
CadetUmfer    234
Ok, that's what I thought. What you can do here is buffer these packets.

Let's say that your main loop is in a class called Server, and that each Client has a reference to this. Then, in your OnRead method, after you've processed the packet, you would add it to a list to be handled by the server.

Client
[code]void OnRead(IAsyncResult r) {
Packet p = // process packet
server.Receive(p);
// read again
}[/code]

Server
[code]public void Receive(Packet p) {
lock (packets) {
packets.Enqueue(p);
}
}

void YourMainLoop() {
lock (packets) {
while (packets.Count > 0) {
Packet p = packets.Dequeue();
// handle packet
}
}
// your code
}

Queue<Packet> packets = new Queue<Packet>();[/code]

The lock (packets) statements ensure that only 1 thread at a time can Enqueue or Dequeue.

Share this post


Link to post
Share on other sites
dmreichard    395
[quote name='typedef struct' timestamp='1307399703' post='4820278']
Ok, that's what I thought. What you can do here is buffer these packets.

Let's say that your main loop is in a class called Server, and that each Client has a reference to this. Then, in your OnRead method, after you've processed the packet, you would add it to a list to be handled by the server.

Client
[code]void OnRead(IAsyncResult r) {
Packet p = // process packet
server.Receive(p);
// read again
}[/code]

Server
[code]public void Receive(Packet p) {
lock (packets) {
packets.Enqueue(p);
}
}

void YourMainLoop() {
lock (packets) {
while (packets.Count > 0) {
Packet p = packets.Dequeue();
// handle packet
}
}
// your code
}

Queue<Packet> packets = new Queue<Packet>();[/code]

The lock (packets) statements ensure that only 1 thread at a time can Enqueue or Dequeue.
[/quote]

This is exactly what I was looking for. Thank you very much!

Share this post


Link to post
Share on other sites
CadetUmfer    234
[quote name='dmreichard' timestamp='1307400172' post='4820280']
[quote name='typedef struct' timestamp='1307399703' post='4820278']
Ok, that's what I thought. What you can do here is buffer these packets.

Let's say that your main loop is in a class called Server, and that each Client has a reference to this. Then, in your OnRead method, after you've processed the packet, you would add it to a list to be handled by the server.

Client
[code]void OnRead(IAsyncResult r) {
Packet p = // process packet
server.Receive(p);
// read again
}[/code]

Server
[code]public void Receive(Packet p) {
lock (packets) {
packets.Enqueue(p);
}
}

void YourMainLoop() {
lock (packets) {
while (packets.Count > 0) {
Packet p = packets.Dequeue();
// handle packet
}
}
// your code
}

Queue<Packet> packets = new Queue<Packet>();[/code]

The lock (packets) statements ensure that only 1 thread at a time can Enqueue or Dequeue.
[/quote]

This is exactly what I was looking for. Thank you very much!
[/quote]

No problem. Now the important thing with locks is to have as few of them as possible, and to hold them for as little time as possible. So, where I said "handle packet" in the above code, you really don't want to be doing any heavy lifting. So to get around doing that, just add another layer of caching. [i]"All programming is an exercise in caching." -- [/i]Terje Mathisen

[code]void YourMainLoop() {
Queue<Packet> newPackets;
lock (packets) {
newPackets = packets.Clone();
packets.Clear();
}
while (newPackets.Count > 0) {
Packet p = newPackets.Dequeue();
// handle packet
}
// your code
}[/code]

So you add a slight overhead in that you're creating a copy of the list and handling that, but you're also holding the lock for as little time as possible, which is far more important when multithreading.

Share this post


Link to post
Share on other sites
dmreichard    395
[quote name='typedef struct' timestamp='1307454745' post='4820508']
[quote name='dmreichard' timestamp='1307400172' post='4820280']
[quote name='typedef struct' timestamp='1307399703' post='4820278']
Ok, that's what I thought. What you can do here is buffer these packets.

Let's say that your main loop is in a class called Server, and that each Client has a reference to this. Then, in your OnRead method, after you've processed the packet, you would add it to a list to be handled by the server.

Client
[code]void OnRead(IAsyncResult r) {
Packet p = // process packet
server.Receive(p);
// read again
}[/code]

Server
[code]public void Receive(Packet p) {
lock (packets) {
packets.Enqueue(p);
}
}

void YourMainLoop() {
lock (packets) {
while (packets.Count > 0) {
Packet p = packets.Dequeue();
// handle packet
}
}
// your code
}

Queue<Packet> packets = new Queue<Packet>();[/code]

The lock (packets) statements ensure that only 1 thread at a time can Enqueue or Dequeue.
[/quote]

This is exactly what I was looking for. Thank you very much!
[/quote]

No problem. Now the important thing with locks is to have as few of them as possible, and to hold them for as little time as possible. So, where I said "handle packet" in the above code, you really don't want to be doing any heavy lifting. So to get around doing that, just add another layer of caching. [i]"All programming is an exercise in caching." -- [/i]Terje Mathisen

[code]void YourMainLoop() {
Queue<Packet> newPackets;
lock (packets) {
newPackets = packets.Clone();
packets.Clear();
}
while (newPackets.Count > 0) {
Packet p = newPackets.Dequeue();
// handle packet
}
// your code
}[/code]

So you add a slight overhead in that you're creating a copy of the list and handling that, but you're also holding the lock for as little time as possible, which is far more important when multithreading.
[/quote]



Thank you for the additional comment, that is very helpful because I have little knowledge of managing multiple threads at the moment. I was always a select() kind of guy. :P

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this