Well thing is I know it is not reasonable to call EndRead right after BeginRead since I will have to wait, but I have tried to track down the memory leak in my rather large application, and when I reduced application to only read data, it was still leaking, but when I reduced the aplication to even not make one asynchronus call except accepting connections it does not leak. With only accpeting connections and this 3line code I posted, it leaks, withou it, it does not.
This is how I accept connections:
while (m_tcpListener.Pending()) // accept new request
CClient cl = new CClient();
cl.m_pConnectionHandle = m_tcpListener.BeginAcceptTcpClient(null, null);
and then in my manage thread I finsh attempts like this
fclient.m_bConnected = true;
fclient.m_pClient = m_tcpListener.EndAcceptTcpClient(fclient.m_pConnectionHandle);
fclient.m_pStream = fclient.m_pClient.GetStream();
So far, this works withou memory loosing over time , when many connections are fired on my server.
But, if for each such connected client I add this single operation before I release it:
byte buff = new byte;
client.m_pRHandle = client.m_pStream.BeginRead(buff, 0, 4000, null, null);
int redbytes = client.m_pStream.EndRead(client.m_pRHandle);
memory grows permanently. I keep no reference to buff or so, theese are the only lines I add for real.
As Apochpiq has mentioned, I believe that I realy leak memory, I have run server for hour and it got from 32mb to like 128mb. Can GC perform like this?
Hi, thanks so far.