This is a very strange problem I've recently stumbled upon and for which I have no explanation nor remedy. The situation is the following:
Main thread is UI and Logic (toolkit has own Display connection).
Render thread is purely OpenGL (own context, own Display connection).
If I have one window rendering everything is fine and no memory leaks happen.
If I have two windows then during each render call each window is made current using glXMakeCurrent, rendered and later on swapped. So basically this each frame:
glXMakeCurrent(display, window1Context...)
render window1
glXMakeCurrent(display, window2Context...)
render window2
wait for data for next frame render
The interesting thing is that while everything runs fine the application slowly starts leaking memory. Not doing glXMakeCurrent the leaking goes away. Using glXMakeCurrent the leaking starts again. It's differently fast/slow on different computers.
Any idea what could be wrong there? OpenGL runs entirely in the thread. The main thread has no connection to OpenGL at all. It only has the UI toolkit so to speak. Also the render thread has an own Display connection which is thread-safe.
Ideas welcome since I'm out of ideas right now.
EDIT: Note. I tried only enabling glXMakeCurrent without rendering between the calls and the leaking is the same. So rendering is not the culprit.