I just don't get it... I've created a fairly efficient way of rendering thousands of faces, and even without updating the instance buffer, the framerate is still incredibly erratic! What really bugs me is that Java is a slower language than C#, yet Java can render more faster... Is it possible that Java using OpenGL is faster than XNA is at rendering?
Java and c# is about as fast. Some benchmarks give java slight edge and some give it to c#. Or if you have better data you can point me to that. Minecraft use LWJGL which is straight binding to opengl. There is no ovearhead other than small JNI cost. XNA on other hand has lot more stuff in between gpu and your code.
But eventually this has nothing to do with language. It's all about data stuctures.
Only render what you have to with minimal amount of data. One example is that you waste lot of data using floats as vertex colors. Unsigned byte is enough. This save 3*4bytes per vertice.
Also if you don't have to support many texture you can replace uv coordinates with unsigned char that you use as index. Then use that index with uniform vec2 array to get right texture coordinates. This will work for 256 unique texture coords if this is not enough there is allways unsigned short but you have to remember that uniform buffer size is limited to some gpu dependant value.