(void*)(offsetof(struct TFontVertex, v)));
anyway you could change
glVertexAttribPointer(1,4,GL_FLOAT,GL_FALSE,sizeof(VertexFormat),(void*)(offsetof(VertexFormat,VertexFormat::color)));
to
glVertexAttribPointer(1,4,GL_FLOAT,GL_FALSE,sizeof(VertexFormat),(void*)(offsetof(struct VertexFormat, color)));
and use
glVertexAttribPointer(0,3,GL_FLOAT,GL_FALSE,sizeof(VertexFormat),(void*)(offsetof(struct VertexFormat, position)));
but dont enable vertexattribpointer before specifying these.
glgenvertexarrays does not return: GL_INVALID_ENUM, that seems that you have a problem somewhere else (before that line of code)
whenver you call glgeterror it returns the last error, and clears error list, so actually if you call glgeterror before glgenvertexarrays you should get the same gl_INVALID_ENUM
so load all shaders check with glgeterror
then create vbo check fo glgeterror if its still gl_invalid_enum after glgenvertexarrays then somethings wrong it shouldn't throw this.
anyway which version of opengl your card supports maybe you should delete #version tag in shaders.
like i wrote before you did these vbo and vo gluints global?
if yes i could talk about that:
since im on old notebook i cant test that so i am guessing
std::vector<VertexFormat> vertices;
vertices.push_back(VertexFormat(glm::vec3(0.25,-0.25,0.0),
glm::vec4(1,0,0,1)));
vertices.push_back(VertexFormat(glm::vec3(-0.25,-0.25,0.0),
glm::vec4(0,1,0,1)));
vertices.push_back(VertexFormat(glm::vec3(0.25,0.25,0.0),
glm::vec4(0,0,1,1)));
looks like you made your array check for values in debugger and see if you really have these values in that vector
glGenBuffers(1,&vbo);
glBindBuffer(GL_ARRAY_BUFFER,vbo);
glBufferData(GL_ARRAY_BUFFER,sizeof(VertexFormat) * 3, &vertices[0],GL_STATIC_DRAW);
&vertices[0] i am not sure but you may use &vertices.
also you could make that vector global to see if theres a sharing problem between cpu gpu.
i reckon i had problem with vectors in gl buffers too and i switched back to raw pointers.
you could post whole project code, that could help