Jump to content
  • Advertisement

og knuckles

Member
  • Content Count

    12
  • Joined

  • Last visited

Community Reputation

125 Neutral

About og knuckles

  • Rank
    Member
  1. og knuckles

    WinAPI Keyboard Input

    It does output a non-NULL handle. GetConsoleWindow() also works to return the handle to the window.   Currently it is command line. I thought about going the inputBuffer route. I am just trying to avoid looping a lot... more like an event-driven listeners. So it only loops if there is input.
  2. og knuckles

    WinAPI Keyboard Input

    Currently im working on a WINAPI keyboard input event-driven system. With not much success I am not fluent in win API so im sure I am either not using it correctly or I am missing something...possibly both .  #define _WIN32_WINNT 0x0500 #include "windows.h" int main() { MSG msg; //HWND consoleWindow = GetForegroundWindow(); while (true) { if (PeekMessage(&msg, NULL, WM_KEYFIRST, WM_KEYLAST, PM_REMOVE)) { TranslateMessage(&msg); DispatchMessage(&msg); } } return 0; } obviously while true is just for debug purposes. So what I am trying to do is pull the applications messages from first key to last. However this code never moves pass the PeekMessage if statement. I have tried passing the consoleWindow handle as well and still nothing.   -this is a command Line window -I am trying to get every key that is being pressed not just the last one so if I press 10 keys at once I need to be able to see 10 instances of KEYDOWN 
  3. og knuckles

    C++ engine hosting LUA scripts

    ok perfect I have been working with LUAJIT and it seems to be exactly what I want. Now I just need to write an interface for rendering:P thanks guys
  4. og knuckles

    C++ engine hosting LUA scripts

    Thank you for all your replies. Should I use luaJIT? 
  5. og knuckles

    C++ engine hosting LUA scripts

    Yeah I can agree with that. However I view this as more of a utility bar without limitations or the need to program it. Like a bar that you can slide back and fourth in game to change specular values. With grim rocks implementation there's no limitations and can be used on a lot more. I am willing to do just as they did only having the audio and render components in c++
  6. og knuckles

    C++ engine hosting LUA scripts

    Yeah overwrite that file and read it again. The developer of GRIMROCK had a page stating he was able to do this during run time ie changing the ai for monsters right infront of him in game. To me this is huge I would love to be able to do this rather than constantly recompiling the whole program.   http://www.grimrock.net/2012/07/25/making-of-grimrock-rapid-programming/
  7. Hello,   So as the title states I'd like to be able to run lua scripts in my c++ engine. I currently have a c++ openGL rendering engine with enough graphical effects to start making games with. I have looked at embedding LUA and it does look fairly simple however there is a few things I am unclear about. How do I make an editor for lua? I understand that once LUA is embedded I only have to run my .exe and then edit the LUA scripts to modify what the game is doing real-time. How am I able to make something that can edit the script while the .exe is running then say ok send the new script to the engine? If you could provide resources/tutorials or a direction for me to head in that would be great! Thanks for your time.
  8. og knuckles

    issues with binary files to VBO

    guys I am such an idiot. I wrote some quick camera class and hacked the view matrix real quick to make it do what I want and I changed a scale value that when multiplied by a translation matrix makes the w or data[15] go right out of wack-.-. I am sorry for wasting peoples time. on the plus note I am not crazy lol I went through so much data and everything was working perfectly.
  9. og knuckles

    issues with binary files to VBO

      The shaders are correct they have a built in error checking system. :/maybe its my drawing I really am at a loss right now.
  10. og knuckles

    issues with binary files to VBO

    so I wrote a little code on the engine to cast it back to float and the data looks perfect:/ heres the code for (int i = 0; i < buffer.size() - 3; i += 4) { char tempArray[4] = { buffer[i], buffer[i + 1], buffer[i + 2], buffer[i + 3] }; float f; memcpy(&f, &tempArray, sizeof(f)); packedData.push_back(f); }
  11. og knuckles

    issues with binary files to VBO

    I took the char * data and wrote it to a file and compared the 2 files and they are identical, so I believe that its reading in correctly.   glGetError() returns 0 after every vbo and vao GL call.   I have rendered stuff to the screen so I know my pipeline is somewhat correct.   my render code is : void renderManager::render() { SHADER_MANAGER->sendToShader("frustrum", CAMERA->perspectiveMatrix); SHADER_MANAGER->sendToShader("view", CAMERA->viewMatrix); for (unsigned int i = 0; i < renderingBuffer.size(); i++) { SHADER_MANAGER->sendToShader("model", renderingBuffer[i]->PositionMatrix); glBindVertexArray(renderingBuffer[i]->m_VAO); glDrawArrays(GL_TRIANGLES, 0, renderingBuffer[i]->m_BufferSize); } glBindVertexArray(NULL); } my shaders and program are linked and binded already. In this snippet I am passing my view, projection and model matrix to the shader.    After that I am binding the objects VAO then calling draw.   then unbinding all VAOs.   as for the hex editor would you be able to recommend one I have notePad++ but im not sure if it has a hex editor plugin   thanks for your time     **so I found the hex editor plugin for notePad++ however I selected the file and ran it and it outputted bad allocation..  originally the data was in floats and I put them into a union with a float and a char[4] its a little odd that its corrupted:/ here is a file that im trying to render it is just a small cube that is triangulated  ÉÞMÁÉÞMÁÉÞMA €? À> €?ÉÞMAÉÞMÁÉÞMA €? ? €?ÉÞMÁÉÞMAÉÞMA €? À> €> €?ÉÞMÁÉÞMAÉÞMA €? À> €> €?ÉÞMAÉÞMÁÉÞMA €? ? €?ÉÞMAÉÞMAÉÞMA €? ? €> €?ÉÞMÁÉÞMAÉÞMA €? À> €> €? ÉÞMAÉÞMAÉÞMA €? ? €> €? ÉÞMÁÉÞMAÉÞMÁ €? À> ? €? ÉÞMÁÉÞMAÉÞMÁ €? À> ? €? ÉÞMAÉÞMAÉÞMA €? ? €> €? ÉÞMAÉÞMAÉÞMÁ €? ? ? €? ÉÞMÁÉÞMAÉÞMÁ €? À> ? €¿ÉÞMAÉÞMAÉÞMÁ €? ? ? €¿ÉÞMÁÉÞMÁÉÞMÁ €? À> @? €¿ÉÞMÁÉÞMÁÉÞMÁ €? À> @? €¿ÉÞMAÉÞMAÉÞMÁ €? ? ? €¿ÉÞMAÉÞMÁÉÞMÁ €? ? @? €¿ÉÞMÁÉÞMÁÉÞMÁ €? À> @? €¿ ÉÞMAÉÞMÁÉÞMÁ €? ? @? €¿ ÉÞMÁÉÞMÁÉÞMA €? À> €? €¿ ÉÞMÁÉÞMÁÉÞMA €? À> €? €¿ ÉÞMAÉÞMÁÉÞMÁ €? ? @? €¿ ÉÞMAÉÞMÁÉÞMA €? ? €? €¿ ÉÞMAÉÞMÁÉÞMA €? ? €? ÉÞMAÉÞMÁÉÞMÁ €? `? €? ÉÞMAÉÞMAÉÞMA €? ? €> €? ÉÞMAÉÞMAÉÞMA €? ? €> €? ÉÞMAÉÞMÁÉÞMÁ €? `? €? ÉÞMAÉÞMAÉÞMÁ €? `? €> €? ÉÞMÁÉÞMÁÉÞMÁ €? > €¿ ÉÞMÁÉÞMÁÉÞMA €? À> €¿ ÉÞMÁÉÞMAÉÞMÁ €? > €> €¿ ÉÞMÁÉÞMAÉÞMÁ €? > €> €¿ ÉÞMÁÉÞMÁÉÞMA €? À> €¿ ÉÞMÁÉÞMAÉÞMA €? À> €> €¿
  12. so I have been struggling with this problem for a while and have tried many different ways of doing it. Currently I am loading in a binary file that is in bytes and sending it to openGL however nothing is rendering.    this is how I have been creating the binary file binaryFloat x,y,z,w,u,v,normx,normy,normz; x.f = PD[i].x; y.f = PD[i].y; z.f = PD[i].z; w.f = PD[i].w; u.f = PD[i]._U; v.f = PD[i]._V; normx.f = PD[i].norm.X; normy.f = PD[i].norm.Y; normz.f = PD[i].norm.Z; fout << x.container[0] << x.container[1] << x.container[2] << x.container[3] << y.container[0] << y.container[1] << y.container[2] << y.container[3] << z.container[0] << z.container[1] << z.container[2] << z.container[3] << w.container[0] << w.container[1] << w.container[2] << w.container[3] << u.container[0] << u.container[1] << u.container[2] << u.container[3] << v.container[0] << v.container[1] << v.container[2] << v.container[3] << normx.container[0] << normx.container[1] << normx.container[2] << normx.container[3] << normy.container[0] << normy.container[1] << normy.container[2] << normy.container[3] << normz.container[0] << normz.container[1] << normz.container[2] << normz.container[3]; binaryFloat is a union containing a float and a char[sizeof(float)]. PD is a packedData vector that contains preparsed data.   Here is where I load it up in my engine: std::ifstream objFile(filename, std::ios::binary); if (objFile.is_open()) { unsigned int length; char *data; objFile.seekg(0, std::ios::end); length = objFile.tellg(); data = new char[length]; objFile.seekg(0, std::ios::beg); objFile.read(data, length); objFile.close(); e->m_BufferSize = (length / (sizeof(float) * 9)); glGenVertexArrays(1, &e->m_VAO); glBindVertexArray(e->m_VAO); glGenBuffers(1, &e->m_Buffer); glBindBuffer(GL_ARRAY_BUFFER, e->m_Buffer); glBufferData(GL_ARRAY_BUFFER, length, data, GL_STATIC_DRAW); glEnableVertexAttribArray(0); glEnableVertexAttribArray(1); glEnableVertexAttribArray(2); glVertexAttribPointer(0, 4, GL_FLOAT, GL_FALSE, sizeof(float) * 9, BUFFER_OFFSET(0)); glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, sizeof(float) * 9, BUFFER_OFFSET(sizeof(float) * 4)); glVertexAttribPointer(2, 3, GL_FLOAT, GL_FALSE, sizeof(float) * 9, BUFFER_OFFSET(sizeof(float) * 6)); glBindVertexArray(NULL); } in this snippet I am loading in the file and sending it to openGL and setting the vbo settings in the VAO. When I run this code data does contain the file so I am not sure why its not rendering.   Thanks for your help and insight in advance:).
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!