Jump to content
  • Advertisement

stdlib

Member
  • Content Count

    14
  • Joined

  • Last visited

Community Reputation

131 Neutral

About stdlib

  • Rank
    Member
  1. stdlib

    Repeating sprites?

      Please elaborate on how they would simplify a 32x32 piece of a fence background that is already as simplified as possible and uses a loop to repeat said graphic across the screen in the background? Thanks!
  2. stdlib

    Repeating sprites?

      https://en.wikipedia.org/wiki/Parallax_scrolling   Some background layers are composed of repeating parts. Tiles are not necessary for this.
  3. stdlib

    Repeating sprites?

    Thanks for the suggestions, everyone! Having worked with OpenGL a lot prior, I had thought the same thing. However, I'm using the hardware-accelerated 2D rendering built into SDL 2.0, so none of these things are actually able to be done in SDL 2.0, to my knowledge. I have found that rendering to a texture and then rendering that texture to the screen afterwards seems to be faster for some reason. Since it works, I can't complain, but now I'm wondering why. I guess maybe because it's all done on the GPU with no data being sent to the renderer/window itself until it is actually needed?
  4. I'm using SDL 2.0 to write a game engine, and, for many things, there are places where sprites repeat, especially in backgrounds (for example, one fence sprite that is repeated all across the screen for the background, as opposed to one long graphic that is rendered one time). I do this by using the SDL 2.0 rendering functions in a loop. Here is some pseudo code: // Render source at dest for entire length of dest. while(dest.w>0) {     Render(sprite,&dest,&clip);     dest.w-=clip.w;     dest.x+=clip.w; } It works perfectly, but it is causing the program to use more an more CPU power the longer it runs. When the player leaves the part of the screen with the repeating sprites, it goes back down to using less than 10% of the CPU. It also does not attempt to render any of the sprites that are offscreen, so it baffles me as to why it is doing this. Any idea how to go about repeating sprites in SDL 2.0 without this happening? Thanks!   Edit: I'm wondering if rendering the background to a texture, then rendering that to the screen would be better?
  5. That's odd. I compiled an Irrlicht demo with OpenGL and that works, but the Direct3D 9 demo doesn't? Yet the simple OpenGL code from the first post still doesn't work? Plus, the Irrlicht demo uses a 3D model that uses skeletal animation, so, unless it was all done on the CPU by Irrlicht, it had to use VBOs. What am I missing here?
  6. You're absolutely right, and it is still crashing even after he installed Nvidia drivers, but now GLView won't open as well. Wow. xD   Do you have any suggestions for a light 3D engine that can compile for DirectX as well as OpenGL? Irrlicht comes to mind, but it seems to have collision code and everything, and I have already written a collision system, and a custom object format for object loading, so I really don't need any of those things, since they will bloat the .exe a lot. I need a renderer that I give it eye coordinates, and then "look at" coordinates similar to gluLookAt. Any suggestions? Thanks!   In C, preferably. xD I have written code to load my custom format, and it was using legacy OpenGL before, but I wrote code to port it from the intermediate format stored in the CPU to VBOs quite easily, so it shouldn't be hard to write code to port it from this intermediate format to whatever format the API for whatever engine I download uses, but the question is, which one can do DirectX and OpenGL without all of the bulk?
  7. Updated the first post with more information. ;)
  8. It is my friend's computer and he is running it directly. After some Googling, I have found that it may be driver-related: http://forum.lwjgl.org/index.php?topic=3144.0   Any suggestions for workarounds? Since glGenBuffers doesn't do anything other than find unused integers for buffer names, I can probably just write a custom function to do what it does, as long as Nvidia's graphics cards still work with glBindBuffer and such.
  9. I wrote a simple program using OpenGL, SDL 2, and GLEW. It works properly on Linux, under Wine, and on every Windows system it has been tested on. However, on another Windows computer, it is crashing as soon as it gets to glGenBuffers, even though it says OpenGL 2.1 is available. Here is the code: #define GLEW_STATIC #include <stdio.h> #include <GL/glew.h> #include <SDL2/SDL.h> SDL_Window *window; SDL_GLContext context; GLuint vbo; void Init() { SDL_Init(SDL_INIT_EVERYTHING); SDL_GL_LoadLibrary(NULL); int min, max; SDL_GL_GetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,&max); SDL_GL_GetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,&min); printf("Default OpenGL version %d.%d\n",max,min); SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,2); SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,1); SDL_GL_GetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,&max); SDL_GL_GetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,&min); printf("OpenGL version %d.%d\n",max,min); SDL_GL_SetAttribute(SDL_GL_ACCELERATED_VISUAL,1); window=SDL_CreateWindow("", 0, 0, 500, 500, SDL_WINDOW_OPENGL); context = SDL_GL_CreateContext(window); glewExperimental=1; GLenum err=glewInit(); if(GLEW_OK!=err) { printf("Error: %s\n", glewGetErrorString(err)); } if(window==NULL) { printf("Could not create window: %s\n", SDL_GetError()); } glClearColor(1.0f, 1.0f, 1.0f, 0.0f); glShadeModel(GL_FLAT); glEnableClientState(GL_VERTEX_ARRAY); float data[][2] = {{50,50},{100,50},{75,100}}; glGenBuffers(1,&vbo); glBindBuffer(GL_ARRAY_BUFFER, vbo); glBufferData(GL_ARRAY_BUFFER, sizeof(data), data, GL_STATIC_DRAW); } void Render() { glViewport(0,0, (GLsizei) 500, (GLsizei) 500); glMatrixMode(GL_PROJECTION); glLoadIdentity(); gluOrtho2D(0.0f, (GLdouble) 500, 0.0f, (GLdouble) 500); glClear(GL_COLOR_BUFFER_BIT); glColor3f(0.0f,0.0f,0.0f); glBindBuffer(GL_ARRAY_BUFFER, vbo); glVertexPointer(2, GL_FLOAT, 2*sizeof(float), 0); glDrawArrays(GL_TRIANGLES,0,3); SDL_GL_SwapWindow(window); } int main(int argc, char **argv){ Init(); Render(); SDL_Delay(2000); glDeleteBuffers(1,&vbo); return(0); } .. . Here is some more information I have added since initially posting this question: Using %p to print the address of glBindBuffer, it is 0. Both (GLEW_ARB_vertex_buffer_object == GL_TRUE) and (GLEW_ARB_vertex_array_object == GL_TRUE) return 0. glGetString(GL_RENDERER) returns "GDI Generic". I have also tried setting the depth size to 16, but that still doesn't work. The graphics card is an Nvidia 750 TI.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!