Jump to content

  • Log In with Google      Sign In   
  • Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.


Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


scgames

Member Since 23 Oct 2003
Offline Last Active Private

Topics I've Started

[BMFont] Alpha blending

24 March 2011 - 03:37 PM

This may be user error, but I've run into a little problem with alpha blending with bitmap fonts generated by BMFont.

To export, I'm using a = glyph, rgb = one. Rather than setting the entirety of the r, g, and b channels to one, this appears only to create a block of white around each glyph. This works fine when the pixel ratio is one-to-one, but if the font is scaled when rendered, dark outlines appear around the glyphs due to bleeding between the white pixels and the neighboring black pixels.

This is easy enough to fix in an image editor by filling the rgb layers with solid white, but I'm guessing there's a reason for the way it's done currently.

Am I setting things up wrong? Is there a particular reason the rgb channels aren't just set entirely to identity?

Display mode options

24 February 2011 - 01:09 PM

How important is it to allow users to select from all video modes their hardware supports?

In particular, I'm wondering about color depth and refresh rate. Some games allow you to select from every resolution/color depth/refresh rate combination supported by the hardware, while other games (and game engines) don't expose color depth and/or refresh rate as options.

Selecting color depth and refresh rate automatically and only presenting the user with a list of resolutions has the advantage of simplifying things technically and offering the user a smaller and more comprehensible list of options. Most casual users aren't going to be too concerned with color depth and refresh rate, I would guess.

But, would this alienate other users? Would this be an "I'll uninstall your game instantly if I can't choose the color depth and refresh rate" type of thing?

VBOs and wireframe mode on a GeForce2

06 February 2010 - 05:30 AM

I'm doing some development on some old hardware (GeForce2 MX TwinView, using OS X 10.5.8 on a PCC Mac), and have run into a couple of odd glitches that appear to be related to using VBOs. One of these glitches is that when drawing in wireframe mode using glPolygonMode(GL_FRONT, GL_LINE), nothing is rendered. It works fine on my Windows machine (which has an equally old card - an Intel 910GL Express), and it also works fine on both machines when using vertex arrays. It's only when using VBOs on the GeForce2 that nothing shows up. Here's my test program, which does nothing but render a single wireframe triangle using a VBO:
#include "SDL.h"
#include "GLee.h"

int main(int, char*[])
{
    SDL_Init(SDL_INIT_EVERYTHING);
    SDL_SetVideoMode(400, 400, 0, SDL_OPENGL);

    float verts[] = { -1, -1, 1, -1, 0, 1 };

    GLuint vbo;
    glGenBuffersARB(1, &vbo);
    glBindBufferARB(GL_ARRAY_BUFFER_ARB, vbo);
    glBufferDataARB(GL_ARRAY_BUFFER_ARB, sizeof(verts), verts, GL_STATIC_DRAW_ARB);
    glEnableClientState(GL_VERTEX_ARRAY);
    glVertexPointer(2, GL_FLOAT, 0, 0);

    glMatrixMode(GL_PROJECTION);
    glOrtho(-1, 1, -1, 1, -1, 1);
    glPolygonMode(GL_FRONT, GL_LINE); // <--Renders correctly without this line

    bool quit = false;
    while (!quit) {
        SDL_Event event;
        while (SDL_PollEvent(&event)) {
            if (event.type == SDL_QUIT) {
                quit = true;
            }
        }

        glClear(GL_COLOR_BUFFER_BIT);
        glDrawArrays(GL_TRIANGLES, 0, 3);
        SDL_GL_SwapBuffers();
    }

    SDL_Quit();
    return 0;
}
Does the above code appear to be correct? If anyone would be willing to look it over and confirm that it *should* be rendering a wireframe triangle onscreen, I think I could just write this off as a driver bug and move on. Thanks, Jesse

D3DXSaveSurfaceToFile and JPEG quality

13 December 2009 - 09:34 AM

This is sort of a random question, but is there any way to specify the compression rate that D3DXSaveSurfaceToFile() uses when saving images in JPEG format? And if not, does anyone know what quality setting (and scale) is used?

OpenGL on Windows 7

20 November 2009 - 09:34 AM

I'm working on a game that uses OpenGL for both the OS X and Windows versions. From what I'm reading online, support for OpenGL in Windows 7 is a bit spotty; it seems it's basically left up to the hardware vendors, and in many cases users are having to jump through some hoops - install drivers manually and so on - in order to get OpenGL working on their systems. My game is a little more towards the casual side, and I think a lot of potential users might be deterred by having to install drivers or deal with other similar technical issues. With this in mind, I'm thinking of switching to Direct3D for the Windows version. This will take some work, so I thought I'd post here first to make sure my appraisal of the situation isn't off base. Would switching APIs be the best bet in this case? Or is the OpenGL Windows 7 situation not as bad as it seems?

PARTNERS