Archived

This topic is now archived and is closed to further replies.

ahhh writing to a buffer, void pointer troubles

This topic is 5584 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

yesterday, i decided it would be fun to make a program that does some per-pixel effects, so i made one and it was really slow because i was calling glDrawPixels and glRasterPos** for each pixel, so on the irc room someone (jargon) sugested render the pixel data to a buffer and then render that once ur done, making it so you only have to call glDrawPixels and glRasterPos one time each frame. its a good idea. so i tried to do this, but for some reason im having a ton of trouble. so anyway, here is the declaration of my void pointer: void* pixel; // i dont see much wrong with this and here is what im trying to do with it:
(COLORRGB)pixel[y*320+x] = pixelcolor;  //where pixel color is of type COLORRGB (which is a struct i made, its just an array of 3 floats)
 
and last, my error messages:
C:\WINDOWS\Desktop\C++\OpenGl C++\Copy of Wrapper\oglwindow.cpp(406) : error C2036: ''void *'' : unknown size
C:\WINDOWS\Desktop\C++\OpenGl C++\Copy of Wrapper\oglwindow.cpp(406) : error C2440: ''type cast'' : cannot convert from ''void'' to ''struct COLORRGB''
        Expressions of type void cannot be converted to other types
 

Sam Johnston (Samith)

Share this post


Link to post
Share on other sites
first off, make your buffer of type unsigned char* instead of void*

and you have to allocate the buffer first:

unsigned char buffer[screen_x * screen_y * (screen_bpp/sizeof(unsigned char)];

Share this post


Link to post
Share on other sites
You should declare your pointer to what type it is to be pointing to, not void! If you do, it has no idea of what pixel[10] would mean. If you want it to be pixel + 10 bytes, then you should declare it as char *pixel;

// This could work
COLORRGB *pixel;
*pixel[y*320+x] = pixelcolor;

// Or this (not sure though)
void *pixel;
*((COLORRGB *)pixel)[y*320 + x] = pixelcolor;

But why do you want to store each pixel as three floats??? I don''t know the first of OpenGL, so I might be wrong, but having each pixel as 12 bytes sound much to me.

Share this post


Link to post
Share on other sites
daerid: it has to be a void*, because thats the type glDrawPixels takes as the fifth param :

and CWizard: if i do this:
((COLORRGB *)pixel)[y*320+x] = pixelcolor;

it works, but when i run the program it immediatly does an illegal operation :
EDIT: CWizard, if i try *((COLORRGB*)pixel) then i get an error saying i have illegal indirection or something like that :|

and i use floats for each component cuz thats just what im used to doing when it comes to opengl. now that you mention it though, i think im going to change over to using bytes, since the float isnt gonna get me any more accuracy since im in 16 bit color mode...



Sam Johnston (Samith)

[edited by - Samith on September 1, 2002 5:34:49 PM]

Share this post


Link to post
Share on other sites
I don''t think you can do like that. If you are going to put something at the address your pointer is pointing to, you shall write like this:
    char cSome;
char *pChar = &cSome;
*pChar = "a";

Hmm... now when I think about you stuff, I''m abit unsure though. Perhaps the brackets[] makes indirection already in your example. But in your case, you should rather cast the pointer to void * in the function call instead.

Something like this then?

COLORRGB *pixel;
// Not using the indirection here...
pixel[y*320+x] = pixelcolor;
glFunction((void *)pixel, ...);

Share this post


Link to post
Share on other sites
CWizard: ok that worked, almost

typecasting it to void* was genius and just made my life about 20 times better, but i still get a problem. it compiles just fine and when i run the program i get an illegal op, and if i comment out the pixel[y*320+x] = pixelcolor; line it runs just fine, only doesnt display anything...:\

thanks for you help

Share this post


Link to post
Share on other sites
Please post more context of what your doing, perhaps the entire function your doing it in, and perhaps, if you find, a reference to that gl function you''re call, as I''m getting a bit puzzeled here

Share this post


Link to post
Share on other sites
CWizard: ok that worked, almost

typecasting it to void* was genius and just made my life about 20 times better, but i still get a problem. it compiles just fine and when i run the program i get an illegal op, and if i comment out the pixel[y*320+x] = pixelcolor; line it runs just fine, only doesnt display anything...:\

thanks for you help

Share this post


Link to post
Share on other sites
What format should the data be in when you pass it to the OpenGL-function? Try to put just one pixel at the base address (0,0) of your block, and see if it displays. Note that if pixel point to, say, 0x00050000, pixel[10] will point to address 0x00050078, as the pixel is declared as a pointer to COLORRGB which are of size 12. Is this what you want?

Share this post


Link to post
Share on other sites
the gl function isnt the problem, i know that, since ive commented it out and had it not commented out and its the same either way.

anyway, here is my function:


    
bool COGLWindow::RenderScene()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Clear Screen And Depth Buffer

glLoadIdentity(); // Reset The Current Modelview Matrix


float dist1, dist2, dist3, avgdist;
float xdiff1, xdiff2;
float ydiff1, ydiff2;


int x, y; // screen coords

for(y = 0; y <= height; y++)
{
for(x = 0; x <= width; x++)
{
xdiff1 = x - Point1->x;
xdiff2 = x - Point2->x;

ydiff1 = y - Point1->y;
ydiff2 = y - Point2->y;

dist1 = sqrt(xdiff1*xdiff1 + ydiff1*ydiff1);
dist2 = sqrt(xdiff2*xdiff2 + ydiff2*ydiff2);

avgdist = (dist1 + dist2)*0.5f;
if((int)avgdist%2 > 0)
{
pixel[y*320 + x] = pixcol; // this is it, this is what causes all the problems, if i comment it out the program runs just fine, only i dont get any pixels displayed

}
}
}

glRasterPos2i(0, 0);
glDrawPixels(width, height, GL_RGB, GL_FLOAT, (void *)pixel); // this is the function, it doesnt make any problems though


MovePoint(Point1, 4, 3); // these functions just move the points around

MovePoint(Point2, 8, 6);
MovePoint(Point3, 2, 5);

return true; // Everything Went OK

}


i know i know the function probably isnt the most efficient thing you will ever see, but it *in theory* gets the job done

[edited by - Samith on September 1, 2002 6:11:14 PM]

Share this post


Link to post
Share on other sites
Currently looking at your function, but what I was primarily interesting in seeing was the declaration/definition/allocation of your buffer (pixel). I guess it is a member var, but it would be helpful the related lines.

And for the function, glDrawPixels(), I was interested in what format it expected your buffer to be in. You said something about 16bit, then, is pixcol a short?

Share this post


Link to post
Share on other sites
I was going to suggest this before, but I wasn''t sure about what you were drawing then. It''s much more effective if you are going to process all pixel linearly to just increment you pointer, like: *pixel++ = pixcol. You avoid the multiplication then.

Share this post


Link to post
Share on other sites
ok well i changed it to just have incrementation (is that even a word?) but still now luck, i get the illegal op

anyway, the declarations:
COLORRGB pixcol; // pixelcolor for short
COLORRGB *pixel;

Share this post


Link to post
Share on other sites
Ok. Are you sure the glDrawPixels() wan''t a buffer of COLORRGB? And, where do you allocate the memory pointed to by pixel?

Does OpenGL convert the pixels of your type (COLORRGB) to whatever is used (16bit?)? Otherwise you must/should use the same format and declare your buffer as such. As I said before, I don''t know the first of OpenGL, so I don''t know what pixel type structs there are, but I would do something like this:

unsigned short pixcol = 0xffff; // White
unsigned short *pixel = malloc(width * height * sizeof(short));
// or... (little unsure about this new thingy here though)
unsigned short *pixel = new short[width * height];
...
// Make a copy so we can free the memory and reset it etc.
unsigned short *pAddress = pixel;
// Plot a pixel
*pAddress++ = pixcol;
// Blit the buffer
glDrawPixels(width, height, GL_RGB, GL_FLOAT, (void *)pixel);

Share this post


Link to post
Share on other sites
glDrawPixels just takes raw data, and depending on the 3rd and 4th params decides how to use that raw data. in my case i have it at GL_RGB and GL_FLOAT, which tells opengl to take count every 3 floats as a pixel, the first float being the red, 2nd being green, 3rd being blue.

COLORRGB is just my way of organizing things, in the end its not anything special, its just raw data thats read by glDrawPixels.

my problem is just that i crash my program whenever i try to assign anything to pixel, i tried changing pixel into an array:
COLORRGB pixel[76800]; so that i wouldnt be assigning a pointer to something thats not a pointer, but no, it still crashed, anything i try crashes :\

Share this post


Link to post
Share on other sites