Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

arunvb

Help wanted in 32 bit bitmap loading

This topic is 6695 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am trying to load a 32 bit image from a file and display it on the screen. The message is lenghty, but please bear with me. These are the steps I am doing: 1. I am creating a offscreen surface with the dimensions of the bitmap with the following pixel format: // ddsd.ddpfPixelFormat.dwRGBAlphaBitMask = 0xFF << 24; ddsd.ddpfPixelFormat.dwRBitMask = 0xFF << 16; ddsd.ddpfPixelFormat.dwGBitMask = 0xFF << 8; ddsd.ddpfPixelFormat.dwBBitMask = 0xFF; ddsd.ddpfPixelFormat.dwRGBBitCount = 32; 2. Loading the image into a offscreen surface using the following code: bool BitmapSurface::LoadBmpData() { HANDLE fileHandle; DWORD bytesRead; DDSURFACEDESC2 ddsd; LPDIRECTDRAWPALETTE lpddpal = NULL; UCHAR* bmpData; // Open the file fileHandle = CreateFile (bmpfile, GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL / FILE_FLAG_SEQUENTIAL_SCAN, NULL); if (fileHandle == NULL) return 0; // Set the cursor to start of the data SetFilePointer (fileHandle,-(int)(byteSize), NULL, FILE_END); // Read the image the data into the buffer if (!surface7) return FALSE; // Lock the surface to get the image data if (!(Lock(&ddsd, NULL))) { TraceToFile ("Lock of the bmpsurface failed"); return FALSE; } ZeroMemory (ddsd.lpSurface, byteSize); // Allocate memory for the memory pointer bmpData = (UCHAR*) malloc(byteSize); // Read the image data from the file if (!(ReadFile(fileHandle, (void*) bmpData, byteSize, &bytesRead, NULL))) { TraceToFile ("ReadFile failed", GetLastError ()); return FALSE; } // Load the bmpData into the surface // memory is non-linear, copy line by line ULONG* tempPtr; tempPtr = (ULONG*) malloc (byteSize * 4 / 3); // Convert the 24 BPP to 32 BPP for (int curpixel=0; curpixel < width*height; curpixel++) { // convert the 24 bit value to 32 bit value UCHAR red = bmpData[curpixel*3 + 0], green = bmpData[curpixel*3 + 1], blue = bmpData[curpixel*3 + 2]; tempPtr[curpixel] = _RGBA32 (red, green, blue, 0); // _RGB32 definition is: // #define _RGBA32(r,g,b,a) // (b + ((g) << 8) + ((r) << 16) + ((a) << 24)) } // end for ULONG* surfacePtr = (ULONG*) ddsd.lpSurface; for (int curline=0; curline < height; curline++) { // copy the line // memcpy(&surfacePtr[curline*ddsd.lPitch], &tempPtr[curline*width*4], width*4); memcpy((void*)surfacePtr, (void*)tempPtr, width * 4); surfacePtr += ddsd.lPitch / 4; tempPtr += width; } // Unlock the surface UnLock (NULL); // Close the file CloseHandle (fileHandle); return 1; } /* BitmapSurface::LoadBmpData */ 3. I am blitting to the backbuffer with the following code: bool DirectDrawSurface::Blt (RECT& drect, DirectDrawSurface* source, RECT& srect) { DDBLTFX blitFX; DWORD Flags = DDBLT_WAIT; HRESULT hRetVal; if (!created) { TraceToFile ("Blt - surface not created"); return FALSE; } // if the source surface is lost, restore it if (source->surface7->IsLost ()) { if (FAILED (surface7->Restore ())) { TraceToFile ("Restore source->surface7 failed"); return FALSE; } } // if the destination surface is lost, restore it if (surface7->IsLost ()) { if (FAILED (surface7->Restore ())) { TraceToFile ("Restore surface7 failed"); return FALSE; } } ZeroMemory ( &blitFX, sizeof (blitFX)); blitFX.dwSize = sizeof (blitFX); // Set the Color Key Flag // if (colorKey == SOURCEBLT) // Flags /= DDBLT_KEYSRC; // else if (colorKey == DESTBLT) // Flags /= DDBLT_KEYDEST; hRetVal = surface7->Blt (&drect, source->surface7, &srect, Flags, &blitFX); if (FAILED(hRetVal)) { if (hRetVal == DDERR_UNSUPPORTED ) { TraceToFile ("DDERR_UNSUPPORTED "); } TraceToFile ("Blt Failed", GetLastError()); return PrintDDError (hRetVal); } return TRUE; } /* DirectDrawSurface::Blt */ This is where the I get the problem. When I try to Blt with the above code, I get the error "DDERR_UNSUPPORTED". I am not able to figure out what the problem is. Can someone help please. Thanks for your any help in advance. Arun

Share this post


Link to post
Share on other sites
Advertisement
Guest Anonymous Poster
quote:
Original post by arunvb
I am trying to load a 32 bit image from a file and display it on the screen. The message is lenghty, but please bear with me.



Are you sure your videocard support the pixel format
exactly? If you are using hardware rendering, the format
must match EXACTLY with the primary surface buffer.

You can also render 24-bit textures directly instead of
32-bit btw.

Also, if you are not using the blitfx, set that to NULL
instead of giving a pointer to an empty object.

Share this post


Link to post
Share on other sites
My video card is an SGI cobalt chipset.It supports 32 bit coloring. If not, it should have given error in SetDisplayFormat.
Also, all my surfaces ( primary and offscreen ) have 32 bit pixel format, set using blitfx.

Thanks
Arun

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!