[java] Bitdepth in java

Started by
2 comments, last by Jim_Ross 24 years, 2 months ago
Is there anyway to specify bit depth of a component. I thought there was, but when i went looking for it I coudln''t find it. Or if that''s not possible, is there a way to make the Image objects a certain bit depth, maybe by messing with their raster, to make things faster? Also, would it have any affect on the speed of a program, or is the default OS bit depth going to be optimal for java?
Advertisement
I could be wrong, but AFAIK Java apps are just like any other windowed app and must use the bit depth currently set on the user''s machine.
Nope, there is no way to specify the bitdepth of a component.

What you can do is create an image out of integer array (using MemoryImageSource) with either java.awt.image.IndexColorModel or java.awt.image.DirectColorModel. In that case you can specify the palette bitdepth. And you can even e.g. change the palette (when using IndexColorModel) with newPixels(int[] newpix, ColorModel newmodel, int offset, int scansize) command of the MemoryImageSource.

But palettes can''t be used anywhere else and you can''t specify the bitdepth anywhere else in the AWT classes. This is the reason I dropped paletted mode support from GameFrame.

Making things faster, nope I don''t think you have to, because the Image implementation actually hides the bitdepth from you and if they are as brilliant people over Sun as they say they are they have optimized the bitformat "under the hood". If you take a PixelGrabber and grab some pixels they are allways in the default 32-bit AARRGGBB format, but you don''t actually know what is the pixel format used by the image itself.

I have no experiences about the speed impact on programs, but at least if you have a good graphics card in your machine (that accelerates GDI stuff under Win32) you shouldn''t see big differences between 8-bit, 16-bit or 32-bit performance.
-Pasi Keranen
Theoretically, since the internal Java Image format corresponds to how most graphics cards store 32-bit image data, running a Java applet on a 32-bit display should be faster (it doesn''t require any run-time modification of the image data).

However, 32-bit mode requires twice as much data to be moved compared to 16-bit, and since the CPU is MUCH faster than the available system->videomemory bandwidth, I believe that 16-bit is actually faster.

I have no hard evidence to support this, but it should be simple to create a large applet that draws an image using the MemoryImageSource 10000 times, and then run it in both modes. (A plain Image is not enough, since it will most likely have the image data stored internally in the correct format).

/Niels
<b>/NJ</b>

This topic is closed to new replies.

Advertisement