• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
maya18222

Image compression

19 posts in this topic

Can anyone suggest any quick image compression techniques for rgb 888 data? So far I'm using RLE and compressing down each component into 565.

Short of going full scale with a jpeg library, is there Anything else worth investigating?
0

Share this post


Link to post
Share on other sites
This is a hobby project, so something I could implement myself really, purely for the learning experience
0

Share this post


Link to post
Share on other sites
[quote name='maya18222' timestamp='1340561308' post='4952383']
This is a hobby project, so something I could implement myself really, purely for the learning experience
[/quote]

In that case I recommend either going for an algorithm in the LZ* family (LZ77, LZ78, LZW, LZMA, in order of difficulty) since they are somewhat simple and achieve good compression on general data. This is what Zip & friends are using. And even the worst LZ algorithm should beat LRE in almost every case as far as compression goes.

If you want something images specific (and lossy), I guess you could go for jpeg or a subset of it. However, all lossy algorithms are, ironically, a lot more complex than lossless ones. So prepare for a big learning experience. ;)
0

Share this post


Link to post
Share on other sites
I think a fun project would be to implement image compression using wavelets. If you aren't willing to learn some heavy Math, maybe you should stick to lossless algorithms.
0

Share this post


Link to post
Share on other sites
I can appreciate the value of DIY in learning, so this might not apply for OP, but here's my recommendation: the [url="http://code.google.com/p/crunch/"]crunch[/url] library. It combines lossy and lossless compression techniques, and decodes directly into DXT (=BCx/S3TC) formats. Decoding is extremely fast, disk sizes are smaller than what .zip/.7z formats offer and because of DXT, is nicely ready-to-use for GPU. The library is contained in three .h files, and integrating it (both to my asset build toolchain and runtime loader) was only a few hours of work, when I already had an existing DXT format support. I can't recommend it enough for anyone looking for GPU-friendly 3D texture formats.
0

Share this post


Link to post
Share on other sites
I think the first steps would be to try to reduce the image data (lossy).
- convert 4x4 pixel blocks to YCbCr colors, save Y for every pixel, but Cb and Cr just once per block -> 18byte/16pixel and you might not see any difference
- make a delta compression, e.g. for Y and CB/CR lines calculate a difference to the previous line, in photographies it will usually reduce a lot of data as pre-step for further compressions
-although RLE is a nice start, go on with LZ77, then LZ78 and LZSS are not that different, but can increase the compression ratio
0

Share this post


Link to post
Share on other sites
My engine uses a proprietary format with the extension .LSI.
It is fully lossless and:
[b]Always[/b] Smaller Than the Same Image in:[list]
[*].BMP
[*].TGA
[*].GIF
[*].DDS
[/list]
[b]Sometimes[/b] Smaller Than the Same Image in:[list]
[*].PNG
[*].JPG (and is not lossy)
[/list]
I documented the routine [url="http://lspiroengine.com/?p=38"]here[/url]. Note that while I mentioned a lossy version of the routine in the blog, all notes above are in regards to the lossless version.
Index compression is what usually wins because the number of bits used for the table is not fixed. If you have only a few colors in your image, the table could easily end up being only 5 or 6 bits per entry or so. The table itself is then compressed using a modified LZW routine, and the look-up table is also compressed.


This should give you plenty of ideas for your own format.


L. Spiro Edited by L. Spiro
0

Share this post


Link to post
Share on other sites
[quote name='L. Spiro' timestamp='1340621655' post='4952612']
...

L. Spiro
[/quote]
why don't you upload some sample pictures, so we could see with our own eyes what ratio you got and what usual png optimizers deliver? I'd be curious to validate the results.
0

Share this post


Link to post
Share on other sites
There is no public-domain decoder for .LSI images, so you would not be able to actually open the files, but when converting models it shows me the size of the .LSI file as a percentage of the original file.
Here are some results.
TGA/BMP files can offer significant size decreases (on the second screenshot there was a TGA amongst all the PNG’s compressed to 0.804009% of its original size (1,048,620 bytes to 8,431 bytes) but I did not include it in the shot).
[attachment=9648:LSTGARes.png]

Here are .PNG files recompressed as .LSI. When the .LSI file is larger than the .PNG file (or whatever the source format is), the converter uses the original file, so the percent is 100%. That means .PNG won.
[attachment=9649:LSPNGRes.png]

With .PNG it is often close, and .PNG sometimes wins, but .LSI wins often as well, and sometimes by quite a large margin (as low as 25% in this case).



L. Spiro Edited by L. Spiro
0

Share this post


Link to post
Share on other sites
Here are some images and their ratios.

.PNG = 40,725, .LSI = 45,094 = 110.72%
[attachment=9654:CORVETTEZ06 RIM.png]


.PNG = 32,865, .LSI = 29,990 = 91.25%
[attachment=9655:CORVETTEZ06 BADGING MASK.png]


.PNG = 14,460, .LSI = 5,209 = 36.02%
[attachment=9656:TIREBACK.png]


.PNG = 55,564, .LSI = 44,546 = 80.16%
[attachment=9657:TIRE BUMP.png]


.PNG = 99,420, .LSI = 130,098 = 130.86%
[attachment=9658:CORVETTEZ06 BRAKELIGHT.png]


Photographic images tend to be larger than .PNG, but I haven’t seen any go above 175% yet, which is still quite acceptable given how small .PNG’s are.

On the other hand cartoony images and masks compress very well compared to .PNG’s.

Normal maps are generally similar in size.


L. Spiro Edited by L. Spiro
0

Share this post


Link to post
Share on other sites
also aimed to try it with optipng, similar to what L.spiro does, it 'losslessly' reduces the images to as few colors as possible before it starts the binary compression.

in cases where this is not possible (photography), png was better anyway. which leads to: the lzw algorithm used in LSI is not reaching the compression ration of deflate in png, which shouldn't be of a surprise, the way it combines lz77 with huffman is very smart, IMHO.
2

Share this post


Link to post
Share on other sites
[quote name='Steveway' timestamp='1340792648' post='4953267']
Even though 2 of the examples are still bigger than yours (even though we have to belive that you don't lie to us since you neither posted one of those pictures or any other proof.) png still wins in most cases and has the benefits of being a free specification.
EDIT: Sounds a little bit provocative, don't take it too seriously, I just like the topic.
[/quote]
I admit that I just used the .PNG images as they were in the model packages I got. I assume these represent the average use case.

I am not too disappointed at slightly losing to .PNG in lossless form (lossy compression still wins handily with pretty nice results). I will have to update my posts and blog, but the main point is that these are practical and easy-to-implement ideas for getting small image files while maintaining losslessness.

I have a few more ideas that were more complicated to implement so I have been putting them off, but maybe I will implement them soon and see how they fair against aggressively optimized .PNG files.


Also, my images will be “free specification” soon. I am not keeping secrets etc., which is why I documented the algorithms roughly in the first place. I will get around to making Photoshop plug-ins and an official specification eventually.


L. Spiro
2

Share this post


Link to post
Share on other sites
[quote name='L. Spiro' timestamp='1340862656' post='4953562']
I admit that I just used the .PNG images as they were in the model packages I got. I assume these represent the average use case.[/quote]yeah, you are right and it's sad that most tools don't have even minimal optimizations for png, just vanilla libpng mostly!

[quote name='L. Spiro' timestamp='1340862656' post='4953562']I am not too disappointed at slightly losing to .PNG in lossless form ... but the main point is that these are practical and easy-to-implement ideas for getting small image files while maintaining losslessness.[/quote]it's easy, but from the algorithm point of few it's also inferior as you use very basic compression ideas. you should read how to combine LZ.. algorithms with entropy encoded algorithms like huffman and artihmetic encoding. additionally some transformations like burrows wheeler transform, move to front, prediction coding etc. are very simple to implement and can be done as a pre-step to optimize the data for better compression ratios.
beside compression ratio, it's also important how fast it is to compress/uncompress and how much memory you need. png is very memory friendly, it works in tiny chunks of 64kb, which can be implemented on SPUs of the CELL (PS3), and also on embeded boards, even on GBA some games used it.


[quote name='L. Spiro' timestamp='1340862656' post='4953562']...(lossy compression still wins handily with pretty nice results). ...[/quote]feel free to post a png image of a previously lossy compressed LSI. I'm sure png will win once again.


[quote]
I have a few more ideas that were more complicated to implement so I have been putting them off, but maybe I will implement them soon and see how they fair against aggressively optimized .PNG files.[/quote]you should first find out how exactly png works, to not reinvent the wheel, because the first few times, you will be inferior if you don't learn from previous work of others,my 2 cent.

[quote]
Also, my images will be “free specification” soon. I am not keeping secrets etc., which is why I documented the algorithms roughly in the first place. I will get around to making Photoshop plug-ins and an official specification eventually.
[/quote]
may I suggest that you stop writing a proprietary format that probably nobody will ever use beside you and invest your time into a good png optimizer? your quantitization algorithms will probably work just as good if afterwards compressed with png, as they would work with LSI, but the ratio is probably better. Beside that, png has still a lot of potential to reach higher compression ratios and if you do it, your png-L.spiro plugin might be used by a lot of people. Isn't that the whole point of making something public?

I don't wan to kill your fun or motivation, just some suggestions ;)
2

Share this post


Link to post
Share on other sites
I think we have a misunderstanding.
My goal was never to beat .PNG. I know how it works (I wrote the decoder I am using for it) but that was too much trouble considering my goal: A format suited specifically for games, but still small in size for mobile devices and lossless for quality.

That means direct support for a lot of common formats used in games such as R5G6B5, A8, A8L8, R8G8, R16B16G16A16, etc.
That also means embedded mipmaps.


As for who may or may not end up using my plug-in or format, the goal of my engine is commercial and if it does well enough I will use it to start a middleware company. So it would be unhealthy for me to doubt whether or not people will end up using it.
Right or wrong, everything I make is made with the assumption that people will be using it someday.


Maybe someday I will get around to writing an encoder for .PNG and start applying it to my format, but I have a lot to do—I haven’t even had the time to finish my DXT compressor and make a release—so it is not a high priority right now.


L. Spiro
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0