Jump to content
  • Advertisement
Sign in to follow this  
  • entries
  • comment
  • views

Compressing LDraw database

Sign in to follow this  


One of the main goal for QLMesh was to add some new formats I have been working with quite often, like Photoshop files of bdf fonts.

For 3D it is LDraw formats and DAZ Studio models.

LDraw is one of my favourite. I am currently working on extending Assimp to support .ldr and .mpd files. One of the major challenge is actually not drawing but embedding library definitions into the plugin. Original library it is about 250MB (compressed to ~40MB). That's quite large for Quicklook plugin. I started to work on some heavy compression/optimalization and current result is:

-rw-r--r--  1 piecuchp  staff    40M May 12 17:18 parts.db
-rw-r--r--  1 piecuchp  staff   2.2M May 12 17:18 parts.db.gz

That's much better. 2MB can be easily embedded into plugin, eg. using assembler module like this:

bits 64

section .rodata

global _ldrawlib
global _ldrawlib_end
global _ldrawlib_size

_ldrawlib:      incbin "parts.db.gz"
_ldrawlib_size: dd $-_ldrawlib

and later build with e.g. nasm: 

/opt/local/bin/nasm -fmacho64 ldraw_lib.asm -o ldraw_lib.o



Sometimes less is more. Working on reading gzip stream, I had to remove one of the compression optimisation. The uncompressed file is slightly bigger, but compressed one much smaller:

-rw-r--r--  1 piecuchp  staff    41M Jun 17 12:03 parts.db
-rw-r--r--  1 piecuchp  staff   1.5M Jun 17 12:03 parts.db.gz


Sadly, this is not the end of the story :) I had to increase the precision of the float numbers in the database (it is now 17 bits - sign:8bit:8bit) - it increased the size but also significantly affected the compression ratio:

-rw-r--r--  1 piecuchp  staff    67M Jul 11 08:55 parts.db
-rw-r--r--  1 piecuchp  staff    41M Jul 11 08:55 parts.db.gz

Seems like I am gonna have to live with such database for a while.

Screen Shot.jpg


Sign in to follow this  


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement
  • Advertisement
  • Blog Entries

  • Similar Content

    • By babyjesus
      I'm currently developing a top-down RPG styled game with an Entity Component System architecture and, as the game grows in features, so does my game entities, that is, item models, enemy prototypes, etc. Those definitions are currently in JSON files but at the end of the day I still have long factory classes that read from those files and populate the entities with their respective components and properties in the most naive way.
      Reading through a presentation about Techniques and Strategies for Data-driven design in Game Development (slides 80–93) (warning: big pdf file) there is this "prototyping approach" where you can build up each game entity from multiple prototypes. I find this really interesting, however, the presentation doesn't mention any implementation details and I'm totally in the dark. I don't know how powerful should this system be. By the way, I'm using Java and LibGDX's engine. My first idea is making a robust prototype-instancing factory where, given a JSON file, it will be able to return an entity populated with its corresponding components. For example:
      { "skeleton" : { "id" : 0, "components" : { "HealthComponent" : { "totalHealth" : 100 }, "TextureComponent" : { "pathToTexture" : "assets/skeleton.png" } } } }  
      If I wanted to instantiate a Skeleton entity, I would read it's prototype, iterate over it's components and somehow I would instantiate them correctly.
      With this approach I have the following issues:
      It will most likely involve using Java Reflection to instance entities from a JSON file. This is a topic that I know little about and will probably end up in dark magic code. Some instances properties can't be prototyped and will have to be passed as parameters to the factory. For example, when creating an enemy entity, an (x, y) position will have to be provided. Suddenly creating instances is not so straight forward. How powerful should this system be? Should it have this "recursive" behavior where you can extend a prototype with an other and so on? This sounds a little bit like dependency injection. Am I reinventing the wheel? Is there anything done already I can make us of? Even though it's still in it's infancy, here is a short demo (under one minute) of my game.

      Thank you!
    • By Iris_Technologies
      Suppose i don't have any linker at hand but i am calling an exported function from a C++ DLL Windows, i.e. sqrt from mvcrt14.dll, how would i get just and only just the Relative Virtual Address of sqrt from that dll to simulate what linker does and convert this call to a call to such RVA on the hexcoded generated .exe file? 
      Either, how would i read the RVA of Mac, Android, iOS and Linux library formats?
    • By Neoshaman
      I'm struggling to find the correct way to make ray intersection with curve that are swept along an axis.
      In theory I thought it should be simple, I decompose the problem into component:
      - find intersection on the curve (cross section), which is easy
      - extrude that intersection point into an axis aligned line, solve the slope intersection to that line to get the final offset.

      To be sure I got it right, I'm starting with a swept 45° line centered on origin (line-plane intersection with dot product would be more efficient, but remember I'm trying to validating swiping a long a line).
      - line equation is origine + directionVector * t
      - line to line intersection equation is t = (origine1 - origine2)/(directionVector2 - directionVector1) >> assuming they are never parallel.

      So let line2dIntersection(directionVector1,origine1,directionVector2,origine2)

      Assuming the ray start on the xy plane (pseudo code):
      - I first compute the cross section into xz
      intersection1 = line2dIntersection(rayDir.xz, vector2(origine.x,0), vector2(1,1).normalize(), vector2(0,0));
      result.xz = raydir.xz * intersection1;

      -Then find the slope swipe offset into yz
      intersection2 = line2dIntersection(rayDir.yz, vector2(origine.y,0), vector2(1,0).normalize(), vector2(0,result.z));
      result.y = raydir.y * intersection2;

      But all my result are garbage. What am I doing wrong? where is the leap of logic I did?
    • By Naitsirc
      I wrote a flood fill algorithm in Java to get the bounds of each sprite in a spritesheet. The problem is that it works fine with some sprites but with the rest of them it doesn't. For example:
      I put a red square surrounding some of the errors.
      This is the flood fill method:
      private static Rectangle floodFill(int[] pixels, int x0, int y0, int width, int height) { Rectangle frame = new Rectangle(x0,y0,1,1); Deque<Point> queue = new LinkedList<>(); queue.addLast(new Point(x0,y0)); while(!queue.isEmpty()) { Point p = queue.poll(); final int x = p.x; final int y = p.y; if(x < 0 || x >= width || y < 0 || y >= height) { continue; } // Is not part of a sprite, or has been visited before if(pixels[x+y*width] == 0) { continue; } pixels[x+y*width] = 0; // Mark as visited // Update bounds if(x < frame.x) { frame.x = x; } else if(x > frame.x + frame.width) { frame.width++; } if(y < frame.y) { frame.y = y; } else if(y > frame.y + frame.height) { frame.height++; } queue.add(new Point(x-1,y)); queue.add(new Point(x-1,y-1)); queue.add(new Point(x-1,y+1)); queue.add(new Point(x+1,y)); queue.add(new Point(x+1,y-1)); queue.add(new Point(x+1,y+1)); queue.add(new Point(x,y-1)); queue.add(new Point(x,y+1)); } return frame; } The flood fill method is called from here:
      private static List<Rectangle> split(BufferedImage image) { List<Rectangle> sprites = new ArrayList<>(); int[] pixels = ((DataBufferInt)image.getRaster().getDataBuffer()).getData().clone(); final int width = image.getWidth(); final int height = image.getHeight(); for(int y = 0;y < height;y++) { for(int x = 0;x < width;x++) { if(pixels[x+y*width] != 0) { Rectangle r = floodFill(pixels,x,y,width,height); sprites.add(r); } } } return sprites; } What it does is visit each pixel, and if it is not equal to zero (background color), then it is a sprite, and the flood fill method gets it bounds.
      I have searched for a solution and I've tried many times with different implementations but I couldn't found a solution.
      Can someone help me? I think I am missing something but I can't see it.
    • By multiappple
          I need to project a picture from a projector(maybe a camera) onto some meshes and save those into the mesh texture according to the mesh's unfolded UV.It  just like the light map which encode the lighting-info into the texture instead of the project-info.
          The following picture is an example(But it just project without writting into texture).I noticed blender actually has this function that allow you to draw a texture on to a mesh.But i have no idea on how to save those project pixel into the mesh's texture.
          I think maybe i can finish this function if i have a better understanding about how to produce Light map.Any advises or matertials can help me out?(any idea,any platform,or reference)>?


Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!