[java] Compiling 3d-objects in jdk118??

Started by
4 comments, last by jonn 22 years, 10 months ago
I made a file-converter from .obj to a java-file so that it simply makes 3 lists of vector-components - double[] verts, double[] uv, int[] polys, which worked fine on small files. but with bigger models, 2000+ points, the jdk118 compiler chokes and gives outOfMemoryError?? I tried to increase the max-heapsize, but it doesn't seem to have any effect as the task manager (I'm on w2k) indicates that the memory consumption rises only a few megs.. (when compiling that is, *not* as a result of increasing heapsize..) has anybody had similar problems? could the problem be that I declare the arrays implicitly as in int[] tris = { 1,2,3,4 ... , n }; ? maybe the intepreter reads it in in some awkward way and runs out of memory? how else could I store the vertex-data? how do you usually load 3d-objects into your engines? Jonn :-? Edited by - jonn on May 30, 2001 7:56:26 AM
Advertisement
Are you using recursion? If so it might be the stack running out of memory.

John
no, I''ve made the arrays into an object, and access them through that object in normal loops..
One of the problems may be the fact that the java bytecode verifier ensures that several different things are limited to 64k. This includes the runtime stack and each method. I don''t know where you''re putting the data, so I''m not sure how much of a problem this might be.
Ack... Not the runtime stack. The operand stack. My mistake there. This is a case where doing some research into the JVM specs would help quite a bit. I''m doing some of said research because my compilers course is having us target the JVM.
hmm.. what''s the operand-stack? I don''t know that much about compilers/interpreters, only some vague memories from the compulsory ''scheme''-course, (-which seems to be universal plague :-P) - anyway, it seems to me that if there''s a 64k limit, shouldn''t there be some way the jvm handles situations where over 64k is needed? but in general, if there''s some 64k limit, it seems possible the verifier chokes on the implicitly-defined arrays.. gotta check if there''s any indications that for example exceeding 64k in an array makes any difference...

and other thing, the class containing the vertex-data compiles ok by itself, the problem occurs when I try to compile the calling class. In the calling class I create only one instance of that vertex-data-class, and access that in a very straight-forward way, so I really can''t figure out whats choking the compiler.. -I found something on this kind of a bug in jdk118, but the workaround was to increase the max heap-size of the interpreter, but it didn''t do anything in my case.. :-(

maybe I should just try storing the data in some different way..

This topic is closed to new replies.

Advertisement