Jump to content

  • Log In with Google      Sign In   
  • Create Account


[java] Compiling 3d-objects in jdk118??


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 jonn   Members   -  Reputation: 122

Like
Likes
Like

Posted 30 May 2001 - 12:37 AM

I made a file-converter from .obj to a java-file so that it simply makes 3 lists of vector-components - double[] verts, double[] uv, int[] polys, which worked fine on small files. but with bigger models, 2000+ points, the jdk118 compiler chokes and gives outOfMemoryError?? I tried to increase the max-heapsize, but it doesn't seem to have any effect as the task manager (I'm on w2k) indicates that the memory consumption rises only a few megs.. (when compiling that is, *not* as a result of increasing heapsize..) has anybody had similar problems? could the problem be that I declare the arrays implicitly as in int[] tris = { 1,2,3,4 ... , n }; ? maybe the intepreter reads it in in some awkward way and runs out of memory? how else could I store the vertex-data? how do you usually load 3d-objects into your engines? Jonn :-? Edited by - jonn on May 30, 2001 7:56:26 AM

Sponsor:

#2 lilspikey   Members   -  Reputation: 122

Like
Likes
Like

Posted 30 May 2001 - 01:16 AM

Are you using recursion? If so it might be the stack running out of memory.

John

#3 jonn   Members   -  Reputation: 122

Like
Likes
Like

Posted 30 May 2001 - 05:58 AM

no, I''ve made the arrays into an object, and access them through that object in normal loops..

#4 c_wraith   Members   -  Reputation: 122

Like
Likes
Like

Posted 30 May 2001 - 06:52 AM

One of the problems may be the fact that the java bytecode verifier ensures that several different things are limited to 64k. This includes the runtime stack and each method. I don''t know where you''re putting the data, so I''m not sure how much of a problem this might be.

#5 c_wraith   Members   -  Reputation: 122

Like
Likes
Like

Posted 30 May 2001 - 06:54 AM

Ack... Not the runtime stack. The operand stack. My mistake there. This is a case where doing some research into the JVM specs would help quite a bit. I''m doing some of said research because my compilers course is having us target the JVM.

#6 jonn   Members   -  Reputation: 122

Like
Likes
Like

Posted 31 May 2001 - 11:05 PM

hmm.. what''s the operand-stack? I don''t know that much about compilers/interpreters, only some vague memories from the compulsory ''scheme''-course, (-which seems to be universal plague :-P) - anyway, it seems to me that if there''s a 64k limit, shouldn''t there be some way the jvm handles situations where over 64k is needed? but in general, if there''s some 64k limit, it seems possible the verifier chokes on the implicitly-defined arrays.. gotta check if there''s any indications that for example exceeding 64k in an array makes any difference...

and other thing, the class containing the vertex-data compiles ok by itself, the problem occurs when I try to compile the calling class. In the calling class I create only one instance of that vertex-data-class, and access that in a very straight-forward way, so I really can''t figure out whats choking the compiler.. -I found something on this kind of a bug in jdk118, but the workaround was to increase the max heap-size of the interpreter, but it didn''t do anything in my case.. :-(

maybe I should just try storing the data in some different way..




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS