The issue here isn't obtaining the tokens, it is associating the tokens with their respective classes. Since each class is potentially different you need to deal with each on a case-by-case basis. Thus, as an extremely simple example, you'd want to do something like:
final Scanner read = new Scanner(new File("inventory.txt"));
String temp[];
while (read.hasNext()) {
// whitespace is irrelevant
temp = read.next().split("[ ]+");
// first should always be the class name ...
if(temp[0].equals("Sword") {
// generally you'd want this in a try/catch block of some kind
final int number0 = Integer.parseInt(temp[1]);
// create your object, etc.
}
else if(temp[0].equals("Shield")){ ... }
// etc
}
You'll need to do these even if you leverage
Scanner's regex capabilities, since you have no way of generically creating classes with correctly numbered and typed arguments on the fly. If you included the class name as the identifier you could potentially use reflection, although that is something of a dubious solution here.
If your items are implemented as simple POJOs I would use object serialization with
ObjectInputStream instead.
To state the obvious, using a well-known format such as JSON or XML frees you from the ardous task of parsing the file directly, which may be helpful if you find yourself generating a lot of files with variable syntax.
Other than that, an easier method is to separate the file I/O from the object creation - read your data into a table (or
Map) of some kind and create your objects only when you need them. Use some kind of ID system so you can query for specific item properties. That way, you don't have to neurotically parse the file on a token-by-token basis.
stringtokenizer is deprecated and you should use the split method of the String class instead.[/quote]
While not the case here, StringTokenizer still has its uses in select circumstances.