This is taking premature optimization to a whole new level.
I know people often bring up that Knuth quote .. but in my experience while you need to beware the pitfalls, it can be a very good idea to try and think about an efficient approach to your whole problem from the start. There are many failed projects and businesses as a result of not taking the time to think carefully about the best way of solving a problem before diving in.
Alright, you probably won't get it right first time, and you typically will end up redoing it several times until you come up with your current 'best solution', but you can save yourself a lot of wasted time. And the big point the 'premature optimization' argument kind of misses is that in many business environments you might not even get the opportunity to refactor it (try explaining software development to a bunch of clueless money crunchers), you might only get one shot.
Typically you need to liaise with a programmer and get them to write some test stuff for the kind of calculations you will be doing on each unit, to get any idea of how many is practical, and design around that.
But other than that, your problem domain is a little vague for us to give specific recommendations. For instance, is this a scientific application? Are the results of each unit required to be correct and calculated, or can they be estimated? Is it running on one machine? Or multiple? GPU, multithreading etc etc.