You can't scale all aspects unfortunately unless you create some kind of super prediction rules and even then memory space will limit you. You just can't run Doom3 on a gameboy :-) If something however can run at 28Hz compared to 60 you could potentially scale it appropriately by making larger time jumps or similar but it all has a minimum performance or the user would get garbage to look at.
You can scale in space (resolution), time (samples), but memory and interaction are hard to scale as they usually require a certain amount of solver iterations before they "settle".
I wish there was a function f(time) that just magically gave the answer but most of them require some kind of integration over time.
Your best bet is to limit the amount of objects and to limit costly operations of interactions to be solved less frequently (for example by doing more dynamics for closer objects and less for those far away). Yielding potentially to wrong states but that's what you pay for ultimate scalability.
: Restored post contents from history.