Sign in to follow this  
thedodgeruk

finding speed on different machine's ?

Recommended Posts

i am doing a uni project to make an engine , there are several machine we use , all differnt specs
i can out put how long it takes to render a frame in milleseconds , but on differnt machines the same scene will take differnt amount of time.

im in the middle of optimising it , making changes

is there a figure that i can get work out , with the internal clock speed , to give me a standard amount of time (real or imaginaly time) on every machine ,

ie , the same imaginary time passes on every machine for the same frame render , so i can figure out if i make an improvement or not

Share this post


Link to post
Share on other sites
Generally speaking, I don't think there is a way to do this. Even if you try to do some hocus-pocus with the CPU clock speed, one CPU may be far more efficient at the same clock speed as another. You may also run into issues with power saving, overclocking, and any number of other effects that modify CPU performance. Then there's memory throughput and so on.

Usually what you do is write a [i]benchmark[/i] - a standard load of work that will be done the same on every machine. First you run the benchmark to see how much work a particular machine can do, say, it runs your test workload in 43.6 seconds. Then you write optimized versions of the code and compare how fast [i]those[/i] run to the benchmark; if your optimized version runs in 41.0 seconds, you made an improvement; if it goes to 55 seconds, something went wrong.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this