Jump to content
  • Advertisement
Sign in to follow this  
Storyyeller

strange performance patterns

This topic is 2531 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a python script which does some calculations, and I am trying to figure out how fast it is. I invoke it from the command line and it prints out the time taken after completing.

However, there is a mysterious pattern to the timings. When I first run it, it starts out relatively fast. However, subsequent runs become slower and slower. If I leave the computer idle for a while, it becomes fast again. I can't really think of any explanation for this. What is going on?

Share this post


Link to post
Share on other sites
Advertisement
What is the difference in time? Have you run your code through a profiler? (Have a look at cProfile: http://docs.python.org/library/profile.html.)

What OS?
What version of Python?
Can you give a basic description of what your code is doing?

Share this post


Link to post
Share on other sites

However, there is a mysterious pattern to the timings. When I first run it, it starts out relatively fast. However, subsequent runs become slower and slower. If I leave the computer idle for a while, it becomes fast again. I can't really think of any explanation for this. What is going on?

Does your script use vast amounts of memory? We sometimes observe this kind of behaviour on scripts that consume 8GB+ of memory (and thus overflow into virtual memory) - it seems the OS often takes a while to free up all that memory back to the next invocation.

Share this post


Link to post
Share on other sites
My os is Windows 7 and I'm using Python 2.6

As far as memory, according to the task manager, the python.exe process peaks around 6mb of memory, so I doubt that that is a problem.

Share this post


Link to post
Share on other sites

If the timing of the overall script with a simple timer is inconsistent, how would in depth profiling help?

You profile several runs, making sure to profile both a quick and a slow run. Compare where the majority of time is spent in each, and you may just have your culprit...

Share this post


Link to post
Share on other sites
While profiling, I noticed that a large amount of time is spent finding the set bits in an integer. Does anyone know the most effecient way to do this? Here's what I have right now. What would be ideal is a c implementation, but there doesn't appear to be one in the standard library, and I don't want to require other libraries.

[source lang='python']def toList(bs):
assert(bs >= 0) #negative numbers have infinitely many bits set
return [i for i,b in enumerate(reversed(bin(bs))) if b=='1']


[/source]

Share this post


Link to post
Share on other sites
Read all of BitManipulation. In particular, you might be able to build some variation of [color=#000000][font=Arial, Verdana, Geneva,]Kernighan/Knuth's bitCount() method, to iterate bits instead.[/font]

[color=#000000][font=Arial, Verdana, Geneva,]Your current solution uses a string conversion, a reversal, an enumeration and a list comphrehension all together - that can't be very fast.[/font]

Share this post


Link to post
Share on other sites
That page looks out of date and inaccurate. Most of the methods involving manual loops are very slow in Python. (This is actually what I tried before I came up with the faster version posted above).

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!