RSA Computational Complexity

Started by
5 comments, last by rip-off 13 years, 3 months ago
Hi Everyone,

I am looking for any explanation of RSA Crypto algorithm Computational Complexity on a 8 /16 Bit processor. How much time will it take to generate a 512 bit keys on 8/16 bit processor. Where can I find an explanation to that. Any help would really be appreciated. Waiting for an urgent reply
Advertisement
The computational complexity has nothing to do with the processor's word size. What exactly are you trying to do?

Generating a key mostly consists of trying to find primes of a certain size. The primality check can be done in O(log(n)^3) using the probabilistic Miller-Rabin test without a fancy multiplication algorithm, and you have to check about log(n) numbers, so the total complexity is about O(log(n)^4).
Ok, and what wil be value of "n" in the logs that you have stated. Is that the size of the prime number in bits ?
And secondly you said that the time taken to generate keys is independent of the underlying processor. Well I am a bit concerned about it. To me, I think the time taken would have significant difference if comparison is done between an 8 and 32 bit processor because, one can operate on 8 bits and the other can handle 32 bits at a given time. how would you approach that.

And secondly you said that the time taken to generate keys is independent of the underlying processor. Well I am a bit concerned about it. To me, I think the time taken would have significant difference if comparison is done between an 8 and 32 bit processor because, one can operate on 8 bits and the other can handle 32 bits at a given time. how would you approach that.

Computational complexity is usually stated in terms of the input size as a bound on the number of operations.
When you have a O(n) operation, it is running in c1 * n + c2 time. Changing the bit size on your processor will change c1. That changes the ACTUAL time that your application takes, but it doesn't change the O-Notation time your application takes. Thats an important distinction because, lets say going from 8 bit to 32 bit is a 16x speed improvement. If your algorithm runs in O(n2) time, 16x gain only gets you 4x more items done in the same timespan. If your algorithm runs in O(2n) time, you get 4 more items done. Thats why the O-notation is so prevelant, as it ignore those constant speed up factors. Moving from a O(2n) algorithm to a O(n2) algorithm is going to gain you more than your constant speed up factor.
Do me a favour please. can you find RSA 512 bit computational complexity for me on a 8 bit processor please.
a bit of Stepwise details please.
The computational complexity does not change.

Computation complexity is designed to eliminate these kinds of values, because you are only concerned with the "big picture" when you are using such analysis.

To find out how fast it will be on an 8 bit processor, find or write an implementation and measure. If the processor is simple enough that every instruction has a fixed execution time and there are no page faults to worry about, and if you can fit your working set in the processor cache, you might be able to derive execution time from doing some math on the instructions that would be executed.

This topic is closed to new replies.

Advertisement