# RSA Computational Complexity

This topic is 2895 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hi Everyone,

I am looking for any explanation of RSA Crypto algorithm Computational Complexity on a 8 /16 Bit processor. How much time will it take to generate a 512 bit keys on 8/16 bit processor. Where can I find an explanation to that. Any help would really be appreciated. Waiting for an urgent reply

##### Share on other sites
The computational complexity has nothing to do with the processor's word size. What exactly are you trying to do?

Generating a key mostly consists of trying to find primes of a certain size. The primality check can be done in O(log(n)^3) using the probabilistic Miller-Rabin test without a fancy multiplication algorithm, and you have to check about log(n) numbers, so the total complexity is about O(log(n)^4).

##### Share on other sites
Ok, and what wil be value of "n" in the logs that you have stated. Is that the size of the prime number in bits ?

##### Share on other sites
And secondly you said that the time taken to generate keys is independent of the underlying processor. Well I am a bit concerned about it. To me, I think the time taken would have significant difference if comparison is done between an 8 and 32 bit processor because, one can operate on 8 bits and the other can handle 32 bits at a given time. how would you approach that.

##### Share on other sites

And secondly you said that the time taken to generate keys is independent of the underlying processor. Well I am a bit concerned about it. To me, I think the time taken would have significant difference if comparison is done between an 8 and 32 bit processor because, one can operate on 8 bits and the other can handle 32 bits at a given time. how would you approach that.

Computational complexity is usually stated in terms of the input size as a bound on the number of operations.
When you have a O(n) operation, it is running in c[sub]1[/sub] * n + c[sub]2[/sub] time. Changing the bit size on your processor will change c[sub]1[/sub]. That changes the ACTUAL time that your application takes, but it doesn't change the O-Notation time your application takes. Thats an important distinction because, lets say going from 8 bit to 32 bit is a 16x speed improvement. If your algorithm runs in O(n[sup]2[/sup]) time, 16x gain only gets you 4x more items done in the same timespan. If your algorithm runs in O(2[sup]n[/sup]) time, you get 4 more items done. Thats why the O-notation is so prevelant, as it ignore those constant speed up factors. Moving from a O(2[sup]n[/sup]) algorithm to a O(n[sup]2[/sup]) algorithm is going to gain you more than your constant speed up factor.

##### Share on other sites
Do me a favour please. can you find RSA 512 bit computational complexity for me on a 8 bit processor please.
a bit of Stepwise details please.

##### Share on other sites
The computational complexity does not change.

Computation complexity is designed to eliminate these kinds of values, because you are only concerned with the "big picture" when you are using such analysis.

To find out how fast it will be on an 8 bit processor, find or write an implementation and measure. If the processor is simple enough that every instruction has a fixed execution time and there are no page faults to worry about, and if you can fit your working set in the processor cache, you might be able to derive execution time from doing some math on the instructions that would be executed.

1. 1
2. 2
Rutin
19
3. 3
khawk
18
4. 4
5. 5
A4L
11

• 12
• 16
• 26
• 10
• 44
• ### Forum Statistics

• Total Topics
633767
• Total Posts
3013739
×