Jump to content
  • Advertisement
Sign in to follow this  

Looking for some math ideas regarding a crypto protocol

This topic is 2850 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've run into a road block when trying to understand the math that goes on in a specific crypto protocol I am studying for educational purposes.

First, this is how it's setup. The client generates values using these two equations:

- 4 (in this specific case, I think this is an important part of it)
- large 512-bit fixed number (assuming prime)
- large 512-bit fixed number (assuming prime)
- client generated 512-bit number

Now, the client keeps B secret but will send A to the server. The server would already know , , and .

From here, the server processes A and does some unknown operations with it. This is the process that I am currently trying to understand.

After the server does its work, it will send some data back to the client (a 20 byte stream of data) which will XOR it byte by byte with the first 20 bytes of B and then passes the result into a function that modifies it some more, basically performing some additional operations on it to come up with the final key. This key is then used to build the encryption and decryption keys through a "scramble" algorithm (i.e. Like a sliding block puzzle).

Considering all of this, this is what I've come to understand from my research:

* The discrete logarithm problem basically tells us that the a value the client generates cannot be realistically calculated by the server over the entire range of possibilities it might be. Even if we could brute force our way through 0 to the max 512-bit number, the modulus would eventually wrap around and generate multiple solutions.

* According to RSA:
Encryption: Sender A does the following:-
1. Obtains the recipient B's public key (n, e).
2. Represents the plaintext message as a positive integer m [see note 4].
3. Computes the ciphertext c = m^e mod n.
4. Sends the ciphertext c to B.

However in this case, the client generates the 'e' and the 'm' is a fixed value known by both parties. The client's 'e' ('a') could not possibly be a private or public key. (Referring to RSA via this link as it keeps it pretty simple.)

* In a Diffie-Hellman key exchange, which is what the process most similarly resembles at first, the server does not send a B value (reference). In addition, no K value is ever calculated on the client side. I've worked with DH in the past a bit, so I'm sure from my current research this is not being used.

* In the Three-pass protocol, I see a lot of correlations between the process and what goes on, however no such full process takes place. The client sends 64 bytes to the server and receives 20 bytes in return and the entire process is done as messages are now ready to be sent encrypted and be decrypted.

* In general Public-key crypto, which I believe something along those lines is going on here,
Not all asymmetric key algorithms operate in precisely this fashion. The most common ones have the property that Alice and Bob each own two keys, one for encryption and one for decryption. In a secure asymmetric key encryption scheme, the private key should not be deducible from the public key. This is known as public-key encryption, since an encryption key can be published without compromising the security of messages encrypted with that key.

That is understandable, but it still doesn't answer the question of how the server could know anything about the client's B given only A while not knowing a or how it could even "sync" values.

All of those points leads me to the following:
1. The server cannot generate 'a' in this situation, correct?
2. Given A, there is no way to deduce B not knowing 'a', correct?
3. Assuming I did not miss anything and 1 & 2 are indeed infeasible, what other math concepts am I missing here?

What's driving me crazy is that, for any A sent to the server with the properly generated B value stored locally, the server responds with a different 20 byte sequence of values that is used each time to generate the final set of values used to seed the encryption/decryption tables.

Everything seems to point towards some sort of public key system, but the setup that is being used does not appear to be public key related since the exponent a is client calculated. It looks like the server is calculating 'a' itself but the research shows this shouldn't be happening. Lastly, a look up table is impossible since it'd require (256 ^ 20) * 64 bytes of storage.

Does anyone have any insights or other reference material that could help me out here? I'm just basically trying to understand what else can go on in the server given the apparent limited information it's sent and sends back.


Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!