Jump to content

  • Log In with Google      Sign In   
  • Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.


Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


#ActualLarryKing

Posted 27 November 2013 - 04:04 PM

So, I've made a few changes that seem to fix the two biggest issues:

I've introduced a "Turbulence" phase to combat patterns,

and I've also added an "Avalanche" phase to increase sensitivity:

 

The results:

A difference of a single bit will result in dramatic variation.

 

Small test: A 256 byte file was created with all bytes = 0xFF and a second, similar file was created where all bytes = 0xFF except for the last byte which = 0xFE

 

Here are the visualizations of the two files: (Note they are 16 x 16 pixels and are quite difficult to see on a white background)

File A : wbmp_zps08624d9f.png

File B : wpbmp_zpsc6f1c551.png

 

Both files were encrypted using the same passcode:

Result A : wbmp1_zpsf36db3db.png

Result B : wpbmp1_zps69c1ebcf.png

 

Both results appear as noise; however each is different.

 

And finally we have the "worst case scenario" triangle image/data test:

ColorsV2_zpsf6e1642f.png

 

Please note that the vertical lines are the result of blocks of data being EXACTLY the same in the original data.

As in one row of pixels being exactly the same as another:

ColorsLines_zps13d663bf.png

Note: Each row of pixels in the solid green block is exactly the same, as is the case wherever else there are vertical lines; encrypting identical datas returns identical results.

In other words, please don't point out the "clear pattern of vertical lines" smile.png

 

Now, adding both the turbulence and avalanche phases to the algorithm results in a ~ 40% increase in computation time.

 

The new code is attached ( PNWL.h )

 

Also, I'm still looking for a good passcode hashing function.

If you feel like testing the new algorithm, use a longer password, or better yet, to emulate a hashcode, generate a 24 byte null terminated string  randomly and use that.

 

Again thank you.

 

Edit: Forgot to attach the new code XD


#4LarryKing

Posted 27 November 2013 - 04:02 PM

So, I've made a few changes that seem to fix the two biggest issues:

I've introduced a "Turbulence" phase to combat patterns,

and I've also added an "Avalanche" phase to increase sensitivity:

 

The results:

A difference of a single bit will result in dramatic variation.

 

Small test: A 256 byte file was created with all bytes = 0xFF and a second, similar file was created where all bytes = 0xFF except for the last byte which = 0xFE

 

Here are the visualizations of the two files: (Note they are 16 x 16 pixels and are quite difficult to see on a white background)

File A : wbmp_zps08624d9f.png

File B : wpbmp_zpsc6f1c551.png

 

Both files were encrypted using the same passcode:

Result A : wbmp1_zpsf36db3db.png

Result B : wpbmp1_zps69c1ebcf.png

 

Both results appear as noise; however each is different.

 

And finally we have the "worst case scenario" triangle image/data test:

ColorsV2_zpsf6e1642f.png

 

Please note that the vertical lines are the result of blocks of data being EXACTLY the same in the original data.

As in one row of pixels being exactly the same as another:

ColorsLines_zps13d663bf.png

Note: Each row of pixels in the solid green block is exactly the same, thus encrypting identical datas returns identical results.

In other words, please don't point out the "clear pattern of vertical lines" smile.png

 

Now, adding both the turbulence and avalanche phases to the algorithm results in a ~ 40% increase in computation time.

 

The new code is attached ( PNWL.h )

 

Also, I'm still looking for a good passcode hashing function.

If you feel like testing the new algorithm, use a longer password, or better yet, to emulate a hashcode, generate a 24 byte null terminated string  randomly and use that.

 

Again thank you.

 

Edit: Forgot to attach the new code XD


#3LarryKing

Posted 27 November 2013 - 03:43 PM

So, I've made a few changes that seem to fix the two biggest issues:

I've introduced a "Turbulence" phase to combat patterns,

and I've also added an "Avalanche" phase to increase sensitivity:

 

The results:

A difference of a single bit will result in dramatic variation.

 

Small test: A 256 byte file was created with all bytes = 0xFF and a second, similar file was created where all bytes = 0xFF except for the last byte which = 0xFE

 

Here are the visualizations of the two files: (Note they are 16 x 16 pixels and are quite difficult to see on a white background)

File A : wbmp_zps08624d9f.png

File B : wpbmp_zpsc6f1c551.png

 

Both files were encrypted using the same passcode:

Result A : wbmp1_zpsf36db3db.png

Result B : wpbmp1_zps69c1ebcf.png

 

Both results appear as noise; however each is different.

 

And finally we have the "worst case scenario" triangle image/data test:

ColorsV2_zpsf6e1642f.png

 

Please note that the vertical lines are the result of blocks of data being EXACTLY the same in the original data.

As in one row of pixels being exactly the same as another:

ColorsLines_zps13d663bf.png

Each row of pixels in the solid green block is exactly the same, thus encrypting identical datas returns identical results.

In other words, please don't point out the clear pattern of vertical lines smile.png

 

Now, adding both the turbulence and avalanche phases to the algorithm results in a ~ 40% increase in computation time.

 

The new code is attached ( PNWL.h )

 

Also, I'm still looking for a good passcode hashing function.

If you feel like testing the new algorithm, use a longer password, or better yet, to emulate a hashcode, generate a 24 byte null terminated string  randomly and use that.

 

Again thank you.

 

Edit: Forgot to attach the new code XD


#2LarryKing

Posted 27 November 2013 - 03:42 PM

So, I've made a few changes that seem to fix the two biggest issues:

I've introduced a "Turbulence" phase to combat patterns,

and I've also added an "Avalanche" phase to increase sensitivity:

 

The results:

A difference of a single bit will result in dramatic variation.

 

Small test: A 256 byte file was created with all bytes = 0xFF and a second, similar file was created where all bytes = 0xFF except for the last byte which = 0xFE

 

Here are the visualizations of the two files: (Note they are 16 x 16 pixels and are quite difficult to see on a white background)

File A : wbmp_zps08624d9f.png

File B : wpbmp_zpsc6f1c551.png

 

Both files were encrypted using the same passcode:

Result A : wbmp1_zpsf36db3db.png

Result B : wpbmp1_zps69c1ebcf.png

 

Both results appear as noise; however each is different.

 

And finally we have the "worst case scenario" triangle image/data test:

ColorsV2_zpsf6e1642f.png

 

Please note that the vertical lines are the result of blocks of data being EXACTLY the same in the original data.

As in one row of pixels being exactly the same as another:

ColorsLines_zps13d663bf.png

Each row of pixels in the solid green block is exactly the same, thus encrypting identical datas returns identical results.

In other words, please don't point out the clear pattern of vertical lines smile.png

 

Now, adding both the turbulence and avalanche phases to the algorithm results in a ~ 40% increase in computation time.

 

The new code is attached.

 

Also, I'm still looking for a good passcode hashing function.

If you feel like testing the new algorithm, use a longer password, or better yet, to emulate a hashcode, generate a 24 byte null terminated string  randomly and use that.

 

Again thank you.

 

Edit: Forgot to attach the new code XD


#1LarryKing

Posted 27 November 2013 - 03:40 PM

So, I've made a few changes that seem to fix the two biggest issues:

I've introduced a "Turbulence" phase to combat patterns,

and I've also added an "Avalanche" phase to increase sensitivity:

 

The results:

A difference of a single bit will result in dramatic variation.

 

Small test: A 256 byte file was created with all bytes = 0xFF and a second, similar file was created where all bytes = 0xFF except for the last byte which = 0xFE

 

Here are the visualizations of the two files: (Note they are 16 x 16 pixels and are quite difficult to see on a white background)

File A : wbmp_zps08624d9f.png

File B : wpbmp_zpsc6f1c551.png

 

Both files were encrypted using the same passcode:

Result A : wbmp1_zpsf36db3db.png

Result B : wpbmp1_zps69c1ebcf.png

 

Both results appear as noise; however each is different.

 

And finally we have the "worst case scenario" triangle image/data test:

ColorsV2_zpsf6e1642f.png

 

Please note that the vertical lines are the result of blocks of data being EXACTLY the same in the original data.

As in one row of pixels being exactly the same as another:

ColorsLines_zps13d663bf.png

Each row of pixels in the solid green block is exactly the same, thus encrypting identical datas returns identical results.

In other words, please don't point out the clear pattern of vertical lines :)

 

Now, adding both the turbulence and avalanche phases to the algorithm results in a ~ 40% increase in computation time.

 

The new code is attached.

 

Also, I'm still looking for a good passcode hashing function.

If you feel like testing the new algorithm, use a longer password, or better yet, to emulate a hashcode, generate a 24 byte null terminated string  randomly and use that.

 

Again thank you.


PARTNERS