So, I've made a few changes that seem to fix the two biggest issues:
I've introduced a "Turbulence" phase to combat patterns,
and I've also added an "Avalanche" phase to increase sensitivity:
A difference of a single bit will result in dramatic variation.
Small test: A 256 byte file was created with all bytes = 0xFF and a second, similar file was created where all bytes = 0xFF except for the last byte which = 0xFE
Here are the visualizations of the two files: (Note they are 16 x 16 pixels and are quite difficult to see on a white background)
File A :
File B :
Both files were encrypted using the same passcode:
Result A :
Result B :
Both results appear as noise; however each is different.
And finally we have the "worst case scenario" triangle image/data test:
Please note that the vertical lines are the result of blocks of data being EXACTLY the same in the original data.
As in one row of pixels being exactly the same as another:
Note: Each row of pixels in the solid green block is exactly the same, as is the case wherever else there are vertical lines; encrypting identical datas returns identical results.
In other words, please don't point out the "clear pattern of vertical lines"
Now, adding both the turbulence and avalanche phases to the algorithm results in a ~ 40% increase in computation time.
The new code is attached ( PNWL.h )
Also, I'm still looking for a good passcode hashing function.
If you feel like testing the new algorithm, use a longer password, or better yet, to emulate a hashcode, generate a 24 byte null terminated string randomly and use that.
Again thank you.
Edit: Forgot to attach the new code XD