Is it possible to store shaders in a binary format?

Started by
9 comments, last by FXACE 12 years, 4 months ago

That actually sounds like the best idea yet. Simple, effective and quick to implement.


Search for ROT13 mangling. All but a safe encryption scheme, but it sounds as if your audience just needs a little discouragement.
Advertisement
Alternatively, stuff them into TGAs. Construct a TGA header with the correct properties, and each byte in the shader source code becomes a byte in the TGA data. To the casual meddler it will look like some kind of noise texture you're using for something (and if they try to change that the program will break on them, so they'll shy away from it after the first attempt).

Edit to add: it occurs to me that you may want to semi-obfuscate the error message your program gives if a shader fails to compile, of course.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Nice discussion: "How to hide codes of shaders from user's eyes". Also, things like to hold them encrypted in files or something like that. But you forgot about OpenGL sniffers (GLIntercept, for example). Libraries like this replacing functions ("glShaderSource") and as how you would encrypt your code you would always need to decode it for OpenGL. And as result this library would give to user all your GLSL code as how it is (from anywhere).




If you want to protect your idea construct its algorithm which would be misunderstanding for others (for example: without any comments; no accurate code; "int jhfgdjfgjhg; vec4 __04129adasuh8hfasf215;"). Of course, do this when your code completing works correct.




+ clCreateProgramWithBinary():

Do not think that code (which is generated by OpenCL compler) is really "binary"... It just becomes from C++ style language to GPU Asm style language (which can be read by human too).




Best wishes, FXACE.

This topic is closed to new replies.

Advertisement