Shading in a cloud is a complex topic - it's normally viewpoint dependant - and involves casting multiple rays inside the fog volume to calculate the scattering. As the sun will be moving in real-time in my sky, i cannot do such complex calculations.
Instead i decided to generate a normal for each voxel and to shade the voxel with a standard dot product with the sun direction. But a voxel isn't a surface - how do you determine its orientation, and by consequence, its normal ?
What i did was to use the cloud densities at the neighbooring voxels, to generate a fake normal. If the voxel on the right has a high density and the voxel on the left has a low density, it means we are on the "left" side of the cloud's enveloppe, and that the normal is primarily pointing to the left. Repeat this idea for Y and Z axis, and you got a normal.
If you try to shade the voxel directly with this normal, your lighting will probably look pretty bad because of the abrupt changes of normals between close voxels. The next step before doing the shading is, for a given voxel, to average the normals of all the neighbooring voxels. I used a box filter with a kernel size of 7 (going from -3 to +3 in each direction). The result is shown in the screenshots below. The first one shows a single cloud made of 2517 particles; the second one, a 128x20x128 voxel space filled of 40 random clouds.
I'm still not happy with the bottom shading of the clouds (it's using a vertical gradiant mixed with the sun shading).
I've also played with many particle textures, and noticed that the framerate decreased significantly when using textures higher than 64x64 (by a factor of 4 for 256x256). But it's not really a problem: given the blury nature of clouds, using a small texture looks good enough.