Considering Warcraft 3 is heavily scripted (using Lua if memory serves), I wouldn't be surprised if each of those object actually executed a script to create a particle system at generation and keep them sync'd with their world positions.
Texture animations are a thing. Particle systems another. Let me elaborate a bit.
Texture animations.
If memory serves, those textures are not animated, they are scrolled. The easiest way to do this at the time War3 was released was to provide a non-identity texture transform matrix. If memory serves, there was a D3D-specific render state to set to enable this functionality.
This would cause each vertex (s,t) coordinate to "move" in the texture and fetch different texels.
With vertex shaders that's even easier as we can just pull the matrix as we want and deal with it.
Texture animations, in the sense of multi-frame textures actually requires no modifications at all in a low-level renderer. What it takes is to have a special "texture sequence" resource and a system (likely part of hi-level renderer) which keeps track of the passing time. It can therefore fetch the correct texture to the low-level renderer. As long as you can figure out if a texture resource is "plain 2D" or "sequenced 2D", it could theorically be implemented without even modifying the model itself.
Particle systems.
Those are much more complicated. As the above post notes, the model will have to be enriched with supporting information. Personally I don't think a model should ever have knowledge of a particle system attached to it. In my opinion, it would be best to export joints/sockets/connection points and then use those in a script to spawn a particle system.
I have been told most DCC file formats support particle system in model files. I don't think that's a good reason to make the runtime model format more complicated than it should (PS will have to be iterated in engine a few times anyway) but I suppose everyone is free to have his/her own opinion.
edit: somehow messed up text size.