Recently I wrote a sample to calculate the time GPU spends to render things, and I just printed out the GPU frequency too in the code. Surprisingly, the frequecy I got were not matching the actual frequency of the GPU (which I know for sure).
A disjoint query just gives you a value you can use to convert ticks from timestamp query into seconds. It's basically a GPU counterpart of QueryPerformanceFrequency. The docs don't say anything about it being an actual physical clock speed on the GPU.