39 also, it appears that unity has a "weird" way of showing your fps.
It looks at what component takes the longest time each frame (CPU or GPU) and then divide a single second with the one that took the longest. (at least, so it seems). Thus if your CPU needs 10 ms and your GPU 1MS each frame your framerate will be 1000/10 = 100FPS
This works, I guess. Until you find out that time spend waiting because of v-sync gets blamed upon the CPU. Thus if you throttle the GPU and thus get closer to what v-sync targets your CPU needs to wait less time thus your "FPS" INCREASES.
Note that I don't have much experience with unity and this is all speculation based on the little experience I have with unity which is also purely based on VR development which most deviantly changes stuff. (in the profiler what seems to be v-sync according to the graphs is called VR.WaitForGPU in the hierarchy for example)