Performance in pitch discrimination tasks is limited by variability intrinsic to listeners which may arise from peripheral auditory coding limitations or more central noise sources. The present study aimed at quantifying such “internal noise” by estimating the amount of harmonic roving required to impair pitch discrimination performance. Fundamental-frequency difference limens (F0DLs) were obtained in normal-hearing listeners with and without musical training for complex tones filtered between 1.5 and 3.5 kHz with F0s of 300 Hz (resolved harmonics) and 75 Hz (unresolved harmonics). The harmonicity of the tone complexes was varied by systematically roving the frequency of individual harmonics, which was taken from a Gaussian distribution centered on the nominal frequency in every stimulus presentation. The amount of roving was determined by the standard deviation of this distribution, which varied between 0% and 16% of the tested F0. F0DLs for resolved harmonics remained unaffected for up to 6% roving, and increased thereafter. For unresolved harmonics, performance remained stable up to larger roving values. The results demonstrate a systematic relationship between F0DLs and stimulus variability that could be used to quantify the internal noise and provide strong constraints for physiologically inspired models of pitch perception.