Interferometric crosstalk is a performance limiting factor of major concern in all-optical WDM transmission networks. Interferometric crosstalk arising from performance imperfections in optical components may introduce large power penalties and bit-error rate floors. Optical amplifiers are often used to increase the signal level incident on a detector so that high receiver sensitivity can be obtained. We investigate theoretically and experimentally the performance of optically preamplified, direct detection receivers in the presence of interferometric crosstalk. The model includes an accurate description of filtered interferometric crosstalk by using a so-called maximum entropy approach. Experimental results, using both directly and externally modulated light sources, are found to be in good agreement with theory.