After looking through some of my research on this topic some more, I recall another method of calculating the ADC dBm input level at 0dBFS.
The formula is generically "Full Scale Signal Power Level (in Watts) = (Vrms^2)/Rin". Where Vrms is equal 0.3889V (which is the full scale 1.1Vp-p divided by 2 and then multiplied by 0.707). And, Rin is equal to the stated datasheet input resistance value of 100ohm?
If this is correct, then I guess this works for ADC's that have Rin values in the 50 to 100ohm range. I'm not sure what you'd do for ADC's that have more complex or time-varying input impedances (due to the switched capacitor input variations during track and hold)?
So then, in order to calculate this power level in dBm, the formula appears to be "Full Scale Signal Power Level (in dBm) = 10Log [(((0.3889Vrms)^2)/100ohms)x(1000mW/Watt)]", which comes out to be +1.8dBm. Does this reasoning look correct, and does it apply correctly to the AD9625? If so, I'm not sure how this works for ADC's that have more complex input resistance values.
So, I thought I should throw this out as further food for thought on the topic.