Has a complicated spec starting on page 15. The specification is quite difficult, fortunately, there's an example worked out on page 19. The only problem is, that is seems very, very wrong. I was trying to estimate the frequency error for a 100MHz signal. If I use that example, it says the 90% confidence interval is an error of +/- 0.1176 s. So, for a 100 MHz signal that puts my signal period between -0.1176 and +0.1176 seconds. In frequency that says my signal is between -8.503 and 8.503 Hz.
It looks like the timebase uncertainty is computed incorrectly. I think it should be (1/10MHz)*1.7ppm which is 0.17 ps. That would shift the basic accuracy at 90% confidence interval to +/- 50.94 ps. In frequency that puts the interval between 99.493181 MHz and 100.512008 MHz. Now, this is closer, but still puts my error at 5120 ppm. Is this correct? I thought I should be able to get within a few ppm.
Which counter can I use to get less than 1ppm measurement uncertainty at 100MHz? I'll use a rock solid reference oscillator.
Hi, I noticed that Agilent has fixed the data sheet to correct the errors in Appendix A. It now makes much more sense.
By the way, I didn't see any reply to my message nor any revision notes on the data sheet. That makes me nervous since that seems to be the only place the accuracy specs are published. It seems Agilent can change the specs anytime without letting the customers know about it. Perhaps some notes about revisions would be helpful.
IANAAE (I Am Not an Agilent Employee), but just to give you some insight into their likely thought process, there are a couple of ways you can measure frequency with high end counters like these. One is to just hook up the DUT and hit the "Frequency" button. The measurement will be dominated (by 5+ orders of magnitude) by the counter's TCXO, just as with any cheap counter. If you measure 100 MHz with a 1-ppm TCXO you will end up somewhere within +/- 100 Hz, obviously not too impressive.
The other method is to use the counter in time-interval mode, where a known-accurate external reference at (e.g.) 1 PPS drives the START channel. After the count is started, the next edge from your 100 MHz DUT will arrive at the STOP channel somewhere between 0 and 10 nanoseconds later. Any change in the observed start-to-stop interval from one second to the next represents a nonzero phase slope, which is the same as the DUT's frequency error versus the expected 100 MHz.
That method is more complicated but it can also be more "certain," because the counter's internal timebase is in play for only ~5 nanoseconds per second in this example. In other words, almost all of the phase-slope uncertainty comes from the period of the 1 Hz START signal. The systematic error is nearly constant from one tick to the next, so there's not much left to degrade the measurement but the counter's 2E-12 random uncertainty.
I've never used a 53230A but on earlier counters with similar specs like the 5370A/B, the real-world TI uncertainty is down at the 3E-12 to 6E-12 per second level. This is mostly a combination of trigger jitter and quantization error.
Now, you can get similar uncertainty in an ordinary frequency measurement if you plunk down the shekels for the ovenized timebase option when you buy the counter. (On the earlier counters this was still not as effective as deriving the frequency error from phase-slope measurements, but it may be now.) The reason why it makes sense for Agilent to ship a cheap TCXO in their high-end counters is simple: the users who care about this sort of thing will either be making TI measurements that don't depend on the counter's timebase anyway, or they'll use a 5/10 MHz house clock as the counter's external reference.
There's no point making them pay for an expensive OCXO that they probably won't use.... except for the unfortunate fact that the TCXO makes the spec sheet look bad. It's the marketing folks' job to make sure that customers understand that, and they may not have quite succeeded here. A page full of equations doesn't tell you what to expect at different timescales, or help with the choice of methodologies. Residual Allan deviation plots for TCXO, OCXO, and external reference options would be better for that purpose, but only the ultra-high end TI analyzers ever advertise those for some reason.
Executive summary: this counter should be trustworthy down to the 1E-10/second level or better, but not if you just buy the base model and hit the "Frequency" button.
The basic accuracy (measurement uncertainty) calculated using the example found on page 22 of the latest 53200A series data sheet is applied to the measurements, and not to the time base reference. Thus, for the calculated basic accuracy of 1.7000566E-6, the measurement uncertainty for a 100 MHz input signal is + 170 Hz.
Note that the example and calculated accuracy are based on the counter’s standard TCXO reference. Substituting the corresponding OCXO time base uncertainty specifications in the time base uncertainty equation (aging + temperature + calibration uncertainty) results in a time base uncertainty of 65E-9. Using the same values for random uncertainty and systematic uncertainty in the example, and using the time base uncertainty of the OCXO, the calculated basic accuracy is approximately 65.025300E-9. The measurement uncertainty for the 100 MHz signal is then +6.5 Hz. This is less than the 1 ppm accuracy you required at 100 MHz. (This is almost achieved using the TCXO.) Note also that the calibration uncertainty term in the time base uncertainty equation assumes the factory calibration value. Recalibration of the instrument at an Agilent Service Center or locking to an external reference and using its uncertainty values will make time base uncertainty less dominant in the accuracy equation.
The question then becomes: 53210A, 53220A, or 53230A? The answer depends on the additional functionality or resolution required.