This has nothing to do with the ASV but it's something I discovered while fiddling with channel impedance. When I first turn-on the scope, the FFT that was configured previously, launches automatically. When I change the channel impedance from 50 ohm to 1 MegOhm, the Max measurement changes from dbm to dbv. It stays that way when I toggle the impedance back to 50 Ohms. When I go the the FFT/More FFT/Vertical Units, the default is Decibels. When I toggle the menu to V RMS and back to Decibles, then Max measurements is locked to dbm.
After fiddling with FFT/More FFT/Vertical Units again, It's in a mode where when I select V RMS, the FFT dissapears and the Max Measurement shows 0dbm. When I launch FFT/More Auto Setup, the measurement is back to 0dbm but when I select V RMS the the measurement is 223mv, the noise floor and the level drops. Back to Decibles and the measurement is correct.
Things aren't consistant and it appears the the measurements are all out of wack. There's no clear way to correct the problem and by changing other settings, the FFT can shut down.
I have an MSO-X 3034A, and I tried to reproduce the problem you're having. I found that with the vertical units set to decibels, I consistently got vertical scaling in dBm for a 50 ohm input impedance setting, and dBV for a 1 Mohm input impedance setting. I could switch back and forth between the two input impedances and the units were always correct.
When I switched the vertical units to V RMS, my vertical scaling was always in volts, no matter which input impedance was selected.
I think this is the correct behavior. What you're seeing when the default units are set to V RMS is a linear (vs. logarithmic) vertical scale, so the display appears much less sensitive (and much less noisy) than with a logarithmic scale. This is the same behavior as with a traditional spectrum analyzer.
What firmware version are you using? I'm using 2.12 (the newest).