For cheapness, I would imagine it would probably be done with an LED/resistor/diode combination that would not be happy to be supplied with double the normal running current.
Possibly.
However, I would think that it's very unlikely that either the LED or diode (only one of which would carry current, depending on the polarity of the {DC} IR test voltage) would be unhappy with double the normal current (which presumably would be very low, since one is not aiming for Blackpool Illuminations!). Nor, given the very brief duration of an IR test, would I think it very likely that the resistor would suffer as a result of 'double the normal current' (4 times the normal power dissipation). As for the diode (assuming it was reverse-biased by the IR test voltage), I'm not sure that exceeding it's PIV during the IR test would do it any harm - it would probably just behave (reversibly, without damage) as a high-voltage zener (with, again, a very low current, limited by the resistor).
So, whilst you might be right, I somewhat doubt that the LED/resistor/diode would, in practice, be at any appreciable risk.
Of course, the resistor would be plenty low enough in value to completely screw up the IR measurement, so the indicator circuit would need to be disconnected in some way (switching off the switch?) before testing. However, as I said before, the same problem exists with a neon. I've just tested an FCU with a neon and find that (with one polarity of testing voltage, but not with the other polarity), the indicated IR is 0.54MΩ at 250V, 0.33MΩ at 500V and 0.3MΩ at 1000V - so, again, necessitating disconnection (or 'switching off - e.g. by removal of an FCU fuse) of the indicator circuit before any meaningful IR test can be undertaken.
Kind Regards, John