I just got an insulation resistance tester to pre-check some hardware i'm developing (formal testing will be done elsewhere but we want to make sure it suceeds). Specifically I got http://cpc.farnell.com/1/1/91037-tenma-72-9400-tester-insulation-resistance.html
The tester is supposed to have a range of 2 megohm to 2000 megohm
Anyway having got the tester I tested a few things, stuff i'd built, power bricks (testing both P+N to E and P+N+E to output), various leads, the inputs of a couple of different multimeters.
The multimeter inputs unsurprisingly read around the 10 megohm mark. The multimeters also indicated that the voltate from the IR tester was slightly over nominal.
One old set of test leads gave readings ranging from about 100 megohm to off-scale depending on how I held them.
Everything else I tested gave an off-scale reading. In one sense off-scale is good but in another sense it leaves you worrying if you actually connected the tester properly. Do people find it is common/normal to get off-scale (e.g. greater than 2 gigohm) readings when IR testing stuff?
The tester is supposed to have a range of 2 megohm to 2000 megohm
Anyway having got the tester I tested a few things, stuff i'd built, power bricks (testing both P+N to E and P+N+E to output), various leads, the inputs of a couple of different multimeters.
The multimeter inputs unsurprisingly read around the 10 megohm mark. The multimeters also indicated that the voltate from the IR tester was slightly over nominal.
One old set of test leads gave readings ranging from about 100 megohm to off-scale depending on how I held them.
Everything else I tested gave an off-scale reading. In one sense off-scale is good but in another sense it leaves you worrying if you actually connected the tester properly. Do people find it is common/normal to get off-scale (e.g. greater than 2 gigohm) readings when IR testing stuff?