As you can see, if a lamp was designed to have a life of 1,000 hours at 230V then at 260V (+13%)
And the electricity bills for the user who has 260v will also be higher than the user who's supply is at 230v.
But as much as 13% more?
As you can see, if a lamp was designed to have a life of 1,000 hours at 230V then at 260V (+13%)
As you can see, if a lamp was designed to have a life of 1,000 hours at 230V then at 260V (+13%)
And the electricity bills for the user who has 260v will also be higher than the user who's supply is at 230v.
But as much as 13% more?
All true. I'm not sure about the motors, either. There would usually be a bit more total energy consumption at 260V as compared with 230V, but nowhere near the 'theoretical' 28% (nor even 13%). Some loads might even use slighly less energy at 260W - since if the kettle, tank of water or whatever heats up more quickly, the losses might be less.This time you have made a mistake and you are wrong. Power increases are in proportion to the square of voltage increase so it would be 28% not 13%. ... BUT. Any modern equipment with a switch mode power supply, TV, HF fluorescent light, electronic 12v transformers, CFL lights, etc are constant wattage over a wide voltage range. Boiling a kettle of water uses the same amount of energy, it just happens quicker. Same applies with a fan heater, the room heats up quicker than the thermostat kicks in. Tungsten bulbs will use more and be so much brighter one would be tempted to use the next size down. Not sure about motors in fridge, washing machine etc.And the electricity bills for the user who has 260v will also be higher than the user who's supply is at 230v. ... But as much as 13% more?
Yes, it's 'normal'. It's just due to the tiny amount of electricity which 'gets through' because of such things as capacitance between the (open) switch contacts. Nothing to worry about. With a different type of meter, you might well not see any voltage at all.Having read through this interesting thread, I took voltage readings at some of my sockets. All read 241V to 242V (AC), although I was surprised to see a very small voltage when the switch was turned off (but the wires still connected). This varied between 0.2V and 1.5V (AC) at different sockets. Is this normal? If so, how is it caused?
Indeed, but I didn't want to confuse the OP with detail (someone asking the question he asked would very probably not understand the meaning and significance of input impedance of a meter), so merely wrote:One thing to rememeber about modern digital multimeters is they have a high input impedance, 10 megohms is typical.
With a different type of meter, you might well not see any voltage at all.
Yep, any of those, or even their little sibling, an example of which is still in service on my bench:http://dsa.ebay.co.uk/sch/i.html?_odkw=%28avo%2Cavometer%29+%28%22model+8%22%2Cmk8%2C%22mk+8%22%29&_osacat=0&_from=R40&_trksid=p2045573.m570.l1313.TR0.TRC0.X%28avo%2Cavometer%29+%28%22model+8%22%2Cmk8%2C%22mk+8%22%2C%22avo+8%22%29&_nkw=%28avo%2Cavometer%29+%28%22model+8%22%2Cmk8%2C%22mk+8%22%2C%22avo+8%22%29&_sacat=0
That's a pity. Mine still works fine, although the original leads and the plastic 'pouch' died decades ago. I've had mine since the mid-60s, so nearly 50 years old.Aah that takes me back. An Avo minor. I think the battery leaked in mine
Are analogue meters (like Avo meters) more accurate than modern digital meters?
If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.
Select the supplier or trade you require, enter your location to begin your search.
Are you a trade or supplier? You can create your listing free at DIYnot Local