House Voltage Help

Just to be 100% pedantic, I assumed that like 99.4% of UK households, the OP's house would have been wired to the 16th edition or earlier when 4% was the allowance.

The 17th edition, and thus the 3%/5% became the requirement for wiring installed after 1 July 2008.
And it also became the standard for what's allowed now.

As discussed many times here, if you were doing a PIR you'd do it to the current standards, not what was in force whenever the installation was done. If, for some reason, you checked voltage drop on a lighting circuit and found it was 4% you'd code it 2 or 4.

If a long run to an outbuilding had been done under the 16th in 10mm², was currently on a 20A breaker from the CU and you were asked to move it to a 50A switchfuse because the outbuilding was to become a sauna and hot tub you'd do the VD calculations to the 17th, not to the 16th.
 
OK looking at appendix 12 it reads 3% and 5% of nominal voltage (Called Uo) so minimum voltage is 204.7 volts except for inrush on starting where it is allowed to be lower.
Maximum volts is 253 i.e. Nominal volts of 230 plus 10%
For lighting minimum volts is 209.3 and the minimum supply volts is 216.2 i.e. 230 – 6%.
Since the maximum cable length is calculated at nominal voltage not minimum voltage it is possible that we may get in some extreme cases a lower voltage but unlikely.
It is also reading regulation 442.3 possible that the voltage can rise to 398.3717 volts and insulation must be designed to be temporarily stressed to this voltage however it does not say items must not burn out only that the insulation must survive.
So question is 218v-221v at the consumer unit cause for concern? Well it is 1.8 volts within the limits but one would not expect on a regular basis to get voltage this low and a call to the supplier would not go amiss. However there is no way you can force them to do anything about it.
I lived for a time on a caravan site where the supply was from a pole mounted step down transformer and there was both over design amount of caravans and as a result design current being draw and the voltage at peak times was under 200 volt. The only problems I had was that I was transmitting mains hum on my two way radio and I had to revert to battery power to ensure smooth supply. Sometimes hum would be present on normal radio and TV screen size reduced. We worried about the fan stalling in the fan heater so I changed to oil filled radiator for safety. But this went on for 4 years and no caravan fires or any other real problems resulted from running on reduced voltage.
The proprietor had issues with the electric meter as it is supposed to read in kilowatt hours but most meters rely on a 230 volt nominal voltage and really read amp hours where 1Kw = 4.347826 Amp/hours but with 200 volt 1Kw = 5 Amp/hours so he was being over charged for his power a bill for £2000 should have been for £1739.13 but his attempts to claim back the £260 did not work. Got to give him full marks for trying though? Their answer was to ask him if he wanted an upgraded supply at a cost well in excess of his losses due to volt drop. He had a split phase supply and the offered him a three phase supply instead.
As usual with any argument with supply rarely does the user win.
Eric
 
The proprietor had issues with the electric meter as it is supposed to read in kilowatt hours but most meters rely on a 230 volt nominal voltage and really read amp hours where 1Kw = 4.347826 Amp/hours but with 200 volt 1Kw = 5 Amp/hours so he was being over charged for his power

Not true.

The meters measure true kWh; they have a voltage winding as well as a current "coil" if you can call one turn a coil. And they measure kWh not kVAh, so power factor is irrelevant. You can't get charged for bad power factor (unless you have a meter that is made specifically to measure kVArh). Electronic meters are similar: they measure kWh.

There are regulations for meter accuracy — I think something like +2.5% or -3.5%, which would be impossible to meet if the meter relied on nominal voltage.

The nominal voltage changed from 240V to 230V some 15 years ago. Meters were not recalibrated, so if they worked as ericmark thinks, suppliers would have enjoyed a windfall of over 4%. Don't you think there would have been a political row over this if it were true?
 
Today may be the new electronic meters do measure true Watts but the old meter with the revolving disc how? Just look on how they are made. No way could these have corrected for volt drop.
As to voltage change lets be real 240v +/- 6% and 230v +10%/-6% meant the voltage could be lowered but didn't have to. Even today if you stick your meter in the sockets 240v is far more common than 230v.
Eric
 
For many years the supply voltage for single-phase supplies in the UK was 240V +/- 6%, giving a possible spread of voltage from 226V to 254 V. For three-phase supplies the voltage was 415 V +/- 6%, the spread being from 390 V to 440V. Most continental voltage levels were 220/380V.

In 1988 an agreement was reached (CENELEC Harmonisation Document HD472) that voltage levels across Europe should be unified at 230V single phase and 400V three-phase with effect from January 1st, 1995. Those countries with a nominal voltage of 240V (like the UK) were obliged to move to 230V +10% -6%, and those on 220V moved to 230V +6% -10%.

It was proposed that on January 1st, 2003 the tolerance levels would be widened to ±10%, and then that was pushed back to 2005, and then in July 2001 the CENELEC Technical Board decided to continue with the existing tolerances until 2008.

I've no idea whether this change did finally get implemented, or delayed again, or if it will ever happen, but in any event European-wide harmonisation is not being done by having common supply voltages, but by requiring manufacturers to make products which operate over a much wider range. Not too much of an issue apart from incandescent lamps.
 
Today may be the new electronic meters do measure true Watts but the old meter with the revolving disc how? Just look on how they are made. No way could these have corrected for volt drop.

There are two coils in the electromechanical meter, a shunt coil which creates a magnetic flux that varies with the voltage and a series coil which creates a magnetic flux that varies with the current.

As to voltage change lets be real 240v +/- 6% and 230v +10%/-6% meant the voltage could be lowered but didn't have to. Even today if you stick your meter in the sockets 240v is far more common than 230v.
Eric

That's the point — nothing changed except for the nominal voltage declaration. So if what you say were true, after the change new meters would be calibrated assuming a voltage of 230V, whereas older meters already on circuit would be calibrated assuming 240V. For the same current, the new meters would run slower than the old.

For example, if the current were 1A, the new meter would be calibrated to record 230W whereas the old meter would be set to run faster to record 240W. In fact, the old meter would be running 4% fast.

Of course, ericmark is wrong and this does not happen.
 

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Back
Top