Voltage drop

Joined
28 Jan 2011
Messages
56,467
Reaction score
4,210
Location
Buckinghamshire
Country
United Kingdom
Hopefully a simple question!

I've often pondered the fact that the deemed-to-satisfy maximum permissible voltage drops (525.3 & Appendix 12) are lower for lighting circuits than everything else. On the face of it, I might have expected the reverse. Certainly with incadescent bulbs (admittedly, almost a thing of the past), the worst that happens with undervoltage is that the light gets a bit dimmer, and I thought (perhaps wrongly) that most other forms of lighting were fairly tolerant to undervoltage (within reason) - whereas it's quite possible that far more voltage-sensitive things will be run off non-lighting circuits.

What am I missing? Are some forms of lighting perhaps far more sensitive to undervoltage than I thought?

Kind Regards, John.
 
Sponsored Links
One possible reason is related to "loss" of power caused by the voltage drop and disappated as heat in the cables

On a lighting circuit it will be limited to a maximum of Volt drop times 6 amps. while on a power ciruit the maximum power disappated in the cable will be limited to Volt drop times 20 amps in a radial ( or times 32 amp in a ring final ).

Or it could be something entirely different.
 
One possible reason is related to "loss" of power caused by the voltage drop and disappated as heat in the cables
On a lighting circuit it will be limited to a maximum of Volt drop times 6 amps. while on a power ciruit the maximum power disappated in the cable will be limited to Volt drop times 20 amps in a radial ( or times 32 amp in a ring final ).
I don't really understand that argument. The power dissipated in the cable with 20A or 32A loads and a 5% voltage drop would be 230W and 368W respectively (if the entire load was connected at the same point, where that 5% drop existed). With 6A and the maximum permitted voltage drop for lighting circuits (3%), the power dissipated would only be 41.4A.

Or it could be something entirely different.
I think it must, but I don't know what. I suppose it might be due a consideration of energy (rather than power) loss, taking into account the fact that lighting is likley to be switched on for far longer periods of time than most significant loads on other circuits - although I suspect that the arithmetic of that would not be convincing.

Kind Regards, John.
 
Best reason I could find is that lighting equipment (with electronic controllers, 'ballasts' etc) is fussy about low voltages these days.
 
Sponsored Links
Best reason I could find is that lighting equipment (with electronic controllers, 'ballasts' etc) is fussy about low voltages these days.
Yes, that remains a possibility, although most of the specs I could find with a quck search seemed to indicate that they are pretty tolerant to undervoltage. Of course, there is any amount of electronic equipment plugged into power circuits these days, but I suppose a high proportion of it probably uses switched-mode PSUs, and is therefore very tolerant to supply voltage variation.

One thing which might give us a clue is the history of this. Does anyone know how far back in history the differential maximum voltage drops for lighting and 'other' circuits (with less drop permitted for lighting circuits) has existed in the regs?

Kind Regards, John
 
Until the 17th edn regs the max volt drop was 4% within the consumers installation for both lighting and power.
The logical explination for changing the maxVD for lighting may be the change in nominal supply voltages, albeit doesn't really explain why power applications were allowed to go lower.
Remembering the regs cover a vast range of installations inculding industrial, SON lamps do not like to have too much of a voltage drop as it causes them to extinguish, then they had to cool before they would restrike. If you added the VD within the consumers installation to the external VD you may overcome the threshold for the lamp.
I remember seeing ballasts with different tappings where you selected the closest voltage.
 
Old florescent fitting are very critical on voltage. I have had problems with a festoon of lamps in a tunnel where the original string of 25 x 60w 110volt lamps was tripping the 16A RCD. The tapping was changed from 110v to 127v and then last few would not strike so had first 20 at 127v and last 5 at 110v.

When I took on fitting and tested it on the bench the results were a surprise. At the 110v tapping they were using 0.8 amp not the just over 0.5 expected and on 127v tapping was under 0.6 amp.

Today with electronic ballast there is not the same problem. Often they will have massive input voltage latitude from 200 to 300 volt AC or DC but in general the HF ballast is very expensive at around £50 and only industry uses them to reduce lighting bill. Domestic normal use old type.

So I would guess the reducing to 3% is more to do with the move to discharge lighting to reduce costs.

However if one works out the earth loop impedance and the volt drop they are nearly the same as to what cable lengths can be used so in real terms it make no difference to length of cable that can be used pre 2008 and post 2008.
 
Until the 17th edn regs the max volt drop was 4% within the consumers installation for both lighting and power.
The logical explination for changing the maxVD for lighting may be the change in nominal supply voltages,
Yes, that would make some sense. Although, as far as I can make out, nothing was actually changed in the UK (so most supplies are probably still well above the 'old' declared minimum of 226V), there is now at least the theoretical possibility that it could be nearly 10V lower than that (216.2V)
.....albeit doesn't really explain why power applications were allowed to go lower.
Indeed, far from it! I know it's 'never' likely to happen, but a 5% drop from the theoretical floor of 216V would not be much over 205V.

As you go on to say, and others have said, perhaps the main reason was the move to more voltage-sensitive (non-incandescent) lighting - mabe in combination with the change in nominal supply voltage.

A little while back I was talking about this to an electrical engineer (not electrican!), explaining that the voltage drop for lighting was a bit of a pain, because it was often the cable-determining factor with a long sub-main. His response was "what's the problem? - let the voltage drop in the cable and stick a constant voltage transformer on the lighting circuit at the far end"! Quite apart from the difficulty and cost of sourcing a suitable animal, I'd love to know what the regs would have to say about that one!!

Kind Regards, John
 
Just a question, is the permissible 3% volt drop referenced to: -

The maximum supply voltage
or
The nominal supply voltage
or
The minimum supply voltage
or
The actual supply voltage at the time it got measured (as in fact it varies throughout the day).

As it seems to me that you could get the situation of a supply voltage of 216.2 - 6.5 (3%) = 210 (figures rounded up)

This would be permissible and the lowest acceptable.

So anything above that must by definition be acceptable up to a maximum of 253V
 
It's the nominal voltage with the current demand (of current-using equipment) after diversity or the design current of the circuit.

The notes in Appendix 4 do concede that this method may be rather pessimistic so it would appear that lower voltage will have been considered.
 
What an absolute **** ache of a calculation we used to do them when designing LV networks! (nice computer programmes do it for us now)

It's also IMO pretty meaningless.
As the regs are the minimum design requirements anything above the minimum permissible should and could be argued to be acceptable. Which is no different to the DNO situation where any thing above 216.2 and 253 is legal and permissible (though in truth anything below 220V is rubbish.
 
Just a question, is the permissible 3% volt drop referenced to: -
The maximum supply voltage
or The nominal supply voltage
or The minimum supply voltage
or The actual supply voltage at the time it got measured (as in fact it varies throughout the day).
Presumably the last one. The regs say that, the voltage drop is measured from 'the origin of the installation' - which in practice presumably means the 3% is 3% of the supply voltage pertaining at the time of the measurement.

As you go on to say, this means that the actual voltage drop (in volts) will vary according to the supply voltage - in theory from from 3% of 216.2V (6.5V) to 3% of 253V (7.6V). However, since the larger permissible drops only arise when the supply voltage is higher, that's not really a problem - since the voltage supplied to the light will actually be higher than with lower permissible drops from lower supply voltages.

Kind Regards, John
 
Just being pedantic but if the supply voltage was 250V at the time of measurement, so the minimum permissible on the lighting system is 250-7.5 = 242.5.

All things being equal and actual measurements taken at a supply voltage of 220V and the actual exceeds 6.6V the design could be questioned!
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top