Voltage drop

What John seemed to be implying was we could have a voltage of 216.2 with no load.
There may be no load in your house but if there is a load next door and you at the end of along feeder then the supply to your property will drop significantly.
Indeed. In fact, if the upstream house (or group of houses) 'next door' in the upstream direction is/are drawing the ~100A it takes to bring 253V down to 216.2V for them, then one might well have 216.2V or less at the origin of one's installation, even with zero load on that installation.

I still don't really understand why we don't see much more dramatic supply voltage variations that we do. With a group of a dozen or two houses, I would think that the difference between the peak and minimum instantaneous demands would be very large - probably considerably more than 100A ... yet no installation that I know sees anything other than very modest variations in supply voltage.

Kid Regards, John
 
Sponsored Links
Don't forget that it's relatively rare for more than a very small number of homes (often just one or two) to be fed from a simple 2-wire 240V system. Most British homes are on a 3-phase 240/415V network, or in some rural areas a single-phase 240/480V network, in which the current drawn by homes on one phase or pole is offset by current being drawn by other homes on the other phases or pole, reducing the current (and therefore the voltage drop) along the neutral.
 
Don't forget that it's relatively rare for more than a very small number of homes (often just one or two) to be fed from a simple 2-wire 240V system. Most British homes are on a 3-phase 240/415V network, or in some rural areas a single-phase 240/480V network, in which the current drawn by homes on one phase or pole is offset by current being drawn by other homes on the other phases or pole, reducing the current (and therefore the voltage drop) along the neutral.
Yes, that all true, but (although I may be wrong) my understanding is that it's not uncommon to have, say, 40-70 consumers fed by one 3-phase main, so a pretty substantial number of consumers on each phase and, I would have thought, very substantial VD variations in the Phase conductors alone
 
yet no installation that I know sees anything other than very modest variations in supply voltage.
It is to the credit of the supply network that the impednaces are low enough that voltage drops due to loading are minimised. But the variations in voltage can be more then insignificant. Most people would not notice a slow rate of change in voltage I takes a 24 hour 7 day log of voltage to get the full picture.

The more consumers there are on a feeder then the less the rate of change in voltage as the variations in each consumers load will not all occur at the same time. That said it isn't totally true. The power generation people can tell when it is the advert break in TV soaps as thousands of kettles all switch on within seconds of each other. Engineers in a sub staton can often tell when the ad break occurs as the transformer's hum gets louder. In fact the main generating control room knows in advance the TV schedules so they can have generators spinning up ready to take up the rapid increase in load.

Useless factoid, Germany 1970's some areas used variations in power consumption as the voting procedure in television talent shows. People were asked "To vote for Fritz Schmidt turn on your appliances NOW " and the electricity supply company for the area would report the increase per contestant
 
Sponsored Links
It is to the credit of the supply network that the impednaces are low enough that voltage drops due to loading are minimised.
Yes, there was either a lot of inspired forethought or 'luck' (based on a lot of over-engineering) - since much of the LV network has been there for decades, during which there has been a major evolution in demand.
But the variations in voltage can be more then insignificant. Most people would not notice a slow rate of change in voltage I takes a 24 hour 7 day log of voltage to get the full picture.
True, but I'm still very surprised that we don't see much wider variation than we do. I don't think I've ever seen my supply voltage above ~248V (certainly never as hight as 250V), even in the middle of the night, yet nor do I recall having seen it appreciably (if at all) below 240V at times of the day when one might expect demand on the network to be particularly high.
The power generation people can tell when it is the advert break in TV soaps as thousands of kettles all switch on within seconds of each other. Engineers in a sub staton can often tell when the ad break occurs as the transformer's hum gets louder. In fact the main generating control room knows in advance the TV schedules so they can have generators spinning up ready to take up the rapid increase in load.
Yes, but, given the limits on permissible supply voltage, there's a definite limit to what they can do. AIUI, they monitor frequency and wind there generating capacity up/down so as to keep that very close to 50Hz, and by so doing will maintain the 'status quo'. And that status quo will presumably include the voltages (in and out) of the HV/LV transformers.

What they cannot do to any significant extent is to increase the voltage leaving their LV transformers to compensate for increased VD in the LV network at times of high demand. If that did that, the voltage supplied to some customers close to the transformer would rise to about the permitted maximum.

Kind Regards, John
 
The monitoring and adjustments certainly also play a big part in minimizing voltage variations where caused by large variations in load over time, even though they might not be able to do anything for one specific LV feed without messing up many others in the area. It's actually an even more amazing feat that the frequency is kept as constant as it is given the variations: Not sure if the standard has changed at all, but the last I heard for the U.K. it is supposed to be maintained with 1%, i.e. +/- 0.5Hz. And it's compensated over a period of 24 hours so that any reduction of frequency due to load is made up with a corresponding slight increase later, in order to keep synchronous clocks on time, i.e. the average over a 24-hour period will always be precisely 50Hz.

It's perhaps not such an issue these days with the nation's viewing spread across hundreds of different channels, but back in the days of just the three networks, apparently there was a lot of preparation in getting ready for extra generating capacity when approaching a commercial break or the end of some show or film which was expecting huge audiences, in anticipation of all those kettles and other appliances being switched on across the country.
 
The monitoring and adjustments certainly also play a big part in minimizing voltage variations where caused by large variations in load over time, even though they might not be able to do anything for one specific LV feed without messing up many others in the area.
As I wrote in my last post, the rules under which they work would appear to very seriously restrict what they can do in attempts to compensate for substantially increased load in any LV network. The cannot increase the LV voltage coming out of the transformer very much, or else a consumer with little/no load close to the tranny would get a voltage exceeding the permitted maximum of 253V - so one is at the mercy of the VD in the main to maintain the supply voltage above the permitted minimum (216.2V) for high load consumers at the far end of the main. As eric has demonstrated, with an average L-N loop impedance of 0.35Ω, it takes only fractionally over 100A to drop 253V to 216.2A - and I would have thought that the instantaneous total load of 'a dozen or two' houses would quite often appreciably exceed 100A ('two showers and a kettle'!) ... so, as I've been saying, I really don't understand why those consumers fairly distant from transformers (mine is about 100 yards/metres away) don't see much larger voltage variations than they do.
It's actually an even more amazing feat that the frequency is kept as constant as it is given the variations: Not sure if the standard has changed at all, but the last I heard for the U.K. it is supposed to be maintained with 1%, i.e. +/- 0.5Hz. And it's compensated over a period of 24 hours so that any reduction of frequency due to load is made up with a corresponding slight increase later, in order to keep synchronous clocks on time, i.e. the average over a 24-hour period will always be precisely 50Hz.
Indeed. My understanding is that is frequency (not voltage) that they primarily monitor very closely, and adjust their generating activities to keep it within very tight limits - and also that, as you say, there is a requirement for the exactly correct number of cycles in a 24h period.

Kind Regards, John
 
As eric has demonstrated, with an average L-N loop impedance of 0.35Ω, it takes only fractionally over 100A to drop 253V to 216.2A - and I would have thought that the instantaneous total load of 'a dozen or two' houses would quite often appreciably exceed 100A
A transformer supplying a decent number of properties will not have an L-N impedance of 0.35, it will be significantly smaller.
 
A transformer supplying a decent number of properties will not have an L-N impedance of 0.35, it will be significantly smaller.
It would have to be, if the figures were to make much sense and correspond with our experiences. Having said that, the measured L-N loop impedance in my installation (near the end of the LV run) is about 0.41Ω (on each of the three phases). However, as has been discussed, there are at least a couple of factors which make things more complicated, and make the VD situation 'not as bad' as it might otherwise be:
  • The load, hence components of overall VD, is distributed along the length of the LV feed. Even a high load connected close to the tranny will result in relatively little VD (for the installation concerned or anyone else).
  • Given fairly balanced loading, the current, hence VD, in the neutral of the 3-phase feed should be pretty low - so the actual VDs experienced could approach half of the figures obtained by simplistic ('single phase') calculations.

Kind Regards, John
 
OK - not been following too closely, so probably talking an irrelevance. I was never able to reconcile the measured supply resistance with the obvious requirement to supply everyone in the street. But this is probably related to the fact that there is a merely sufficiently thick cable between the street and the CU. This must be where most of the measured supply resistance comes from.
 
OK - not been following too closely, so probably talking an irrelevance. I was never able to reconcile the measured supply resistance with the obvious requirement to supply everyone in the street.
Indeed. It is my similar difficulty in that reconciliation that has essentially been what I have been talking about.
But this is probably related to the fact that there is a merely sufficiently thick cable between the street and the CU. This must be where most of the measured supply resistance comes from.
Eric suggested that, but I'm not sure how relevant is usually is. Certainly in urban scenarios, the runs of cable (which as you say, may be only marginally adequate) from houses to the main are generally very short, so I doubt that they are responsible for a substantial proportion of the total observed VD. What we need to know is the typical CSA of the actual main - then we could do some sums!

In any event, there are situations which would negate that argument, at least on a local basis. One of my immediate neighbours derives their supply from the supply side of the DNO fuse on one of my phases in my house. Both them and us have 80A service fuses. My neighbour therefore shares all of my path back to the tranny (plus a bit more cable) - so, no matter what cables are responsible for the VD, the VDs due to the two installations will be additive.

Kind Regards, John.
 
I haven't had a chance for more than a cursory glance as yet, but this network design manual from one DNO appears to have some interesting information:
http://www.eon-uk.com/downloads/network_design_manual.pdf
The introductory notes plus the section on LV distribution and voltage regulation might prove interesting.
Thanks - there's a lot of reading there. Watch this space (probably for a while)!

Kind Regards, John
 
I haven't had a chance for more than a cursory glance as yet, but this network design manual from one DNO appears to have some interesting information:
http://www.eon-uk.com/downloads/network_design_manual.pdf The introductory notes plus the section on LV distribution and voltage regulation might prove interesting.
Thanks - there's a lot of reading there. Watch this space (probably for a while)!
Right ... for starters .... The formulae for calculating VD in sections of an LV main (section 6.1.2 of document) are pretty confusing ....

Goodness knows what ‘L’ is. I suspect that it’s probably the length of the section, with Rp being the Phase Resistance per metre (not just “Phase Resistance”, as defined).

More confusing is that whilst they define Rn as being Neutral Resistance, it does not appear in any of their expressions! However, their calculations of VD boil down, as expected, to “resistance times current times various correction factors”, and one of those correction factors (F1, “Unbalance Correction Factor”) becomes 5.14 for a single consumer/ service (i.e. no ‘looping’) situation. Hence the VD calculated by multiplying current by just the Phase resistance then gets multiplied by 5.14! The other correction factors (Diversity Correction Factors) reflect the fact that one can rely increasingly less on the concept of diversity as the number of connected consumers decreases - the calculated VD increases quite rapidly if the number of connected consumers (I presume across all 3 phases) falls below about 12. I presume that the (“a/3”) reflects the fact that they are talking about the usual situation in which the loads of the total number of connected consumers are spread across the three phases.

Whatever, these calculations are obviously crucially dependent on the statistical concept of diversity. As we’ve discussed before the ‘After Diversity Maximum Demands’ (ADMDs) they work with are extremely low - just 2.0 kW for the average non-electrically-heated house (per 1.2.4.1 of the document) - and not really a ‘maximum’ in the normal sense; rather they are ‘peak expected averages’, averaged over both time and consumers. Such ‘averaging’ would be expected to work reasonably well for most of the time, but one would equally expect that there would occasionally be times when the total instantaneous demand (hence VD) rose to considerably more than those ADMD figures - such is the nature of statistics! For example, the 24 kW total ‘ADMD’ of 12 connected properties could well be exceeded, for at least a short time, if one or two of those consumers switched on 10.5 kW showers at roughly the same time. It would still probably average out at 2 kW/property over a ‘reasonable period of time’, but for 5 or 15 minutes, the demand (hence VD) could be something like double that average - and it’s that sort of voltage fluctuation that I personally have not noticed.

It is also interesting to read in 5.4.1 of the document that they work to a recommendation that the ‘step voltage change’ in response to switching on a single-phase load of 7.2 kW should not exceed 3%. Although they describe 7.2kW as “i.e. an electric shower”, this implies that they would presumably regard a voltage drop (in their main) of about 4.4% (aka ~10.1V) when someone switched on a 10.5 kW shower as ‘acceptable’. Those are the sort of sudden drops in voltage that I’ve personally never observed or been aware of.

... but I’m still reading!

Kind Regards, John
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top