Not really a DIY question - Capacitive Loads Query

Joined
9 Feb 2014
Messages
294
Reaction score
7
Location
Surrey
Country
United Kingdom
Hi,

I was hoping someone on here may be able to help me. My office is on a drive to be green and energy efficient. We are looking at the impact of unplugging laptop chargers vs keeping them plugged in. From internet research we have found that our modern laptop chargers go into a standby mode when they are either unplugged from the laptop or the laptop battery is fully charged. The manufacturer state that the power draw should be lower than 0.5w. Now whilst this is still a power saving, it isn't really that big even when you multiply it by the number of desks on site.

One of our team then came on site with a FLUKE multimeter and found that the power supplies were actually drawing ~35w of power despite being unplugged from the laptop. His view was that the manufacturers tests do not state at what phase they were testing at and the difference is a result of capacitive load being different to inductive load.

I am no expert and despite researching the above am still unclear whether we are saving the world in a large way or only a small way by unplugging the devices. I know that the supplies are completely cold to the touch which in my basic way makes me think that the real energy they are using is fairly low.

Could someone enlighten me? Does a high capacitive load still use up a lot of energy when the inductive load is so small?

Thanks for your help, I am really looking forward to being educated :)
 
Sponsored Links
There is no way your colleague took that measurement correctly.

Okay, the reason I am asking is that it sounds fishy but what would your reasoning be? Ignore the 0.5w stated by the manufacturer for a moment. Could a modern power supply that is cold to the touch still be putting giving a 35w reading?
 
Here are the readings he got:


This is what he said:

As I have learnt from my Pioneer plasma TV at home the value quoted in Watts often does not state at what phase they are measuring it at!
e.g. capacitive load verses inductive load and as we have more higher current inductive load devices in our hours there is some trade off in those.

Because I thought my TV was drawing a lot power on standby I measured it and contacted Pioneer direct to confirm what I was seeing. They agreed measuring the same panel with my measurements that the power usage was higher but was a capacitive load therefore to an electric meter in your house it would appear lower.
 
Sponsored Links
Roughly speaking we can talk about three types of "power".

Real power is where the current is in phase with the voltage. It represents a net transfer
Reactive power is where the current is 90 degrees out of phase with the voltage, as would be seen with a capacitor or inductor. It represents no net transfer of power. Capacitive and inductive reactive loads cancel each other out and conventionally capacitors are said to generate reactive power while inductors are said to consume it (this is because traditionally inductive loads were far more common than capacitive loads and capacitors were often deliberately added to balance them).
Harmonic power is where the current is at a harmonic of the voltage. It is generated by nonlinear loads (which pretty much means rectifiers). Assuming the voltage waveform has no harmonics then again it represents no net transfer of power.

These types combine in a nonlinear way to produce what we call "apparent power". That is RMS voltage times RMS current.

With a multimeter you can only measure apparent power because you can only measure voltage and current seperately. To get the full picture you really need an oscilloscope with mains rated differential probes. Unfortunately those don't come cheap.

Since your guy was almost certainly measuring apparent power it's possible the discrepancy could be explained by the manufacturer specifying real power but frankly for an unloaded laptop PSU 35VA of apparent power still seems very high.

It's also possible that the meter is not capable of accurately measuring currents at frequencies other than 50Hz. I'd be especially suspiscious of this if the current measurement was made with a clamp meter.
 
Simple - he's measured the current and made the incorrect assumption that the power factor is unity (which it won't be).
Buy a plug in energy meter that includes power factor and real power - it'll tell you that the power factor is "poor" and the actual power drawn is "quite low".
 
Buy a plug in energy meter that includes power factor and real power - it'll tell you that the power factor is "poor" and the actual power drawn is "quite low".
Whilst I'm inclined to agree in concept, I suspect that one might struggle to find such a meter which was even half-accurate at the sort of low currents we're talking about.

Kind Regards, John
 
Now you could find a sufficiently large thermos flask and a waterproof sealed bag for the PSU to be immersed in water in the flask and measure the temperature rise over time to discover what the approximate energy loss was...
 
If the adapter were actually dissipating 35W+ with no load, it would be very noticably warm. Simple enough to test.
 
0.5 w 24/7 is about 50p a year. Multiply that by number of chargers. Not much but if it makes them feel good unplug them.
As a bonus some cheap chargers can spontaneously catch fire or explode but only if they are plugged in.
 
Whilst I'm inclined to agree in concept, I suspect that one might struggle to find such a meter which was even half-accurate at the sort of low currents we're talking about.
True, but it would have to be spectacularly poor to not show the difference between (say) 35W apparent power and (say) 0.5W real power & a "poor" power factor.

0.5 w 24/7 is about 50p a year. Multiply that by number of chargers. Not much but if it makes them feel good unplug them.
And how many man minutes does that 50p buy you ? And what's that per day assuming about 250 working days/year ?
 
[quote="SimonH2"
And how many man minutes does that 50p buy you ? And what's that per day assuming about 250 working days/year ?[/quote]

Not really relavent to my point which was:

If it makes them feel good unplug them.
 
Simple - he's measured the current and made the incorrect assumption that the power factor is unity (which it won't be).
Buy a plug in energy meter that includes power factor and real power - it'll tell you that the power factor is "poor" and the actual power drawn is "quite low".

Does anyone have examples of such devices? Looking on ebay and amazon, it was difficult to see if the devices could measure power factor and real power.

Great responses from everyone. Thanks so much for your help and insight.

Jon
 
Intrigued by this and having the test equipment to hand I did a quick test on my laptop power supply.

Testing a Lenovo 90W 20V AC adaptor with a Voltech PM100 I got:

Off load
230.6V
16.4mA
500mW
3.8VA
PF 0.13

On load but with fully charged battery
230.1V
160mA
13.8W
36VA
PF 0.37

The on load readings were difficult to take as they fluctuated quite a lot - probably due to the battery management system

As you can see the PF is far from 1, especially off load. The 38W off load reading taken by the OP is wildly high. Something is wrong here I think.
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top