# calculating power

It isn't always as simple as adjusting the current - modern dimmers usually regulate the power by clipping either the leading or falling edge of the voltage supply sine wave.
Lamps are also a bit more technical in they have a positive temperature coefficient so the current drawn isn't linear related to the supply voltage.

It isn't always as simple as adjusting the current - modern dimmers usually regulate the power by clipping either the leading or falling edge of the voltage supply sine wave.
i.e. they vary the voltage, and therefore the current flowing in the load.

Lamps are also a bit more technical in they have a positive temperature coefficient
Most normally conductive materials do.

so the current drawn isn't linear related to the supply voltage.
No, but it is still proportional to it - reduce the voltage and you reduce the current.

It isn't always as simple as adjusting the current - modern dimmers usually regulate the power by clipping either the leading or falling edge of the voltage supply sine wave.
Lamps are also a bit more technical in they have a positive temperature coefficient so the current drawn isn't linear related to the supply voltage.

Yes of course modern dimmers tend to switch rather than provide a constant supply, otherwise we would not be able to get a pair of 1KW dimmers in a standard switch box.

The thing that makes the bulb glow is current and to make it glow brighter the current needs to be increased, the shape of the current is almost irrelevant, its the amount of area under the curve that does the work and is averaged out by the speed the lamp brightness can vary. We don't tend to refer to average voltage or average current but it's pretty much comparable to (although not necessarily the same as) RMS.

The temperature coefficient is also bordering on irrelevant in this situation as its an insignificant variation compared with the light output and the eyes perception. Please don't all jump on this point, I know its calculable and measurable and creates all sorts of inrush problems especially with bigger lamps.

Regardless of the temperature coefficient the current is still directly related to the resistance and voltage at the point of measurement and if the voltage is increased the current will also rise, even though the resistance also rises.

Hi BAS

Your post wasn't there when I started writing mine, Its good to see we are saying the same thing in different ways.

the shape of the current is almost irrelevant

Of course it is not. Don't be ridiculous.

The current needs to have the same cross section as the conductor down which it travels. That is why when a cable gets trapped and misshapen, the current flow stops.

Thanks SS

It isn't always as simple as adjusting the current - modern dimmers usually regulate the power by clipping either the leading or falling edge of the voltage supply sine wave.
i.e. they vary the voltage, and therefore the current flowing in the load.
Not quite - they vary the time which the voltage is applied across the load hence the amount of time on power.

If anyone is interested I can send a bit of a write up on dimmers (specifically betapack 1) which I wrote as part of a project.

oops - I thought I had edited - must have pressed the wrong button

The thing that makes the bulb glow is current and to make it glow brighter the current needs to be increased, the shape of the current is almost irrelevant, its the amount of area under the curve that does the work and is averaged out by the speed the lamp brightness can vary. We don't tend to refer to average voltage or average current but it's pretty much comparable to (although not necessarily the same as) RMS.

Sort of - the time which voltage is applied across the load and current is applied before switching off is used to regulate the amound of power to the load - as you say it is the area under the curve which is important.
RMS of anything other than a sine wave makes my head hurt!

I wasn't aware that a stable output voltage from a varying input voltage was a requirement of a switch mode PSU.
Most electronic 'transformers' used for ELV lighting are more akin to SMPSs, capable of delivering a stable output over a range of input voltages, so the simple model of a fixed resistance load goes out the window.

So, just because you quote a singular source of opinion on a DIY forum, that becomes the de-facto definition of a switch mode power supply? It may well be the case with many SMPSUs, but it is not a requirement in order for a power supply to be classified as switch mode. As the name suggests, it's called Switch Mode because of the high frequency switching method employed to regulate the output voltage.

Anyway, a modern dimmer doesn't vary the voltage to a load (be it a lamp or transformer),
Of course it does - if it didn't then a directly connected incandescent lamp would not dim.

I think you'll find it would, and it works because you're varying the amount of time the filament is powered ON for, vs the amount of time it's OFF for. It works because the switching happens many times a second (120 times, every half cycle), such that it cannot be detected by the human eye. The fact that the filament does not have enough time to cool down between cycles also hides the way the dimmer is really working.

It could be, but is it?

Do mass-market consumer dimmers use PWM, or just a simple thyristor circuit which fires at a given point in the cycle?

A thyristor firing at a certain point in the cycle is a form of PWM. The timing of the firing vs the zero crossing point is varied to give different levels of dimming.

Also worth noting that PWM dimming will also work on a DC supply (although of course it will not have to be in sync with any zero crossing point). Having a sine wave of varying voltage is not a requirement in the dimming process, as the on/off period of the lamp is what's important, not variation of current in the load.

In fact, thyristor/SCR/triac dimmers have to be in sync with zero crossing on an AC supply because the control circuit is only able to turn the thyristor ON, after that it is latched, and will only stop conducting once the zero crossing point is reached and no current is passing through the thyristor.

Some more modern designs employ IGBTs and the like, which can be turned off at any point in the cycle. These designs are preferred as they allow the load current to gradually ramp up as the supply voltage in each half cycle rises, avoiding the RFI interference (and the requirement for large chokes) generally associated with switching on of large loads mid-cycle.

so a SMPSU would be able to look at the mark/space ratio of the incoming PWM modulated AC mains, and approximate this to the required output on the ELV side for the lamp.
.
It could, but does it?

Do mass-market consumer ELV supplies do that?

Good question. While I have a fair amount of experience with dimmers, having built computer controlled dimming units in the past, I confess to not knowing huge amounts about the workings of mass-market dimmable SMPSUs as used to supply ELV lighting.

Equally, in contradiction to my last post, it could work merely based on the fact that the incoming AC is rectified to DC inside the SMPSU before being chopped, which would produce a varying DC voltage based on the duty cycle of the incoming PWM modulated 230v from the dimmer.

If you have any info yourself on the exact workings of a dimmable switching transformer then I'd be interested to know more.

I suppose you could argue that if you approximate the output of a PWM dimmer to an RMS voltage, you could say that the output voltage is being varied. However, as the dimmer can only turn it's output on and off at varying speeds and durations, the peak voltage at the output will still remain the same, and the dimming is achieved by only passing current through the load for a set period of time every 1/50th of a second.
In other words the voltage is reduced.

The only voltage we ever talk about is the RMS one. The value is actually the integral of the waveform - if parts of it are chopped off the area under the curve is reduced, i.e. the RMS voltage is reduced, even though the peak stays the same.

Agreed, but let's keep in context - strictly speaking, the dimmer still isn't reducing the voltage in the same way as a variac. With a scope, you'd see that the variac is reducing the overall amplitude of the sine wave, whereas the PWM dimmer is cutting away parts of the wave.

So, just because you quote a singular source of opinion on a DIY forum, that becomes the de-facto definition of a switch mode power supply? It may well be the case with many SMPSUs, but it is not a requirement in order for a power supply to be classified as switch mode. As the name suggests, it's called Switch Mode because of the high frequency switching method employed to regulate the output voltage.

I think you'll find it would, and it works because you're varying the amount of time the filament is powered ON for, vs the amount of time it's OFF for.
If you do that then you are varying the voltage.

Having a sine wave of varying voltage is not a requirement in the dimming process,
Indeed not - and you do end up with a non-sine wave, which makes RMS calculations a lot harder, but if you do them, or simply superimpose your chopped output waveform over the sine wave input, you'll see that the output voltage is lower.

as the on/off period of the lamp is what's important,. not variation of current in the load.
If it's on for less time and off for more then the current is reduced.

Good question. While I have a fair amount of experience with dimmers, having built computer controlled dimming units in the past, I confess to not knowing huge amounts about the workings of mass-market dimmable SMPSUs as used to supply ELV lighting.
It's pretty much guaranteed that they are made cheaper and nastier than a cheap and nasty thing which is made very cheaply and nastily...

Agreed, but let's keep in context - strictly speaking, the dimmer still isn't reducing the voltage in the same way as a variac.
Agreed - it's reducing it in a different way to a variac.

With a scope, you'd see that the variac is reducing the overall amplitude of the sine wave, whereas the PWM dimmer is cutting away parts of the wave.
With a scope, you'd see that the variac is reducing the voltage by reducing the peak voltage of the sine wave, whereas the PWM dimmer is reducing the voltage by cutting away parts of the wave.

If you had a square wave of 325V and cut bits off the wave to make it a sine wave, you'd say its voltage was 230V, the RMS value and not 325V, the peak value.

It follows that a 230V RMS sine wave with bits chopped out of it should be described by its RMS value, which would be lower than 230V.

So when BAS describes a dimmer that chops down the sine wave as reducing the voltage, his description is apt.

If you had a square wave of 325V and cut bits off the wave to make it a sine wave, you'd say its voltage was 230V, the RMS value and not 325V, the peak value.
Not always - we use peak values on instrumentation.
Like I said above, RMS for anything other than a sine wave makes my head hurt!
Start clipping part of the sine wave out and the resulting waveform is no longer a pure sine wave so working out RMS is difficult, working out the area under the curve is a bit easier.

#### DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Are you a trade or supplier? You can create your listing free at DIYnot Local

Replies
29
Views
2K
Replies
5
Views
752
Replies
4
Views
1K
Replies
30
Views
4K
Replies
17
Views
2K