Vphase VX2

As pointed out in the video when this snake oil device is reducing the voltage to the appliance the meter is still using the un-reduced voltage in the calculation of power being used. Hence for some loads the meter will be recording a higher consumption than the actual consumption of the load.
I noted that in video, but actually it was drawn wrong, it showed it as a choke rather than a transformer be it buck or auto they really do the same thing. I have used many auto transformers and they work exactly like an isolation transformer but without the isolation. So output 220 volt at 8 amp and input will be 240 volt at 7.4 amp the transformer clearly gets warm so must use some power so output may be 1760 watts but input likely 1780 watts there will be some losses.

I was working at Sizewell power station building and we had a maze of tunnels to put temporary lighting in. My boss decided it would be better to use a festoon of fluorescent lamps so we had a load of 110 volt fluorescent lamps which were really standard 240 volt with an auto transformer to step it up, the transformer was marked 110 - 0 - 127 volt and the supply was connected to the 110 volt tapping, quick calculations 60W 110 volt so two lamps per amp so 16A supply with a safety margin so 25 lamps per festoon. Within a couple of hours called to failed lamps, after the third call took clamp on ammeter and they were using 25A no wonder 16A trip was opening. So we at first swapped all to 127 volt tapping instead but end lamps were not starting so returned the last 5 to 110 volt tapping. Measured current and down to 16A or just below so that 17 volt actually reduced the power used by 9A. I after that took a lamp and experimented with it and realised removing the PF correction capacitor also made a huge difference. With input to 110 volt tapping and capacitor removed it was drawing over an amp nearly double it's rating.

So there is no question with standard ballast controlling the voltage and correcting the power factor if required a unit like the Vphase can save a lot of money. It is NOT snake oil they CAN work. But for them to work they need to be supplying items like the fluorescent fitting, and today the good LED lighting with PWM drivers built in will not allow the unit to save money. However how many LED lamps do have a PWM driver and how many use a simple capacitor as a dropper? If using a capacitor what will the power factor be? In fact how will the transformer in the Vphase alter the power factor and is the power factor corrected in the unit?

Don't get me wrong I would never fit a Vpower unit, however I have fitted power factor correction units and they did reduce the bill. They were clearly tested and they did show a saving, however the 5% was for a house with no electrical heating of any type and if there was it was corrected for, so we are looking at houses with rather low electric bills to start with. So at £1 per month it would take 9 years to pay for its self.
 
no bernard, although the meter is recording the higher voltage it is recording a lower current.

anyone who wants to know more about the circuits should google for "voltage sag compensation". this is the exact opposite problem - boosting the voltage when it is temporarily below spec - but the technique to fix it, a PWM autotransformer, is essentially the same.
 
upload_2016-3-7_1-54-20.png
Now that makes a lot of sense. Boosting the voltage by adding to it with an inverter would clearly work. But although you could reduce voltage in the same way you would be using energy to reduce it so does not make sense. The line diagram looks simple but it would be quite a complex bit of kit. Since in the main it is the fluorescent lights which use the extra power when voltage is high it would have made more sense to only connect it to the lighting circuit. Inverters are so cheap today likely safer and easier combining it with an UPS device so the 220 volt is always supplied to lights even with a power cut. However as already said events have removed the advantage HF ballast units can also work with an UPS device built in, and as well as precisely controlling the current they also increase tube life, increase tube output, and remove the stroboscopic effect. So today the device is no longer going to help compared with other devices now in the market place.
 
no bernard, although the meter is recording the higher voltage it is recording a lower current.

V_snake_1.jpg


As there is no third connection to the voltage adjuster Im will be the same as Il

If the voltage adjuster reduces the voltage then the power disipated in the load is less then the power recorded by the meter. A resistive load will be taking less current due the the reduced voltage but the meter will still be recording a higher power. If the load is a Switched mode supply it is likely to pull more current at the lower voltage in order to achieve its required power output.

The voltage adjuster could be a resistance in which case the meter will record the sum of the power to the load and the power converted to heat in the resistor.

Even if the voltage adjuster is a winding on a transformer the load current into that winding is still the same as the current leaving it.
 
This is what the video showed but I don't think the coil is wired quite that way. I believe either a bucking or auto transformer arrangement so it will reduce current. As an auto transformer it will reduce current seen at the meter. What the device seems to assume is reduce the voltage will not affect operation and it will save money.

However as admitted on the trials this will not work with items using electric to heat, neither will it work where the items have some form of switch mode power supply, which today is 90% of house hold electrical goods.
 
no bernard, although the meter is recording the higher voltage it is recording a lower current.
As there is no third connection to the voltage adjuster Im will be the same as Il ... If the voltage adjuster reduces the voltage then the power disipated in the load is less then the power recorded by the meter. A resistive load will be taking less current due the the reduced voltage but the meter will still be recording a higher power. ...The voltage adjuster could be a resistance in which case the meter will record the sum of the power to the load and the power converted to heat in the resistor.
Indeed - IF the 'voltage adjuster' is just a two-terminal passive device such as a resistor. In that case the meter will (correctly) record, and the customer will (correctly) pay for, the entire amount of power used - both that supplied to the load and that 'wasted' in the 'voltage-adjusting' resistor. If the load is passive (e.g. resistive), that total power will be less than it would have been without the 'voltage-adjusting' resistor, but one will still be paying for a substantial amount of wasted (but metered) power usage.

However, if the 'voltage-adjustment' is achieved by a transformer (or an electronic equivalent), then, other than for losses in the voltage-adjustment process, there would be no wasted power and, in the case of a passive load, the current metered (hence energy paid for) would be reduced. As you say, if the load is a 'self-adjusting' one such as a SMPSU, then 'voltage adjustment' ought not to appreciably effect what is metered and paid for.

Kind Regards, John
 
the current metered (hence energy paid for) would be reduced.
but the current would be metered at a voltage higher than that applied to the load so the power measured at the meter would be higher than that consumed by the load.

Say current is 1 amp. incoming voltage is 240 volt. reduction bring it down to 220 volts.

Meter records it as 240 VA. load is consuming 220 VA

So where is the "missing" 20 VA going ? Phase change in the voltage adjuster altering the power factor in the meter is the most likely explanation.
 
Now I am not sure exactly what happened but many years ago we found every time we welded the three single phase meters would reverse direction. It as replaced with a single three phase meter which stopped this from happening. So when we talk about what is metered it really can't be guessed at, no one expected the meter to reverse when we welded but it happened.
 
the current metered (hence energy paid for) would be reduced.
but the current would be metered at a voltage higher than that applied to the load so the power measured at the meter would be higher than that consumed by the load.
Not if the current is 'transformed'
Say current is 1 amp. incoming voltage is 240 volt. reduction bring it down to 220 volts. ... Meter records it as 240 VA. load is consuming 220 VA ... So where is the "missing" 20 VA going ?
You're being too vague by saying "current is 1 amp". If voltage is transformed from 240V to 220V by a transformer (or electronic equivalent), then if the current through the load (at 220V) is 1A (i.e. 220 VA), then the current on the primary/input of the transformer/whatever (at 240V) will be about 0.92A, (220 / 240 x 1), hence metered consumption ~0.92A x 240A = 220 VA. There is no "missing 20 VA"!
Phase change in the voltage adjuster altering the power factor in the meter is the most likely explanation.
As above, there is no need to contemplate such mechanisms if a transformer (or equivalent) is doing the 'voltage adjustment'.

Kind Regards, John
 
Transformer theroy is quite complex, and I will hold up my hands and admit without pulling out my university books I can't remember how you worked out how many turns needed with a transformer. However I would expect in order to ensure as much saving as they could power factor is corrected within the device. Clearly some losses, but basic theroy of 1800 watts in equals 1800 watts out. So on the limit of 8A we should be looking at a gain of around 210 watt I would allow 6 watt for losses. If this was the maximum saved at 15p per kWh then maximum it could save would be around £275 per year.

However as I have already said fluorescent lamps even 17 volt jumped the current form 0.6A to 0.8 amp at 110 volt. Sorry for 110, but that was the voltage I was using when I did the tests. But in my house just 5 fluorescent lamps, two in garage, two in kitchen and one at top of the stairs, the one at top of stairs was HF so no saving there, the pair in kitchen one swapped for LED and other is 240 volt and any volt drop and it fails to strike, garage hardly used. Tested electronic CFL and was pleasantly surprised to find when it says 11W it actually uses 11W the electronics seem to compensate for volt drop so they are out. So in real terms in my house to be able to drop volts the kitchen lamp would need changing first and If I changed it then I would now fit either a HF ballast or a LED tube.

To have any benefit house would need to be fitted with 220 volt wire wound ballast units. But at around £200 for the Vphase and £20 for HF ballast units clearly swapping to HF ballast units would be the way to go.

But really we are looking at a house with a Vphase already fitted and the cost of repair v cost of alternative. With no circuit diagram I would think it's a non starter for repair so looking at cost of removal and fitting HF ballast units. I would consider around £50 to £100 to remove and around £40 per fluorescent to swap to HF type. So to remove and update the lighting will cost about the same as a service exchange replacement if they were available. So I would say the options are clear. Remove it.
 

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Back
Top