104 of 12V LED G4 Bulbs wired in parallel

Joined
23 Aug 2007
Messages
88
Reaction score
0
Country
United Kingdom
Hi,

I'm making a display light set up like this kind of thing:

f17176f01baeb61434ecfc2cce0f9fc4.jpg


There will be 104 bulbs in total so I have opted to use 12V G4 DC LED bulbs in G4 lampholders.

It doesn't need to be super bright so was going to use 1W LEDS rated at 12V.

Obviously will wire them in parallel.

Just wondering, will a 100W 12V DC adaptor do it even though there will be 104W of bulbs? Was looking at these bulbs (http://www.lightinthebox.com/g4-3w-...1448877.html?pos=ultimately_buy_9&prm=1.3.5.0) which have a Wattage Range of 0.5W to 2.5W. Does that mean there may be a total wattage of anywhere between 52W and 260W. If I got a 100W adaptor would it be likely to blow? Or if I got a 250W adaptor would it blow all the bulbs?!

Also, how imperative is cable size when it comes to 12V DC?

Cheers.
 
Sponsored Links
Was looking at these bulbs (http://www.lightinthebox.com/g4-3w-...1448877.html?pos=ultimately_buy_9&prm=1.3.5.0) which have a Wattage Range of 0.5W to 2.5W. Does that mean there may be a total wattage of anywhere between 52W and 260W.
Goodness knows what it means - it makes no sense (for a given voltage, which is specified as 12V)!! Unless you want to go to the hassle of trying to get clarification from them, I would suggest that you go for LEDs for which (as is usual) there is just a single power specified at the specified operating voltage!!

Kind Regards, John
 
A LED is a current dependent device, when housed in a voltage dependent package there has to be some electronics to convert it from current to voltage dependent, this could be a simple resistor or a complex pulse width modulated chip. The two converters will work the opposite way to over and under voltage, with a resistor under voltage will cause the current to drop, with a pulse width modulated chip the current will increase. This LED driver is designed to power a set of 340 mA LED's in series, you will note the voltage is 10 to 44 volt, the current is regulated not the voltage, at around 3 volt each for white LED's it could power 4 or 14 LED, it really does not matter, with red LED's at 1.2 volt it could power 36 LED's. Each colour has a different voltage so how many depends on the colour. Most seem to stop short of 50 volt not sure why as 75 volt is limit for extra low voltage with DC.

It is likely the LED you are looking at is current regulated by a simple resistor, however we normally consider a LED to give out around 100 lumen per watt, with simple resistors this is reduced because some of the power goes into the resistor so 75 lumen per watt is common, however it would seem what you have selected gives 110 lumen per watt which points to a pulse width modulated controller.

On the other hand 0.5 to 2.5 watts points to them using a simple resistor and adjusting voltage will adjust output. Of course at 12 volt with 12 LED's it could be 3 sets of 4 LED's and the voltage would be critical or 4 sets of 3 LED's and a resistor and the voltage would be less critical. The latter is more likely. Even with 3 LEDs and a resistor voltage is still very important think of 9 volt fixed across the LED's and 3 volt across the resistor 1/4 watt on resistor, so R = V²/W so resistor is 36Ω and current 0.08333. Now change that voltage to 3.2 volt i.e. 12.2 across whole device and that current jumps to 0.08888 and drop to 2.8 volt and current drops to 0.07777 so total watts goes from 0.917 to 1.084 with just a 0.2 volt out with the voltage. This is because the voltage across the LED's stays static it is only the voltage on the resistor with alterers. A little more complex than simple ohms law.

There are two ways to work out what will happen, one ask the supplier, two buy a few and test them. There are two ways to work out the lumen output, one is to measure each LED and multiply by number of LED's used, the other is to measure as a whole, since 110 lumen per watt is around maximum output for white LED's I will guess they have done the former and not even taken into account the resistor used to limit current. But I can only guess.
 
This
letter B.jpg
has 18 cheap LED elements ( no driver or separate resistor ) connected in parallel to a 3 volt battery supply ( a pair of AAA cells ).

Total current taken from the battery is around 20 mA ( varies with temperature ) so each LED is taking a bit more than 1 mA at 3 volts which is 3 mWatt per LED. This is more than enough for the lights to be seen clearly in a daylit room and to be a bit too bright in a dimly lit room.

The LED element has a resistive layer of semi-conductor to limit the current and this will over time deteriorate and eventually the LED will fail. Short life time.

I intend to replace the LED elements with something like this https://www.rapidonline.com/truopto-5mm-white-leds-64846 with individual resistors per LED . 1K resistor and 9 to 12 volt variable supply ( 9 to 12 mA per LED )
 
Sponsored Links
On the other hand 0.5 to 2.5 watts points to them using a simple resistor and adjusting voltage will adjust output.
Indeed - but the stated specification for the LEDs in question seemed be be indicating "0.5 to 2.5W" at 12V DC - which obviously makes no sense.

Kind Regards, John
 
On the other hand 0.5 to 2.5 watts points to them using a simple resistor and adjusting voltage will adjust output.
Indeed - but the stated specification for the LEDs in question seemed be be indicating "0.5 to 2.5W" at 12V DC - which obviously makes no sense.

Kind Regards, John
I would agree, however everything needs a tolerance of some sort, I have seen bulbs designed for 12 volt DC marked as 10-30 volts DC and the current draw stipulated as that when voltage is 12.7 volt. We refer to a car as having 12 volt electrics, but in real terms we know the voltage to range from 11 volt to 14.8 volt, with modern stage charging. The old two bobbin regulator used with dynamos could reach 16 volt.

With AC LED packages we expect the voltage to be stable, and rated at 12 volt RMS then we would expect +/- 0.1 volt. However with DC we tend specially with 12 or 24 volt, for it to be the nominal voltage, and for the voltage to alter according to if about to switch off due to being fully discharged normally considered as 11 volt with lead acid, or in stage 2 of a three stage charge which for open cells is 14.8 and for VRLA (valve regulated lead acid) 14.4 volt.

We however should not need to assume this, it should be quoted in the spec. It seems likely if the power supply has a voltage adjustment then a simple reduction of say 0.2 volt should be enough to drop it the 4 watt required. However without testing or getting better info this is a guess.

So out of interest would a string of 3 white LED's and resistor likely draw 1W at 12 volt, 0.5W at 11 volt and 2.5W at 14.4 volt? Assuming voltage on the LED's is 9 volt constant then the resistor will be 36 ohm using ohms law 3²/0.25 (9/0.25) as 0.25W on the resistor. So at 11 volt 0.1111 W x 4 = 0.44444 Watt OK reasonably close to 5 W. So now for 14.4 volt so 5.4 volt across resistor 0.81 x 4 = 3.64 W so may be they are working at 13.4 or 13.8 the range of float voltages used? So 0.537777 x 4 = 2.15 Watt or 2.56 Watt at 13.8 volt. Now we are getting close. So it would seem very likely the voltage range of 11 to 13.8 volt is being used to give us the 0.5 to 2.5 watt. Now the voltage on an LED is not completely static and they are not exactly 3 volt each but the calculation is close enough to make it a reasonable assumption. Glad I have a java script program to work out ohms law however.

You are correct we should not have to assume 11 to 13.8 volt gives 0.5 to 2.5 watt with 1 watt at 12 volt this should be in the spec. But it does seem very likely that my guess is correct. Unless you have some other idea?
 
I would agree, however everything needs a tolerance of some sort... in real terms we know the voltage to range from 11 volt to 14.8 volt, with modern stage charging. The old two bobbin regulator used with dynamos could reach 16 volt. ............ So out of interest would a string of 3 white LED's and resistor likely draw 1W at 12 volt, 0.5W at 11 volt and 2.5W at 14.4 volt? Assuming voltage on the LED's is 9 volt constant then the resistor will be 36 ohm ..... You are correct we should not have to assume 11 to 13.8 volt gives 0.5 to 2.5 watt with 1 watt at 12 volt this should be in the spec. But it does seem very likely that my guess is correct. Unless you have some other idea?
I get somewhat different figures from you:
upload_2016-8-3_14-1-26.png


As you can see, if you/they are talking about the total power dissipation of the package (LEDs plus resistor), then, if I've done my sums right, that would imply a voltage range of 10.685 to 15.0 volts to achieve a power range of 0.5 to 2.5 W and a total dissipation of 1W at 12V. If, on the other hand, one was talking about the power dissipation of just the LEDs, then the implied voltage range would be 11V to 19V. Over the range 11V-14.4V, the power dissipated in the LEDs (which would be reflected in light output) would vary from 0.5W to 1.35W.

Whatever, my point is that if (as is probably the case) the wide range of quoted powers relates to a range of supply voltages, then they should specify what that range of voltages is (not just specify it as "12V").

This also illustrates the folly/inefficiency of relying on a simple resistor for current control when there is likely to be an appreciable variation in supply voltage (unless the resistor is very large in value, which results in serious inefficiency). After all, one would probably not usually be happy to accept the variation in light output that would result from the power variations discussed above.

Kind Regards, John
 
Yes I see my error, but it still shows why it had a range of wattage on the package. I have been pointing out for some time a LED lamp using pulse width modulated current control will normally give around 100 lumen per watt, however there are many lamps down to 75 lumen per watt which either use a capacitor with AC or resistor with DC.

Lamps from this outlet seem all to have a pulse width modulated controller built in but the price is much higher than other suppliers as a result. You get what you pay for.
 
Yes I see my error, but it still shows why it had a range of wattage on the package. I have been pointing out for some time a LED lamp using pulse width modulated current control will normally give around 100 lumen per watt, however there are many lamps down to 75 lumen per watt which either use a capacitor with AC or resistor with DC.
I somewhat doubt that you'd even get anything like that much (in terms of overall package power) with the example we're considering at 14.4V - since nearly 40% of the applied power is lost in the resistor.

I'm not exactly sure what the 'lumens' Hence also lumens/watt) figure means when one uses PWM - does it relate to the peak luminous flux or some sort of average over time?

Kind Regards, John
 
I realise that the lumen per watt figure is a bit off as it has some variation as to how measured and also some LED's are better than others. But as a general rule I have found LED's controlled using PWM system are quoted at around 100 lumen per watt, and other systems are much lower lumen per watt. The replacement LED tube I have is quoted at 100 lumen per watt and the replacement for my R7s tube in my outside light also 100 lumen per watt the latter has a voltage of 85 ~ 265 which clearly points to PWM regulation. Most my other LED units are rated at around 70 to 85 lumen per watt and are dimmerble which points to using simple capacitors (since AC) to regulate. As one looks at colour changing LED strips the lumen per watt often drops to around 25 due to the losses using resistors to regulate.

I looked at a standard fluorescent tube with HF ballast as these are rated at around 95 lumen per watt, so to produce a LED replacement at less than 100 lumen per watt would be pointless, since they can be fitted leaving the old magnetic ballast in place they have to work with reduced voltage and clearly there would be losses because of the ballast so likely no real saving, except that in general they give out half the lumen to the fluorescent so half the wattage, but saving is that you are using less light not using a more efficient light. My son found at work using in corridors worked OK but in rooms to use required using double fittings instead of single so rather pointless.

Personally I have swapped compact fluorescent for LED as the LED lasts longer and the compact fluorescent with the folded tube are rather poor lumen per watt so swapping "Bulbs" is well worth while. The LED strip light in kitchen was only swapped to LED as the fat tubes are no longer made and I was lazy and did not want to swap fitting, although I did remove the ballast. There is a marked reduction of light, but in that area of kitchen it is not really required plus fast switch on is an advantage.

In the case in question I don't for one minute think it really gives out 110 lumen per watt, the LED may give out 110 lumen per watt, but don't think the package will with the losses with the resistor. I would expect more like 70 lumen per watt. Although using RAW LED's and a proper driver would use less power and give more lumen per watt, it will also require more work setting it all up. So using the lamp linked to would likely do the job nicely, the question is can you reduce the voltage output of the power supply slightly to reduce the draw to match power supply output? If so I am sure it will all work OK, if not then there might be an over load. Can't be 100% as there is still some guess work, but 95% sure that a small reduction in voltage will allow all the lamps to be used.
 
I realise that the lumen per watt figure is a bit off as it has some variation as to how measured and also some LED's are better than others. But as a general rule I have found LED's controlled using PWM system are quoted at around 100 lumen per watt, and other systems are much lower lumen per watt.
It obviously makes sense that the lumens/watt (in terms of overall 'package' watts) will be greater with PWM than with a series resistor; 'perfect' (100% efficient) PWM would not consume any of the power, whereas a resistor obviously does, thereby making less of the total 'package power' available for the LEDs.
Most my other LED units are rated at around 70 to 85 lumen per watt and are dimmerble which points to using simple capacitors (since AC) to regulate.
I'm less clear about series capacitors. Unlike a resistor, a 'perfect' capacitor would not dissipate any power/energy, so it's less obvious why such a setup should be inefficient.

Also, if, as you say, "dimmable points to using simple capacitors", I wonder why are dimmable LEDs generally appreciably more expensive than non-dimmable ones? Once can't get much simpler/cheaper than a 'simple capacitor'!

However, my point/question related to the meaning and interpretation of 'lumens' (hence 'luimens/watt') when one uses PWM. From what I recall, the physiology of visual perception is such that an intermittent ('chopped') light source (hence with higher peak levels) will be perceived as brighter than a constant source with the same average light level (lower than the peak when 'chopped'). It therefore could be that a PWM'd source could appear brighter than one consuming the same amount of electrical power (to the LED) which did not use PWM. Is that the case?

Kind Regards, John
 
Yes it is, or at least that is what seemed to be the case when we experimented flashing LED's when in University. The flashing LED had a higher power when on but same over all wattage over the mark/space time. We could see they were brighter, but the lux meter showed they were not.

When we tried to rad a book under the light it was the same with flashing light as with constant light, so use wise no advantage.

For automotive lights the output is measured after they have been on for one hour, but that is not required for domestic lights. Also it seems measuring one LED and multiplying by how many used is permitted, but really the lamp should be measured. However how do you measure a lamp shining at 360° most lux meters don't totally enclose the lamp.

There has to be some regulations soon to ensure all are singing from the same book but as it stands it's a complete mess.
 
Yes it is, or at least that is what seemed to be the case when we experimented flashing LED's when in University. The flashing LED had a higher power when on but same over all wattage over the mark/space time. We could see they were brighter, but the lux meter showed they were not.
As I said, that's what I thought. That therefore presumably means that one cannot necessarily use lumens/watt as a valid measure of "perceived brightness/watt".

Kind Regards, John
 
Yes it is, or at least that is what seemed to be the case when we experimented flashing LED's when in University. The flashing LED had a higher power when on but same over all wattage over the mark/space time. We could see they were brighter, but the lux meter showed they were not.
As I said, that's what I thought. That therefore presumably means that one cannot necessarily use lumens/watt as a valid measure of "perceived brightness/watt".

Kind Regards, John
Not sure about that, as although the LED's appeared brighter, the using of that light was not any better than non pulsed light, try reading a book and although the bulbs seem brighter you can't see the writing on the page any better. I found this at home when I swapped 10 compact fluorescent globe bulbs for 10 candle LED bulbs. The room seemed brighter, the lights seemed brighter, but when I came to read a book I had to use a reading lamp. It was a huge drop in wattage from 80 watt to 30 watt, and also 2/3 of the lumen, but looked brighter.
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top