As I've said, if they are 'bare LEDs' then they cannot, sensibly, be just connected to a 4v supply....As above, neither one nor 3 can sensibly be just connected to a voltage source without anything to control current.
In fact, no number can be connected to any ostensibly correct supply.
Even if you managed to arrange/trweak things so that you got roughly the desired current at 12V, an increase to 13.8V (without anything to restrict current rise), would very probably increase the current to above the LEDs' 'absolute maximum' - the current/voltage curve for an LED is very steep.
Even if you managed to arrange/tweak things so that you got roughly the desired current at 4V, an increase to 4.6V (without anything to restrict current rise), would very probably increase the current to above the LEDs' 'absolute maximum' - the current/voltage curve for an LED is very steep.
Exactly. To the best if my knowledge, there is no such thing as a bare LED (without integral resistor or equivalent) which is designed/intended to be connected directly to a fixed voltage source. The current would be too unpredictable, and catastrophic changes in current could occur if voltage changed a little.
And what of Bernard's suggestion to use 2 + resistor? If that was tweaked to allow the right current at 12V, what would happen if the voltage went up to 13.8?
Let's do some sums and see. Design current 20mA. Two bare LEDs with Vf of 4V at 20mA. Hence, designing for 12V, the resistor would be (12-8 )/0.02 = 200Ω. Increase the voltage to 13.8V. If the Vf remained as 4V per LED, the current would rise to (13.8-8 )/200 = 29mA. In practice, Vf would rise a bit because of increasing current, so the voltage across the resistor would be a bit less than (13.8-8 ), so the actual current would be less than 29mA.
In contrast, if you had the two LEDs connected to an 8V constant voltage source with an internal resistance such that 20mA flowed, if you then increased that constant voltage source to 9.2 volts (0.6V more per LED), with typical LED I/V curves you'd probably see the current rising to 60mA or more, probably shortly followed by a 'small puff of smoke'!
Exactly - which is why one should use a voltage source appreciably greater than 4V coupled with a resistor (or more complex circuitry) to control current.
Is 15% "appreciably greater"?
The greater the difference between supply voltage and the total Vf of the LED’s, the closer the PSU+resistor comes to being a constant-current source, and hence the less change in current will there be when the supply voltage changes.
I don't understand what you mean by a "4V constant current supply".
Badly phrased. If the "LED thingy" needs a constant current then there will be a voltage drop across it, and I meant that this would be about 4v, or else the "4v" descriptor is meaningless.
As you’ve said, we don’t really know, but, assuming that it’s a bare LED, I think the “4V descriptor” can but be the Vf at normal operating current – there’s not really anything else it could be. As I’ve said, if it’s not a ‘bare LED’, then all bets are off.
I seriously doubt that a resistor sized to drop 12v to 8v when the forward current of the diodes at 4v each flows would limit the current enough when the voltage increased to 13.8 given the non-linear characteristics of the diodes - their forward resistance would drop by too much.
I suggest that you revisit the sums above and then reconsider your serious doubts.
If the Vf of the LEDs was 4V, 3+resistor with 12V would be very iffy.
Not if you can guarantee that they'd never be given more than 12v.
With total Vf equal to supply voltage, you main problem would be in the other direction. If the supply fell appreciably below 12V, current and light would more-or-less vanish.
Kind Regards, John