Looks like a referee may be needed ...
So I reckon that, on this occasion, Stoday wins, by a relatively large head, and even you are only second
Kind Regards, John.
Bernard did indeed say that, but he really shouldn't have done - because in the situation being discussed, the relevant voltage (that across the length of cable in question) would not remain constant.[Err... What he [bernardgreen] said was "If the resistance increases and the voltage remains constant then the wattage reduces."
Ahem... anybody who can read should know that.
Well, what Stoday said was far closer to the situation under discussion than what Bernard had written. As Stoday said, the current through the entire circuit (including the bit of cable of interest) would be almost entirely dictated by factors other than the impedance of the bit of cable in question. Hence, in practical terms, the current through (not voltage across) the length of cable in question would be essentially constant. That being the case, increasing the impedance of that bit of cable would result in an increase of voltage (not unchanged voltage) across the bit of cable and hence an increase, not decrease, in the the power disipated in that bit of cable.But the current doesn't remain constant does it. Anybody with GCSE physics should know that.If the resistance increases and the current remains constant then the wattage increases.
So I reckon that, on this occasion, Stoday wins, by a relatively large head, and even you are only second
Kind Regards, John.