That is flawed logic. At a guess, a 100w bulb will be 10% efficient in producing light, 90% or 90w of heat. 90w of heat in a room will make very little difference in autumn and cost around 4 times as much as gas, for heating.
The efficiency of a device is how much energy in compared with how much useful energy out, you can't destroy energy so operative word is useful. If the heat is useful then it is included in the efficiency rating.
There will be loses as not all the heat with bulb being so high will get into the area of the room used be occupants. However the big advantage of radiant heat is it is nearly instant, so no need to pre-heat the room, but that instant bit is also its down fall, as you can't use mark/space ratio to regulate it, so only real way to use it is the regulate the back ground heat.
However the radiant heat is not only wasted in summer it could result in energy being used to cool it. This is not limited to use of tungsten bulbs, the same applies to cooking, so be it large fan, or full blown AC unit, if using gas hob for cooking instead of an induction hob means you also spend money cooling the room, then that also needs to be taken into account. We want the heat to go into the food, not the room.
This is repeated again and again, seems great idea to have double glazed windows with k glass, until you realise the room with bay window is now sitting at 32°C in the winter as the morning sun has heated the room and the TRV has been too slow to act so both sun and hot radiator have acted together. May be k glass fitted wrong way around, may be it should stop the sun heating the room?
But the whole point is a home is a complete unit, no good quoting efficiency figures between a condenser dryer and a vented dryer without considering the humidity of air drawn in and the heat thrown out when it draws air from the room.
So we can gain an advantage by controlling the energy used, but only when ALL things are considered, not just taken in isolation.
So returning to the optimiser, this method I did use years ago with a power supply to a radio, before switch mode, the transformer for the power supply had tapping, with 10 volt increments, so I had a relay which would switch to lower voltage tapping if the voltage dropped and back to higher voltage as volts increased to limit how much heat the 2N3055 transistor had to dissipate when regulating the output, it was not that successful, but principle was good.
I also had a problem with 110 volt 58 watt fluorescent fittings on the build of Sizewell 'B' power station, my quick mental calculations was if 116 volt then 58 watt would be 0.5 amp so 16A MCB so 25 fittings should be just under 16 amp so 25 fittings per string.
However the lights tripped, clamp-on ammeter showed actual current 20 amp, so took one fitted to experiment, the fitting had an auto transformer, 230 volt out, not quite centre tapped it was marked 110 - 0 - 127 volt from memory and moving the tapping reduced current from 0.8 amp to 0.6 amp, so first 20 fittings the tapping was changed, last 5 left as volt drop on cable, and it dropped the total to 15 amp.
So in the days of wire wound ballast the voltage was very important, but the place to adjust is at supply to fluorescents not the whole house.
As to the LED bulb, common to use a capacitor to limit current, so it would be interesting to see what they use with varying voltage, some have a regulator built in, marked 10 to 30 volt DC as to mains I suspect the fluorescent tube replacement uses a switch mode form of regulation. So reducing voltage with some items will increase current and with others it will reduce it. And as to wave form distortion when not directly connected to mains incomer, I really don't know.
There would need to be a modern study to see effects, but I suspect little point, today I would not dream of building my own power supply to power my radio, I would just buy a switch mode supply as now far better than the old ones and can cope with the sudden increase in current when PTT is pressed. With the old one I gave up, and fitted a lead acid battery to stop the effects of volt drop (mains hum on transmit).
Also voltage has dropped, when we went from 240 to 230 in real terms nothing happened, but when they started to fit solar panels the panels would trip on over voltage, so at long last the actual voltage was dropped, so now we really have a 230 volt supply, so no real point in the optimiser.