Has electric demand gone up or down over the years?

Back on topic, looking at my electricity consumption since I bought my first house in 1976 it has risen by a large amount. ... In 1976 I didn't have a freezer, washing machine, microwave or any boxes under the TV. The TV had an on off switch rather than a standby button. I didn't possess a computer like I am now typing on either. My cooker and oven was gas rather than the electric oven and induction hob I use now.
I imagine that much the same for most of us over that period, although of all the things you mention, the 'voluntary' change to electric cooking is probably the one which has the greatest impact. On the other side of the equation, lighting often represented a significant proportion of the total 'back then', but has probably fallen to near insignificance in many homes these days.

It might be more interesting to think about the changes since a slightly later date, by which time most people had freezers, washing machines, dryers, microwaves and maybe dishwashers, and ignoring those who voluntarily changed to or from electric cooking. From that point in time onwards, I doubt that there has been major change, maybe even a modest decrease, primarily because of the lighting. The graphs I presented above certainly seem to indicate an appreciable progressive reduction over the last 10 years (well, as far as the data goes - to 2012).

Kind Regards, John
 
Sponsored Links
I remember up until the mid 2000s the power consumption of computers was going one way, UP. Compare the heatsink on your 486 to the heatsink on your pentium 4. Go back even further and the CPU didn't even have a heatsink. Power management was something that was in it's infancy.

Since then while the peak power consumption of computers has stayed fairly stable (and even gone up for the top end gaming rigs) the power management has got much better so the average power consumption has trended downwards again.

I wonder if this attitude change happened in other industries as well. The graphs definately seem to show a peak in the mid 2000s.
 
I remember up until the mid 2000s the power consumption of computers was going one way, UP. Compare the heatsink on your 486 to the heatsink on your pentium 4. Go back even further and the CPU didn't even have a heatsink. Power management was something that was in it's infancy. ... Since then while the peak power consumption of computers has stayed fairly stable (and even gone up for the top end gaming rigs) the power management has got much better so the average power consumption has trended downwards again.
All true.
I wonder if this attitude change happened in other industries as well. The graphs definately seem to show a peak in the mid 2000s.
Maybe, but I would suspect that many of the worst energy-guzzling appliances/ equipment in most homes do not offer much scope for much 'power management' in terms of design. It's probably more a question of user behaviour, and it wouldn't surprise me if there has been an appreciable increase in 'energy-awareness' (probably actually 'cost-awareness'!) over recent years.

Kind Regards, John
 
Sponsored Links
I remember up until the mid 2000s the power consumption of computers was going one way, UP. Compare the heatsink on your 486 to the heatsink on your pentium 4. Go back even further and the CPU didn't even have a heatsink. Power management was something that was in it's infancy.
Top-bin Haswells are getting on for 1W/mm².

What's the output/mm² of a nuclear power station? ;)
 
I would suspect that many of the worst energy-guzzling appliances/ equipment in most homes do not offer much scope for much 'power management' in terms of design.
Fridges, freezers and washing machines use very much less power now than old ones used to. Even vacuum cleaners are being forced down in power usage, and, apart from people with downlights, energy-saving lamps have cut lighting usage.
 
I would suspect that many of the worst energy-guzzling appliances/ equipment in most homes do not offer much scope for much 'power management' in terms of design.
Fridges, freezers and washing machines use very much less power now than old ones used to. Even vacuum cleaners are being forced down in power usage, and, apart from people with downlights, energy-saving lamps have cut lighting usage.
That's certainly all true in comparison with the 'old ones'. However, plugwash was talking about changes in the past decade ("since the mid-2000s") and I'm not sure that the changes to which you refer have been all that great during that period, have they?

Kind Regards, John
 
in the last ten year's I've chucked out two old fridges and one old freezer, two old TVs, one old washing machine, one old drier, and one dwr. my average electricity usage per day has dropped quite noticeably.

Appliances have quite long lives, so it takes a while for the old ones to be replaced.

I've also had a modern boiler and programmable stat, and reduced usage of electric bedroom heaters on freezing nights.
 
Appliances have quite long lives, so it takes a while for the old ones to be replaced.
I think that is a major factor. We have 6 TV's in the house only 2 are LED the rest are cathode ray tubes. Of them two in regular use.

Last year both fridge/freezer and freezer went to the wall and were replaced with inverter motor ones of A+ and A++ rating but old ones were 7 and 20 years old and I did the calculations it's not worth changing them until they fail.

Moving from desk to to lap top computer has likely reduced the power used as lap tops tend to want some battery life.

But the big electric use in my house is the tumble drier. Compared with that the rest is nothing. The washer uses less water and washes at lower temperature but one big change it's now cold fill so old washer used less electric power. Dish washer does not use much.

I fail to work out how one woman can dirty so many cloths every day. If my wife left I am sure power used would drop to 1/4 of what is used now. OK I may start looking like a tramp but the dryer is largest electric power user.

As to TV mine is rated around the 100W I remember my first large TV the voltage dropper cracked and broke in two and I could not find a replacement however I worked out that a 240 volt 750W device was around the right size resistor so fitted a 13A socket where resistor should have been and plugged in the iron.

So old TV was using around 750 watts. It was a real pain if iron set to low temperature always seemed to reach exciting bit before iron switched off.

Although still have cathode ray tubes rest of TV is transistorised and uses a switched mode power supply not a voltage dropper.

I think the biggest waste of power is the apps on my wife's phone. Without those apps she would have enough time to hang out cloths to dry.
 
As to TV mine is rated around the 100W I remember my first large TV the voltage dropper cracked and broke in two and I could not find a replacement however I worked out that a 240 volt 750W device was around the right size resistor so fitted a 13A socket where resistor should have been and plugged in the iron.

So old TV was using around 750 watts. It was a real pain if iron set to low temperature always seemed to reach exciting bit before iron switched off.

Only a sparks would do something like that!

However I question your figures. An old TV typically took 200W, not 750.
You iron wold have a resistance of around 77 ohms. TV droppers typically carried 300mA valve heater current so your iron would drop 26 v and dissipate around 7w.
Maybe the dropper carried more current but obviously not the full 3A of the iron as there would be no volts left for the TV.

Congrats for an ingenious repair however.
 
As to TV mine is rated around the 100W I remember my first large TV the voltage dropper cracked and broke in two and I could not find a replacement however I worked out that a 240 volt 750W device was around the right size resistor so fitted a 13A socket where resistor should have been and plugged in the iron.

So old TV was using around 750 watts. It was a real pain if iron set to low temperature always seemed to reach exciting bit before iron switched off.

Only a sparks would do something like that!

However I question your figures. An old TV typically took 200W, not 750.
You iron would have a resistance of around 77 ohms. TV droppers typically carried 300mA valve heater current so your iron would drop 26 v and dissipate around 7w.
Maybe the dropper carried more current but obviously not the full 3A of the iron as there would be no volts left for the TV.

Congrats for an ingenious repair however.

As you say the TV would have used some power. Either the toaster or iron would do the job but toaster tended to pop up quicker than the iron would reach temperature so iron was favourite but did need turning to maximum temperature.

So TV used less than 750 watts out of interest I tried different heater voltages 12 volt = 712 watt and 50 volt = 590 watt or 100 volt = 437 watt nothing as low as 200 watt. So I am thinking considering how long before thermostat tripped around 590 watt with 50 volt heaters.
 
Equipment may use less power now however surely there are more of use using it?
Our population is rising, making more people using.
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top