Commando socket installation.

I can't see how. It's really the other way around - the power consumption results in the heat, not the converse.
Is the cooling apparatus sensitive to the CPU temperature? If so, less energy will be required if the cooling medium (air or liquid) is cooler. Maybe cooling the coolant, or avoiding overheating it, could reduce the cost of cooling.

BTW, lots of farms have chunky power supplies to redundant or seasonally used buildings. Maybe the OP could do a deal with one if the infrastructure supplier turns him down.
 
Sponsored Links
Is the cooling apparatus sensitive to the CPU temperature? If so, less energy will be required if the cooling medium (air or liquid) is cooler. Maybe cooling the coolant, or avoiding overheating it, could reduce the cost of cooling.
Are you talking about the energy required to run the "cooling apparatus" (which will be trivial in terms of the big picture)? The actual (IT) equipment will draw as much current as it needs to work, which will result in a certain amount of heat being generated, regardless of what cooling is done.

Kind Regards, John
 
Are you talking about the energy required to run the "cooling apparatus" (which will be trivial in terms of the big picture)?
Not if it is trivial.

The actual (IT) equipment will draw as much current as it needs to work, which will result in a certain amount of heat being generated, regardless of what cooling is done.
So there is no thermostatic control of the CPU temperature?
 
Out of interest, I started logging various data a long time ago at work. Amongst this data was the load on the UPS, and various temperatures. I found there was a distinct correlation between the inlet temperatures of the servers and the load. Not a massive change, but I estimated that on a 7kW base load (give or take a bit), I could see something like a 400W variation with temperature.
I did consider that it might be variations in load due to variations in workload on the servers - but I could definitely see a difference between hot and cold days.
Mind you, with a bit of experience you could hear the temperature in the server room - with all the servers running the fans up when hot. So while each fan was only a watt or two, when you add a few hundred of them together then it does make a difference - both to power consumed and noise made.
 
Sponsored Links
It depends, 800 or 1250 , however you can overclock them and that in turn consumes more power!!!
i'm thinking of getting 6 of the 800 and 6 of the 1250 but I don't want loads of sockets, thats why i thought of the commando idea so I can get some PDU strips.

So that's 10kW+ of heat that is going to be dissipated continuously into your garage (just about ALL the energy used ends up as heat in the room - even the sound and light energy eventually hits the building walls and converts to heat.

" I have several windows , a few fans plus the aeration is great due to a metal roof and many gaps!"

isn't going to work. You need to be able to remove that 10kW of low grade energy whilst keeping the servers at a reasonable operating temperature, and you need to be able to do it in the height of summer when the ambient temp could be 35C. That means you need air conditioning, and not just a little domestic one. That will itself need a few kW to run. That metal roof with the sun beating down on it won't help then either.
 
Last edited:
And expect a knock from the local police after their helicopter has spotted a big heat signature on the roof and a visual identifies some extraction. :)
Also expect a visit from the local scallies that will want those high spec computers, you'd need them to be behind very strong walls, doors and roofing as they will rip anything off to get to them.
 
So there is no thermostatic control of the CPU temperature?
Yes, there is, in the sense that the cooling fans only switch on when needed to restrict the CPU temperature to a safe level.

Doing as you suggested and reducing the temperature of the 'input' air (or other coolant) being drawn through the equipment would therefore reduce the amount of time that the fans were on. However, I would still say that, in terms of the big picture, the amount of energy used by the fans would be fairly trivial, and I also imagine that any such benefit would be more than cancelled by the energy required to cool the ingoing coolant.

What cannot be changed (for a given piece of equipment) is the amount of heat generated by the CPU (and other things) when the computer is doing a given amount of 'work'.

As I said before, the most energy-efficient approach would be to recover as much of the heat as possible and putting it to 'good use'. In winter, the most obvious use would be space heating, but water heating could be done throughout the year. That way, one would save at least some of the energy/cost that would otherwise have been needed for space/water heating/whatever.

Kind Regards, John
 
However, I would still say that, in terms of the big picture, the amount of energy used by the fans would be fairly trivial
As I wrote above, my experience suggests that while small, it's not trivial. In our case, with around 7kW total load, there could be in the order of 300-400W difference between a cold day and hot day - I estimate that we were only shifting about 1/2 the air we needed to, and we also suffered from external factors allowing the hot extracted air to pool outside in the vicinity of the intake on calm days.
And of course, when the temperature went up, the server fans ramped up, so the airflow through them went up and so a higher percentage of the air going through the servers would find it's way to the front of the racks - further increasing the intake temperature and then the server fan speeds.

, and I also imagine that any such benefit would be more than cancelled by the energy required to cool the ingoing coolant.
Even with passive (outside air) cooling that can be considerable. Going from memory, we had 315mm radial flow inline duct fans taking several amps each. Originally we used to vary the speed, so in very cold weather we'd dial them back a bit - but by the time we got to 7kW they ran full speed almost all year round.

The Specific Heat Capacity (dry air @ standard density) = 1.01 kJ/kg K, = 1.23 kJ/m^3 K (can't recall where I got those figures from). So if you want to keep the delta-T down to (say) 10˚C, you need 10k/1.23k m³/s of airflow, that's over 8m³/s of airflow. To put that in perspective, if your vents have 1m² of area, then the air velocity is over 8m/s, or about 18 mile/hour :cool:

You also have to bear in mind that with passive air cooling, you bring in all the "dirt" that's in the air. You can filter the larger stuff out (at the cost of filters and the restriction to airflow (and hence power needed to shift the air), but realistically you cannot filter the very fine dust. Your servers WILL soon end up with a nice coating of fine "dust" in them.

An "air source heat pump" could recover the heat (in ASHP terms, it's actually relatively high grade heat) - but think in terms of around 3 to 3.5 kW of power needed to run the heat pumps, and then what could the average house do with 10-13kW of heat in summer (which is when the servers are hardest to cool) :whistle:
 
As I wrote above, my experience suggests that while small, it's not trivial. In our case, with around 7kW total load, there could be in the order of 300-400W difference between a cold day and hot day ...
Yes, I read that, and was rather surprised. Did you really have 300-400W worth of fans?

In any event, I think my other point remains. In what you describe above, you got the cooler air (on cooler days) for nothing, since it was merely the consequence of the weather! If you had had to explicitly cool down air to the same extent, I imagine that the energy required to do that would have taken a very large chunk out of that 300-400W saving, if not turning the 'saving' into the converse!

Kind Regards, John
 
I've read, perhaps somewhat anecdotally, about people who do heat their houses in the winter with bitcoin miners and offset the electric cost on the heating savings.

Not sure what they do in the summer months though...
They probably lose money.

You cannot, simply cannot, turn a profit doing bitcoin mining without leading edge purpose designed hardware. The value of bitcoins is high right now, that is true, which does alleviate the problem, but essentially as they get harder and harder to find the cost of finding them goes up and up, and it is very easy to burn more in electricity than you make from mining.

I think (not sure) I've read that you struggle now using GPUs, and that it's bitcoin ASICs or nothing.

gauldicus - what are you mining for, and with what?
 
Last edited:
Out of interest, I started logging various data a long time ago at work. Amongst this data was the load on the UPS, and various temperatures. I found there was a distinct correlation between the inlet temperatures of the servers and the load. Not a massive change, but I estimated that on a 7kW base load (give or take a bit), I could see something like a 400W variation with temperature.
I did consider that it might be variations in load due to variations in workload on the servers - but I could definitely see a difference between hot and cold days.
Did you have servers where the clock speed and/or the VM loads on them were variable, and controlled by temperature? Could the variation not have been because with a lower inlet temperature the CPUs could be driven harder?
 
OK - it's not a subject I know much about - I didn't know how long ago GPUs fell out of favour.
 
I see that water-cooled bitcoin ASICs are now available. The OP will need an endless supply of people who want a nice hot bath!

Come to think of it, my first computer was water-cooled.
 
Yes, I read that, and was rather surprised. Did you really have 300-400W worth of fans?
Hard to say, I've never really looked at the power consumption of a single fan - but I assume it's not a lot. But when a single 1U server could have as many as a dozen little fans in it, then the numbers soon add up.

Hmm, just had a quick look to see, here's a small fan that takes over 6W at full power which is "a bit more" than I was expecting. Put half a dozen of those in a small box, and you've got a not insignificant additional load that will vary with intake temperature.

In any event, I think my other point remains. In what you describe above, you got the cooler air (on cooler days) for nothing, since it was merely the consequence of the weather! If you had had to explicitly cool down air to the same extent, I imagine that the energy required to do that would have taken a very large chunk out of that 300-400W saving, if not turning the 'saving' into the converse!
Indeed. Typical COP for cooling plant is around 3:1 (ish) - ie one unit of electric in for 3 units of heat removed. When heating, you also get your unit of energy you put in as heat out, so for heating uses, you can see up to around 4:1. It depends a bit on the engineering in the plant, but mostly it's determined by the properties of the refrigerant and the operating conditions. This is something I've actually done - got the manual for an aircon system and looked at the tables of performance vs indoor/outdoor conditions (IIRC at the time I was arguing with an idiot technician who was arguing that it was "the wrong sort of room" rather than fixing an obvious fault).

Assuming more or less constant "indoor" conditions (which you mostly can for a server room cooling application), as you raise the outside air temperature, you increase the head pressure the compressor has to work against (the boiling point vs pressure curve for the refrigerant), and increase the work the compressor needs to do. The refrigerate flow rate and compressor suction pressure will remain the same, being fixed by the heat input power (cf latent heat of evaporation x flow rate) and indoor temperature (again, boiling point vs pressure curve) respectively. At some point the system will hit a limit - either the mechanical design of the compressor will limit the pressure, or it will have to shut down to avoid overloading the drive motor and mechanical components.
As a secondary thing, there will most likely be an increase in power to the condenser coil cooling fans as ambient temperature rises.

Conversely, if you lower the indoor temperature, you will reduce the COP (lower temperature -> lower suction pressure -> higher pressure differential across the compressor) and increase the running costs for a given heat load. So running the server room at "put on a thick coat before entering" temperatures adds to operating costs.

Did you have servers where the clock speed and/or the VM loads on them were variable, and controlled by temperature? Could the variation not have been because with a lower inlet temperature the CPUs could be driven harder?
Well the actual computing load wouldn't be weather dependent, showing a regular daily variation that roughly followed business hours. These weren't "churn away at a dataset and finish it as soon as performance allows" loads, they were all "process something when a user does something" loads (web based business applications mostly).
It wasn't something that's easy to spot as you have to compare graphs from different days and estimate the difference. I'm sure someone with both the skills and inclination could have done a proper analysis to separate out the temperature correlation from the normal daily cycle - I had neither. But it did look to be in the order of 300-400W difference - the UPS only reported load to a resolution of 100W. Realistically, a 5% increase in power consumption doesn't seem too unrealistic when you consider how much air needs to be shifted in some of these compact boxes when fed with high temperature inlet air.[/QUOTE][/QUOTE]
 

DIYnot Local

Staff member

If you need to find a tradesperson to get your job done, please try our local search below, or if you are doing it yourself you can find suppliers local to you.

Select the supplier or trade you require, enter your location to begin your search.


Are you a trade or supplier? You can create your listing free at DIYnot Local

 
Sponsored Links
Back
Top