[CentOS] OT: hardware question

m.roth at 5-cent.us m.roth at 5-cent.us
Tue Sep 16 21:28:38 UTC 2014


Warren Young wrote:
> On 9/16/2014 14:39, m.roth at 5-cent.us wrote:
>> Warren Young wrote:
>>> On 9/16/2014 13:29, m.roth at 5-cent.us wrote:
>>>>
>>>>      Opinions on which slot to use?
>>>
>>> My opinion is that you should read "Hot Air Rises and Heat Sinks:
>>> Everything You Know About Cooling Electronics Is Wrong" by Tony
>>> Kordyban.  It is quite readable, for all that it is a serious EE book.
>>
>> My degree's in CIS, not EE, so I never got into that. Should I assume
>> that you are an EE?
>
> No, I just play one on the Internet.
>
>> If so, you could give me an opinion....
>
> I'm trying to tell you that a true EE would not give you an "opinion."

Fine, but the admins here might have some practical experience to offer an
opinion....
<snip>
> - What is the operating environment temperature now?
>
> - What will the temp be on the day when the site power goes down,
> cutting the aircon, while the server room keeps running on UPS, and the
> electronic door locks stay locked because *that* UPS is separate and
> someone forgot to check the battery, so it fell over as soon as the wall
> outlet flatlined?

Wrong scenario: if the site power goes down, very shortly a lot of the
servers in the room will shut down by themselves from the firmware
protection against overheating.

And if it's not during regular business hours, my manager will be notified
and coming in, and he's got this odd thing called a "key" to get into the
room and shut things down.

Assuming that the giant UPS next door doesn't kick in.
>
> - How many fans do you have in the case?

Don't remember what it comes with. It's a std. Dell server.
>
> - Is it better if you add another, or worse?

It's a rack-mount server, not someone's tower workstation. There's no
place for more fans.
<snip>
> Another war story, from personal experience: a workstation from a
> big-name manufacturer which ran wonderfully with the case closed up, but
> when you let all that cool outside air in through the side panel, it
> went into thermal lock-up because you'd disrupted the carefully-designed
> airflow channels.

Right. I had the ignorant vendor of the video cameras and card that I
bought last year ask me if I could run the server with the case open....
They really had *no* idea of what an actual server was.
>
> The correct answer isn't always intuitively correct.
>
>>> Or, you can do the experiment yourself.  You have the equipment, you
>>> have the environment, and you have lm_sensors.  Try it both ways, and
>>
>> No, I cannot "do the experiment". I've got to get these racked and up
>> and running, for my users to use. They're not my toys....
>
> You can't spend two hours to run them under load, one hour in each
> configuration?  These servers have to be production ready two minutes
> after first power-on?

I could... if I wanted the extra work. As it is, I may have the
experiment: I realized that the right-hand one is *only* for a short
adapter. Now, there's two servers, and two RAID boxes, and one of the RAID
boxes had an adaptor to fit the back of a short slot... and for the life
of me, I can't find one for the other....

       mark




More information about the CentOS mailing list