Stephen Harris wrote: > On Tue, May 29, 2007 at 02:55:02PM -0400, Jim Perrin wrote: >> On 5/29/07, Mark Hull-Richter <mhullrich at gmail.com> wrote: >> >>> Getting a little OT here, but how does one match up the VA ratings of >>> the UPSs (and battery backups) with the power supply wattage ratings? >> Watts = Volts x Amps. I usually use a sliding scale based on # of machines. > > Which is correct for DC circuits, but not for AC. A 1500VA UPS does not > provide 1500Watts of power. And given the output of UPS's aren't necessarily > perfect sine waves, it gets more complicated. > > A typical APC 1500VA UPS has around 860->900Watts of power. > > The answer to Mark's question is "check the manufacturers web site" :-) > Wow! What a smart bunch you all are. I was about to reply to the initial post and explain VA vs Watts, but im wayyyyyyyyy to slow. (Well to be fair, it takes longer for the bits and bytes to get here (down under) ) Seriously though ... i worked for a company which moved buildings, didnt spec the power right and ended up in a situation where the servers would run fine from UPS, with the UPSes consuming about 70% of the available power (i think 42A from the 60A outlet or something like that), *BUT* when a power cut happened and the UPSes took over they needed to recharge when the power came back. 5 UPSes x ~15A charging current = immediately tripping the fuse and starting a cycle that can only be broken by turning off half the UPSes until the other half have charged. My numbers may not be correct but this is the gist of it. The solution they should have used; Spec your UPSes based on the load from servers (dont even *tell* the power/building engineers how much power the servers consume - they *dont* need to know as the servers will *never* draw power from the grid). Then tell the engineers (or do it yourself) what the power requirements of your *UPSes* are when they are CHARGING FLAT OUT. This will always be far higher than any normal running situation. Regards, MrKiwi