Disclaimer: I am not a licensed electrician. But I can install new circuits and I have a healthy respect for currents. Plus, the math is pretty easy to do.

15 Amps = 120*15 Watts Non Peak.

Or 1800 Watts

Of which, 80% can be in use for lengthy periods of time.

So 1440 Watts can safely be used on a 15 amp circuit for extended periods of time. 1920 Watts in a 20 amp. To power all of my machines, I ran 3 extra 20 amp lines (plus the existing 15 amp line). HOWEVER - this is overkill. I would have been ok with 1 extra 20 amp line and I suspect most of you here will also be unless you wind up with something crazy.

Keep in mind that a 1000 watt power supply WILL NOT continually pull 1000 watts. I have one, powering a quad core with 3x 8800GTXes. IDLE they pull 400 watts from the wall. Power supplies actually waste about 20% of their total power being used (none can reach 100% efficiency).

I have spoken a bit about this in my FAQ - take a look there. There are also cheap devices like the Kill-A-Watt that can read power consumption at the wall. Buy one and measure. If you do go the extension cord route (I don't suggest this but it is possible) then make SURE you get one PROPERLY rated or else it could over heat and cause a fire. Meaning, DO NOT use a 100' 20 gauge cord on a 20 amp rated breaker.

If in doubt - consult a qualified electrician, please.