In assessing a computer in which a second video card is needed, how should I determine the power requirements?
Specifically, if both video cards recommend 400W minimum, does that mean I need at least an 800W power supply?
Want to enjoy fewer advertisements and more features? Click here to become a Hardware Analysis registered user.
I have two computers that have an AGP FX5200. I was looking to add a PCI GeForce 8400GS. From what I have read, it looks like both cards recommend a minimum power supply of 300W, but on a different page I saw 400W minimum was recommended for the 8400. I don't know who to believe or what power supply to upgrade to...the machines have stock low wattage power supplies (280 or 305 I can't remember off the top of my head).
I would try it and if the system is unstable then worry about it. I don't think low end cards like that require an AUX power connection. The PSU that came with the system should be adequate... assuming there isn't a bunch of other cards in the system.
If you want to be sure that they computer has enough power consider upgrading the PSU along with the graphics card. You should be able to get a decent 500W or so PSU under $100. I personally prefer to stick to Corsair or PC Power & Cooling... you'll pay a premium but the power they supply is better than what you get from a really cheap unit (It and your parts will last longer too). Thermal Take, Antec, Enermax are not bad choices either. You should look for a unit with a single 12V rail and if the computer is a Dell you'll need check and see if the PSU is standard ATX or a proprietary design ( they were doing that a while back and im not sure if they still are or not )
Yes, I will be upgrading the PSU with the video. But I do want to make sure I was clear in my original question...I will be running two video cards...not replacing the old one with a new one. That being said, do you still think 500W is sufficient to run dual video cards? This power stuff is messing with my mind (or what's left of it)!
If you type [psu calculator] into a net search field, you'll find some tools that are available to help estimate the power requirements of a given system configuration.
I would take the resulting Watts number and add 20% for a safety factor and to provide a power buffer for future expansion.
Edit to add:
I'd also try several of the tools to check if they are at least somewhat consistent in the final figure, and (being my very technically conservative self) probably take the highest "reasonable" value as the "resulting Watts number".
I know there are hacks out there that let people do it but from my experience ATi and nVidia drivers do not play nice together. If you're going to run two cards you should save yourself the headaches and run two from the same GPU vendor.
I also question the purpose of running such a setup... Get a single more powerful GPU and be done with it. As for running two low end GPUs on a [good] 500W PSU... shouldn't be an issue.
I have four computers in an office environment that already run dual monitors. They each require a third, and potentially fourth monitor. Three of the four have existing dual-head Nvidia GeForce cards, the other one is ATI Radeon. Seeing as I only have PCI slots available for expansion, I was looking to add an additional card to support the additional monitor requirement. (This is not a graphically-intense environment...just stock business apps and browsers.)
In today's economy, I am of course expected to do this on a minimal budget. That is why I was looking at cheaper cards and power supply upgrades.
But that leads to another question: do you see a better way to accomplish the mission without spending a pile of money? If anyone has a better idea, I am certainly open to options.
And thanks to everyone for your input thusfar...I appreciate the help.
If your video card doesn't have enough heads on it to drive the number of monitors you need you're doing the right thing by buying more video cards... thats the only way I know of to accomplish what you want to do.