GPU card update/upgrade H3LL !           
                    This is what GAMERS do (and others too) and do it wrong, many times, my page covers all ways and cures, and fast short list at the end here.

You are here because, you have replaced or added a new GPU card or just plan to do this... (dead GPU or PC or some features dead) or just want to do this once the correct way (yes).
GPU means Graphics processor unit (a chip or a card or hidden deep in your CORE(tm) processor. Intel HD GPU 'inside"
No bitcoin mining here, go away please.
There are 2 issues, here, one is power (watts) needs are short and 2nd is PCI-express communications limits.

    I will now teach you how to do a power budget for your new card and if you need a new PSU.
The faster the PC or GPU or CPU (all 3) the more current it uses, and   it  does burn  power. (some PC's can burn over 500watts at full load, gaming PC or a powerful Video  Rendering workstation)
All that power is in the form of HEAT expelled from the rear of most PC's. (disktops, here, not laptops as most have no such thing as a graphic card to upgrade)
How ever most PC's are limited to 95watts CPU and 75watts GPU, or or both way less, using modern small feature  sized chip manufacturing 14nm or at intel now heading fast to 5nm (five nano meters!)

There are 2 ways to do the power budget (watts) max.  one is to read specs and add them up, and way #2 is to an A.C Ammeter. (and measure it before adding the new card)

   The PC has many power eating devices, in electroncs this is wattage also know as power = V times A , volts times amps is power, or called VA power.
The PSU must handle all that power or the PC will crash.
The CPU and GPU chips all use  losts of power, (less today) but many high performance PCs use 95watt CPU and a GPU from  15watts to 300watts, just 1 card (PCI-express x15 video card(GPU)
The total power is the SUMMATION  of all loads in the PC, the mobo(slang for motherboard) just it alone can be 25watts, no CPU , no RAM. The chip set and other chips on the board use power even the DDR RAM.
Let me show how I did this the safe and easy way, (not guessing ) but you can put in 700watt PSU and in most cases that will work, (no magic just guess high)  No shame in guessing, so do it.

   The modern   PSU is 90% efficient, so if at 200watts, only 10% loss (0.1) times 200watts is 20watts lost. (no matter what PSU you have, the maker has specs on that to read, but rail limits and efficiencies)
So a 200w PSU at full load sucks 220watts from the WALL AC jack. OK, is this clear and simple , I do hope so...
Many PCs cheap have a stock PSU (means OEM new original parts) a 200 or 230watt PSU, or 290w.
The 200w version is a gutless wonder and is the first thing to change out. (in almost every case)
They use  this PSU for one simple reason it is cheap. Next is myth busting.
The PSU 500watts does not use 500watts if the load is 200watts, so end that lie and the green shame lies now.
See my GTX power hogs page from hell here.
If you do not want to buy a new PSU  avoid all GTX cards on the above linked page, that is it's purpose this page.

DOING a POWER BUDGET NOW:
    The steps I take are easy, and really simple, and I believe it measuring things not taking other folks word  or using a spec sheet for the 10 version of card, totally different than yours (and unknown by you this fact)

I use a simple Amp meter (AC amperes) on the AC line as seen below.
I take 2 readings one at IDLE and  one with CPU/GPU under full load using free Bench Mark program utilities, (PASSMARK and FURMARK)
With those 2 free programs I can overload the CPU and the GPU chips easy and see power used under all conditions.

Here are some simple tests and simple math (even grade school addition math works here , ok?)

I connect my Ammeter and it shows 0.2amps on the hot line (using the GPU inside my intel i5-4xxx gen 4 processor and rated at 84watts)  ( max was 0.3 amps doing extensive load testing) <<< before modifcations.
0.2a x 120v =  24watts, my  PC only uses 24watts of power total doing nothing at all ,just Windows 10 64bit at idle, before all modifications.. 
I then run PASSmark CPU mark and see 100watt.  (all metered) again PC unmodified.
Try to know the 10 processor option that work in you PC, all use different power levels in WATTS.
NOW THE MOD'S:
Other things to consider and add wattage to the budget:
30watts per HDD ( times the  number  of drives added) I have 3, so 90watts for my HDD set. (to add)
That SSD uses only 2watts so can be skipped.
To take this next step you may have upgrade the PSU next,  if yours is weak or marginal and below 300watt PSU, do that now.
I next add the new  GTX650 card (125w TDP rated)to top slot 1, x16 lanes, (with the AUX power jack wired directly to the PSU Atx12V CABLES.
My GTX650 draws 0.4amps at w10-64bit idle desktop and that is 48watts then I load it up with Furmark the current goes fast to 1.6amps (192watts at full tilt loads)
If you upgraded to  a high powered USB-c PCI card to fast charge a cell phone, add 60 to 100 watts more. (the wiki shows this)
As you can see the power is about 1/2 PC power and 1/2 GTX.
This fact and measurements, tells  me I need a 300watt PSU minimum. (so I popped in a 500watt, I keep  spare PSU  in my closet shelf , just in case one blows up)



If your GPU card has no AUX jacks then your card is not one listed on my  HOGs page linked above and only uses 75watts off the PCI express slot #1 (top)
Aux is my name for ATX12v (v.2.3 up cable set) some HOG cards suck near or at  300watts all by itself.
If the 6 or 8 pin AUX is missing you have  a max 75watt card. (or way less, some are 15 watt only) SEE THE WIKI.

That set of AUX jacks yell,  over 75watts needed for sure.

The correct PSU plugs are here.


the video card can use any of these but never that EPS  8pin, ok.  Atx12v is mostly for CPU VRM's plugged in to all modern motherboards MOBO.


This next drawing shows what many GPU card makers use  for power feeds and jacks (it is no law just guide lines) {keep in mind , that 6pins is really 3, only 3 pins are 12vdc)
Each molex pin is rated at 13amps but skip this.... (but the wire is the limit here  as is the PSU rail limits per wire set) See how OCP over current protection works,  using 1 rail to make 3 rails(virtaul)
Some PSU have 3 ATX12v jacks and each are 20 amp limted.(20x3=60amps)
A typical Rail uses a very very small ohm resistor (0.1) to measure current this resistor is called a shunt and the electronics measures the current via voltage present across the shunt (differentially) and at 0.1ohms trips at 2vdc (acoss it)
Some nasty cheap or old PSU have no OCP  at all and can in fact catch wire if shorted (seen it many a time, too many)
The actual wire is rated: (16awg, the spec real is 9amps 6wire bundle 18" and 7amps 18awg) more than that it burns up.
 (keep in mind noise is the limit not heat here the current rises to much per wire and the voltage drops(sags) and the noise this causes is to much for many computers)
Clearly the limit below is 75 divided by 2  yellow wires (above missing pin) is 37.5watts divided by 12v is 3.125amps limited,
or this:
The 150watt  jack is 3 wires yellow and computes to 50w  each wire or 4.2 amps per wire.
The PSU will feed this set of 3 pins with  20amps (ATX spec max, 240watts is the rail limit  but  on 750watt PSU to see a prime example of that at 60 amps! with 3 jacks ,20amps each)
Now way will 60amps run down any 1 jack and not cause a fire shorted to ground, and is limted to 20amps with OCP. The safe limits are fire code and the fact of wire gauge and NEMA /NEC rules same.
The cards using 75w or less have no AUX jack seen on the top of table below. (these comments are just my opinion)
Most PSU makers do NOT have full specifications , leaving out vast details, like what is the protection on all rail pins. (dang me)


Next is PSU jacks seen on all modern PSU  , but forget  floppy and no more 4 pin DVD/HDD PATA RELIC IDE power plugs.
The first 4 plugs below are PRIME now. ATX SPEC.


LAST IS MY METER. (NOTHING TELLS THE TRUTH BETTER THAN A REAL METER... (No hype, no errors, no rumors , no wild forum opinions , no lame consensus and the like)
Amps times volts (120vac) is watts (VA) this is all you need, and using at least a 500watt PSU to start makes doing all at one GO, vastly more easy.
Then if you must use a low power less watts PSU (why, I've not 1 clue) do so. (using your factual power budget as you guide) Good luck to you !

The trick here is only the hot wire (live) can be measured, do not use the clamp around 2 wires or the current will cancel to zero.
This is real current flow. A Kill-a-watt meter can do the same thing, and report actual Watts , a $10 tool.
With old meters the line must be broken (cut) to measure current, but not this tools induction clamp device using faradays laws to measure current , measuring a magnetic field.
This clamp actually forms an air core transformer. (then computes that (AC output ) to amps) The best meter is 10AMP full scale if you can find it, or can resolve 0.1amps changes.

Do not bother with COST, it is tierd cost everwhere in the USA , from 5cents to 25cents a KW/HR or worse time of day. (or far worse swimming pool rates as seen in CALIF)
All we need is A (Amp = Amperes)  and 1.0 amps is 120watts (1a x 120v = 120watts (VA) This tool is like $15




The hit list off all things you can do wrong in any GPU upgrade. PCI-express forget old PCI, ok? (years 2003 up are PCI-express)
  • The card will not fit the case, too tall or too long,(use  ruler first before you buy the card) Any cave man do this, with ruler  in hand. (case opened first, no less)
  • Do not attempt to plug that new card in a PCI slot (relic pre 2003) AGP slot or ISA slot (Jurassic era)
  • The top PCI-express slot #1 is the correct slot not any others, (your mobo manual tells you that)
  • Not learning your GPU draws 300watts all by itself and overloads the guteless cheap 200watt PSU, so only  black screen) Try 550watts. or more.  (or buy a GTX1050 that needs only 75watts.
  • Not turning off the onboard GPU chip first, in BIOS, on my Dell it's the BIOS VIDEO page settings, (a 1 second fix) (on some PCs it's marked as  BOOT PCIe video card first)
  • In the same vain, you left cables or worse adapters (or dead head plugs) connected the old onboard video, port, VGA/DVI/HDMI/DP ,so remove all that and connect only to the new GPU  card PCI-e)
  • Using UEFI SAFE BOOT , mode in BIOS makes all older cards, now useless, or worse on DELL PC the BIOS will block all non Dell Cards and the non dell certificate. GAMERS are smart not using SAFE BOOT at all.
  • The very old GPU in new PC may rarely not work  (2003 card in 2019+ PC may not work due to PCI ver1.0 failing to be truly x16 spec compliant,
  • The reverse (rare too)of the above line,  a 2003 PC motherboard my not like PSIe version 3 cards, even though it is backward compatible. 
  • Myth that PCIe has 5v versions is only that. (this is PCI old confusion or the person fails to know that power is not voltage,  it's watts)
Factoid's#1 ( the spec. dreams and facts) (most of this is legacy junk no longer valid)
PCI-express x16 versions are all backwards compatible , so as long as a card is pci-express x16 compliant .  (but  power = watts is always suspect in the GPU world of REALITY)
It will work in any slot supporting any pci-express version ... and it works the other way round too, so a v1.1 compliant motherboard with a first generation pci-express x16 slot will run any pci-express x16 graphics card
But sometimes dreams are broken for sure back in 2003, v1 boards (cards)of any kind.
No lie some board makers tell you our board only does v2 up. (and that  means it's NOT X16 compliant  !) (that is 2003 history now, but some folks like using 16+ year old mobo'?)
PCI-SIG stated at v2 this:
"PCIe 2.0 motherboard slots are fully backward compatible with PCIe v1.x cards. PCIe 2.0 cards are also generally backward compatible with PCIe 1.x motherboards, using the available bandwidth of PCI Express 1.1. Overall, graphic cards or motherboards designed for v2.0 will work with the other being v1.1 or v1.0a". (seen here) 2007 facts. and UP.
last the power horror , old express.
"However, the speed is the same as PCI Express 2.0. The increase in power from the slot breaks backward compatibility between PCI Express 2.1 cards and some older motherboards with 1.0/1.0a, but most motherboards with PCI Express 1.1 connectors are provided with a BIOS update by their manufacturers  to support backward compatibility of cards with PCIe 2.1. " (the old BIOS does not like seeing new cards added, and is a  BUG)
The early boards (mobo) were limited in power then (not 75watts like now) this limit can be BIOS Plug and play limits (bugs) or copper limits, weak traces on the power pins as seen below)
Here is a Dell SFF 960, slot , with wrong PCIe limit, cica 2008 and is not spec. at 35watts limited (so that means any 75watt spec card can crash due to power to the slot is limited, (means weak PCB traces for power! )
960 optiplex SFF, next time buy the MT with far better MOBO.
This will fail only if you are gaming, not reading email , no using NotePad.
Learn to read the spec on you mobo, first, dig in and read, then learn if it really is PCI-Express X16 complaint as the above is NOT.
That 35watts means GPU cards, as most all other PCE-e card on earth never ever go that high. (see the Nvidia GPU card wiki it is clear as day what cards have less than 35watts, TDP. for sure)

The wiki as shows.
Power
8-pin (left) and 6-pin (right) power connectors used on PCI Express cards

All sizes of 4 and 8 PCI Express cards are allowed a maximum power consumption of 25 W.
 All 1 cards are initially 10 W; full-height cards may configure themselves as 'high-power' to reach 25 W, while half-height 1 cards are fixed at 10 W.
All sizes of 16 cards are initially 25 W; like 1 cards, half-height cards are limited to this number while full-height cards may increase their power after configuration.
They can use up to 75 W (3.3 V3 A + 12 V5.5 A), though the specification demands that the higher-power configuration be used for graphics cards only, while cards of other purposes are to remain at 25 W.[12][13]

Optional connectors add 75 W (6-pin) or 150 W (8-pin) power for up to 300 W total (275 W + 1150 W).
Some cards are using two 8-pin AUX jack connectors, but this has not been standardized yet, therefore such cards must not carry the official PCI Express logo. !
This configuration would allow 375 W total (175 W + 2150 W) and will likely be standardized by PCI-SIG with the PCI Express 4.0 standard.
The 8-pin PCI Express connector could be mistaken with the EPS12V connector,(if you do that BOOM it blows UP) which is mainly used for powering SMP and multi-core systems ,servers.


Then there are PCs and mobo that are not truly complaint to this.  (power compliance and data communications compliance are the same thing, the power limits can be wrong or ignored by the maker)
  • PCI Express x16 Graphics 150 W-ATX Specification—Published in October 2004, this standard defines a six-pin (2x3) auxiliary power connector capable of delivering an additional 75 W to a graphics card directly from the power supply, for a total of 150 W to the card.
  • PCI Express 225 W/300 W High Power Card Electromechanical Specification—Published in March 2008, this standard defines an eight-pin (2x4) auxiliary power connector capable of supplying an additional 150 W of power, for a total of either 225 watts (75+150) or 300 watts (75+150+75) of available power.
The spec is here but you need GOD status to read it,  note the new specs,  ,  2008  


If all this is confusing? why not just buy a fully compliant Motherboard from ASUS (z270prime) or MSI or Gigbyte,  and end using OEM mobo that do not meet spec,'s,
YMMV your motherboard my vary, so join the makers forum and ask what your's can do.



version 2.  5-1-2019 , how to make GPU card work right and for sure the HOGS!