I’ve had this question in my head for a while, and was finally prompted to ask after reading this question which partially hinges on the answer.
My question is: does the amount of power used—and thus load on the power supply—vary between different add-on cards and other components within the same socket type?
For example, if I had a low-performance graphics card plugged into a PCI-E 2.0 socket and I replaced it with a new card that offered substantially better performance in the same socket as the original card, does that new card use more power/put more strain on my power supply than the old card? What about HDs, would replacing an older HD with a new one that had higher RPMs, again, using the same port as the old one?
I know (or at least believe) that the voltage available at each port/connector is standardized, but does the actual watts or amperage drawn vary depending on the device attached?