0

My friend had a very small, ignited fire near where their PSU meets their graphics card last night. He thought it was from cooking bacon, until nothing could turn the fire alarms off. In the server room, there was the small fire, it had ignited the PCB of the graphics card. The only casualty was a graphics card.

There are two likely culprits for who started the fire. It was either:

  • The graphics card
  • The power supply
  • Some combination of both

The new change made was a 1600W power supply powering components. Previously it had been a 1000W component. In the build, these components, including another GPU, when running at full capacity, probably max out around 1030 Watts. However since they weren't doing any calculations, at the time they were running, they were probably using much less power. There were many fans and many water cooling.

My friend thinks it could be the power supply that caused the issue, because on another Amazon listing, it says that running their 1600W power supply at 500-800 Watts can "burn".

"if you want to use for gaming pls change the voltage to 600-800w otherwise it will burn!!!Pls assure it's in the low watt before you use it."

However on the supply he bought, it doesn't say that, which is why he bought it.

Is this a common thing, where using too high of a power supply can cause a card to burn?

5
  • 1
    Most likely a poor quality voltage regulator inside the PSU. Supplied too high a voltage to the GPU and the magic smoke escaped. Commented Sep 7, 2020 at 14:38
  • It was a real fire like SWIM had to turn off the power supply and bang the graphics card around to put out the fire Commented Sep 7, 2020 at 14:58
  • 2
    What is "SWIM"? All I can find is a possibility that you mean "Someone Who Isn't Me"...
    – Mokubai
    Commented Sep 7, 2020 at 15:00
  • 1
    I'm not saying it wasn't a fire, I'm saying what the likely cause is/was. Commented Sep 7, 2020 at 15:05
  • Also, your quote from Amazon "pls change the voltage to 600-800w". You don't measure voltage in w, so that's debunked. Commented Sep 7, 2020 at 15:06

1 Answer 1

2

A 1600W PSU should make no difference compared to a 1000W PSU of identical quality.

A 1600W PSU may have a different efficiency curve compared to a 1000W PSU, possibly resulting in one or the other running warmer, and will be able to deliver far more power if so required.

What it will not do is force current on the devices that use power unless the PSU is faulty and has faulty or poor voltage regulation. In which case it could be supplying too high a voltage and potentially overloading components in the receiving device. Poor quality power supplies may not clamp their maximum voltage properly and could require a higher load in order to keep their maximum voltage at the correct value. I would consider this a power supply design fault. A well designed supply should not have a significant problem with this, a power supply that expects a minimum load should clearly state exactly what that minimum load is. Given the broad range of computer components and power saving states I would be shocked at any power supply that had such an issue. A good quality SMPS should be able to vary its power output by adjusting their own internal switching frequency to compensate for load.

On the other side the graphics card circuitry should be able to cope with some variation in input voltage, usually you would expect 10% of nominal.

It is entirely possible that the PSU connector wasn't making great contact with the mating socket on the graphics card. Or something got knocked or damaged while fitting the new one. Or some piece of grit or dirt was in the connector.

Or lastly it could just have been age. Components do suddenly fail.

Once something does fail though, a 1600W PSU will be able to supply significantly more current and for a longer period and thus provoke and more serious fire.

0

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .