4

My Dell XPS 15z has the following power requirements:

  • Power: 90 watts
  • Input voltage: AC 100-240V 50/60 Hz
  • Output voltage: DC 19.5V 4.62A

I have a HP adapter which has the following specs:

  • Power: 90 watts
  • Input voltage: AC 100-240V 50/60 Hz
  • Output voltage: DC 19V 4.72A

As you can see, there is a mismatch of 0.5 V and 0.12 A at the output.

Is it okay to use this HP adapter with my Dell Laptop?

1
  • 1
    Check your Dell laptop power plug, I am not sure about your model but my Dell has a small pin in the centre of the plug. If the signal on this pin is not present, the laptop will not recognise the adapter and will not charge, although it will power up and run. The difference in voltage will not cause damage, but I would check the polarity before connecting it.
    – Tog
    Commented Feb 23, 2012 at 8:45

5 Answers 5

6

Using a HP adapter on a Dell could cause minor issues, as most Dell BIOSes communicate with the AC adapter (there's a Dallas chip inside the adapter), so it will not recognize the HP adapter. Thus, it is possible that it will not allow charging of the battery.

It should, however, power the notebook.

The output voltage and amperage are so close that is not an issue.

1
  • AFAIK some HP models also have a communication channel with the charger (3 wires) and probably through i2c protocol Commented Apr 26, 2023 at 16:02
2

It's always okay to use a power supply that can supply more current than your laptop needs. Voltage tolerances are around 10%, so .5V shouldn't make any difference.

0

Yes. You can use.

Since the current rating is higher for HP adapter, there is no risk.

(If it was lower, that would have been a problem. Because your laptop would have pulled in more current from the adapter, thereby destroying the adapter, and consequently letting high current into your laptop)

One of my friends did this on my laptop (Dell Studio 1555). He used an HP charger. I was skeptical. But since we had no other option, we tried, it worked.

(Note : The other risk involved in scenarios like this is the polarity)

1
  • In this case your answer is right, but in general it is wrong! The amperage must be high enough, but if the voltage is too much off, it may not work or damage the laptop, so the voltage is really more of an issue than current. Commented Sep 23, 2017 at 17:54
0

This is a really old question, but since this question is on the first page of results for a search with the terms "hp", "charger", and "dell", I felt it should be noted there is definitely a risk in using a higher current power supply. It's a very common misconception that as long as the adapter being used is providing more current than the device's rated current draw, it's is O.K. While it's not as consistently dangerous as using a lower rated adapter it's still taking a risk.

These devices are designed with safety measures to prevent dangerous conditions that could result in electric shocks and overheating. But because of the nature of the some the circuitry, and the voltage transformation circuitry that goes with them, which can require certification from safety organizations such as UL and take up large amounts of space, parts of both are relocated to the "brick" in an adapter (this also allows manufacturers to get per-certified bricks).

The problem with using a higher current adapter is it's like removing parts of those safety features. Imagine a part such as the laptop's backlight inverter is short circuited. The laptop was (hopefully) designed so that even if the maximum amount of available current is passed to the inverter, it can't (or at least shouldn't) overheat to unsafe levels.

But that design is also relying on the fact that even if somehow the inverter is shorted straight to the power source, the most current it should ever see is around 4.26A. Any higher than that, and the circuitry in the adapter should stop the flow of power immediately. You might have seen this before if you've ever shorted a laptop adapter. Even though the components inside the adapter would probably take a few seconds to fail, and a small object causing a short would probably start to heat up, and maybe even melt, the adapter immediately stops providing power (before any of those things can happen).

Using an adapter that's rated too high removes one of the assumptions that the system is built around. The inverter can now receive enough current to overheat and ignite, since the higher rated adapter will not treat the increased draw as a fault. In reality there are more systems capable of failing in over-current between those two points, but those systems are also designed around these assumptions, and they too are subject to failing dramatically if they're provided even more current than they were designed to deal with in over-current conditions.

I often hear analogies based on water for current, so one you could use here is an imaginary house's super-simplified plumbing system. The plumbing would be designed to operate with water coming into the house at a certain pressure, then given a little more pressure with a pump (Disclaimer: I'm not a plumber, and this hypothetical plumbing is pretty wonky, so please don't build a house using this description...):

  • Providing too little pressure causes the pump to start running out of water to pressurize. Once it runs out of water it will start pumping air and burn out. This is analogous to using an adapter that provides less current than the device is rated to use. The device uses more power than the adapter can handle, and eventually the adapter ("pump") or it's connections start to burn out.

  • Providing too much pressure to will leave the pump adding to excess pressure in the plumbing. Imagine a dishwasher on this system. It's a smart dishwasher and only takes as much water as it needs. And it's designers were also smart, so just in case something the goes wrong, the dishwasher has a built in drain near the top so that even if a leak occurs, the dishwasher won't flood.

    With the excess pressure, the dishwasher still only uses as much water as it needs, so normally there's no problem. But if the dishwasher gets a leak now, it can no longer drain away the excess water in time, and instead of running out of pressure because all that water is draining away, the pump will keep on adding was much water as the dishwasher can leak.

    This is analogous to using an adapter that provides more current than your device is rated to use. Usually the device only uses as much current as it needs. But when there's a problem with a part that has failed, the adapter will be able to provide much more current than the "drain" (or fail-safes) were meant to deal with (and a failed part in this context is one that's no longer regulating how much current it uses).

While in this question the differences in ratings might be within the tolerances the device was designed for, it's definitely not something to rely on. Often times those tolerances are for peak values, that aren't expected to exist for more than a few milliseconds at a time (for example, the time it takes for the correct adapter to respond an increase in current draw). Taking advantage of those tolerances for longer than they were designed for will cause them to fail, resulting in the same situation.

And on the subject of voltage, going based solely on the numbers mentioned see isn't a good idea, since different pin configurations exist. In this answer the pin configuration was not mentioned, and the replacement charger could have caused serious damage.

Although I'd expect that laptop chargers of the same unique size would have the same configurations, different manufacturers are free to use different connections with identical parts, and they often do because of special identification protocols used to identify approved chargers. In fact, even if the device doesn't use those identification schemes, it's quite likely it will recognize the difference in voltage, and may not charge as a result (only draw power to run).

Additionally, while .5v might not sound like much, if the laptop was designed with a tolerance of .5v, it's because the chargers are expected to provide the right voltage, give or take .5v. Using the wrong charger cancels out the margin for error on the laptop's side, but the charger can still be off by the same amount. Even if the difference doesn't result in going over that tolerance, it can strain the components in the charger. These components are designed and tested to achieve certain lifetimes on average, but those figures are all based on operating at nominal voltages (Going out of the ranges of both the nominal voltages and the tolerance, will almost certainly harm the component's lifetime).

Having said all that, I'm not claiming these circuits aren't rugged. I've simplified these examples to help get the idea across. For example, the device should have an internal fuse that would blow before too much damage would occur. But placing extra stress on components adds up over time, and not going up in smoke the moment you plug it in doesn't mean everything is O.K. It just means the design and components are a little forgiving ... or something small is already broken and sooner or later it's going to cause a domino effect leading to a catastrophic failure of the entire system, or even cause sporadic issues that that are are practically impossible track down, but end up making the device unusable because of problems with reliability (that last problem is very common). There's very rarely a good reason to do risk these kinds of things.

I used to occasionally use a Dell charger with some of my HP laptops (even though the differences in voltages and current ratings are within at most 10%, the HP laptops would run, but not charge with my Dell chargers, and some of the Dell models wouldn't even run on an HP charger). But after fixing the 3rd non-mechanical "mysterious power jack problem", I decided to buy some cheap replacements that were properly rated and haven't had any more issues past the usual mechanical ones.

Also note the problems that can stem from wrongly rated chargers don't have to happen over a long period of using the wrong charger. As soon as the wrong charger is plugged in, the device might end up with a lot more damage than it should if anything that was even in part dependent on the correct adapter being able to recognize over-current conditions and cut off power goes wrong, or the charger providing a certain voltage on exact pins.

TL;DR: The cost of the right charger will almost always will be less than any amount of damage the wrong one can cost.

8
  • 1
    The problem with this entire answer is that it's wrong; if the laptop is only drawing X amps the. Then even if the charger supports X+1 it will only output X
    – Ramhound
    Commented Sep 25, 2014 at 1:43
  • You're using the same reasoning that misconception is based on. The problem is not nominal or rated amps, it's the amps the laptop can draw during a fault. Imagine your average holiday lights with a fuse. The idea is if you fray the wires and they short, the fuse will blow before the wires will start to melt. But if you replace the fuse with one that allows more current, the lights will still work, they'll still only draw as much current as they did, but now when the wires are shorted the fuse won't blow before the wires start to melt and become a real fire hazard. Commented Sep 25, 2014 at 1:57
  • 1
    You might be getting caught up in semantics. Of course a device will only use as many amps as it's drawing. But that sentence doesn't mean anything, it's like saying that a human being only breathes as much air as a human being breathes as opposed to somehow breathing all the air there is in the world. The problem is that narrow reasoning is over-extended into the misconception I stated in the question. Maybe you should explain exactly what about my explanatory comment you disagree with since it presents the same problem with a simpler example... Commented Sep 25, 2014 at 2:14
  • 1
    You're starting to get it. This answer is addressing this single question, it's addressing the people who Google to see if they can use charger X with laptop (or device) Y and find this answer on page 1. In this case there is a small difference in amperage, but a person searching isn't going to compare these same numbers (that's I didn't mention the amperage the of the hypothetical higher amperage charger in my example). The gist of the answers was a charger of a higher amperage never does anything no matter how much higher it is. This one says there is a risk in some situations. Commented Sep 25, 2014 at 17:55
  • 1
    You might say it's pedantic to go over that slight risk with hypothetical very worst cases that are rare, but absolutes like "..**no risk**.." made it reasonable to give people an equally strong counterpoint. I'd now expect the end result of someone who searched for this and read all the answer to become "I'll use the somewhat similar charger I have one hand for a day, and buy the right one ASAP" instead of "I'll buy the somewhat similar charger and apparently there is no risk ever" Commented Sep 25, 2014 at 18:01
0

Interesting thread. My answer is "It depends." My example is two HP-branded power supplies specified for an HP laptop (4540S).

The HP site specifies a 90W adapter that delivers 19v for discrete configurations, or a 65W adapter that delivers 18.5v for configurations with integrated graphics.

HP also offers another adapter for the same laptop model, rated at 65W and 19.5V, with no configuration specification - only the laptop model.

This goes to the "Rely on mfr spec" side, while also keeping open the question about raw electrical specs.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .