This is a really old question, but since this question is on the first page of results for a search with the terms "hp", "charger", and "dell", I felt it should be noted there is definitely a risk in using a higher current power supply. It's a very common misconception that as long as the adapter being used is providing more current than the device's rated current draw, it's is O.K. While it's not as consistently dangerous as using a lower rated adapter it's still taking a risk.
These devices are designed with safety measures to prevent dangerous conditions that could result in electric shocks and overheating. But because of the nature of the some the circuitry, and the voltage transformation circuitry that goes with them, which can require certification from safety organizations such as UL and take up large amounts of space, parts of both are relocated to the "brick" in an adapter (this also allows manufacturers to get per-certified bricks).
The problem with using a higher current adapter is it's like removing parts of those safety features. Imagine a part such as the laptop's backlight inverter is short circuited. The laptop was (hopefully) designed so that even if the maximum amount of available current is passed to the inverter, it can't (or at least shouldn't) overheat to unsafe levels.
But that design is also relying on the fact that even if somehow the inverter is shorted straight to the power source, the most current it should ever see is around 4.26A. Any higher than that, and the circuitry in the adapter should stop the flow of power immediately. You might have seen this before if you've ever shorted a laptop adapter. Even though the components inside the adapter would probably take a few seconds to fail, and a small object causing a short would probably start to heat up, and maybe even melt, the adapter immediately stops providing power (before any of those things can happen).
Using an adapter that's rated too high removes one of the assumptions that the system is built around. The inverter can now receive enough current to overheat and ignite, since the higher rated adapter will not treat the increased draw as a fault. In reality there are more systems capable of failing in over-current between those two points, but those systems are also designed around these assumptions, and they too are subject to failing dramatically if they're provided even more current than they were designed to deal with in over-current conditions.
I often hear analogies based on water for current, so one you could use here is an imaginary house's super-simplified plumbing system. The plumbing would be designed to operate with water coming into the house at a certain pressure, then given a little more pressure with a pump (Disclaimer: I'm not a plumber, and this hypothetical plumbing is pretty wonky, so please don't build a house using this description...):
Providing too little pressure causes the pump to start running out of water to pressurize. Once it runs out of water it will start pumping air and burn out.
This is analogous to using an adapter that provides less current than the device is rated to use. The device uses more power than the adapter can handle, and eventually the adapter ("pump") or it's connections start to burn out.
Providing too much pressure to will leave the pump adding to excess pressure in the plumbing. Imagine a dishwasher on this system. It's a smart dishwasher and only takes as much water as it needs. And it's designers were also smart, so just in case something the goes wrong, the dishwasher has a built in drain near the top so that even if a leak occurs, the dishwasher won't flood.
With the excess pressure, the dishwasher still only uses as much water as it needs, so normally there's no problem. But if the dishwasher gets a leak now, it can no longer drain away the excess water in time, and instead of running out of pressure because all that water is draining away, the pump will keep on adding was much water as the dishwasher can leak.
This is analogous to using an adapter that provides more current than your device is rated to use. Usually the device only uses as much current as it needs. But when there's a problem with a part that has failed, the adapter will be able to provide much more current than the "drain" (or fail-safes) were meant to deal with (and a failed part in this context is one that's no longer regulating how much current it uses).
While in this question the differences in ratings might be within the tolerances the device was designed for, it's definitely not something to rely on. Often times those tolerances are for peak values, that aren't expected to exist for more than a few milliseconds at a time (for example, the time it takes for the correct adapter to respond an increase in current draw). Taking advantage of those tolerances for longer than they were designed for will cause them to fail, resulting in the same situation.
And on the subject of voltage, going based solely on the numbers mentioned see isn't a good idea, since different pin configurations exist. In this answer the pin configuration was not mentioned, and the replacement charger could have caused serious damage.
Although I'd expect that laptop chargers of the same unique size would have the same configurations, different manufacturers are free to use different connections with identical parts, and they often do because of special identification protocols used to identify approved chargers. In fact, even if the device doesn't use those identification schemes, it's quite likely it will recognize the difference in voltage, and may not charge as a result (only draw power to run).
Additionally, while .5v might not sound like much, if the laptop was designed with a tolerance of .5v, it's because the chargers are expected to provide the right voltage, give or take .5v. Using the wrong charger cancels out the margin for error on the laptop's side, but the charger can still be off by the same amount. Even if the difference doesn't result in going over that tolerance, it can strain the components in the charger. These components are designed and tested to achieve certain lifetimes on average, but those figures are all based on operating at nominal voltages (Going out of the ranges of both the nominal voltages and the tolerance, will almost certainly harm the component's lifetime).
Having said all that, I'm not claiming these circuits aren't rugged. I've simplified these examples to help get the idea across. For example, the device should have an internal fuse that would blow before too much damage would occur. But placing extra stress on components adds up over time, and not going up in smoke the moment you plug it in doesn't mean everything is O.K. It just means the design and components are a little forgiving ... or something small is already broken and sooner or later it's going to cause a domino effect leading to a catastrophic failure of the entire system, or even cause sporadic issues that that are are practically impossible track down, but end up making the device unusable because of problems with reliability (that last problem is very common). There's very rarely a good reason to do risk these kinds of things.
I used to occasionally use a Dell charger with some of my HP laptops (even though the differences in voltages and current ratings are within at most 10%, the HP laptops would run, but not charge with my Dell chargers, and some of the Dell models wouldn't even run on an HP charger). But after fixing the 3rd non-mechanical "mysterious power jack problem", I decided to buy some cheap replacements that were properly rated and haven't had any more issues past the usual mechanical ones.
Also note the problems that can stem from wrongly rated chargers don't have to happen over a long period of using the wrong charger. As soon as the wrong charger is plugged in, the device might end up with a lot more damage than it should if anything that was even in part dependent on the correct adapter being able to recognize over-current conditions and cut off power goes wrong, or the charger providing a certain voltage on exact pins.
TL;DR: The cost of the right charger will almost always will be less than any amount of damage the wrong one can cost.