0

I'm about to buy an external battery with this output specs:

  • AC 230V/ 50Hz.
  • 65Watt (max 85 Watt).

It would be similar to this one; picture below for reference:

enter image description here

I’m used to see amperage output, so knowing amperage input I can have some control, but this has watts. I suppose that is because of the AC current? Don't really know. Having watts, how can I control the number of devices that can I connect to this battery?

For example, my laptop has input specs like this. The input from the transformation, the one that I usually connect to the wall socket, just like I'm going to do with the battery:

  • 100-240V~ 1.7A 50-60Hz

I see that the voltage and the herzs are fine, but it shows amperage instead of watts.

Now, without asking the battery things. My laptop transformer have those specs:

  • Input: 100-240V~ 1.7A
  • Ouput: 19.5V 6.15A 120W

When I apply Ohm's law I see that the input can have from 58W to 141W, so when I see the maximum Watts a power supply can offer, what Watts should I to check? The 120W the laptop grabs or the W the transformator grabs depending on the voltage?

5
  • Power (watts) = voltage x current. Divide the watts by the battery's voltage to get the amps.
    – fixer1234
    Commented Nov 29, 2014 at 19:53
  • @fixer1234 So this battery only outputs 0.3A? Commented Nov 29, 2014 at 20:08
  • 1
    That's not a battery, it's a power supply. One of the product pictures shows the back, where it lists the specs on the USB ports (5V, 2.4A for that model). It looks like it also has an AC output (which would be for something very low-powered). You need to go by the product specs rather than calculations. There is circuitry between the battery and the outside that manipulates what the battery puts out, and that circuitry has its own limits.
    – fixer1234
    Commented Nov 29, 2014 at 20:22
  • @fixer1234 Oh ok. I thought that was an external battery, so it's like a UPS with an internal battery, isn't it? I went to the product specs and saw the 65 Watts, thats why I asked. I have a final question. You can add your comment as a reply and I'll accept it. Commented Nov 29, 2014 at 20:48
  • @fixer1234 Question updated. I appreciate your help! Commented Nov 29, 2014 at 21:02

1 Answer 1

2

There are several things going on, it isn't just a matter of applying Ohm's law.

First, that's not a battery, it's a power supply. You're not dealing with the battery, rather the output of various circuitry that uses the battery. The circuitry manipulates the voltage and current and has its own limits. So in terms of how much current you can get out, you need to look at the specifications for the device.

One of the product pictures shows the back, where it lists the specs on the USB ports (5V, 2.4A for that model). It also has an AC output (which would be for something very low-powered), and you would need to look at the specs for its limits. It would not exceed the power rating for the product. Using what you listed in the question, the maximum power would be what it can handle momentarily, like the surge when you first turn on whatever is plugged into it. The continuous rating is the 65W and as you calculated, that would be just under 0.3A at 230V. However, the power rating is typically for the whole device, so you probably couldn't demand that much AC power if you were also siphoning power on the USB ports.

Which brings us to whether it can power your laptop. The laptop power adapters are "universal" in the sense that they don't manufacture it specifically for your laptop. It is probably used with a whole range of laptops, so they make it to support the heaviest demands it will be associated with. It is also designed to work on a wide range of voltage. The only starting point for calculations is the output because it produces that from anything in the input range.

It will put out 19.5V with as much current as the laptop needs as long as it is less than 6.15A. That doesn't tell you much about what your laptop actually needs. You can get better information from the laptop's battery.

The battery voltage is likely to be a little lower (to charge the battery, you need a higher voltage than the battery puts out). The battery probably lists a mAh rating (1/1000ths of an amp times how long it will provide output). The values are typically over 1,000; divide the number by 1,000 to get the amp-hours. The user manual may tell you how much run time you should get on a full charge, or just use the run time you achieved when the battery was new. Divide the amp-hours by the hours of run time and that gives you the amps. Multiply the amps by the battery's voltage and that gives you a ballpark approximation of the watts used by the laptop. There is a big margin of error and you don't want to operate right at the limits of that external power supply, so leave some cushion between your laptop watts calculation and the power supply's rating.

Another consideration. Just as the laptop battery has a mAh rating, so does the battery inside the power supply. The power supply's wattage rating just tells you how fast you can pull power from it without burning it out or popping a circuit breaker. It doesn't tell you how long you can operate. Inside its cabinet, there is a lot of circuitry in addition to the battery, so the battery takes up only a portion of the space. Compare some arbitrary fraction of the unit's size to the size of your laptop's battery. It could be using a battery with higher energy density than your laptop (like lithium vs NiCad), but if it is a lot smaller, you can probably expect less run time. Also, run time isn't linear with load. You will get less amp-hours out of the battery if you drain it at full load than if you nurse power out of it. One other consideration, if you plug the laptop AC adapter into this power supply, the AC adapter will waste some power so the full rated power won't be available for the laptop.

3
  • That helped me a lot. Also I have a big voltimeter and I can use it to see how many amps my laptop uses at full capacity (with a USB device connected that I'm going to use usually, full brightness, with some CPU and GPU intensive task, etc...) and add some error margin, as you say. About the transformator, if I understood well, it will grab as many ampers as the laptop grabs, isn't it? I mean, when I connect the transformator to the power supply it will grab 230V because is the amount that the power supply sends, but how many A will grab? I need this to calculate Watts. Commented Nov 29, 2014 at 22:37
  • The current is determined by the voltage and the load. The laptop has a certain effective or equivalent resistance. You apply a voltage to it and that defines how much current flows. The transformer basically just changes voltage levels (and wastes a little power in the process). If there were no losses, the current flowing through the input side of the transformer would just be a ratio (P=VI; if the power stays the same, the voltage x current stays the same). If the power loss is 15%, that would show up as 15% more current on the input side.
    – fixer1234
    Commented Nov 29, 2014 at 22:57
  • @JorgeFuentesGonzález - Calculate watts at the location you're interested in and just be consistent to apply voltage and current for the same place. If you are interested in laptop usage, measure at the laptop. If you are interested in what the external power supply will see, measure at the input to the AC adapter. If you have to measure things somewhere else, throw in estimates for losses. For example, measure the input to the AC adapter and then subtract a little to get what the laptop uses. Or, measure what the laptop uses and add a little for what the AC adapter needs to supply that.
    – fixer1234
    Commented Nov 29, 2014 at 23:03

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .