-1
\$\begingroup\$

So I'm trying to work out my daily Ah usage for an off-grid set up (and to see what my power bill at this new place might be like).

Context: I've gone through and created a sheet with the appliances I use. I've read up a for a few hours on the P=IV relationship. I remember the V=IR relationship well from school. I've gone through and done lots of calculations.

Problem is I suck at math so I don't know if I'm right or wrong. Once I understand something I'm ok, and I can often pick the outlier/extreme muck ups, but until then.. My first calcs came out with 711 Ah/d. I'm now down to around 238 Ah/d after some corrections. It seems more realistic as the $/d figure estimated using provider prices is about $14/d.

Question(s): Some of the devices that I've calculated for, lets take laptop charger as an example (and yes I've read a few electronics.stackexchange posts on those too). It says it's 150W, with 240V ac input and 2.5A. 2.5A * 240V = 600W - am I missing something?

  • Which is the determining factor in the calculation?
  • They all change based on their relation to the others, but when reading the specs on a device, if it says 240V, 2.5A, 200W (making it up if you can't tell), what is the factor that determines the other two?
  • Is the reason they list two factors on appliances because the amperage/wattage specified is the determining factor?
  • Also does a little a mean milliamps and large A mean amps?

Formulas I've been using: I = P/V P = I*V

enter image description here

Edit:
Average household use annually in my country is 8492kWh, or 23kWh a day. I'm a below average user, but my calculations show almost double the national average?

Link to spreadsheet: https://docs.google.com/spreadsheets/d/1RxyESS4RZUqBwq8omk89D9vGgJLgF6uPt123cP8iDIs/edit?usp=sharing

\$\endgroup\$
6
  • \$\begingroup\$ >"does a little a mean milliamps and large A mean amps?" - no. Milliamps are mA. The A is always capitalized because the unit is named after a person (André-Marie Ampère). \$\endgroup\$
    – brhans
    Commented Jul 1 at 3:38
  • \$\begingroup\$ P=U×I for purely resistive loads. Your household does not have only purely resistive loads as some loads are inductive and some are capacitive, so you likely need to take into account the power factor of each device too. For example it takes current to charge up a capacitive load with positive mains cycle, but the charge needs to be moved out to charge it up with negative cycle too, so while a capacitor consumes no power, you need to move current in and out which must be provided by your source supply. And things like switch mode power supplies only charge up their caps at mains peaks. \$\endgroup\$
    – Justme
    Commented Jul 1 at 4:08
  • \$\begingroup\$ @brhans I should have clicked to that since I know that from headlamp batteries being 3700 maah facepalm \$\endgroup\$
    – theYnot
    Commented Jul 1 at 4:50
  • \$\begingroup\$ @Justme - is there an appropriate power factor to compensate for that? Some of the articles I read suggested that things are typically 80% efficient at max. Is that a good rule of thumb to work with, or is there a better one? \$\endgroup\$
    – theYnot
    Commented Jul 1 at 4:52
  • \$\begingroup\$ @theYnot It is not about efficiency (power in vs power out vs power lost in heat in your laptop charger for example) but power factor of reactive loads. \$\endgroup\$
    – Justme
    Commented Jul 1 at 4:56

1 Answer 1

2
\$\begingroup\$

It says it's 150w, with 240v ac input and 2.5a. 2.5A * 240v = 600w - am I missing something?

Several somethings, but the big one is that the ratings you find on the labels or nameplates of most devices are maximum current draw. They won't pull that much all the time, even when the device is running. In many cases (for example anything with a motor) they only draw that much for a few moments when first turning on.

For something like that laptop "charger", it's just converting power, and it will only supply as much as the laptop needs (plus a bit because it's not 100% efficient). Probably that's 5-10 watts when the laptop is running with a full battery, and maybe as much as 70-80 watts when charging a near-empty battery.

And probably that charger is also "universal voltage", meaning it accepts 100-240V input, and it needs to be specified for the amount of current it might draw when getting 100V. 250W in for 150W out still seems a bit inefficient, but there's probably some padding in there. It's a lot less absurd than 600W in for 150W out, right?

In short, the thing you're trying to do will basically always give you numbers that are way too big, and you can't really know how wrong they are without actually measuring current. Otherwise, you can only guess.

\$\endgroup\$
2
  • \$\begingroup\$ Ok, thank you. That's super helpful to know that the actual usage differs (significantly?) from the stated maximum usage. Does that mean assuming that everything was running max draw my calcs appear to be correct, which means that it's just a case of measuring actual draw? \$\endgroup\$
    – theYnot
    Commented Jul 1 at 3:08
  • 1
    \$\begingroup\$ Yes; get one of those inline power meters and go to town. I can't speak to your calculations since I don't know what the devices are specifically, but that's a moot point when you're measuring actual current. \$\endgroup\$
    – vir
    Commented Jul 1 at 7:11

Not the answer you're looking for? Browse other questions tagged or ask your own question.