So I'm trying to work out my daily Ah usage for an off-grid set up (and to see what my power bill at this new place might be like).
Context: I've gone through and created a sheet with the appliances I use. I've read up a for a few hours on the P=IV relationship. I remember the V=IR relationship well from school. I've gone through and done lots of calculations.
Problem is I suck at math so I don't know if I'm right or wrong. Once I understand something I'm ok, and I can often pick the outlier/extreme muck ups, but until then.. My first calcs came out with 711 Ah/d. I'm now down to around 238 Ah/d after some corrections. It seems more realistic as the $/d figure estimated using provider prices is about $14/d.
Question(s): Some of the devices that I've calculated for, lets take laptop charger as an example (and yes I've read a few electronics.stackexchange posts on those too). It says it's 150W, with 240V ac input and 2.5A. 2.5A * 240V = 600W - am I missing something?
- Which is the determining factor in the calculation?
- They all change based on their relation to the others, but when reading the specs on a device, if it says 240V, 2.5A, 200W (making it up if you can't tell), what is the factor that determines the other two?
- Is the reason they list two factors on appliances because the amperage/wattage specified is the determining factor?
- Also does a little a mean milliamps and large A mean amps?
Formulas I've been using: I = P/V P = I*V
Edit:
Average household use annually in my country is 8492kWh, or 23kWh a day. I'm a below average user, but my calculations show almost double the national average?
Link to spreadsheet: https://docs.google.com/spreadsheets/d/1RxyESS4RZUqBwq8omk89D9vGgJLgF6uPt123cP8iDIs/edit?usp=sharing