So they say that it's easier to supply very high voltage in power lines and then have a transformer turn that voltage down.
But isn't voltage the potential difference between the two sources/electrodes that electricity flows between them? In other words, no matter how many transformers you put in the way, you can just decrease the current by increasing the resistance, not the voltage, cause by definition that's how voltage is defined.
But then again, why do people say the voltage is decreased? Voltage should always be fixed no matter what, unless you change the way you produce electricity all the way in the battery/power plant.
Also, wouldn't putting a lot of resistance in the way waste a lot of energy in heat?
To take this further, a device later down the line should be unable to measure the true voltage of something, because it has no way of knowing what the potential difference in the source is and how much resistance was put in the way. Rather the only thing it should be able to detect is how much current passes through it, which is determined by ohms law, but again, it has two missing variable. It has no way of finding out the source voltage and no way of knowing the resistance put in the way (unless of course you supply the device with said data). If it only had one of them, it could find the other, but it doesn't have both.
And so what does it even mean when a device operates by a known voltage?
What am I misunderstanding here?
Edit:
Now I am confused as to what actually voltage is. Is it not a measure of the difference in the number of electrons. What about current, is it not a measure of how many electrons flow per second?
What does a "unit" of charge physically represent and how would that unit of charge have more energy (does it increase its velocity or what)?
Since I have thought of voltage and ampere like a number thing, if the difference in number electrons is higher, then they flow much more violently, in other words, more electrons would move per second (more ampere). But I am being told that the number stays the same, it's just those electrons have more energy each. This would also raise the question of why would more voltage then even have to do with more current (since it only represents the amount of energy each electron has, not the number of them, so you could have a lot of voltage but very little ampere, and I am saying without even factoring resistance in)?
If we go with the idea that voltage is how much energy each "unit" has, and current is how many "units" there are, then what's really stopping something from have like a very low number of "units", but them being really high energy. It's not like the "units" having more energy would change the number of "units", so why would a higher voltage even mean more ampere?
If we assume higher voltage pushes electrons more violently, then why is that? Is it not due to the bigger difference (in number) between the electrons (when compared to protons) on both sides? Causing it to flow more violently (in more numbers, thus higher current)?
You could have technically very little charge to travel, but them having a very high difference (in number compared to each other?), but that would just mean you would get a really high current, but for a really short time. So again, current would be enough to measure the power.
The only case in which current would not be enough to measure the power is if the number of "units"/electrons/whatever moving doesn't change, yet somehow their energy does (but again, Ohm's law says that any increase in voltage would cause an increase in ampere too, so like what?).