In the top answer to the question Why do we use Root Mean Square (RMS) values when talking about AC voltage, the following was stated:
This RMS is a mathematical quantity (used in many math fields) used to compare both alternating and direct currents (or voltage). In other words (as an example), the RMS value of AC (current) is the direct current which when passed through a resistor for a given period of time would produce the same heat as that produced by alternating current when passed through the same resistor for the same time.
The RMS value, specifically applied to a sinusoidal voltage source $V_\mathrm{p}$ is given by:
$$V_{\mathrm{RMS}} = {V_\mathrm{p} \over {\sqrt 2}}$$
Here is where my intuition conflicts (where it probably goes off).
I'd imagine the average voltage that would be "felt" (direction insignificant / i.e. absolute value) by the circuit would equate to the true average value of the voltage, which is given by integrating a half period and dividing by the length of the period (the mathematical procedure of finding the average height of a function over a given interval)
I.e. the avg. Voltage, it seems to me, should be given by the equation:
$$V_{\mathrm{avg.}} = {2V_\mathrm{p} \over {π}}$$
I know that the two conversion coefficients are close but I simply cannot see why the RMS value is the one that conforms to reality. Please enlighten me! :)