I am computing the energy efficiency of a wireless system, given by the ratio of how many bits are received over how many energy is consumed.
$$EE=\dfrac{bits}{joules}$$
For a given scenario, I have an algorithm that outputs the value of \$bits\$ and the value of \$joules\$. My issue is when I compute the average \$EE\$ obtained across different (independent) scenarios.
Assume I run the algorithm for \$1000\$ scenarios and I obtain:
- \$bits = 1\ \mathrm{Mb}\$ and \$joules = 1\ \mathrm{J}\$ for \$999\$ scenarios; and
- \$bits = 1\ \mathrm{Mb}\$ and \$joules = 10^{-5}\ \mathrm{J}\$ for \$1\$ scenario.
Now, in average, I have \$\bar{bits} = 1\ \mathrm{Mb}\$ and \$\bar{joules} ≈ 1 = 999\times10^{-3}\ \mathrm{J}\$. This gives an average energy efficiency of \$\bar{EE} ≈ 1\ \mathrm{Mb/J}\$.
If I compute the \$EE\$ for each scenario (which I think is the correct way to do this) we will get:
- \$EE = 1\ \mathrm{Mb/J}\$ for \$999\$ scenarios; and
- \$EE = 10^5\ \mathrm{Mb/J}\$ for \$1\$ scenario.
This gives an average of \$\bar{EE} = 100\ \mathrm{Mb/J}\$.
Should I compute the average \$EE\$ as the average of \$bits\$ divided by the average of \$joules\$ or should I compute the average \$EE\$ as the average of individual \$EE\$ computed for each scenario.?