From the Stefan-Boltzmann Law ($j^*=\sigma T^4$), the luminosity of the Sun is $L_{sun}=j^*A=[\sigma T^4][4\pi R_{sun}^4]=3.85\times 10^{26}$ W.
If I assume the Sun to be in a steady state then the energy exiting the core (I will assume $R_{core}\sim 0.25R_{sun}$) should be equal to the energy the sun radiates away into space at the surface.
If I set the the power that is exiting the Sun's core (heading into the radiative zone) equal to the Sun's luminosity and apply the $j^*=\sigma T^4$ and $L=j^*A$, I find the temperature at the boundary between the core and radiative zone is 11,558 K (much lower than the 7,000,000 K value quoted online).
I realise that a more accurate picture would be to solve the equations of hydrostatic equilibrium, but I would think that if all the Sun's energy is created in the core, and the Sun operates at a steady state, than energy conservation would require that the energy exiting the core should equal the luminosity of the sun.
Is there something wrong in my reasoning about energy conservation, or is/are there ways that the sun emit energy formed in the core other than by blackbody radiation? Am I wrong to assume the Sun's core emits as a black body (if this is the case, I don't think a non-zero emissivity would matter much since I'm off by almost 3 orders of magnitude)?