I am trying to calculate the surface temperature of the sun with a copper strip. I have a temperature sensor to calculate the temperature of the strip and that's it. Assuming the rate at which energy is supplied by the sun is constant at my location and time. I can plot the temperature at various times, and get the gradient which will be (assuming it is linear): \begin{equation} \frac{dT}{dt}=consant \end{equation} Since I know that: \begin{equation} dQ=mcdT \end{equation} I can then differentiate it with respect to time, and since I know the rate of change of temperature: \begin{equation} \frac{\mathrm{d}Q }{\mathrm{d} t}=mc\frac{\mathrm{d}T }{\mathrm{d} t} \end{equation} Thus: \begin{equation} \frac{\mathrm{d}Q }{\mathrm{d} t}={mc*constant} \end{equation} Then from here I can equate it to the the Stefan–Boltzmann law and calculate the temperature of the temperature of the sun: \begin{equation} mc*constant=\varepsilon \sigma AT^{4} \end{equation}
This is where I get stuck, I am including the emissivity constant as the atmosphere has to absorb some power. I am also taking the area of the sun as a known constant so I can rearrange and solve for T. Do you guys think that this approach is valid or are there any mistakes with it? Furthermore, are there any factors I am neglecting in this? I am fine with any higher level calculus if necessary. Sorry for my formatting I am new to this page.