23
$\begingroup$

I've been trying to understand how we can equate the Boltzmann entropy $k_B \ln \Omega$ and the entropy from thermodynamics. I'm following the approach found in the first chapter in Pathria's statistical mechanics, and in many other texts. Many other questions on stackexchange come close to addressing this problem, but I don't think any of the answers get at my specific question.

So, we're considering two isolated systems 1 and 2, which are brought into thermal contact and allowed to exchange energy (let's assume for simplicity that they can only exchange energy). On the thermodynamic side of the problem, we have the necessary and sufficient condition for thermal equilibrium

$$T_1=\frac{\partial E_1}{\partial S_1}=T_2=\frac{\partial E_2}{\partial S_2},$$

where the temperatures $T_1$ and $T_2$, the internal energies $E_1$ and $E_2$, and the entropies $S_1$ and $S_2$ are all defined appropriately in operational, thermodynamic terms. On the other hand, we can show that the necessary and sufficient condition for equilibrium from the standpoint of statistical mechanics is given by

$$\beta_1 \equiv \frac{\partial \ln \Omega_1}{\partial E_1}= \beta_2 \equiv \frac{\partial \ln \Omega_2}{\partial E_2}.$$

Here, $\Omega_1$ and $\Omega_2$ are the number of microstates associated with the macrostate of each system. Now, since both of these relations are necessary and sufficient for equilibrium, one equality holds if and only if the other also holds. My question is: How can we proceed from here to show that $S=k_B \ln \Omega$, without limiting our scope to specific examples (like an ideal gas)? In Pathria's text and in other treatments, I don't see much explanation for how this step is justified.

My possibly wrong thoughts are: It seems like we first need to show that $\beta$ is a function of $T$ alone (and indeed the same function of $T$ for both systems), and then show that the form of this function is in fact $\beta \propto T^{-1}$. But I'm not sure how to prove either of those claims.

$\endgroup$
14
  • $\begingroup$ I think the reasoning is: if you define the temperature $T = \frac 1{k_B\beta}$, then the entropy can be $k_B\ln \Omega$. $\endgroup$
    – user115350
    Commented Jul 24, 2016 at 18:12
  • 2
    $\begingroup$ @user115350 i agree that that would be enough. so put another way, my question is: how can we know that? how can we prove the relation $\beta = 1/k_B T$, when $T$ is defined in thermodynamic terms and $\beta$ is defined in terms of microstate numbers? $\endgroup$ Commented Jul 24, 2016 at 20:52
  • 1
    $\begingroup$ Based on statistical theory, entropy is defined first $S=k_B \ln \Omega$. Then the thermodynanic temperature is defined as $\frac1{T}=\frac{\partial S}{\partial E}$, which leads to $T=\frac1{k_B\beta}$. The path is very clear. I guess your question might be: how can one prove that the statistical entropy is the classical entropy? I think there is a lot discussion on this in text book. $\endgroup$
    – user115350
    Commented Jul 25, 2016 at 19:38
  • 2
    $\begingroup$ @user115350 right, i'm trying to show that the statistical (Boltzmann) entropy and thermodynamic entropy are the same (and I'm most interested in a very general proof of this equality, one which isn't restricted to specific systems like ideal gases or something). we can certainly define statistical mechanical entropy as $S= k_B \ln \Omega$ if we so chose. but this definition will be worthless if we can't show how it is linked to the thermodynamic entropy, a quantity that we can actually measure in terms of energy flows and temperature changes. $\endgroup$ Commented Jul 25, 2016 at 20:01
  • $\begingroup$ @user115350 and to add a little more motivation - i've seen several proofs of the equality of the statistical mechanical and thermodynamic entropies in various works, but so far i've never been fully convinced by any of them. all of them seem to only apply in limited cases, and to not be fully general. the reason i posted this is because the line of argument outlined in my post looks like a very airtight proof of the equality in the general case, except for that last step. so i wanted to try to clarify the logic that goes into making the final step of the proof. $\endgroup$ Commented Jul 25, 2016 at 20:12

6 Answers 6

10
$\begingroup$

$\newcommand{\mean}[1] {\left< #1 \right>}$ $\DeclareMathOperator{\D}{d\!}$ $\DeclareMathOperator{\pr}{p}$

Proof that $\beta = \frac{1}{k T}$ and that $S = k \ln \Omega$

This proof follows from only classical thermodynamics and the microcanonical ensemble. It makes no assumptions about the analytic form of statistical entropy, nor does it involve the ideal gas law.

Pressure $P$ in the microcanonical ensemble

First recall that in a system described by the microcanonical ensemble, there are $\Omega$ possible microstates of the system. The pressure of an individual microstate $i$ is given from mechanics as:

\begin{align} P_i &= -\frac{\D E_i}{\D V} \end{align}

When assuming only $P$-$V$ mechanical work, the energy of a microstate $E_i(N,V)$ is only dependent on two variables, $N$ and $V$. Therefore, at constant composition $N$,

\begin{align} P_i &= -\left( \frac{\partial E_i}{\partial V} \right)_N \end{align}

The energy change of an individual microstate is trivially independent of the number of microstates $\Omega$ in the ensemble. Therefore, the pressure of an individual microstate can also be expressed as

\begin{align} P_i &= -\left( \frac{\partial E_i}{\partial V} \right)_{\Omega,N} \end{align}

According to the Gibbs postulate of statistical mechanics, the macroscopic pressure of a system is given by the statistical average of the pressures of the individual microstates:

\begin{align} P = \mean{P} &= \sum_i^\Omega \pr_i P_i \end{align}

where $\pr_i$ is the equilibrium probability of microstate $i$. For a microcanonical ensemble, all microstates have the same energy $E_i = E$, where $E$ is the energy of the system. Therefore, from the fundamental assumption of statistical mechanics, all microcanonical microstates have the same probability at equilibrium

\begin{align} \pr_i = \frac{1}{\Omega} \end{align}

It follows that the pressure of a microcanonical system is given by

\begin{align} P = \mean{P} &= -\sum_i^\Omega \frac{1}{\Omega} \left( \frac{\partial E_i}{\partial V} \right)_{\Omega,N} \\ &= -\left( \frac{\partial \left( \frac{\sum_i^\Omega E_i}{\Omega} \right) }{\partial V} \right)_{\Omega,N} \tag{1}\label{eq1} \end{align}

The macroscopic energy $E$ of a microcanonical system is given by the average of the energies of the individual microstates in the ensemble:

\begin{align} E = \mean{E} &= \frac{\sum_i^\Omega E_i}{\Omega} \end{align}

Substituting this into $\eqref{eq1}$ above, we see that the pressure of a microcanonical system $P$ can also be expressed as

\begin{align} P &= -\left( \frac{\partial E}{\partial V} \right)_{\Omega,N} \tag{2}\label{eq2} \end{align}

This expression for the pressure $P$ of a microcanonical system (derived wholly from statistical mechanics) can be compared to the classical expression

\begin{align} P &= -\left( \frac{\partial E}{\partial V} \right)_{S,N} \end{align}

which immediately suggests a functional relationship between entropy $S$ and $\Omega$.

Identification of $\frac{1}{\beta} \D \ln\Omega$ with $T \D S$

In the microcanonical ensemble, energy $E$ is a function of $\Omega$, $V$, and $N$. We now take the total differential of the energy $E(\Omega, V, N)$ for a microcanonical system at constant composition $N$:

\begin{align} \D E = \left(\frac{\partial E}{\partial \ln \Omega}\right)_{V, N} \D \ln\Omega + \left(\frac{\partial E}{\partial V}\right)_{\ln \Omega, N} \D V \tag{3}\label{eq3} \end{align}

As stated in the OP, for the microcanonical ensemble, the condition for thermal equilibrium is sharing the same value of $\beta$:

\begin{align} \beta &= \left( \frac{\partial \ln \Omega}{\partial E} \right)_{V,N} \tag{4}\label{eq4} \end{align}

After substituting \eqref{eq2} and \eqref{eq4} into \eqref{eq3} we have:

\begin{align} \D E = \frac{1}{\beta} \D \ln\Omega - P \D V \end{align}

Compare with the classical first law of thermodynamics for a system at constant composition $N$:

\begin{align} \D E = T \D S - P \D V \end{align}

Because these equations are equal, we see that

\begin{align} T \D S &= \frac{1}{\beta} \D \ln\Omega \\ \D S &= \frac{1}{T \beta} \D \ln\Omega \\ \D S &= k \D \ln\Omega \tag{5}\label{eq5} \end{align}

where

\begin{align} k &= \frac{1}{T \beta} \end{align}

$k$ is a universal constant independent of state and composition

At this point it is not immediately obvious that $k$ is a constant. In $\eqref{eq5}$, $k$ could be a function of $\ln \Omega$ and/or a function of $N$ (since $N$ was held constant in the derivation of $\eqref{eq5}$). In this section we show that $k$ is a universal constant, independent of all state functions, including $N$.

Recall from the derivation of $\beta$, when two or more systems are in thermal equilibrium they necessarily share the same $\beta$ and the same $T$, yet they generally share no other state function (i.e., each system can have its own $E$, $\Omega$, $V$, $N$, composition, etc). Thus $\beta(T)$ is an invertible function of $T$ and only $T$. More formally, we have

\begin{align} \beta(T,X) = \beta(T) \end{align}

for all state functions $X$.

It follows that $k = \frac{1}{T \beta(T)}$ likewise can be a function of only $T$ ($k$ could also of course be a constant); $k$ is not a function of $N$. From $\eqref{eq5}$, both $\D S$ and $\D \ln\Omega$ are exact differentials, so $k$ must be either a function of $\Omega$ or a constant. But we have already established that $k$ cannot be a function of any state variable other than $T$, so therefore $k$ must be a constant, independent of $\Omega$, $N$, and all other state functions.

Alternatively, since $\D S$ and $\D \ln\Omega$ are both extensive quantities, $k$ cannot depend on $\Omega$ and must be a constant (see Addendum below for detailed proof).

Therefore

\begin{align} \beta &= \frac{1}{k T} \end{align}

where $k$ is a universal constant that is independent of composition and state.

Integration and third law give $S = k \ln\Omega$

By integrating $\eqref{eq5}$ over $E$ and $V$, we have

\begin{align} S &= k \ln\Omega + C(N) \end{align}

where $C$ is a constant that is independent of $E$ and $V$, but may depend on $N$.

By invoking the third law (essentially $S=0$ when $T=0$ for all pure perfect crystalline systems) we conclude that $C$ must be independent of $N$, composition, and all other state functions. Since the constant $C$ is empirically vacuous, we can thus freely choose to set $C=0$ and arrive at the famous Boltzmann expression for the entropy of a microcanonical system:

\begin{align} S &= k \ln\Omega \end{align}


ADDENDUM:

Proof that $k=\frac{1}{T \beta}$ is a constant and cannot be a function of $\Omega$

This follows directly from the definition of an extensive function.
Let's start with this relation

\begin{align} \D S &= \frac{1}{T \beta} \D \ln\Omega \end{align}

and express it for clarity as

\begin{align} \D S &= k \D Z \end{align}

where $Z = \ln\Omega$ and $k = \frac{1}{T \beta}$ is either a constant or a function of $Z$ (and consequently of $\Omega$). Because this equation expresses a total differential for two exact differentials, $S$ is a function of only $Z$. We rewrite this equation to explicitly incorporate these features:

\begin{align} \D S(Z) &= k(Z) \D Z \tag{6}\label{eq6} \end{align}

Now from the definition of an extensive function we know that

\begin{align} \D S(\lambda Z) &= \lambda \D S(Z) \end{align}

where $\lambda$ is an arbitrary constant scalar factor.

Also,

\begin{align} \D S(\lambda Z) &= k(\lambda Z) \D (\lambda Z) \\ \lambda \D S(Z) &= \lambda k(\lambda Z) \D Z \\ \D S(Z) &= k(\lambda Z) \D Z \end{align}

by comparison with equation $\eqref{eq6}$ above we see that this implies

\begin{align} k(\lambda Z) &= k(Z) \end{align}

and so we have shown that $k$ is a constant independent of $Z$ and $\Omega$.

$\endgroup$
16
  • 1
    $\begingroup$ I appreciated your answer. However, looking again at it, I got some doubts about the step where you conclude that $\frac{1}{\beta T}$ should be constant. I can imagine a non-linear relation between $S$ and $\ln \Omega$ even if $dS$ and $d \ln \Omega$ are exact differentials and extensive. Could you add some more details about this crucial step? $\endgroup$ Commented Nov 20, 2022 at 22:55
  • $\begingroup$ @GiorgioP Please see the addendum to my original post above. $\endgroup$
    – ratsalad
    Commented Nov 22, 2022 at 14:13
  • 1
    $\begingroup$ Ok, now that step is clear. I think that for complete proof, a couple of additional points should be addressed. The first is the dependence on $N$ of the entropy. But that is a minor point. The second is the justification of formula $P=-\frac{\partial E}{\partial V}$ in the microcanonical ensemble. You seem to assume the equality between that derivative and derivative of $E_i$. However, energies of different configurations do not have the same volume derivative (they also depend on the atomic coordinates). I think the solution requires using the virial theorem to get the pressure. $\endgroup$ Commented Nov 22, 2022 at 15:58
  • $\begingroup$ @GiorgioP Thanks for the comments. Re: the second point, you are correct that $\frac{\partial E}{\partial V} \neq \frac{\partial E_i}{\partial V}$. I have made a simple modification to the derivation (a bit more convoluted now) that avoids referencing the virial theorem. $\endgroup$
    – ratsalad
    Commented Nov 22, 2022 at 20:00
  • $\begingroup$ hi @ratsalad ! thanks for your thoughtful work on this answer. i see a potential issue with your addendum. you only derive the equation $dS = \frac{1}{T\beta} d \ln \Omega$ for fixed $N$ (for example, you invoke $dE =T dS - P dV + \mu dN$ with $dN = 0$). therefore, $dS = \frac{1}{T\beta} d \ln \Omega$ tells us that at constant $N$, $S$ is purely a function of $\Omega$. however, this functional relationship between $S$ and $\Omega$ need not be the same for each $N$ - in other words, $S$ may be a function of $N$ as well. this seems to spoil the rest of the proof. curious to hear your thoughts. $\endgroup$ Commented Jan 16, 2023 at 4:43
0
$\begingroup$

As discussed in the comments, your proof needs to show that: $$ \beta = \frac{1}{kT} $$ Following Gaskell's "Introduction to Thermodynamics", I think that this is a definition. The rationale comes from looking at $\beta$ as a parameter which controls the shape of the distribution of the Boltzmann distribution of energy among particles: $$ n_{i}=\frac{ne^{-\beta E_i}}{P} $$ where $n$ is the total number of particles, $E_i$ is the $i^{th}$ energy level, and $n_i$ is the occupation of the $i^{th}$ energy level, and $P$ is the partition function.

Having $\beta$ and $T$ inversely proportional makes sense because, as the plot of occupation vs energy below shows, you would expect the higher energy states to become more occupied when the temperature is raised. This happens when beta is lowered.

enter image description here

$\endgroup$
3
  • $\begingroup$ Good answer, but I really would like someone to say that we're not "proving" anything here but simply giving a motivation: we can make Boltzmann's / Shannon's conception of entropy behave in important ways very like the macroscopic, Clausius definition by postulating that the Lagrange multiplier $\beta$ is the reciprocal temperature (modulo a scale factor): this is what you are showing here. We then find that statistical mechanics foretells experimental results if we assume this. So it's really a theoretical hunch backed by experiment .... $\endgroup$ Commented Dec 12, 2016 at 9:25
  • $\begingroup$ ...See my answer here for more information. $\endgroup$ Commented Dec 12, 2016 at 9:26
  • 1
    $\begingroup$ @WetSavannaAnimalakaRodVance hey thanks for the response. i did some more research after asking this, and i'm now of the tentative opinion that it is possible to prove the connection. it's impossible to lay out my whole argument in this comment, but basically the idea is: since we can prove that beta = 1/T (T here is ideal gas temperature) for the ideal gas case, and we can show empirically that the ideal gas T is proportional to thermodynamic T, we can use the ideal gas as a "bridge" between thermo and stat mech to show that the thermodynamic T and 1/beta must be proportional for any system $\endgroup$ Commented Dec 12, 2016 at 21:12
0
$\begingroup$

$\newcommand{\mean}[1] {\left< #1 \right>}$ $\DeclareMathOperator{\D}{d\!}$

Proof that $\beta = \frac{1}{k T}$ for the canonical ensemble

This proof assumes only classical thermodynamics and the Boltzmann distribution of microstates. It assumes nothing about statistical entropy.

First recall the statistical mechanics expressions for the pressure $P$ and internal energy $E$ of a system

\begin{align} \label{eq:sm_pressure} P = \mean{P} = \frac{1}{\beta} \left( \frac{\partial \ln Z}{\partial V} \right)_{\beta, N} \end{align}

\begin{align} \label{eq:sm_energy} E = \mean{E} = -\left( \frac{\partial \ln Z}{\partial \beta} \right)_{V, N} \end{align}

where $Z$ is the partition function. These can both be derived from the Boltzmann distribution of microstates.

Lets take the partial derivative of the pressure with respect to $\beta$, holding $V$ and $N$ constant.

\begin{align} \left( \frac{\partial P}{\partial \beta} \right)_{N} &= \frac{1}{\beta} \left( \frac{\partial^2 \ln Z}{\partial \beta ~\partial V } \right)_{N} -\frac{1}{\beta^2} \left( \frac{\partial \ln Z}{\partial V} \right)_{\beta, N} \\ &= \frac{1}{\beta} \left( \frac{\partial^2 \ln Z}{\partial \beta ~\partial V } \right)_{N} -\frac{1}{\beta} \mean{P} \end{align}

Now we take the partial derivative of the energy with respect to volume $V$, holding $\beta$ and $N$ constant.

\begin{align} \left( \frac{\partial E}{\partial V} \right)_{\beta, N} &= -\left( \frac{\partial^2 \ln Z}{\partial \beta ~\partial V} \right)_{N} \end{align}

Combining these two partial derivatives gives

\begin{align} \label{eq:sm_pressure_eq} -\mean{P} &= \left( \frac{\partial E}{\partial V} \right)_{\beta, N} + \beta \left( \frac{\partial P}{\partial \beta} \right)_{N,V} \end{align}

This equation, which was derived completely from statistical mechanics assumptions, can be compared with a famous analogous equation from classical thermodynamics "the thermodynamic equation of state":

\begin{align} \label{eq:ct_pressure_eq} -P &= \left( \frac{\partial E}{\partial V} \right)_{T, N} - T \left( \frac{\partial P}{\partial T} \right)_{N,V} \end{align}

Because $\mean{P} = P$, we can combine these two equations:

\begin{align} \left( \frac{\partial E}{\partial V} \right)_{\beta, N} + \beta \left( \frac{\partial P}{\partial \beta} \right)_{N,V} &= \left( \frac{\partial E}{\partial V} \right)_{T, N} - T \left( \frac{\partial P}{\partial T} \right)_{N,V} \\ \label{eq:eq1} \left( \frac{\partial E}{\partial V} \right)_{\beta, N} + \left( \frac{\partial P}{\partial \ln \beta} \right)_{N,V} &= \left( \frac{\partial E}{\partial V} \right)_{T, N} - \left( \frac{\partial P}{\partial \ln T} \right)_{N,V} \qquad \textrm{because $\D \ln x = \frac{\D x}{x}$} \end{align}

Therefore, the left and right-hand sides are equal, and they are equal only when

\begin{align} \D \ln \beta &= - \D \ln T \end{align}

(at constant composition $N$).

Integrating both sides:

\begin{align} \ln \beta &= - \ln T -\ln k(N) \\ \beta(T) &= \frac{1}{k T} \end{align}

Because $\beta$ is a function of only $T$ and is independent of composition (e.g., $\beta(T,N) = \beta(T)$), the integration constant $k$ is a universal constant which is also independent of composition and thermodynamic state.

$\endgroup$
5
  • $\begingroup$ A late comment. Your "proof" goes in the right direction, but there is a problem. Your starting formulae for $E$ and $P$ are valid in the canonical ensemble, while Boltzmann's expression for the entropy, as in the question, is valid in the microcanonical ensemble. $\endgroup$ Commented Mar 24, 2021 at 7:21
  • $\begingroup$ My proof is more general, and applies to the OP's question, since Boltzmann's entropy can be derived from the canonical ensemble for the special case when all microstates have the same energy. $\endgroup$
    – ratsalad
    Commented Mar 25, 2021 at 12:14
  • $\begingroup$ This is not correct. The microstates of the canonical ensemble do not have the same energy. The fact that a large majority of them has energy very close to the average energy is not a good reason for ignoring the deviations. As a matter of fact, the correct expression for the specific heat in the canonical ensemble directly exploits the non-zero variance of the energy distribution. $\endgroup$ Commented Mar 25, 2021 at 14:59
  • $\begingroup$ Imagine a quantum mechanical canonical ensemble, in which we lower the temperature until only the ground states are accessible. Then the energy variance is zero, as well as the heat capacity, as is well known experimentally. All microstates have the same energy, and the entropy is given by an expression equivalent to the Boltzmann expression, dependent only on the log of the redundancy of the ground state. $\endgroup$
    – ratsalad
    Commented Mar 26, 2021 at 15:23
  • $\begingroup$ @GiorgioP You motivated me to produce a similar "proof" specifically for the microcanonical ensemble. It also addresses the OP's desire to derive the Boltzmann entropy equation. See my latest answer. $\endgroup$
    – ratsalad
    Commented Mar 27, 2021 at 21:25
-1
$\begingroup$

I will present the logic, omitting mathematical details which you can fill in from a textbook.

First establish that the Boltzmann distribution is the one which maximises number of microstates under constraints of fixed total energy and particle number. (A convenient method is the Lagrange multiplier method). In this distribution, $\beta$ is a parameter whose physical interpretation is next to be discovered.

Now take two separate systems and treat them first as independent, with $Z = Z_1 Z_2$, and then as a single larger system with $Z$ to be discovered. For this you allow the systems to be weakly interacting which means they can exchange energy but their energy levels and degeneracies are not affected by the interaction. You thus discover that the equilibrium of the joint system (the macrostate with the most microstates) is the one where $\beta_1 = \beta_2$ and the result is otherwise general.

It follows from this that $\beta$ must be a function of temperature alone, and furthermore a universal function (same for all systems). These facts are thus derived not assumed. To find out which function it is, you can study any one system and you soon find that $\beta = 1 / k_{\rm B} T$.

OK so far so good.

Now return to $$\beta = \frac{\partial \ln \Omega}{\partial E}$$ and recall the thermodynamic $$ \frac{1}{T} = \frac{\partial S}{\partial E}. $$ Now that we know $\beta = 1 / k_{\rm B} T$ these results strongly suggest the association $$ S = k_{\rm B} \ln \Omega . $$ To make sure this is indeed implied one needs to check some details, such as what is held fixed in the partial derivatives, and that the reasoning does not require any one particular type of physical system.

$\endgroup$
3
  • $\begingroup$ Why would you assume that the $\beta$ from the Boltzmann distribution is the same $\beta$ from the microcanonical (i.e. $\beta = \frac{\partial \ln \Omega}{\partial E}$)? That seems to be a rather bold postulate. $\endgroup$
    – ratsalad
    Commented Jun 21, 2023 at 17:50
  • $\begingroup$ @ratsalad that need not be assumed; to show they agree one invokes long-standing standard statistical mechanics arguments which are not the central point here $\endgroup$ Commented Jun 21, 2023 at 19:25
  • $\begingroup$ In the Lagrange multiplier method you suggest, $\beta$ is a Lagrange multiplier associated with the constraint of constant average energy for a canonical system. It is far from obvious that this Lagrange multiplier should be exactly the same function as the partial change in $\ln \Omega$ with respect to the energy of a microcanonical ensemble. You and I both know, a posteriori, that they are in fact equivalent, but deriving that equivalence is precisely what must be done in your argument if we are to conclude that $\beta = 1/kT$. $\endgroup$
    – ratsalad
    Commented Jun 21, 2023 at 19:51
-2
$\begingroup$

There is no well-defined "thermodynamic entropy" outside of the Shannon or Von Neumann entropy because there is no well-defined concept of temperature at all without entropy. Entropy is foundational, and temperature is derived from it. And entropy is fundamentally statistical in nature.

In fact, it is entropy that should have its own base unit, with temperature having a derived unit. If entropy had a unit, $B$, then temperature would have the unit $J/B$.

$\endgroup$
2
  • $\begingroup$ I agree with the last comment and add that the proper unit of entropy, following Shannon/Von Neumann, should be unitless. The first part is iffy because there is an entropy in classical thermodynamics defined in terms of heat, temperature and reversible paths. How "well defined" this definition is is indeed questionable because classical $T$ is defined operationally, for example, via the ideal gas law, not in a fundamental way. Ultimately, thermodynamics is statistical. $\endgroup$
    – Themis
    Commented Jun 10, 2023 at 19:47
  • $\begingroup$ @Themis ideal gas temperature is defined via the ideal gas law, thermodynamic temperature is defined via the Carnot efficiency formula based on 2nd law, and absolute thermodynamic scale is thermodynamic scale zero fixed to be the lowest temperature (3rd law). $\endgroup$ Commented Aug 22, 2023 at 2:23
-3
$\begingroup$

To go from statistical mechanics to thermodynamics we assume that the quantity $\frac{\partial E}{\partial S}$ is equal to Inverse of temperature. Talking about Boltzmann's relation it can be verified by considering case of coins.

$\endgroup$
1
  • 2
    $\begingroup$ Please refrain from using short forms and use MathJax to format equations. $\endgroup$
    – wavion
    Commented Apr 18, 2020 at 6:46

Not the answer you're looking for? Browse other questions tagged or ask your own question.