0
$\begingroup$

I am not able to conceive why work wouldn't increase the entropy of a system. If i am compressing a ideal gas adiabatically and reversibly, the gas pressure inside would increase,meaning more no of particles hitting unit area, and that to me seems a more chaotic state of the gas, than before compression, so why doesn't the entropy increase.

And the 'chaos' or 'randomness' terms that we use to define entropy, what are they dependent upon.Velocity or pressure or what?

My understanding of entropy is that, it is a measure of how unpredictable the system is , how difficult it is to point out a certain gas particle. Am i right in my understanding?

$\endgroup$
1
  • $\begingroup$ Your question seems sort of unclear. Are you asking about work and entropy or heat and entropy? $\endgroup$ Commented Jun 26, 2016 at 12:12

3 Answers 3

1
$\begingroup$

Reversible adiabatic expansions and compressions do not have changes in entropy because no heat is exchanged during the process. Work done is not directly related to entropy. The relationship between temperature, heat exchange and entropy in a reversible process is as follows:

$$TdS=\delta Q$$

Since a reversible adiabatic expansion occurs with no heat transfer $\delta Q =0$, by definition then there is no change in entropy $dS=0$. In fact, reversible adiabatic processes are more properly called isentropric processes.

The key is reversibility. The true relation between entropy temperature and heat is

$$TdS \ge \delta Q$$

When the process is reversible, the $\ge$ sign becomes $=$, giving the first equation I gave you. Truly reversible processes are rare, so there are real expansions and compressions in which no heat is transferred, but there may be changes in entropy.

$\endgroup$
0
1
$\begingroup$

'Ben Norris' gives a very clear answer. However, you use the words 'chaos' or 'randomness' to describe entropy, and while widespread, this is ultimately misleading as they are not exact enough.

You need instead to think about the number of energy levels that can be populated at a given energy, and the number of ways these can be occupied. For example, suppose there are two equal but separate fixed volumes each containing 1 mole of an inert ideal gas, at the same temperature and pressure. If you now allow the two gases to mix (each by free expansion into the other) so that the total volume is doubled (pressure and temperature unchanged) then the entropy is increased.

But is the gas more random after allowing mixing than before? Both would seem to be equally 'random'. But the extra volume available allows the molecules of the gas to move in a larger space and this means that at a given energy, the number of energy levels available to a molecule is increased. You can think of this in the same way that the energy levels of a 'particle in a box' decrease as the box increases in size meaning that at a given temperature (or energy) more levels are populated. As more levels are populated there are more ways of filling these levels and so the entropy increases.

(For completeness the entropy change for each gas in the example above is given by $\Delta S = R ln(V_f/V_i) $, and if the initial volume $V_i=v$ and $V_f = 2v$ then the total entropy change is $2Rln(2)$ which is positive.)

$\endgroup$
3
  • $\begingroup$ "If you now allow the two gases to mix (each by free expansion into the other) so that the total volume is doubled (pressure and temperature unchanged) then the entropy is increased. ". I think you are wrong with the pressure part, even in free expansion pressure decreases if volume increases. $\endgroup$
    – user47024
    Commented Jul 11, 2018 at 13:34
  • $\begingroup$ I see what you mean , I had supposed that the pressure was the same as each filled the other empty half as it were as it doubled its volume. $\endgroup$
    – porphyrin
    Commented Jul 11, 2018 at 14:30
  • $\begingroup$ Your example of the removable partition set-up has been quite handy to me. I was replying to a question with similar background, and then i stumbled upon the pressure part, did some digging and found a pv graph for free expansion, in which the end states were joined by a dotted line(indicating irreversible process i suppose), with the final pressure less than initial pressure. $\endgroup$
    – user47024
    Commented Jul 11, 2018 at 14:41
0
$\begingroup$

I'm not an expert, but here's my two cents:

1) If gas get compressed from work, I think it makes sense that it's energy is more concentrated rather than disperse. The total energy depends only on its temperature. You're right that lower volume means more pressure, but it evens out and as long as temperature is constant before and after compression, the total energy is also constant. But the difference is that in the more compressed state, there is less dispersal of that energy - it's all bunched up in a lower volume and hence entropy is less.

2) Another understanding on entropy is how much of the system's energy can NOT be used to do work. So when a gas is all bunched up in a small volume, more of its energy is available to do work (for example power a heat engine), while if it's energy all dispersed in a large volume, less of it is available to do useful work.

$\endgroup$
8
  • $\begingroup$ I think the verbiage "useful work" is completely circular. Can you define it in any other way as just "work"? $\endgroup$
    – hyportnex
    Commented Jun 28, 2016 at 0:18
  • $\begingroup$ yeah I just meant "work", the adjective "useful" is not needed. So if a gas is condescend and could do work on a piston, that part of it's energy is not entropy. $\endgroup$
    – Daniel
    Commented Jun 29, 2016 at 3:35
  • $\begingroup$ Your use of the words are conflating "energy" with "entropy". While it is true that there is a quantity called "availability", or sometimes the same is called "exergy", quantifying what amount of internal energy is available for work it is definitely not entropy but something that is directly dependent on entropy, but is also dependent on temperature and pressure, etc. $\endgroup$
    – hyportnex
    Commented Jun 29, 2016 at 14:02
  • $\begingroup$ Thanks for your comment. It seems that you know much more about this than me. I got my idea from wikipedia: 'One dictionary definition of entropy is that it is "a measure of thermal energy per unit temperature that is not available for useful work". ' $\endgroup$
    – Daniel
    Commented Jun 29, 2016 at 20:41
  • $\begingroup$ you should study the excellent books by Pippard, or Atkins, or Callen, all three are superbly written and you should not have any problem finding any of these. $\endgroup$
    – hyportnex
    Commented Jun 29, 2016 at 23:06

Not the answer you're looking for? Browse other questions tagged or ask your own question.