• Entropy: the Nothing That Always Wins

    From Pentcho Valev@21:1/5 to All on Sat Feb 12 08:03:00 2022
    Arthur Eddington: "The law that entropy always increases—the Second Law of Thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwellâ
    €™s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can
    give you no hope; there is nothing for it but to collapse in deepest humiliation." https://todayinsci.com/E/Eddington_Arthur/EddingtonArthur-Entropy-Quotations.htm

    Athel Cornish-Bowden: "The concept of entropy was introduced to thermodynamics by Clausius, who deliberately chose an obscure term for it, wanting a word based on Greek roots that would sound similar to "energy". In this way he hoped to have a word that
    would mean the same to everyone regardless of their language, and, as Cooper [2] remarked, he succeeded in this way in finding a word that meant the same to everyone: NOTHING. From the beginning it proved a very difficult concept for other
    thermodynamicists, even including such accomplished mathematicians as Kelvin and Maxwell; Kelvin, indeed, despite his own major contributions to the subject, never appreciated the idea of entropy [3]. The difficulties that Clausius created have continued
    to the present day, with the result that a fundamental idea that is absolutely necessary for understanding the theory of chemical equilibria continues to give trouble, not only to students but also to scientists who need the concept for their work."
    https://www.beilstein-institut.de/download/712/cornishbowden_1.pdf

    Scientific American: "When I discussed it with John von Neumann, he had a bet­ter idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty func­tion has been used in statistical mechan­ics under that
    name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage." https://www.esalq.usp.br/lepse/imgs/conteudo_thumb/Energy-and-Information.pdf

    See more here: https://twitter.com/pentcho_valev

    Pentcho Valev

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Pentcho Valev@21:1/5 to All on Sat Feb 12 14:59:45 2022
    "Entropy was discovered when it was noticed to be a quantity that behaves as a function of state, as a consequence of the second law of thermodynamics." https://ipfs.io/ipfs/QmXoypizjW3WknFiJnKLwHCnL72vedxjQkDDP1mXWo6uco/wiki/Entropy.html

    It was Clausius who "noticed" that the entropy is a state function. Here is the story:

    If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (
    so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this
    deduction remains the only justification of "Entropy is a state function":

    "Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes
    over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

    The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of
    conservative forces CANNOT be subdivided into small Carnot cycles.

    Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

    More here: https://twitter.com/pentcho_valev

    Pentcho Valev

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Pentcho Valev@21:1/5 to All on Sun Feb 13 01:27:06 2022
    The version of the second law of thermodynamics stated as "Entropy always increases" (a version which, according to A. Eddington, holds "the supreme position among the laws of Nature") is in fact a theorem deduced by Clausius in 1865:

    Jos Uffink, Bluff your Way in the Second Law of Thermodynamics, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an
    equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state." http://
    philsci-archive.pitt.edu/archive/00000313/

    Clausius' deduction was based on three postulates:

    Postulate 1 (implicit): The entropy is a state function.

    Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's paper) is correct.

    Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle.

    All the three postulates remain totally unjustified even nowadays. Postulate 1 can easily be disproved by considering cycles (heat engines) converting heat into work in ISOTHERMAL conditions. Postulate 3 is almost obviously false:

    Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible]
    process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for
    processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."

    Note that, even if Clausius's theorem were correct (it is not), it only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that all applications of "Entropy always increases" to processes which do not
    begin and end in equilibrium would still be unjustified!

    See more: https://twitter.com/pentcho_valev

    Pentcho Valev

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)