Discussion:
THE QUEST FOR THE ULTIMATE THEORY OF TIME, AN INTRODUCTION TO IRREVERSIBILITY
(too old to reply)
"Juan R." González-Álvarez
2010-05-21 17:00:40 UTC
Permalink
##########################################################
THE QUEST FOR THE ULTIMATE THEORY OF TIME, AN INTRODUCTION
TO IRREVERSIBILITY
##########################################################

I do not believe that we, scientists, can develop any "ultimate theory"
and I do not claim to have one. I have chosen the above title for this
event, in direct relation to the last book by Sean Carroll —The quest for
the ultimate theory of time— because I wanted to be a bit ironic here.

I smile each occasion that arrogant string theorists talk me about their
"theory of everything", specially because their approach is built over
several approximations and outdated formalisms. Somehow related is the
recent perplexity of mathematical physicists as John Baez who has reported
his shocking finding of "a curious generalization of Hamiltonian mechanics
that involves entropy" without being aware that more fundamental and
elegant formalisms, as the canonical one, have been known to physical
chemists and chemical physicists for more than thirty years now!

My comments on entropic generalizations of mechanics are available in This
Week's Finds in Mathematical Physics (Week 295). For a basic review of the
outdated aspects of both string and M theory see Canonical science: its
history, goals, and future.

I applaud Carroll valiant attempt to consider irreversibility seriously,
but his work is far from the usual standards in specialized literature. In
fact, whereas several of the authors cited below have advanced our
understanding of irreversibility, and some of them even provided us some
irreversible equation that explains a determined range of phenomena, there
is not such stuff as a Carroll equation that was used in the laboratories
by us. Carroll's scientific contributions to the theory of irreversible
thermodynamics, nonequilibrium statistical mechanics, or kinetic theory,
are easily summarized: zero. Moreover, he is misguided on many points. I
have done some comments and corrections in Carroll's blog, for instance in
Chapter Eight, Chapter Eleven, Chapter Eleven (b), Chapter Twelve, Chapter
Fourteen, Chapter Fourteen (b), and Chapter Fourteen (c) (external
hyperlinks), but the list is far from exhaustive; what is more, I have
decided to ignore both his unscientific speculations about the multiverse
and its reflections on philosophy of science.

The problem of the "arrow of time" is one of the oldest fundamental
problems of physics. The world, as we observe around us, is irreversible:
ice cubes melt and the water gets slightly cooler, broken eggs cannot be
recovered, paper ages, people passes away, etc. All those observations
finally led to the celebrated second law of thermodynamics, for the which
no violation is known [1,2]. However, the fundamental laws of mechanics
are time-reversible. There exist three attitudes toward this confrontation
between mechanics and observations: the coarse-grained, the pragmatic, and
the fine-grained.


*The coarse-grained attitude*

The authors who share this attitude affirm that, at some fundamental
level, Universe is time-reversible and that irreversibility is only an
illusion arising from our ignorance. This misguided attitude was initiated
by Ludwig Boltzmann, who was the first who tried to obtain the second law
of thermodynamics from mechanics. A short but beautiful history of the
early objections that his work has raised is given in the book by Laidler,
Meiser & Sanctuary [3]. For instance, the authors report how Boltzmann
paper of 1877 used the probability argument, contradicting his 1872
article, and add:

Much confusion resulted from the fact that for many years Boltzmann
was inconsistent in his statements about the matter, sometimes using
the probability argument and sometimes saying that it was not
necessary to invoke probability theory. When pressed, Boltzmann
invoked the probability argument to justify his arguments but
sometimes insisted that he had arrived at his function H on the basis
of pure dynamics.

I agree with the authors that Boltzmann never understood that he had made
some hidden assumptions that left out the terms that would lead to the
wrong conclusions and included only the terms that led to the second law.
Rather than giving the second law a purely mechanical origin, he used
extra-mechanical assumptions for breaking the time-symmetry of mechanics,
forcing it to match with the observations predicted by the law.

As Nico G. van Kampen —considered one of the "most outstanding theoretical
physicists of the second half of the 20th century" [4]— has noticed:
"Obviously it is a logical impossibility to deduce from reversible
equations an irreversible consequence" and emphasized that "One cannot
escape from this fact by any amount of mathematical funambulism". Many
important physicists, chemists, and mathematicians have stated similar
views. For instance, Landau & Lifshitz speculated, in their celebrated
course in theoretical physics [5], that the second law of thermodynamics
would be the macroscopic form of some fundamental law of irreversibility
associated to the measurement problem in quantum mechanics —recent
research partially confirms their suspicion but going on the line of
solving the measurement problem of quantum mechanics by using a
microscopic version of the second law!—. Truesdell's criticism is much
more acid and, in his classic monograph Rational thermodynamics, writes
about "microscopic reversibility" in the next terms [6]:

In fact, it requires no great mathematician to see that the
reversibility theorem and Poincaré's recurrence theorem makes
irreversible behavior impossible for dynamical systems in the
classical sense. Something must be added to the dynamics of
conservative systems, something not consistent with it, in order to
get irreversibility at all. Everyone competent in mechanics has known
this for a long time.

Unfortunately, some modern physicists —especially those "who would never
dream of trying to go through the mathematics and check the "proofs"",
again in the acid words of Truesdell— continue to repeat both the
ambiguities and subtle mistakes done by Boltzmann, and even add some
recent ones. I recall that in one occasion, I was so perplexed after
reading Science of Chaos or Chaos in Science? that I asked to the Nobel
laureate Ilya Prigogine about his opinion; he just confirmed my findings
with his concise, "the work of Bricmont is completely wrong".
Unfortunately, J. Bricmont's work has been indeed influential, especially
among non-experts [7].

Moreover, the amounts of "mathematical funambulism" do not seem enough and
some authors are utilizing metaphysical and religious arguments to sustain
their credo on coarse-graining. In a recent monograph [8], Robert Zwanzig
affirms that an ice cube, melted in water, will eventually reappear, but
"we never see it happen". Nevertheless, even when Zwanzig is ignoring the
experimental basis of the scientific method, he is still obligated to
conclude that the eventual reappearing would violate the second law of
thermodynamics. It is at this point when Zwanzig reveals us the reach of
his faith: "what we know about time irreversibility is obtained by
experiments on a human time scale". For authors as him, the paradox of
time is gone, the second law of thermodynamics, the laws of chemical
kinetics, or the evolution law of biology are only illusions obtained by
humans, mere mortals, whereas the time-reversible and deterministic laws
of mechanics have a divine origin and thus atemporal, being valid for
scales of time beyond our experience [9]!

It was not my objective to give here to my readers a detailed and complete
criticism of the coarse-grained attitude nor of the epistemological
fallacies traditionally associated to it. I plan to write a report about
all this.


*The pragmatic attitude*

This starts from the failure of the coarse-grained attitude. Irreversible
equations of motion are postulated, which are then justified a posteriori
sought in comparison with experimental data measured in the laboratory or
observed in nature. This is the attitude taken by Byung Chan Eu on his
interesting formulation of nonequilibrium statistical mechanics [10]:

We take the viewpoint that dynamical systems consisting of 10^23
particles require dynamical equations of broken time reversal symmetry
which are not derivable from the time reversal invariant Newtonian or
quantum equations of motion for the systems.

Another example is Zubarev's theory. He added a small infinitesimal source
term to the Liouville equation, breaking its time-symmetry.

The equations proposed have limited validity —for instance, the Eu
equation uses a Markovian approximation— and the question of the origin of
irreversibility is not even asked in this attitude.

This is also the case in Lindblad's semigroup approach [11]: "Thus the
present formalism has nothing to say about the origin of time's arrow".
Lindblad's theory has been criticized in the basis of its original
obtaining from purely mathematical assumptions and also by its detected
unphysical behaviors. Recent fine-grained —see below— developments confirm
the affirmations of Ulrich Weiss: "one should not attribute fundamental
significance to the Lindblad master equation". In the fine grained
attitude we can amend the equation with physical terms, study more general
dynamical regimes, and correct Lindblad qualitative assertions about
irreversibility.


*The fine-grained attitude*

This is the more physical, modern, and sophisticated attitude of the
three. Its goal is to find the microscopic source of the irreversibility
and to give a rigorous derivation of the macroscopic irreversible
equations, providing us with their microscopic generalizations too. As a
bonus, this attitude tries to solve other fundamental problems of physics
and of philosophy, such as the problem of free-will or the problem of
measurement in quantum mechanics. This attitude and the recent research
advances done at this Center will be discussed in the second part:
Trajectory branching in Liouville space as the source of irreversibility.


*References and notes*

[1] Regularly new articles are published which pretend to provide a real
violation, but each time it is showed that the violation is only in the
title or abstract of the articles. This was also the case for the last
assertions of violations of the second law done around the year 2002. See
the next reference for details.

[2] Has been thermodynamics violated? 2003: CPS:physchem/0309002.
González—Álvarez, Juan R.

[3] The Approach to Equilibrium In Physical Chemistry 2003: Houghton
Mifflin Company, Boston; 4th Ed. Laidler, Keith J.; Meiser, John H.;
Sanctuary, Bryan C.

[4] Nico van Kampen: charlatans beware! 2001: Physics World, January, 46.
ter Haar, D.

[5] Pag. 32 In Course of Theoretical Physics Volume 5, Statistical Physics
Part 1 1980: Pergamon Press, Oxford; 3rd Ed; Lifshitz, E. M.; Pitaevskii,
L.P. (Revision and Enlargement); Sykes, J. B.; Kearsley, M. J.
(Translation from Russian). Landau, L. D.; Lifshitz, E. M.

[6] Pag. 121 In Rational Thermodynamics 1968: MacGraw-Hill Book Company,
New York. Truesdell, C.

[7] It comes now to the mind the names Cosma Rohilla Shalizi and Carlos
Zuppa, basically because both are the last authors who I read. It is
rather amazing to read such misguided claims and unfair accusations
against Prigogine, specially when the authority of important —fictitious—
people as "Osanger [sic]", "Feymann [sic]", and "Keiser [sic]" is invoked
as argument, for instance by Zuppa; which gives a good idea of the
terribly ill-informed that he is.

[8] The Paradoxes of Irreversibility In Nonequilibrium Statistical
Mechanics 2001: Oxford Uniersity Press, Oxford. Zwanzig, Robert.

[9] In their work Entropy: A dialogue, Joel L. Lebowitz & Christian Maes
reproduce a conversation between a physicist and an... angel who has a
perfect knowledge of the past and present state of a perfectly isolated
macroscopic system of particles moving according to Hamiltonian dynamics.
It is worth asking if this is the same angel who convinced Zwanzig of that
the second law was obtained from experiments performed by humans and,
therefore, would not be trusted.

[10] Pag. 55 In Nonequilibrium Statistical Mechanics, Ensemble Method
1998: Kluwer Academic Publishers, Dordrecht. Chan Eu, Byung.

[11] Pag. 19 In Non-equilibrium Entropy and Irreversibility 1983: D.
Reidel Publishing Company, Dordrecht. Lindblad, Göran.


##############
NEWS AND BLOG:
##############

http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html

http://www.canonicalscience.org/publications/canonicalsciencetoday/20100521.html
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Dirk Bruere at NeoPax
2010-05-21 23:37:12 UTC
Permalink
Post by "Juan R." González-Álvarez
##########################################################
THE QUEST FOR THE ULTIMATE THEORY OF TIME, AN INTRODUCTION
TO IRREVERSIBILITY
##########################################################
I do not believe that we, scientists, can develop any "ultimate theory"
and I do not claim to have one. I have chosen the above title for this
event, in direct relation to the last book by Sean Carroll —The quest for
the ultimate theory of time— because I wanted to be a bit ironic here.
So you post your physics theory to chemistry news groups in the hope
that nobody here knows enough to rip you to shreds?
Try sci.physics.research
--
Dirk

http://www.transcendence.me.uk/ - Transcendence UK
http://www.blogtalkradio.com/onetribe - Occult Talk Show
"Juan R." González-Álvarez
2010-05-24 15:26:44 UTC
Permalink
Post by Dirk Bruere at NeoPax
########################################################## THE QUEST
FOR THE ULTIMATE THEORY OF TIME, AN INTRODUCTION TO IRREVERSIBILITY
##########################################################
I do not believe that we, scientists, can develop any "ultimate theory"
and I do not claim to have one. I have chosen the above title for this
event, in direct relation to the last book by Sean Carroll —The quest
for the ultimate theory of time— because I wanted to be a bit ironic
here.
So you post your physics theory to chemistry news groups in the hope
that nobody here knows enough to rip you to shreds? Try
sci.physics.research
It is difficult to be more wrong with so few words! You are close to a
new crackpot record. Congrats!

Anyone who read the message will see that it only gives an introduction to
different views about irreversibility and references to their work. But it
is difficult to find a troll as you without reading disability.

Anyone who know the subject, will understand that chemists know a lot of about this.
The main theories of irreversible statistical mechanics have been developed by chemists
(one of them won a nobel Prize for his extension of thermodynamics).
But you are evidently an ignorant in all this.

Anyone familiar with spr knows that most of authors therein are not experts in irreversibility.
My corrections to other's post are archived in spr. But this mean that a troll as you would be
able to perform a search in my historial of posting in spr!

Of course, spr cannot be used as an alternative to formal peer-review by real experts
in the topic, which seems to be your ridiculous suggestion :-D
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Dirk Bruere at NeoPax
2010-05-24 20:44:11 UTC
Permalink
Post by "Juan R." González-Álvarez
Post by Dirk Bruere at NeoPax
########################################################## THE QUEST
FOR THE ULTIMATE THEORY OF TIME, AN INTRODUCTION TO IRREVERSIBILITY
##########################################################
I do not believe that we, scientists, can develop any "ultimate theory"
and I do not claim to have one. I have chosen the above title for this
event, in direct relation to the last book by Sean Carroll —The quest
for the ultimate theory of time— because I wanted to be a bit ironic
here.
So you post your physics theory to chemistry news groups in the hope
that nobody here knows enough to rip you to shreds? Try
sci.physics.research
It is difficult to be more wrong with so few words! You are close to a
new crackpot record. Congrats!
Anyone who read the message will see that it only gives an introduction to
different views about irreversibility and references to their work. But it
is difficult to find a troll as you without reading disability.
Anyone who know the subject, will understand that chemists know a lot of about this.
The main theories of irreversible statistical mechanics have been developed by chemists
(one of them won a nobel Prize for his extension of thermodynamics).
But you are evidently an ignorant in all this.
Anyone familiar with spr knows that most of authors therein are not experts in irreversibility.
My corrections to other's post are archived in spr. But this mean that a troll as you would be
able to perform a search in my historial of posting in spr!
Of course, spr cannot be used as an alternative to formal peer-review by real experts
in the topic, which seems to be your ridiculous suggestion :-D
But they do weed out the cranks.
If you are looking for the root of irreversibility then it might be
worth starting with the time asymmetry of the quantum measurement process.
--
Dirk

http://www.transcendence.me.uk/ - Transcendence UK
http://www.blogtalkradio.com/onetribe - Occult Talk Show
"Juan R." González-Álvarez
2010-05-25 09:55:08 UTC
Permalink
Post by Dirk Bruere at NeoPax
Post by "Juan R." González-Álvarez
Post by Dirk Bruere at NeoPax
########################################################## THE QUEST
FOR THE ULTIMATE THEORY OF TIME, AN INTRODUCTION TO IRREVERSIBILITY
##########################################################
I do not believe that we, scientists, can develop any "ultimate
theory" and I do not claim to have one. I have chosen the above title
for this event, in direct relation to the last book by Sean Carroll
—The quest for the ultimate theory of time— because I wanted to be a
bit ironic here.
So you post your physics theory to chemistry news groups in the hope
that nobody here knows enough to rip you to shreds? Try
sci.physics.research
It is difficult to be more wrong with so few words! You are close to a
new crackpot record. Congrats!
Anyone who read the message will see that it only gives an introduction
to different views about irreversibility and references to their work.
But it is difficult to find a troll as you without reading disability.
Anyone who know the subject, will understand that chemists know a lot
of about this. The main theories of irreversible statistical mechanics
have been developed by chemists (one of them won a nobel Prize for his
extension of thermodynamics). But you are evidently an ignorant in all
this.
Anyone familiar with spr knows that most of authors therein are not
experts in irreversibility. My corrections to other's post are archived
in spr. But this mean that a troll as you would be able to perform a
search in my historial of posting in spr!
Of course, spr cannot be used as an alternative to formal peer-review
by real experts in the topic, which seems to be your ridiculous
suggestion :-D
But they do weed out the cranks.
If you are looking for the root of irreversibility then it might be
worth starting with the time asymmetry of the quantum measurement process.
And I emphasize that READING a message is a basic requirement before replying it.
Anyone who READ my message knows that I cited Landau & Lifshitz's thoughts
about this and explained that "recent research partially confirms their
suspicion but going on the line of solving the measurement problem of quantum
mechanics by using a microscopic version of the second law!"...
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
"Juan R." González-Álvarez
2010-05-25 18:48:06 UTC
Permalink
Dirk Bruere at NeoPax wrote on Sat, 22 May 2010 00:37:12 +0100:

(snip straw-man)
Post by Dirk Bruere at NeoPax
Try
sci.physics.research
Another stuff that you ignored is that the message was sent to physics groups
including sci.physics.research.

Moderators approved and is today available to everyone there to comment.
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Dirk Bruere at NeoPax
2010-05-25 21:55:29 UTC
Permalink
Post by "Juan R." González-Álvarez
(snip straw-man)
Post by Dirk Bruere at NeoPax
Try
sci.physics.research
Another stuff that you ignored is that the message was sent to physics groups
including sci.physics.research.
Moderators approved and is today available to everyone there to comment.
Well, you will get far better feedback there than in the chem NGs.
Time asymmetry is a huge theoretical problem in physics.
--
Dirk

http://www.transcendence.me.uk/ - Transcendence UK
http://www.blogtalkradio.com/onetribe - Occult Talk Show
"Juan R." González-Álvarez
2010-05-26 08:27:41 UTC
Permalink
Post by Dirk Bruere at NeoPax
Post by "Juan R." González-Álvarez
(snip straw-man)
Post by Dirk Bruere at NeoPax
Try
sci.physics.research
Another stuff that you ignored is that the message was sent to physics
groups including sci.physics.research.
Moderators approved and is today available to everyone there to comment.
Well, you will get far better feedback there than in the chem NGs.
I doubt. I already explained why to you.
Post by Dirk Bruere at NeoPax
Time
asymmetry is a huge theoretical problem in physics.
"All of chemistry deals with irreversible processes"

Preface to "Resonances, Instability, and irreversibility"
1997: Adv Chem. Phys. 99, 1997.
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Dirk Bruere at NeoPax
2010-05-26 12:11:19 UTC
Permalink
Post by "Juan R." González-Álvarez
Post by Dirk Bruere at NeoPax
Post by "Juan R." González-Álvarez
(snip straw-man)
Post by Dirk Bruere at NeoPax
Try
sci.physics.research
Another stuff that you ignored is that the message was sent to physics
groups including sci.physics.research.
Moderators approved and is today available to everyone there to comment.
Well, you will get far better feedback there than in the chem NGs.
I doubt. I already explained why to you.
Post by Dirk Bruere at NeoPax
Time
asymmetry is a huge theoretical problem in physics.
"All of chemistry deals with irreversible processes"
"All of life deals with irreversible processes"
Might as well post into biology NGs
--
Dirk

http://www.transcendence.me.uk/ - Transcendence UK
http://www.blogtalkradio.com/onetribe - Occult Talk Show
"Juan R." González-Álvarez
2010-05-26 19:02:49 UTC
Permalink
Post by "Juan R." González-Álvarez
Post by Dirk Bruere at NeoPax
Post by "Juan R." González-Álvarez
(snip straw-man)
Post by Dirk Bruere at NeoPax
Try
sci.physics.research
Another stuff that you ignored is that the message was sent to
physics groups including sci.physics.research.
Moderators approved and is today available to everyone there to comment.
Well, you will get far better feedback there than in the chem NGs.
I doubt. I already explained why to you.
Post by Dirk Bruere at NeoPax
Time
asymmetry is a huge theoretical problem in physics.
"All of chemistry deals with irreversible processes"
You sniped the reference to this. It was a chemical journal.
"All of life deals with irreversible processes" Might as well post into
biology NGs
I do not know any biologist who has done advances in the theory of
irreversibility.

I do not know conferences organized by them where the last
advances and theories are discussed.

I do not know any nobel Prize awarded to biologists by his contribution to
the understanding of irreversible phenomena.

======

I know many chemists who have done advances in the theory of
irreversibility. I have given several names.

I know conferences organized by them where the last
advances and theories are discussed. E.g. the reference that you sniped above.

I know two nobel Prize for Chemistry awarded to two chemists by their
contribution to the understanding of irreversible phenomena.

Why do you attack chemists? What is your problem with them?
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-05-26 16:06:32 UTC
Permalink
You left out an important aspect of the discussion, namely QM nature
of entropy as such.
To derive entropy through enumeration of the states in the phase
space,
you need quantization - otherwise your enumeration will be infinite!

Look for example for ab-initio derived formula for absolute entropy of
ideal gases
(simplest possible object!) derived
by Zakura and Thetrode through quantum quantisation of phase space,
and corresponding
closely to experimentaly measured S(T) functions - it will contain
constant "h".
Same is true for entropy derived for cristalline solids - they all
contain h.
Which means entropy is fundamentaly a QM-dependent quantity, and so
there is no surprize that to derive
laws involving entropy, such as second law of thermodynamics, you need
to use QM formalizm.

This has been recently done by generalizing fluctuation theorem by
Searles and Evans
to assign probability to spontaneous decrease of entropy for an object
of arbitrary size:
http://en.wikipedia.org/wiki/Fluctuation_theorem#Statement_of_the_fluctuation_theorem

The key point of it is - this probability is inversely proportional to
absolute entropy S
of the system. Since absolute entropy depends on quantization, e.g. S
= f(h, size),
the QM nature of second law of thermodynamics is not only apparent,
but is
quantitatively represented by this probability relation.

Regards,
Yevgen


--
Tune in to "Strange Drawing of the Day" buzz:
http://www.google.com/profiles/100679771837661030957#buzz
"Juan R." González-Álvarez
2010-05-26 19:15:00 UTC
Permalink
You left out an important aspect of the discussion, namely QM nature of
entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
Which means entropy is fundamentaly a QM-dependent quantity,
Nope, classical entropy is not quantum (h=0).
and so
there is no surprize that to derive
laws involving entropy, such as second law of thermodynamics, you need
to use QM formalizm.
Plain wrong again: classical statistical mechanics.
This has been recently done by generalizing fluctuation theorem by
Searles and Evans
to assign probability to spontaneous decrease of entropy for an object
This was already done much before by Einstein and others.

P = 1/Z exp(S-S_0)

Z is normalization constant.

(...)
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-05-27 13:51:41 UTC
Permalink
Post by "Juan R." González-Álvarez
You left out an important aspect of the discussion, namely QM nature of
entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of cause
empirical representations of absolute entropy used in thermodynamical
relationships does
not require enumerations of states. It can be measured for example by
integrating
Cp (thermal capacity) from absolute zero to present temperature.

However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
In classical model your number of states will be infinite.

That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not need
it and the
only constant the result will include is "h", and it is always the
same so no
additional measurements are needed.
Post by "Juan R." González-Álvarez
Which means entropy is fundamentaly a QM-dependent quantity,
Nope, classical entropy is not quantum (h=0).
and so
there is no surprize that to derive
laws involving entropy, such as second law of thermodynamics, you need
to use QM formalizm.
Plain wrong again: classical statistical mechanics.
This has been recently done by generalizing fluctuation theorem by
Searles and Evans
to assign probability to spontaneous decrease of entropy for an object
This was already done much before by Einstein and others.
P = 1/Z exp(S-S_0)
Z is normalization constant.
Ab-initio derived value of entropy of does not include any
"normalization constants"
(that need to be empirically calibrated), it only includes Plank
constant "h".

The fact that these ab-initio derived equations for ideal gases and
kristalline solids
coincide very well with experimental data on S(T) is a triumph of QM
description of entropy and thermodynamics. I am surprised that you
chose to overlook this large and promising area of research.

Regards,
Yevgen
Post by "Juan R." González-Álvarez
(...)
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
--
Tune in to "Strange Drawing of the Day" buzz:
http://www.google.com/profiles/100679771837661030957#buzz
Yevgen Barsukov
2010-05-27 14:35:52 UTC
Permalink
Post by Yevgen Barsukov
Ab-initio derived value of entropy of does not include any
"normalization constants"
(that need to be empirically calibrated), it only includes Plank
constant "h".
The fact that these ab-initio derived equations for ideal gases and
kristalline solids
coincide very well with experimental data on S(T) is a triumph of QM
description of entropy and thermodynamics.
Here are some pointers to these area of research (this is classic
stuff,
but more research is continuing on more complex system of practical
importance):
http://en.wikipedia.org/wiki/Sackur-Tetrode_equation

Regards,
Yevgen
"Juan R." González-Álvarez
2010-05-28 10:10:23 UTC
Permalink
Post by Yevgen Barsukov
Ab-initio derived value of entropy of does not include any
"normalization constants"
(that need to be empirically calibrated), it only includes Plank
constant "h".
The fact that these ab-initio derived equations for ideal gases and
kristalline solids
coincide very well with experimental data on S(T) is a triumph of QM
description of entropy and thermodynamics.
Here are some pointers to these area of research (this is classic stuff,
but more research is continuing on more complex system of practical
http://en.wikipedia.org/wiki/Sackur-Tetrode_equation
Expression which is valid only under restrictions, such as that the gas
was at equilibrium...

That was obtained by the well-known naive approach (sometimes named the
semi-classical approach), where it is often said such nonsense as that in
classical mechanics one *works* with a phase-space structure with cells of size
h^3, which arise from 'quantum uncertainty' DqDp = h, but at the same time the
quantum Hamiltonian is /ad hoc/ substituted [*] by a classical Hamiltonian H(q,p),
which evidently only makes sense when x and p are both defined and measurable
at the instant t, etc.

Even if one closes the eyes to all the physical nonsense and the mathematical
funambulism the resulting expression (containing h) does not give the correct
value and then textbooks just ignore some terms for comparison with the
correct value given by classical thermodynamics.

The rigorous computation of the CLASSICAL entropy (i.e. without h) for an ideal
gas at equilbrium, from the definition of entropy, was given in my other post
(see also the references cited).


[*] Traditionally the 'demonstration' goes with the author writting something as

"H_quantum --> H(q,p)"

in wait that readers will say: "Wow! what impressive demonstration. I am convinced."
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
"Juan R." González-Álvarez
2010-05-28 09:41:12 UTC
Permalink
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM nature
of entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of cause
empirical representations of absolute entropy used in thermodynamical
relationships does
not require enumerations of states. It can be measured for example by
integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue doing mistakes
such as confounding the *definition* of entropy with formulae valid *only* in
special cases.

Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in rigorous form
and apply naive approaches such as those appearing in *elementary* textbooks
in statistical thermodynamics. Even if one closes the eyes to the mathematical
funambulism in those approaches, the resulting formulaes may be often
reinterpreted/truncated to give the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive entropy
from classic considerations, while QM treatment does not need it and the
only constant the result will include is "h", and it is always the same
so no
additional measurements are needed.
That is all wrong again.

The definition of entropy was pointed before (the link was given). As many
people you confound the definition of entropy with the expression S = k lnW
which is only valid under certain restrictions (In the link I showed
how S = k lnW is *obtained from* the definition of entropy).

The definition I gave was the well-known quantum one (I repeat it here)

S = -k Tr ρ ln ρ

In the classical limit, this *rigorously* reduces to (you can use Wigner
formalism to prove this in the limit h --> 0)

S = -k \int f ln f dqdp

Adding or substracting a constant does not change the result, thus some authors
as Keizer define the entropy for a classical system as

S = -k \int f ln f dqdp + constant

And some authors give the value of the constant as

constant = k

I.e. they work with the expression

S = -k \int f (ln f -1) dqdp [*]

This is the classical entropy. It contains not h in any part because it is
CLASSICAL. This is the equation (4.1) in the reference [10] given in my original post.

For a classical isolated ideal gas at equilibrium f = f_eq is well-known since
Maxwell. Substituting Maxwellian f_eq into the above expression gives the
value of the classical entropy for the gas, which, of course, coincides with
the entropy given by classical thermodynamics. See ref [10] for details.

Ref [10] is a advanced monograph and probably it is not at hand at your
favourite library. All this material is also found in other books. For instance
in the well-known book "Non-equilibrium Thermodynamics" by deGroot an Mazur.

The definition of entropy [*] is the equation (31) in page 170 of their book.
The Maxwell f is given in (43). Combinign both they give the standard result
for the CLASSICAL entropy -see (48)-

S = 1/T (U + pV - N mu)

This is the equation (4.22) in the monograph [10]. Again this is a classical
expression (no "h" appears in any part).
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-06-01 21:13:02 UTC
Permalink
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM nature
of entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of cause
empirical representations of absolute entropy used in thermodynamical
relationships does
not require enumerations of states. It can be measured for example by
integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue doing mistakes
such as confounding the *definition* of entropy with formulae valid *only* in
special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in rigorous form
and apply naive approaches such as those appearing in *elementary* textbooks
in statistical thermodynamics. Even if one closes the eyes to the mathematical
funambulism in those approaches, the resulting formulaes may be often
reinterpreted/truncated to give the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive entropy
from classic considerations, while QM treatment does not need it and the
only constant the result will include is "h", and it is always the same
so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given). As many
people you confound the definition of entropy with the expression S = k lnW
which is only valid under certain restrictions (In the link I showed
how S = k lnW is *obtained from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use Wigner
formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus some authors
as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because it is
CLASSICAL. This is the equation (4.1) in the reference [10] given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is well-known since
Maxwell. Substituting Maxwellian f_eq into the above expression gives the
value of the classical entropy for the gas, which, of course, coincides with
the entropy given by classical thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at your
favourite library. All this material is also found in other books. For instance
in the well-known book "Non-equilibrium Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of their book.
The Maxwell f is given in (43). Combinign both they give the standard result
for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a classical
expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
Entropy is a thermodynamical parameter that is uniquely defined and it
is basically
a number in Watt/C. This number can be either right or wrong, and no
matter
how it is derived, it has to correspond to experimentally measured
value.

Empyrically entropy can be obtained using the framework of
thermodynamical potentials
developed by Gibbs for any arbitrary combination of isothermal,
isohoric, adiabatic etc.
processes. It requires measurement of various extensive quantities
such as thermal capacities
and integrating them from 0 to present temperature. There is nothing
interesting about
this very mature area of technology. There are tables of absolute
entropies S(T) for various
element which are widelly used in chemistry (they are needed to
determine direction of various reactions).
You will get an actual measured number for S(T) for ahy particular gas
or kristall.

Now, when we are talking about ab-initio derivations of this number
(not experimental measurement!)
we can talk about different strategies. Note that we want to get an
actual number of absolute
entropy for present temperature, so not tricks with "constants" are
allowed.

But - there is no classical strategy to do that! Any "classical"
derivation will
only give you a delta between S(T1) and S(T2) but it will not give you
an absolute value of S(T),
therefore need for "constants".

There is only one way to derive absolute entropy without any
experimentally measured constant,
and it is the way derived by Sackur and Tetrode, using quantization
of
http://en.wikipedia.org/wiki/Sackur-Tetrode_equation
and result will include Plank constant.

If you want to challenge this assertion and say that there is a
classical way to derive
ab-initio absolute entropy S(T) (not a delta!) of any ideal gas
without any experimental
constant, and result will not include "h" (so QM was not involved),
please provide a link.

Regards,
Yevgen


--
Tune in to "Strange Drawing of the Day" buzz:
http://www.google.com/profiles/100679771837661030957#buzz
"Juan R." González-Álvarez
2010-06-02 08:12:55 UTC
Permalink
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM
nature of entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of
cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for example by
integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue doing
mistakes such as confounding the *definition* of entropy with formulae
valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing in
*elementary* textbooks in statistical thermodynamics. Even if one
closes the eyes to the mathematical funambulism in those approaches,
the resulting formulaes may be often reinterpreted/truncated to give
the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not need
it and the only constant the result will include is "h", and it is
always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given). As
many people you confound the definition of entropy with the expression
S = k lnW which is only valid under certain restrictions (In the link I
showed how S = k lnW is *obtained from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus some
authors as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because it
is CLASSICAL. This is the equation (4.1) in the reference [10] given in
my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the above
expression gives the value of the classical entropy for the gas, which,
of course, coincides with the entropy given by classical
thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at your
favourite library. All this material is also found in other books. For
instance in the well-known book "Non-equilibrium Thermodynamics" by
deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of their
book. The Maxwell f is given in (43). Combinign both they give the
standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a
classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical state, and you
would object to terms as "classical thermodynamics" or "quantum themodynamics"
but they are well-established and your misconceptions easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined and it
is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical parameter but
a physical quantity. Second, C is not an unit but ºC is. Third, Watts are Joules
per second but the units of entropy are Joules per Kelvin:

[S] = J K^-1
Post by Yevgen Barsukov
This number can be either right or wrong, and no
matter
how it is derived, it has to correspond to experimentally measured
value.
Empyrically entropy can be obtained using the framework of
thermodynamical potentials
developed by Gibbs for any arbitrary combination of isothermal,
isohoric, adiabatic etc.
processes. It requires measurement of various extensive quantities such
as thermal capacities
and integrating them from 0 to present temperature. There is nothing
interesting about
this very mature area of technology. There are tables of absolute
entropies S(T) for various
element which are widelly used in chemistry (they are needed to
determine direction of various reactions). You will get an actual
measured number for S(T) for ahy particular gas or kristall.
And you show again your ignorance because you confound AGAIN the
general concept of entropy with the special IDEALIZED cases as the listed above.

As is well-known the "thermodynamical potentials developed by Gibbs" have
only a limited field of applicability and as a consequence the general
theory of thermodynamics cannot be built over them. It is well-known that the
Gibbs energy is not always a thermodynamic potential

All this is well-known. There is even two Nobel prizes in chemistry awarded
to people who extended Gibbs theory beyond those idealized special cases
(e.g. cristals) that you repetitive cite.
Post by Yevgen Barsukov
Now, when we are talking about ab-initio derivations of this number (not
experimental measurement!)
we can talk about different strategies. Note that we want to get an
actual number of absolute
entropy for present temperature, so not tricks with "constants" are
allowed.
Adding/substrating a constant to quantities as energy or entropy does
not change the physics. This is not a trick but a 'theorem'.
Post by Yevgen Barsukov
But - there is no classical strategy to do that! Any "classical"
derivation will
only give you a delta between S(T1) and S(T2) but it will not give you
an absolute value of S(T),
therefore need for "constants".
This is ALL PLAIN WRONG. First, entropy is not a natural function of T. Second,
the definition of classical entropy is not "a delta between"...
Post by Yevgen Barsukov
There is only one way to derive absolute entropy without any
experimentally measured constant,
and it is the way derived by Sackur and Tetrode, using quantization of
http://en.wikipedia.org/wiki/Sackur-Tetrode_equation and result will
include Plank constant.
Your misguided claim was already corrected. It was already explained to you
that the ST expression is only an expression with LIMITED validity (it does
not gives the entropy of a gas in the general case) and that the ST is usually
obtained through an truly inconsistent way, where it is said such nonsense as
that:

"In classical mechanics, if a system possesses f degrees of freedom then
phase-space is conventionally subdivided into cells of arbitrarily
chosen volume h_0^f

[...]

in classical mechanics by setting the cell size in phase-space equal
to Planck's constant, so that h_0 = h. This automatically enforces the
most restrictive form of the uncertainty principle, \delta q_i \delta p_i = h."
Post by Yevgen Barsukov
If you want to challenge this assertion and say that there is a
classical way to derive
ab-initio absolute entropy S(T) (not a delta!) of any ideal gas without
any experimental
constant, and result will not include "h" (so QM was not involved),
please provide a link.
Better than that. I gave you two common references, one of them very
very well-known (the classical monograph by deGroot and Mazur), where the
DEFINITION of classical entropy is given (it contains no arbitrary constant
as you insist).

Moreover, I explained to you how the classical expression for the entropy
arises from the quantum expression one in the limit h--> 0. Of course, the
quantum definition of entropy is not S = k lnW.

Don't insist on repeating the same mistakes again. Stop from making bogus
claims. Stop from writting as if you know what you are talking out, when
do not even know what are the units for entropy!

Just open the literature and learn!
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
"Juan R." González-Álvarez
2010-06-02 08:19:57 UTC
Permalink
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM
nature of entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of
cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for example
by integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue doing
mistakes such as confounding the *definition* of entropy with formulae
valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing in
*elementary* textbooks in statistical thermodynamics. Even if one
closes the eyes to the mathematical funambulism in those approaches,
the resulting formulaes may be often reinterpreted/truncated to give
the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not
need it and the only constant the result will include is "h", and it
is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given). As
many people you confound the definition of entropy with the expression
S = k lnW which is only valid under certain restrictions (In the link
I showed how S = k lnW is *obtained from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus
some authors as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because
it is CLASSICAL. This is the equation (4.1) in the reference [10]
given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the above
expression gives the value of the classical entropy for the gas,
which, of course, coincides with the entropy given by classical
thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at
your favourite library. All this material is also found in other
books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they give
the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a
classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical state,
and you would object to terms as "classical thermodynamics" or "quantum
themodynamics" but they are well-established and your misconceptions
easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined and it
is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC is.
Third, Watts are Joules per second but the units of entropy are Joules
[S] = J K^-1
Post by Yevgen Barsukov
This number can be either right or wrong, and no matter
how it is derived, it has to correspond to experimentally measured
value.
Empyrically entropy can be obtained using the framework of
thermodynamical potentials
developed by Gibbs for any arbitrary combination of isothermal,
isohoric, adiabatic etc.
processes. It requires measurement of various extensive quantities such
as thermal capacities
and integrating them from 0 to present temperature. There is nothing
interesting about
this very mature area of technology. There are tables of absolute
entropies S(T) for various
element which are widelly used in chemistry (they are needed to
determine direction of various reactions). You will get an actual
measured number for S(T) for ahy particular gas or kristall.
And you show again your ignorance because you confound AGAIN the general
concept of entropy with the special IDEALIZED cases as the listed above.
As is well-known the "thermodynamical potentials developed by Gibbs"
have only a limited field of applicability and as a consequence the
general theory of thermodynamics cannot be built over them. It is
well-known that the Gibbs energy is not always a thermodynamic potential
All this is well-known. There is even two Nobel prizes in chemistry
awarded to people who extended Gibbs theory beyond those idealized
special cases (e.g. cristals) that you repetitive cite.
Post by Yevgen Barsukov
Now, when we are talking about ab-initio derivations of this number
(not experimental measurement!)
we can talk about different strategies. Note that we want to get an
actual number of absolute
entropy for present temperature, so not tricks with "constants" are
allowed.
Adding/substrating a constant to quantities as energy or entropy does
not change the physics. This is not a trick but a 'theorem'.
Adding/substrating a constant to classical quantities as energy or
entropy does not change the physics. This is not a trick but a 'theorem'.
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
But - there is no classical strategy to do that! Any "classical"
derivation will
only give you a delta between S(T1) and S(T2) but it will not give you
an absolute value of S(T),
therefore need for "constants".
This is ALL PLAIN WRONG. First, entropy is not a natural function of T.
Second, the definition of classical entropy is not "a delta between"...
Post by Yevgen Barsukov
There is only one way to derive absolute entropy without any
experimentally measured constant,
and it is the way derived by Sackur and Tetrode, using quantization of
http://en.wikipedia.org/wiki/Sackur-Tetrode_equation and result will
include Plank constant.
Your misguided claim was already corrected. It was already explained to
you that the ST expression is only an expression with LIMITED validity
(it does not gives the entropy of a gas in the general case) and that
the ST is usually obtained through an truly inconsistent way, where it
"In classical mechanics, if a system possesses f degrees of freedom then
phase-space is conventionally subdivided into cells of arbitrarily
chosen volume h_0^f
[...]
in classical mechanics by setting the cell size in phase-space equal
to Planck's constant, so that h_0 = h. This automatically enforces
the most restrictive form of the uncertainty principle, \delta q_i
\delta p_i = h."
Post by Yevgen Barsukov
If you want to challenge this assertion and say that there is a
classical way to derive
ab-initio absolute entropy S(T) (not a delta!) of any ideal gas without
any experimental
constant, and result will not include "h" (so QM was not involved),
please provide a link.
Better than that. I gave you two common references, one of them very
very well-known (the classical monograph by deGroot and Mazur), where
the DEFINITION of classical entropy is given (it contains no arbitrary
constant as you insist).
Moreover, I explained to you how the classical expression for the
entropy arises from the quantum expression one in the limit h--> 0. Of
course, the quantum definition of entropy is not S = k lnW.
Don't insist on repeating the same mistakes again. Stop from making
bogus claims. Stop from writting as if you know what you are talking
out, when do not even know what are the units for entropy!
Just open the literature and learn!
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-06-02 14:54:52 UTC
Permalink
Post by "Juan R." González-Álvarez
Adding/substrating a constant to classical quantities as energy or
entropy does not change the physics. This is not a trick but a 'theorem'.
It is getting better and better. Now you want to add/subtract
arbitrary constants
to energy??

What happened with the law of energy conservation? You got tired of
bashing 2nd law
of thermodynamics, and now tackling the first, bringing "the physics"
as your witness?
Unfortunately "the physics" is not likely to be very supportive.
The only way to violate energy conservation law would be to go to
general relativity,
and even there energy/momentum tensor still going to be conserved.

Regards,
Yevgen

--
Tune in to "Strange Drawing of the Day" buzz:
http://www.google.com/profiles/100679771837661030957#buzz
"Juan R." González-Álvarez
2010-06-05 11:32:16 UTC
Permalink
Post by "Juan R." González-Álvarez
Adding/substrating a constant to classical quantities as energy or
entropy does not change the physics. This is not a trick but a
'theorem'.
It is getting better and better. Now you want to add/subtract arbitrary
constants
to energy??
I agree that the funyness of your postings in increasing :-D
What happened with the law of energy conservation?
Consider an isolated system. Conservation reads

dU = 0

Now add/substract a constant (U' = U + constant), it follows

dU' = dU + d(constant) = d(constant)

Now it remains to compute the differential of a constant. I will
leave this to you as an exercise [*]

(snip further misunderstandings)


[*] Solution: it is zero and dU' = 0
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-06-02 14:48:41 UTC
Permalink
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM
nature of entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of
cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for example by
integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue doing
mistakes such as confounding the *definition* of entropy with formulae
valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing in
*elementary* textbooks in statistical thermodynamics. Even if one
closes the eyes to the mathematical funambulism in those approaches,
the resulting formulaes may be often reinterpreted/truncated to give
the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not need
it and the only constant the result will include is "h", and it is
always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given). As
many people you confound the definition of entropy with the expression
S = k lnW which is only valid under certain restrictions (In the link I
showed how S = k lnW is *obtained from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus some
authors as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because it
is CLASSICAL. This is the equation (4.1) in the reference [10] given in
my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the above
expression gives the value of the classical entropy for the gas, which,
of course, coincides with the entropy given by classical
thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at your
favourite library. All this material is also found in other books. For
instance in the well-known book "Non-equilibrium Thermodynamics" by
deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of their
book. The Maxwell f is given in (43). Combinign both they give the
standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a
classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical state, and you
would object to terms as "classical thermodynamics" or "quantum themodynamics"
but they are well-established and your misconceptions easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined and it
is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical parameter but
a physical quantity. Second, C is not an unit but ºC is. Third, Watts are Joules
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K if
you make the conversion.
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
Which means, you can not sneak into it any arbitrary constant.
Here is a link to the table of experimentally measured molar entropy
values
so that you can get used to the idea that entropy is a NUMBER:
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar-Entropies-984.html

While this number can be either measured using thermodynamical
potential formalism
or derived using ab-initio considerations using quantum mechanics (it
has even been
done for black holes for christ sake and you still can not agree with
ideal gas case
done in the 1933!), it is still going to be a FIXED NUMBER without
any arbitrary constants.
Post by "Juan R." González-Álvarez
  "In classical mechanics, if a system possesses f degrees of freedom then
   phase-space is conventionally subdivided into cells of arbitrarily
   chosen volume h_0^f
   [...]
   in classical mechanics by setting the cell size in phase-space equal
   to Planck's constant, so that h_0 = h. This automatically enforces the
   most restrictive form of the uncertainty principle, \delta q_i \delta p_i = h."
That is hilarious. If you use any limitation to the size of chosen
volume, it is no
longer classical. Moreover, if you use a particular limitation "Plank
constant = h",
it is now a quantum mechanical treatment. Which is proving my point -
you CAN NOT
get to a NUMBER representing absolute entropy without quantum
mechanical quantisation.
I rest my case.

Regards,
Yevgen


--
Tune in to "Strange Drawing of the Day" buzz:
http://www.google.com/profiles/100679771837661030957#buzz
"Juan R." González-Álvarez
2010-06-05 11:49:05 UTC
Permalink
Yevgen Barsukov wrote on Wed, 02 Jun 2010 07:48:41 -0700:

And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM
nature of entropy as such.
To derive entropy through enumeration of the states in the phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of
cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for example
by integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue
doing mistakes such as confounding the *definition* of entropy with
formulae valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing in
*elementary* textbooks in statistical thermodynamics. Even if one
closes the eyes to the mathematical funambulism in those approaches,
the resulting formulaes may be often reinterpreted/truncated to give
the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not
need it and the only constant the result will include is "h", and
it is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given).
As many people you confound the definition of entropy with the
expression S = k lnW which is only valid under certain restrictions
(In the link I showed how S = k lnW is *obtained from* the
definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus
some authors as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because
it is CLASSICAL. This is the equation (4.1) in the reference [10]
given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the
above expression gives the value of the classical entropy for the
gas, which, of course, coincides with the entropy given by classical
thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at
your favourite library. All this material is also found in other
books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they give
the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a
classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical state,
and you would object to terms as "classical thermodynamics" or "quantum
themodynamics" but they are well-established and your misconceptions
easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined and
it is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC is.
Third, Watts are Joules per second but the units of entropy are Joules
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K if
you make the conversion.
I am replying to what you wrote (i.e. "Watt/C") not to what you "meant" nor
what it is supposed that you meant.
Post by Yevgen Barsukov
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
That is all plain wrong again. Entropy is a physical quantity!

Physical quantities (entropy, energy, temperature...) are not given by
numbers but by the *product* of a quantity and a unit. The quantity does
not need to be a number, it can be a scalar function, for instance.

This is all elementary stuff and it is well covered in elementary
textbooks on physics or chemistry. Open some of them.
Post by Yevgen Barsukov
Which means, you can not sneak into it any arbitrary constant. Here is a
link to the table of experimentally measured molar entropy values so
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar-Entropies-984.html
That is only a table of standard molar entropies at 298.15 K. That is it is
a table of numbers.

I already said to you that one may not confound a number with a function.

Above I wrote (several posts ago) the entropy for a classical gas

S = 1/T (U + pV - N mu)

This is a function S = S(U,V,N), as is also explained in textbooks. If you select
a specific value for T (e.g. 298.15 K) and molar quantities... then you get numbers
(plus units), but do NOT confound those numbers (plus units) with the function entropy S.

Also I already explained to you that those numerical entropies are only
*approximated* because are based in several approximations.

Moreover *standard* molar entropies are based in conventions. The
well-known textbook by Klotz & Rosenberg (Chemical Thermodynamics; 7th
Ed.) is very clear at this point:

"In the statement that we have adopted for the third law, it is assumed
(arbitrarily) that the entropy of each element in some crystalline state
is zero at 0 K."

The molar entropy so obtained frequently is called the "absolute"
entropy and is indicated as S_mT^0. However, in no sense is S_mT^0
truly an absolute entropy, because Equation (11.21) is based on the
convention that zero entropy is assigned to each element in some state
at 0 K."

"Entropies obtained from Equation (11.21) properly are called
conventional entropies or standard entropies."

Apart from conventions, their (11.21) is not the general expression for
the (function) entropy, but a special form valid only under *approximations*. The
the GENERAL expression was already given to you, including relevant literature
as the monograph in statistical mechanics by Eu and the celebrated
textbook in thermodynamics by deGroot and Mazur.

You do not read. You do not open the books... I will repeat this stuff again for
you, but this is the last time...

As noted by Keizer, the definition of classical entropy is

S = -k \int f ln f dqdp  +  constant

Authors as Eu, DeGroot and Mazur, etc. choose k as the constant. Thus the
definition of classical entropy they are using is

S = -k \int f (ln f  -1) dqdp

This is the equation (4.1) in Eu monograph and the equation (31) in the
classic textbook by deGroot an Mazur.

In their "Course in theoretical Physics" volume 10, Landau and Lifshitz
use

S = \int f ln(e/f) dqdp

This is their equation (4.1) in the section "The entropy of an ideal gas",
but with other notation.

I leave you as an exercise to verify the relation of Landau and Lifshitz with the above
expression given by Eu, DeGroot, and Mazur :-D

(...)
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
  "In classical mechanics, if a system possesses f degrees of freedom
  then
   phase-space is conventionally subdivided into cells of arbitrarily
   chosen volume h_0^f
   [...]
   in classical mechanics by setting the cell size in phase-space
   equal to Planck's constant, so that h_0 = h. This automatically
   enforces the most restrictive form of the uncertainty principle,
   \delta q_i \delta p_i = h."
That is hilarious.
Of course it is. As I SAID in my message the above is NONSENSE, but *you*
truncated this important part of my message...

(...)
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-06-07 17:44:57 UTC
Permalink
Post by "Juan R." González-Álvarez
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM
nature of entropy as such.
To derive entropy through enumeration of the states in the
phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of
cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for example
by integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue
doing mistakes such as confounding the *definition* of entropy with
formulae valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing in
*elementary* textbooks in statistical thermodynamics. Even if one
closes the eyes to the mathematical funambulism in those approaches,
the resulting formulaes may be often reinterpreted/truncated to give
the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not
need it and the only constant the result will include is "h", and
it is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given).
As many people you confound the definition of entropy with the
expression S = k lnW which is only valid under certain restrictions
(In the link I showed how S = k lnW is *obtained from* the
definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus
some authors as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because
it is CLASSICAL. This is the equation (4.1) in the reference [10]
given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the
above expression gives the value of the classical entropy for the
gas, which, of course, coincides with the entropy given by classical
thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at
your favourite library. All this material is also found in other
books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they give
the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a
classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical state,
and you would object to terms as "classical thermodynamics" or "quantum
themodynamics" but they are well-established and your misconceptions
easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined and
it is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC is.
Third, Watts are Joules per second but the units of entropy are Joules
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K if
you make the conversion.
I am replying to what you wrote (i.e. "Watt/C") not to what you "meant" nor
what it is supposed that you meant.
Post by Yevgen Barsukov
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
That is all plain wrong again. Entropy is a physical quantity!
Physical quantities (entropy, energy, temperature...) are not given by
numbers but by the *product* of a quantity and a unit. The quantity does
not need to be a number, it can be a scalar function, for instance.
This is all elementary stuff and it is well covered in elementary
textbooks on physics or chemistry. Open some of them.
Post by Yevgen Barsukov
Which means, you can not sneak into it any arbitrary constant. Here is a
link to the table of experimentally measured molar entropy values so
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
That is only a table of standard molar entropies at 298.15 K. That is it is
a table of numbers.
I already said to you that one may not confound a number with a function.
Above I wrote (several posts ago) the entropy for a classical gas
S = 1/T (U + pV - N mu)
This is a function S = S(U,V,N), as is also explained in textbooks. If you select
a specific value for T (e.g. 298.15 K) and molar quantities... then you get numbers
(plus units), but do NOT confound those numbers (plus units) with the function entropy S.
Also I already explained to you that those numerical entropies are only
*approximated* because are based in several approximations.
Moreover *standard* molar entropies are based in conventions. The
well-known textbook by Klotz & Rosenberg (Chemical Thermodynamics; 7th
  "In the statement that we have adopted for the third law, it is assumed
   (arbitrarily) that the entropy of each element in some crystalline state
   is zero at 0 K."
As you already attacked in passing both second and first law of
thermodynamics,
I should have seen this coming. Now the third law of thermodynamics is
a mere "convention"?

In reality it is of cause not a "convention" but the only possible
definition of entropy for
an object that is in ground state:
"As a system approaches absolute zero, all processes cease and the
entropy of the system approaches a minimum value."

For more details on justifications for third law
of thermodynamics for audience who has been paying attention:
http://en.wikipedia.org/wiki/Third_law_of_thermodynamics

Minimum value here deviates from zero if (again, from quantum
mechanical considerations)
system can have more than one state without thermal energy. Such cases
exist (for example
solid hydrogen would still have spin) but again the entropy would be
clearly defined by
QM means even for such cases. So, again, NO ARBITRARY CONSTANTS even
at zero.

I guess we are through all laws of thermodynamics already, so in
absence of objects
to attack I presume you will take on something more fundamental - QM
itself?
Or maybe general relativity?

Regards,
Yevgen
"Juan R." González-Álvarez
2010-06-07 19:29:19 UTC
Permalink
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely
QM nature of entropy as such.
To derive entropy through enumeration of the states in the
phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations.
Of cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for
example by integrating
Cp (thermal capacity) from absolute zero to present
temperature.
And instead reading the references pointed to you, you continue
doing mistakes such as confounding the *definition* of entropy
with formulae valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing
in *elementary* textbooks in statistical thermodynamics. Even if
one closes the eyes to the mathematical funambulism in those
approaches, the resulting formulaes may be often
reinterpreted/truncated to give the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does
not need it and the only constant the result will include is
"h", and it is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was
given). As many people you confound the definition of entropy
with the expression S = k lnW which is only valid under certain
restrictions (In the link I showed how S = k lnW is *obtained
from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result,
thus some authors as Keizer define the entropy for a classical
system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part
because it is CLASSICAL. This is the equation (4.1) in the
reference [10] given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the
above expression gives the value of the classical entropy for the
gas, which, of course, coincides with the entropy given by
classical thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand
at your favourite library. All this material is also found in
other books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they
give the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is
a classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical
state, and you would object to terms as "classical thermodynamics"
or "quantum themodynamics" but they are well-established and your
misconceptions easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined
and it is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC
is. Third, Watts are Joules per second but the units of entropy are
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K
if you make the conversion.
I am replying to what you wrote (i.e. "Watt/C") not to what you "meant"
nor what it is supposed that you meant.
Post by Yevgen Barsukov
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
That is all plain wrong again. Entropy is a physical quantity!
Physical quantities (entropy, energy, temperature...) are not given by
numbers but by the *product* of a quantity and a unit. The quantity
does not need to be a number, it can be a scalar function, for
instance.
This is all elementary stuff and it is well covered in elementary
textbooks on physics or chemistry. Open some of them.
Post by Yevgen Barsukov
Which means, you can not sneak into it any arbitrary constant. Here
is a link to the table of experimentally measured molar entropy
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
That is only a table of standard molar entropies at 298.15 K. That is
it is a table of numbers.
I already said to you that one may not confound a number with a function.
Above I wrote (several posts ago) the entropy for a classical gas
S = 1/T (U + pV - N mu)
This is a function S = S(U,V,N), as is also explained in textbooks. If
you select a specific value for T (e.g. 298.15 K) and molar
quantities... then you get numbers (plus units), but do NOT confound
those numbers (plus units) with the function entropy S.
Also I already explained to you that those numerical entropies are only
*approximated* because are based in several approximations.
Moreover *standard* molar entropies are based in conventions. The
well-known textbook by Klotz & Rosenberg (Chemical Thermodynamics; 7th
  "In the statement that we have adopted for the third law, it is
  assumed
   (arbitrarily) that the entropy of each element in some crystalline
   state is zero at 0 K."
As you already attacked in passing both second and first law of
thermodynamics,
At contrary, you were who did the silly claims about the violation of the
first law because, according to you, the differential of a constant is not zero!
Post by Yevgen Barsukov
I should have seen this coming. Now the third law of thermodynamics is a
mere "convention"?
Not a surprise that you pretend to misunderstand this too.

(...)
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-06-10 16:51:02 UTC
Permalink
Post by "Juan R." González-Álvarez
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely QM
nature of entropy as such.
To derive entropy through enumeration of the states in the
phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations. Of
cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for example
by integrating
Cp (thermal capacity) from absolute zero to present temperature.
And instead reading the references pointed to you, you continue
doing mistakes such as confounding the *definition* of entropy with
formulae valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing in
*elementary* textbooks in statistical thermodynamics. Even if one
closes the eyes to the mathematical funambulism in those approaches,
the resulting formulaes may be often reinterpreted/truncated to give
the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does not
need it and the only constant the result will include is "h", and
it is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was given).
As many people you confound the definition of entropy with the
expression S = k lnW which is only valid under certain restrictions
(In the link I showed how S = k lnW is *obtained from* the
definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result, thus
some authors as Keizer define the entropy for a classical system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part because
it is CLASSICAL. This is the equation (4.1) in the reference [10]
given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the
above expression gives the value of the classical entropy for the
gas, which, of course, coincides with the entropy given by classical
thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand at
your favourite library. All this material is also found in other
books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they give
the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is a
classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical state,
and you would object to terms as "classical thermodynamics" or "quantum
themodynamics" but they are well-established and your misconceptions
easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined and
it is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC is.
Third, Watts are Joules per second but the units of entropy are Joules
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K if
you make the conversion.
I am replying to what you wrote (i.e. "Watt/C") not to what you "meant" nor
what it is supposed that you meant.
Post by Yevgen Barsukov
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
That is all plain wrong again. Entropy is a physical quantity!
Physical quantities (entropy, energy, temperature...) are not given by
numbers but by the *product* of a quantity and a unit. The quantity does
not need to be a number, it can be a scalar function, for instance.
This is all elementary stuff and it is well covered in elementary
textbooks on physics or chemistry. Open some of them.
Post by Yevgen Barsukov
Which means, you can not sneak into it any arbitrary constant. Here is a
link to the table of experimentally measured molar entropy values so
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
That is only a table of standard molar entropies at 298.15 K. That is it is
a table of numbers.
I already said to you that one may not confound a number with a function.
Above I wrote (several posts ago) the entropy for a classical gas
S = 1/T (U + pV - N mu)
This is a function S = S(U,V,N), as is also explained in textbooks. If you select
a specific value for T (e.g. 298.15 K) and molar quantities... then you get numbers
(plus units), but do NOT confound those numbers (plus units) with the function entropy S.
You keep mixing up ab-initio derivation of something from basic
principles (what I am talking about),
and a function that relates one physical parameter as in this case
entropy, with other physical
parameters (as in this case U).
Ab-initio derivation allows you to get the value of physical parameter
without making
any measurements. That is what I was asking you to demonstrate by
using only classical
mechanics for entropy.

And yet you keep coming back with functions connecting entropy with
other parameters,
and which require physically measured values of U. I am surprized
about your peristancy
in this falacy. I have absolutely no disagreement with you that it is
very straightforward
to physically measure entropy, or to relate it to other values such as
U, as I mentioned
in every one of my messages. However, it does not bring you any closer
to the challenge
I posed - to show how you can derive entropy values for even the
simplest object such as
ideal gas that will be close to experimental values (such as in this
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
)
WITHOUT measuring any physical quantities, e.g. ab-initio.

Just makes sure to make a simple test before you post a next function
- substitute all the
parameters into it (none is allowed to be experimentally measured!)
and see if you can
get J/K value of entropy. If you happen to find that there are some
experimentally measured
parameter needed, don't bother to post it - it is irrelevant to our
discussion about ab-initio
derivation of entropy.
If you will find a function that satisfies above criteria, my
statement is - it WILL include
Plank constant h, which proves that I am right and entropy has QM
background.
If you will NOT find an ab-initio entropy function that satisfies
above criteria and does not
contain "h", than go ahead and admit that you are wrong.

Regards,
Yevgen

--
Tune in to "Strange Drawing of the Day" buzz:
http://www.google.com/profiles/100679771837661030957#buzz
"Juan R." González-Álvarez
2010-06-11 23:58:03 UTC
Permalink
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely
QM nature of entropy as such.
To derive entropy through enumeration of the states in the
phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations.
Of cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for
example by integrating
Cp (thermal capacity) from absolute zero to present
temperature.
And instead reading the references pointed to you, you continue
doing mistakes such as confounding the *definition* of entropy
with formulae valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing
in *elementary* textbooks in statistical thermodynamics. Even if
one closes the eyes to the mathematical funambulism in those
approaches, the resulting formulaes may be often
reinterpreted/truncated to give the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does
not need it and the only constant the result will include is
"h", and it is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was
given). As many people you confound the definition of entropy
with the expression S = k lnW which is only valid under certain
restrictions (In the link I showed how S = k lnW is *obtained
from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result,
thus some authors as Keizer define the entropy for a classical
system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part
because it is CLASSICAL. This is the equation (4.1) in the
reference [10] given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the
above expression gives the value of the classical entropy for the
gas, which, of course, coincides with the entropy given by
classical thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand
at your favourite library. All this material is also found in
other books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they
give the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is
a classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical
state, and you would object to terms as "classical thermodynamics"
or "quantum themodynamics" but they are well-established and your
misconceptions easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined
and it is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC
is. Third, Watts are Joules per second but the units of entropy are
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K
if you make the conversion.
I am replying to what you wrote (i.e. "Watt/C") not to what you "meant"
nor what it is supposed that you meant.
Post by Yevgen Barsukov
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
That is all plain wrong again. Entropy is a physical quantity!
Physical quantities (entropy, energy, temperature...) are not given by
numbers but by the *product* of a quantity and a unit. The quantity
does not need to be a number, it can be a scalar function, for
instance.
This is all elementary stuff and it is well covered in elementary
textbooks on physics or chemistry. Open some of them.
Post by Yevgen Barsukov
Which means, you can not sneak into it any arbitrary constant. Here
is a link to the table of experimentally measured molar entropy
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
That is only a table of standard molar entropies at 298.15 K. That is
it is a table of numbers.
I already said to you that one may not confound a number with a function.
Above I wrote (several posts ago) the entropy for a classical gas
S = 1/T (U + pV - N mu)
This is a function S = S(U,V,N), as is also explained in textbooks. If
you select a specific value for T (e.g. 298.15 K) and molar
quantities... then you get numbers (plus units), but do NOT confound
those numbers (plus units) with the function entropy S.
You keep mixing up ab-initio derivation of something from basic
principles (what I am talking about), and a function that relates one
physical parameter as in this case entropy, with other physical
parameters (as in this case U).
You did not understand anything. I wrote the fundamental definition
of entropy and the ab-initio derivation of the *approximated*
expressions that you use.
Post by Yevgen Barsukov
Ab-initio derivation allows you to get the value of physical parameter
without making
any measurements. That is what I was asking you to demonstrate by using
only classical
mechanics for entropy.
You did not understand anything. The definition of the entropy given
yields the value without any measurement.
Post by Yevgen Barsukov
And yet you keep coming back with functions connecting entropy with
other parameters,
You did not understand anything. The definition for the entropy does not use
parameters. Neither entropy is a parameter, which was another of your silly
claims in the past.
Post by Yevgen Barsukov
and which require physically measured values of U. I am surprized about
your peristancy
in this falacy.
You did not understand anything. The expression given above

S = 1/T (U + pV - N mu)

was derived ab-initio from the definition of classical entropy (which you do not
still understand).

The value of U is not measured but computed ab-initio.

You insist on confounding that theoretical expression with the related
phenomenological expression of macroscopic thermodynamics

S_th = 1/T_th (U_th + p_thV_th - N_th mu_th)

where U_Th, V_th, N_th, etc. are obtained from measurements.
Post by Yevgen Barsukov
I have absolutely no disagreement with you that it is
very straightforward
to physically measure entropy, or to relate it to other values such as
U, as I mentioned
in every one of my messages.
You did not understand anything.
Post by Yevgen Barsukov
However, it does not bring you any closer
to the challenge
I posed - to show how you can derive entropy values for even the
simplest object such as
ideal gas that will be close to experimental values (such as in this
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
)
WITHOUT measuring any physical quantities, e.g. ab-initio.
You did not understand anything. You have been showed how to obtain the entropy
of an ideal gas from the definition. No measurement of anything is needed.
The result is purely ab-initio and its value can be obtained only with a computer.

You did not understand anything. As has been repeated and repeated and repeated
and repeated and repeated and repeated and repeated, the definition of the entropy
given is the *general* form, whereas your tables only contain an approximated entropy
valid only under certain *restrictions*.
Post by Yevgen Barsukov
Just makes sure to make a simple test before you post a next function -
substitute all the
parameters into it (none is allowed to be experimentally measured!) and
see if you can
get J/K value of entropy. If you happen to find that there are some
experimentally measured
parameter needed, don't bother to post it - it is irrelevant to our
discussion about ab-initio
derivation of entropy.
You did not understand anything. The expressions given to you are ab-initio.

You did not understand anything. The expressions given to you do not contain
parameters.

You did not understand anything. The value of the entropy is computed from
the expression given without measurement of parameters.
Post by Yevgen Barsukov
If you will find a function that satisfies above criteria, my statement
is - it WILL include
Plank constant h, which proves that I am right and entropy has QM
background.
You did not understand anything. The expression for classical entropy
was given to you. The expression for the *classical* entropy does not
contain the Planck constant, because classical quantities do not contain
Planck quantum constant.

You did not understand anything. The expression for the classical entropy
can be obtained from the quantum entropy doing h --> 0 in it. Doing h --> 0
means that h vanishes. Your silly claim is like saying the idiocy that
the Newton laws of classical mechanics "WILL include Planck constant h" (sic)

You were also given three textbooks, including the famous "Course in
theoretical physics" by Landau, who also gives the same expression for the
classical entropy that I gave to you half dozens of times. But you did not understand anything.

I gave you homework but you did not. It was too difficult for you. I will give you the solution now :-D

The definition of classical entropy (which does not include Planck constant h, of course) is

S = -k \int f (ln f  -1) dqdp

This is the equation (4.1) in Eu monograph and the equation (31) in the
classic textbook by deGroot an Mazur. The references were given to you, but
you did not even opened them.

In their "Course in theoretical Physics" volume 10, Landau and Lifshitz
use

S = \int f ln(e/f) dqdp

This is their equation (4.1) in the section "The entropy of an ideal gas",
but with other notation. Nor you opened this book.

I invited you to verify, as an exercise, the relation between their equation
and that given by Eu, DeGroot, and Mazur. This is easy. First note that

ln(e/f) = ln e - ln f = 1 - ln f

Substituting back gives

S = \int f (1 - ln f) dqdp

which can be rewritten as

S = - \int f (ln f  -1) dqdp

Now you can see that the expression given by Eu and DeGroot and Mazur was

S = -k \int f (ln f  -1) dqdp

The only difference between both is on the presence of k. But do not worry.
Both expression are the same.

It happens that Landau is using a system of units where k=1.

I am sorry if this homework was too difficult for you.
Post by Yevgen Barsukov
If you will NOT find an ab-initio entropy function that satisfies above
criteria and does not
contain "h", than go ahead and admit that you are wrong.
I admit that I was wrong in some respect. I supposed that using dialoque,
mathematical derivations and citing three textbooks in the topic you could
learn something new and stop from saying nonsense. As wrong I was!

Please if you feel the need to invent some new nonsense and to share it
with us, do it :-D
--
http://www.canonicalscience.org/

BLOG:
http://www.canonicalscience.org/publications/canonicalsciencetoday/canonicalsciencetoday.html
Yevgen Barsukov
2010-07-04 17:02:10 UTC
Permalink
On Jun 11, 6:58 pm, "Juan R." González-Álvarez
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
And again you ignore my advice to open the textbooks and learn. Instead
you add more mistakes and nonsenses to your first ones.
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
You left out an important aspect of the discussion, namely
QM nature of entropy as such.
To derive entropy through enumeration of the states in the
phase space,
This mistake was corrected in Sean Carrol's blog (links given).
You are mixing up "ab initio" from empirical representations.
Of cause empirical representations of absolute entropy used in
thermodynamical relationships does
not require enumerations of states. It can be measured for
example by integrating
Cp (thermal capacity) from absolute zero to present
temperature.
And instead reading the references pointed to you, you continue
doing mistakes such as confounding the *definition* of entropy
with formulae valid *only* in special cases.
Integrating Cp is one of those *special* cases.
Post by Yevgen Barsukov
However, to derive value of entropy "from scratch" without any
measurements,
you will end up with need to do quantization of the phase space.
Plain wrong again.
Post by Yevgen Barsukov
In
classical model your number of states will be infinite.
Only if you do not take the quantum-classical correspondence in
rigorous form and apply naive approaches such as those appearing
in *elementary* textbooks in statistical thermodynamics. Even if
one closes the eyes to the mathematical funambulism in those
approaches, the resulting formulaes may be often
reinterpreted/truncated to give the correct values.
Post by Yevgen Barsukov
That is why you will need normalization constants etc to derive
entropy from classic considerations, while QM treatment does
not need it and the only constant the result will include is
"h", and it is always the same so no
additional measurements are needed.
That is all wrong again.
The definition of entropy was pointed before (the link was
given). As many people you confound the definition of entropy
with the expression S = k lnW which is only valid under certain
restrictions (In the link I showed how S = k lnW is *obtained
from* the definition of entropy).
The definition I gave was the well-known quantum one (I repeat it here)
S = -k Tr ρ ln ρ
In the classical limit, this *rigorously* reduces to (you can use
Wigner formalism to prove this in the limit h --> 0)
S = -k \int f ln f dqdp
Adding or substracting a constant does not change the result,
thus some authors as Keizer define the entropy for a classical
system as
S = -k \int f ln f dqdp  +  constant
And some authors give the value of the constant as
constant = k
I.e. they work with the expression
S = -k \int f (ln f  -1) dqdp   [*]
This is the classical entropy. It contains not h in any part
because it is CLASSICAL. This is the equation (4.1) in the
reference [10] given in my original post.
For a classical isolated ideal gas at equilibrium f = f_eq is
well-known since Maxwell. Substituting Maxwellian f_eq into the
above expression gives the value of the classical entropy for the
gas, which, of course, coincides with the entropy given by
classical thermodynamics. See ref [10] for details.
Ref [10] is a advanced monograph and probably it is not at hand
at your favourite library. All this material is also found in
other books. For instance in the well-known book "Non-equilibrium
Thermodynamics" by deGroot an Mazur.
The definition of entropy [*] is the equation (31) in page 170 of
their book. The Maxwell f is given in (43). Combinign both they
give the standard result for the CLASSICAL entropy -see (48)-
S = 1/T (U + pV - N mu)
This is the equation (4.22) in the monograph [10]. Again this is
a classical expression (no "h" appears in any part).
http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
First of all I would like to object to the terms "quantum" or
"classical" entropy.
And you would also 'to object' to terms as quantum or classical
state, and you would object to terms as "classical thermodynamics"
or "quantum themodynamics" but they are well-established and your
misconceptions easily ignored...
Post by Yevgen Barsukov
Entropy is a thermodynamical parameter that is uniquely defined
and it is basically
a number in Watt/C.
This is plain NONSENSE. First, entropy is not a thermodynamical
parameter but a physical quantity. Second, C is not an unit but ºC
is. Third, Watts are Joules per second but the units of entropy are
[S] = J K^-1
I meant to say Wh/C (technical notation) which is the same as J / K
if you make the conversion.
I am replying to what you wrote (i.e. "Watt/C") not to what you "meant"
nor what it is supposed that you meant.
Post by Yevgen Barsukov
While focusing on misprint, you are trying to distract from the main
point - the absolute entropy
for given material at given molar amount, temperature, pressue is A
NUMBER.
That is all plain wrong again. Entropy is a physical quantity!
Physical quantities (entropy, energy, temperature...) are not given by
numbers but by the *product* of a quantity and a unit. The quantity
does not need to be a number, it can be a scalar function, for
instance.
This is all elementary stuff and it is well covered in elementary
textbooks on physics or chemistry. Open some of them.
Post by Yevgen Barsukov
Which means, you can not sneak into it any arbitrary constant. Here
is a link to the table of experimentally measured molar entropy
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
That is only a table of standard molar entropies at 298.15 K. That is
it is a table of numbers.
I already said to you that one may not confound a number with a function.
Above I wrote (several posts ago) the entropy for a classical gas
S = 1/T (U + pV - N mu)
This is a function S = S(U,V,N), as is also explained in textbooks. If
you select a specific value for T (e.g. 298.15 K) and molar
quantities... then you get numbers (plus units), but do NOT confound
those numbers (plus units) with the function entropy S.
You keep mixing up ab-initio derivation of something from basic
principles (what I am talking about), and a function that relates one
physical parameter as in this case entropy, with other physical
parameters (as in this case U).
You did not understand anything. I wrote the fundamental definition
of entropy and the ab-initio derivation of the *approximated*
expressions that you use.
Post by Yevgen Barsukov
Ab-initio derivation allows you to get the value of physical parameter
without making
any measurements. That is what I was asking you to demonstrate by using
only classical
mechanics for entropy.
You did not understand anything. The definition of the entropy given
yields the value without any measurement.
Post by Yevgen Barsukov
And yet you keep coming back with functions connecting entropy with
other parameters,
You did not understand anything. The definition for the entropy does not use
parameters. Neither entropy is a parameter, which was another of your silly
claims in the past.
Post by Yevgen Barsukov
and which require physically measured values of U. I am surprized about
your peristancy
in this falacy.
You did not understand anything. The expression given above
S = 1/T (U + pV - N mu)
was derived ab-initio from the definition of classical entropy (which you do not
still understand).
The value of U is not measured but computed ab-initio.
You insist on confounding that theoretical expression with the related
phenomenological expression of macroscopic thermodynamics
S_th = 1/T_th (U_th + p_thV_th - N_th mu_th)
where U_Th, V_th, N_th, etc. are obtained from measurements.
Post by Yevgen Barsukov
I have absolutely no disagreement with you that it is
very straightforward
to physically measure entropy, or to relate it to other values such as
U, as I mentioned
in every one of my messages.
You did not understand anything.
Post by Yevgen Barsukov
However, it does not bring you any closer
to the challenge
I posed - to show how you can derive   entropy values for even the
simplest object such as
ideal gas that will be close to experimental values (such as in this
Post by "Juan R." González-Álvarez
Post by Yevgen Barsukov
http://chemed.chem.wisc.edu/chempaths/GenChem-Textbook/Standard-Molar...
)
WITHOUT measuring any physical quantities, e.g. ab-initio.
You did not understand anything. You have been showed how to obtain the entropy
of an ideal gas from the definition. No measurement of anything is needed.
The result is purely ab-initio and its value can be obtained only with a computer.
You did not understand anything. As has been repeated and repeated and repeated
and repeated and repeated and repeated and repeated, the definition of the entropy
given is the *general* form, whereas your tables only contain an approximated entropy
valid only under certain *restrictions*.
Post by Yevgen Barsukov
Just makes sure to make a simple test before you post a next function -
substitute all the
parameters into it (none is allowed to be experimentally measured!) and
see if you can
get J/K value of entropy. If you happen to find that there are some
experimentally measured
parameter needed, don't bother to post it - it is irrelevant to our
discussion about ab-initio
derivation of entropy.
You did not understand anything. The expressions given to you are ab-initio.
You did not understand anything. The expressions given to you do not contain
parameters.
You did not understand anything. The value of the entropy is computed from
the expression given without measurement of parameters.
Post by Yevgen Barsukov
If you will find a function that satisfies above criteria, my statement
is - it WILL include
Plank constant h, which proves that I am right and entropy has QM
background.
You did not understand anything. The expression for classical entropy
was given to you. The expression for the *classical* entropy does not
contain the Planck constant, because classical quantities do not contain
Planck quantum constant.
You did not understand anything. The expression for the classical entropy
can be obtained from the quantum entropy doing h --> 0 in it. Doing h --> 0
means that h vanishes. Your silly claim is like saying the idiocy that
the Newton laws of classical mechanics "WILL include Planck constant h" (sic)
You were also given three textbooks, including the famous "Course in
theoretical physics" by Landau, who also gives the same expression for the
classical entropy that I gave to you half dozens of times. But you did not understand anything.
I gave you homework but you did not. It was too difficult for you. I will give you the solution now :-D
The definition of classical entropy (which does not include Planck constant h, of course) is
S = -k \int f (ln f  -1) dqdp
This is the equation (4.1) in Eu monograph and the equation (31) in the
classic textbook by deGroot an Mazur. The references were given to you, but
you did not even opened them.
In their "Course in theoretical Physics" volume 10, Landau and Lifshitz
use
S = \int f ln(e/f) dqdp
This is their equation (4.1) in the section "The entropy of an ideal gas",
but with other notation. Nor you opened this book.
I invited you to verify, as an exercise, the relation between their equation
and that given by Eu, DeGroot, and Mazur. This is easy. First note that
ln(e/f) = ln e - ln f = 1 - ln f
Substituting back gives
S = \int f (1 - ln f) dqdp
which can be rewritten as
S = - \int f (ln f  -1) dqdp
Now you can see that the expression given by Eu and DeGroot and Mazur was
S = -k \int f (ln f  -1) dqdp
The only difference between both is on the presence of k. But do not worry.
Both expression are the same.
It happens that Landau is using a system of units where k=1.
I am sorry if this homework was too difficult for you.
Post by Yevgen Barsukov
If you will NOT find an ab-initio entropy function that satisfies above
criteria and does not
contain "h", than go ahead and admit that you are wrong.
I admit that I was wrong in some respect. I supposed that using dialoque,
mathematical derivations and citing three textbooks in the topic you could
learn something new and stop from saying nonsense. As wrong I was!
Please if you feel the need to invent some new nonsense and to share it
with us, do it :-D
--http://www.canonicalscience.org/
BLOG:http://www.canonicalscience.org/publications/canonicalsciencetoday/ca...
Too bad you wasted so much writing without making a simple check that
I suggested - make
sure to substitute all values and get an answer in J/K before you
bother to post a new equation.
This simple check allows to verify that it is indeed ab-initio and
does not require anything experimentally
measured.
If you would try to do that with
S = -k \int f (ln f -1) dqdp
...you would quickly discover that you don't have dqdp - it is either
experimentally measured value or
needs to be determined by ab-initio derivation of its own.
In the contrast, if you try to get entropy value from quantum
mechanical consideration as done
by Sackur and Tetrode
http://en.wikipedia.org/wiki/Sackur-Tetrode_equation
you don't need to substitute anythign except volume, total energy,
weight and number of atoms.

Leo Tolstoy one time wrote in his letters "sorry I did not have enough
time to write you a short letter".
I would recommend to think about it. The key to success as a theorist
is to see the underlaying concept and not get drowned in the many
layers of equations - after all mathematics is just a tool. In this
particular discussion the key question is - can absolute entropy be
found without quatum mechanical considerations or not. I say no, you
say yes.
I already provided equation that gives an answer in J/K for absolute
entropy without experimentally measured quantities (link above) using
quantum mechanics, you did NOT do same for "classical only" ab-initio
derivation. Please do so (classical equation for S using only the same
"inputs" as Sackur-Tetrode equation) and there is no need for many
pages of pondering.

Regards,
Yevgen

Loading...