Problematic thermodynamics: A new beginning?
Below is a post on thermodynamics. It’s from Kent Mayhew. I don’t necessarily agree or disagree with his views, but I do agree that new ideas on entropy are needed, please comment and let us know your thoughts.
When seriously learning thermodynamics we often start with the random walk, and from that we learn probabilities and how the interaction of matter and energy can be best summed up by probabilities. Few ever question this guise, all forgetting that probabilities give results, not reasons. Deal a deck of cards and the results of a pair, flush, full house etc all come into play, yet the reason remains the cards were dealt, and only the results are defined by probabilities.
To say that thermodynamics needs a rewrite because probabilities give results and not reasons is an ambiguous argument that would not alter anyone’s mindset. However with this in the back of our minds, let us reconsider some aspects of thermodynamics.
If one accepts that our atmosphere exerts a pressure then one must accept that the atmosphere consists of mass located in a gravitational field. Now consider an expanding system; in order to experience a volume increase, it must upwardly displace our atmosphere’s mass. And this constitutes work (W=PdV)!
Sure such an expansion may simply cause a region pressure increase but as an open system, once mechanical equilibrium is attained, the net result will the upward displacement of Earth’s atmosphere, which required isobaric work: W=PdV. Furthermore you cannot simply recover this work, hence it becomes “lost work”/”lost energy”. This is discussed in my paper titled “Second law and lost work” Physics Essays 2016
See:
- http://www.physicsessays.org/browse-journal-2/product/1173-24-kent-w-mayhew-second-law-and-lost-work.html
- This paper is also posted on my website: http://www.newthermodynamics.com/#!mypapers/c1qsm
Sadly, the 19th century scientific greats, i.e. Clausius, Maxwell, Kelvin etc, all failed to understand the above simple premise. Some may call this a minor oversight but the reality becomes that this is perhaps the biggest blunder ever endeared by the sciences. Just think of the consequences.
Clausius realized that something when multiplied by temperature (T) defined energy, and named that something entropy (S). Hence the basis of the following fundamental isothermal isobaric thermodynamic relation was realized:
TdS = dE+PdV (1)
Eqn (1) states that isothermal entropy change (TdS) equals the change in internal energy (dE) plus the isobaric volume change (PdV).
All who have studied thermodynamics have accepted the above relation (1) without any real consideration. Furthermore, we have all thought that the lost work (PdV) experienced by expanding systems can be best explained in terms of entropy change (dS). For example latent heat of vaporization (AKA enthalpy of vaporization) involves PdV, which is often discussed in terms of changes to the liquid molecule’s entropy/randomness. Engineers often describe this as “non-sensible” work.
It sounds so simple, until you ask what is entropy? Well it turns that entropy has different interpretations, all dependant upon it application. When Shannon asked Von Neumann what should I call my mathematical entity in his information theory, he was basically told: “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.” Interestingly Shannon was going to call it information and perhaps that would have been a superior choice over entropy.
The point remains entropy lacks clarity and yet it forms the basis of the most used/pertinent differential equation in thermodynamics, that being eqn (1).
What does eqn (1) imply to most physicists. Consider that one accepts Boltzmann’s conceptualization that entropy signifies “the randomness of molecules in incessant motion” or Frank Lambert interpretation that entropy signifies ”the dispersal of a system’s energy”. Either way, the conceptualization of randomness/dispersal remains vague, and herein I am in agreement with the likes of Arieh Ben-Naim, in saying that such terminology is not particularly scientific.
However unlike everyone else I believe that the association of randomness to lost work is basically due to the fact that we studied expanding systems, saw that the systems became more random and then jumped to the erroneous conclusion that the lost work (PdV) is due to randomness increase within expanding systems, rather than due to the displacement of the surrounding atmosphere by such systems.
To some readers this may sound like a simple mistake but unfortunately this gross oversight now pervades every aspect of the sciences. It has allowed entropy to attain a demigod status (for lack of a better word). Entropy is used to explain so much, yet it remains a variable without any concise meaning. Entropy may be nothing more than a mathematical contrivance, as is discussed in my other 2016 paper published in Physics Essays titled “Entropy: An ill-conceived mathematical contrivance?”
See:
- http://physicsessays.org/browse-journal-2/product/1377-13-kent-w-mayhew-entropy-an-ill-conceived-mathematical-contrivance.html
- http://www.newthermodynamics.com/#!mypapers/c1qsm
Some might argue that we use mathematical contrivances all the time, and they would be right. Is this to say, so what if entropy is one? Come on folks. We must at least understand that if we are going to base a whole science upon a mathematical contrivance then it would be nice if that contrivance eventually had an all encompassing meaning that made sense. Some may argue that this eventuality may arise with further study. Perhaps but after over 150 yrs of use, we might now be better off examining the sanity of our past. Moreover the conceptualization that eqn (1) has anything to do with randomness makes no real sense.
Okay your attitude may be who cares because the science as it stands, can explain so much. My point becomes that we should care that our science complicates the simple. Moreover we should man up and accept the punishment, that being a shot to all of our prides for being fool hearty. Everyone with a science degree has made the mistake of accepting entropy’s association with randomness, and the implications to its accomplice, the second law, all without realizing that it is all a complication of the simple.
The second law should be limited to isolated systems as is stated in its definition. The mere fact that all useful systems here on Earth are expanding systems surrounded by our atmosphere, means that no useful systems here on Earth are isolated. In other words, all useful expanding system must upwardly displace our atmosphere! Therefore, such systems experience lost work (lost into surrounding atmosphere’s potential increase). Hence no useful system is isolated therefore the second law of thermodynamics does not apply. This is discussed in my papers previously described herein.
A referee did not want my paper “Second law and lost work” to be published. Although he thought I was right, he could not allow 150 yrs of indoctrination to be taken down by such a simple argument. Luckily saner minds prevailed.
If the above is not enough, now consider Boltzmann’s constant (k). Is it really a universal constant? The fact is; no it is not a universal constant. Rather it is a variable that depends upon the mass of our atmosphere and our gravitational field. Since our gravitational field is constant and our atmosphere’s mass is relatively constant then when we measure/calculate Boltzmman’s constant, it seems like a constant to us here on Earth.
We can take this a step further and realize that Boltzmann’s constant is what enables Boltzmann’s great eloquent math, commonly known as statistical thermodynamics, to work. If we were on another planet with a different atmosphere in a different gravitational field, then we could use the same basic statistical thermodynamics to explain what we see so long as we used a new so-called constant. In my way of thinking Boltzmann’;s constant (k) basically describes the ability of a system to do work here on Earth, as a function of its temperature.
No wonder the likes of Plank and Mach did not endear themselves to Boltzmann’s probability based world. Sure we could explain lost work (PdV) in terms of Boltzmann’s statistics but that is only because Boltzmann designed his constant (k) so that it equated to lost work (PdV) here on Earth. If you think about it this is classic circular logic, at its finest.
I like to think that Plank and Mach preferred more concrete ideologies based on rational. To bad that they did not think in terms of the upward displacement of Earth’s atmosphere because then they may have closed the door to randomness and the domination of all those associated probabilities. Do not get me wrong, there is nothing wrong with learning about probabilities, it is just that one must remember that for the most part they give results, not reasons.
The consequences go further than randomness vs work required to upwardly displace our atmosphere by expanding systems. Just consider cosmology:
- Does entropy even apply to things like the big bang? Assuming Hubble was right?
- CBR: Does entropy increases in radiation actually happen? Actually I have another explanation for CBR but that’s for another day!
- What about those so called black hole paradoxes?: Okay some claim to have solved it but whether they be right or wrong, would have the paradox even have existed?
The questions become numerous.
We could equally argue that entropy and the second law do not belong in biology, or any of the sciences. Okay entropy can remain as a mathematical contrivance until we further our understanding, which might actually happen once we simplify thermodynamics. Subjects like physical chemistry will need an extensive overhaul, but this should be viewed as an opportunity to finally apply some constructive logic, rather than remain an awkward subject wherein differentials are randomly moved about until some sort of seemingly acceptable result arises.
Speaking of awkward uses of differentials, now consider what we do with our equation (1). Certainly (1)[ TdS=dE+PdV] can be considered as the isothermal, isobaric version of:
$d(TS) = dE + d(PV)$ (2)
Another non-sensible aspect of thermodynamics is the fact that we generally starts of with (1) and then subtract versions of (2) from it to get the other differentials utilized in thermodynamics. Where else in mathematics do we start of with a part [eqn (1)] and then subtract the whole [eqn (2)] to get the other parts. Logic dictates that we should start off with a whole [eqn (2)] and then start determining its parts such as eqn (1). Sadly logic in thermodynamics is long lost. And then there is the 150 yrs of indoctrination.
There are those who rightfully believe that relativity needs to be challenged. Perhaps you will see that thermodynamics is in more need of revision. Not because it carries as much glamour as challenging Einstein’s thought. Rather because it is taught at the high school level and above. Imagine your children learning that our atmosphere has mass and that its upward displacement requires work rather than being daunted by the mystical entropy and its awkward association with randomness. This sort of logic may pervade their minds enabling some of them to then grow up and apply their newly found logic to all realms of the sciences, including relativity. Who knows maybe it will force any indignant physicist to actually open their minds to constructive criticism.
Herein I have just scratched the surface and could type for hours but will stop as I am wondering if there is any interest in further discussion by the good folk of Natural Philosophy. Thank you for your time.
Sincerely Kent Mayhew
P.S. There are some mistakes in my first edition of my book that have been changed and a new edition should be out in the new year if all goes to plan. Even so the mistakes remain minor when compared to the mistakes in thermodynamics that we all have accepted at some point in our lives.
I do not fully understand what you are saying (I have to purchase your Physics Essay in order to read it.), but have the following comments. In general, chemists and bio-chemists understand thermodynamics better than physicists. I refer you to Peter Atkins (Professor of Chemistry at Oxford) book, “Four Laws That Drive the Universe.” Entropy is increasing disorder. Entropy is always increasing which raises the question as to what drives the formation of structure. Mass is structure. Energy is neither created nor destroyed, it can always be tracked in any interaction. Entropy is the reason that nothing (including us) lasts forever. Entropy has a direction, but no time parameter. Work is constantly being lost. The thing about work is that there must be a “cold sink” ie a place where entropy can drive energy in order for work to be done. If there is no “cold sink” work will not take place. This is why we die of heat stroke. Our bodies are systems far from equilibrium and in order to maintain that status, work must be done. Lost work does not equate to lost energy. If our atmosphere is relocated or dispersed, that is entropy at work. Expansion, which is a decrease in the structural density of mass is entropy at work. I have pondered Shannon’s work on information. Information deteriorates over time – increasing entropy.
First of all you can get a copy of my papers from my website: http://www.newthermodynamics.com and look for heading “mypapers”
What Am I saying. One must understand that entropy was a term devised by Clausius (circa 1850s) as being something that when multiplied by temperature defines energy. The concept that energy was lost in mechanical systems may have originated with Carnot (Carnot engine: circa 1830). It was eventually understood by the 19th century greats that work was lost by expanding systems and this work was PdV. And then along comes Boltzmann’s eloquent mathematical analysis (basis of statistical thermodynamics). I am not sure at which point exactly in all this that the eqn (1): TdS=dE+PdV was devised.
Not matter eqn (1) has been used to explain lost work. So our formulation became a correlation between entropy change (dS) and lost work.
My point is this that entropy change is not need to explain lost work. Rather lost work is the energy that is lost in the displacement of our atmosphere. It is easy to show that from work equals force times distance that this is the same as PdV. Furthermore it can be shown that PdV is the energy required to lift our atmosphere’s mass.
Now I can lift a rock, tie a rope to it and harness the potential energy from that rock’s elevation. The same cannot be done with gas molecules of our atmosphere. There is simply no way to harness our atmosphere’s potential energy increase. If volume is removed from the Earth’s surface then this results in a kinetic energy increase of our atmosphere which is really nothing more than adding heat to those molecules.
In other words the work in lifting our atmosphere is lost, hence is lost work.
What does this mean. First of all it means that we do not have to start off with eqn 1. Instead we can write
eqn 2): d(TS)= dE+d(PV) and realize that 1) is just the isobaric isothermal version of 1).
Eqn 2) is useful in processes wherein both the pressure and volume increase within a system. The p[rime example being bubble nucleation
Now as for Atkin’s assertion that entropy is always increasing disorder. This is a second law of thermo based argument. I previously stated that the second law has major issues. However without even considering entropy or its associated second law, I can make the following argument. Dispersion with make molecules want to become more random. That right just the fact that gaseous molecules collide and exchange momentum means that they will disperse. Of course walls will keep them from forever parting ways. and without walls there is always gravity, the force that seemingly prevent the universe from being completely random. The force of life if you like.
So entropy is not needed. gravity keeps things together at least close enough so that other forces of structure can persist
My point becomes entropy is not really needed. Of course what about all those differentials that entropy has been associated with. Well we start by questioning entropy’s importance in its original guise as an explanation for lost work. Once you show that to be bogus then questions arise. And it is about time that we seek answers to such questions because basing a science upon entropy, when entropy lacks clarity is a rather dangerous approach.
Cold sink: Ykes why not just remove entropy and realize that thermodynamics can be better understood if we remove entropy from the subject. Sure it can be still be used as a mathematical contrivance for those who choose to do so, especially those in statistical thermo and physical chemistry. But if we now start asking the right questions maybe we can redefine entropy so that it has clarity and true meaning. Of course its importance will have diminished.
If our atmosphere disperses it is because molecules continue to collide while someone has shut off gravity.
Shannon should have used another term other than entropy. It has become like calling even man John, and then having a room full of Johns and then asking John top step forward.
Cheer and thanks for your great response
Kent
Your new thermodynamics seeks to define “entropy” or absence thereof in terms of the slowly changing energy of a system, i.e. the displacement of the atmosphere. But if you wish to use energy as a variable then you must also take into consideration more dynamic systems with rapidly changing energy which are referred to as dissipative systems and are much more complex. See for example A. Libchaber in Nonlinear Phenomena at Phase Transitions and Instabilities, ed. T. Riste (NY: Plenum, 1982). An increased order occurs in these systems that is not explained by your methods.
Dear Richard Oldani. Thanks for your point on dynamic systems
I do not differentiate between dynamic and infinitesimal systems. That is something that you have choosen to do. I do not blame you, as it is part of how we are poorly taught.
We are taught that eqn 1) TdS=dE+PdV is valid for systems that expand infinitesimally
First of all its validity is enshrined in math
I stated we should start with eqn 2): d(TS)=dE+d(PV)
The truth is this 2) becomes:
TdS+SdT+(delta T)(delta S) = dE + PdV + VdP + (delta P)(delta V)
I apologize for putting delta’s in the same equation as differentials but it is done to make a point. The (delta T)(delta S) and (delta P)(delta V) terms only vanish if the change is small in comparison to the actual parameters.
When that is the case then and only then does:
d(TS)=dE+d(PV) equate to TdS+SdT = dE + PdV + VdP
and then the isothermal isobaric version becomes eqn 1) TdS=dE+PdV
In other words from a purely mathematical sense eqn 1) is only valid for infinitesimal changes.
Yet eqn 1) is often applied to all sorts of changes by so many. It is baffling. It is probably why we often have difficulty in dealing with dynamic systems because traditional thermodynamics which is based upon 1) cannot.
Does this mean that eqn 2) is better suited for dealing with dynamic systems. I am willing to bet that is the case. However it really does not change the concept that entropy is not needed. Dynamic systems can probably be better dealt with by throwing out entropy and simply saying:
Changes to a systems energy = dE + d(PV)
Again no need for entropy
Another point concerning eqn 1) TdS=dE+PdV and infinitesimal change
When a system expands and it does work in upwardly displacing our atmosphere and no energy is flowing into that system. Then that system’s temperature decreases. Hence a temperature difference between the expanding system and surroundings. Now if the walls are not 100% insulated, then heat will flow back into the expanding system.
The point becomes that we often fool ourselves by infinitesimal changes as heat does often flow back into our expanding system, we just do not realize it. Of course if the system expands rapidly then a noted temperature change will exist. It is just that we do not consider the system until after thermal equilibrium occurs. This too is a mistake in logic. I often refer to thermal energy given in this manner as “freely given” energy that being energy given from our surroundings
We have to stop fooling ourselves by terms like equilibrium especially when energy is freely given. Sure the surroundings (i.e. atmosphere) is only acting as a heat bath hence the energy extracted from the surroundings is not measurable, but just because it is not measurable is not to say that it does not exist.
Cheers and many thanks Kent
“Thermodynamics is a funny subject. The first time you go through it, you don’t understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don’t understand it, but by that time you are so used to it, so it doesn’t bother you anymore.” – A. J. Sommerfield
I have studied various aspects of thermodynamics more than three times and to be sure, I am not always sure that I completely understand it so what I’m about to say should be taken in that context. I am no expert.
The challenges do not come from the basic definition of the four laws of thermodynamics (zero through three), but from the interdependence between them. To be certain, thermodynamics is one of the most fundamental empirical studies in all of physics, to which no exceptions have been found, with one quirk to which I shall return shortly. Therefore, from my perspective, I place the significance of our knowledge about thermodynamics above all other aspects of physics, including Newton’s primary axiom F = ma. This is because we know that all systems will behave in very predicable ways, even if we do not completely understand why. No scientific theory can make those claims; a theory is just the formulation of plausible explanations as to why, no matter how rigorous is always under scrutiny. By way of example, we know that every experiment concerning F=ma and gravitational effects will always produce the same results under the same conditions accurately to a first order approximation and generally better. We do understand some of the unknowns that can cause these experiments to be skewed, frictional forces, EM considerations, etc. But we also know that Newton’s laws are not the final end all be all when it comes to kinematics. Our theory needs revision. While we do not know exactly how to define gravity, energy, mass, etc. we know that Newton’s laws seem to need to be extended. Thermodynamics, on the other hands need only to be understood (IMHO). The basic laws of thermodynamics cover all known physical phenomena to the extent that we know all of the conditions of state, including initial and boundary conditions for dynamical systems, even if they can be confusing. I may be wrong here so some example where this fails would be helpful and educational.
Now there are two aspects of thermodynamics that are not complete. I would say that the implication of the definition of entropy is definitely one of them, but perhaps not because of the physics, for entropy in physics is defined. It is defined both by macro-states and micro-states. But if we look carefully at the micro-state definition, it helps explain the macro-state definition in that the concept of reversibility (the notion from which entropy is defined) must be supported by the constituents of the micro-states. It should appear self-evident that, when we consider a large ensemble of particles, that the probability that every micro-state will be in exactly the same state when the ‘reversible’ system goes from macro state A to macro state B and then back to A must be less than unity. Whether we call this randomness or dispersion, would depend on whether we are dealing with a non-isolated or an isolated system. If the system is isolated, then there is no dispersion, just ‘random’ changes in the micro-states of the system. Again logic here tells us that because of this there are no truly reversible physical processes, only approximately reversible physical processes. The clear implication here is that what has been defined as entropy must increase, that is the system cannot maintain the same order after undergoing a so-called reversible process. Here it matters not what happens to any excess energy regardless of how it is either rendered inaccessible by as system or dispersed into the systems surroundings. This is why thermodynamic processes can be applies to both open and closed systems as long as care is taken to completely define the system, its boundaries, and any mass/energy exchange. The reason I say that this aspect of thermodynamics is not complete is because of how we teach it. We who teach it often either do not understand it or we have a preconceived notion of what it means (or doesn’t mean). While this should not happen, particularly in science, but also in academia, it does. There is a LOT of indoctrination happening these days, so we need to insure that whatever revisions we make, we should be sure it is not easy to misinterpret.
The second aspect of thermodynamics, which adds insult to injury when considering the entropy question is the third law which from a logical perspective conflicts with the concept of absolute zero when we consider the fifth state of matter in the Einstein-Bose condensate. This is the quirk that I alluded to earlier. I was originally taught that absolute zero implies no motion, but that is simple not what we see. Without proposing an explanation for this conundrum, as I do not have one that is appropriate in this space, it adds to the colloquial meaning of the concept of increasing entropy leading toward less order. I deliberately am avoiding the use of the term disorder here since order in the traditional thermodynamic frame work is defined by the initial micro-stated on a system prior to any reversible process. This concept of less order is exactly the concept that has been latched onto by other scientific disciplines, perhaps with good reason.
I do not believe that that science here is faulty, just our understanding. So any attempt to clarify what thermodynamics means and what entropy means must be done carefully – and empirically. We all know that eloquent mathematical theories can look beautiful but say nothing about reality – and some of them are even deceptive. I am not saying that we should or should not try to better define thermodynamics or any of its laws, we just need to be certain that we maintain rigor, logic, and truth.
Respectfully submitted,
-Joe Bova
A continuation to my reply for Joe’s great insights
Joe you state:
“But if we look carefully at the micro-state definition, it helps explain the macro-state definition in that the concept of reversibility (the notion from which entropy is defined) must be supported by the constituents of the micro-states. “
Joe what I say does apply to both the micro and macrostate. You just have to be willing to let go of entropy: I know that it is hard: an example will be discussed later in this reply
Joes states: “It should appear self-evident that, when we consider a large ensemble of particles, that the probability that every micro-state will be in exactly the same state when the ‘reversible’ system goes from macro state A to macro state B and then back to A must be less than unity. “
Again you are hanging onto traditional taught statistical physics/thermo
Joe states: “Whether we call this randomness or dispersion, would depend on whether we are dealing with a non-isolated or an isolated system. If the system is isolated, then there is no dispersion, just ‘random’ changes in the micro-states of the system.
No joe. Whether I am using the right term is dispersion (because it also applies to light)
Maybe semantics. In an isolated system dispersive forces will act, like hitting a cue ball into a rack in billiards, the balls will tend to evenly disperse, more so with every collision (assuming no holes/pockets for the balls to go into) . Such forces/actions that disperse matter equally apply to all systems, assuming that there is a sufficient number of entities/balls/particles
Joe states: “ Again logic here tells us that because of this there are no truly reversible physical processes, only approximately reversible physical processes. The clear implication here is that what has been defined as entropy must increase,”
Please let go of such entropy mindsets. A process is only reversible if no energy/work is lost and if the energy exchanges can be suitably controlled and directed. For example if a system expands thus displaces our atmosphere therefore there is “lost work” and that process is not reversible:
possible exception is pulling on a hermetically sealed syringe, wherein you create a vacuum and then let go and syringes goes crashing back to original position
Remember useful systems here on Earth tend to experience expansion: formation of steam in the steam engine being a prime example: No need for entropy based arguments
Also mechanical systems tend to have friction as a component of their motion. Friction can be viewed as uncontrollable heat lost, wherein heat radiates away. Unless you can extract this radiated heat and then control its direction of flow, then friction means it is an irreversible process: again no need for entropy based arguments
Always remember heat tends to radiate in all directions so the vast majority of heat exchanges are irreversible. Again no need for entropy
Joe then states: “that is the system cannot maintain the same order after undergoing a so-called reversible process. Here it matters not what happens to any excess energy regardless of how it is either rendered inaccessible by as system or dispersed into the systems surroundings.”
No need for entropy: Just say heat is lost due to our inability to control; its direction of flow. This renders entropy into a description of the result and removes it from being some reason for irreversibility. Or if you prefer do not even think in terms of entropy.
Joe states: There is a LOT of indoctrination happening these days, so we need to insure that whatever revisions we make, we should be sure it is not easy to misinterpret.
Yes joe there is: Try this example:
1) Draw a box.
2) You now label energy in with an arrow pointing inward: Ein
3) You now label energy out with an arrow at the other end: Eout
4) Now put an arrow vertical going upwards and call this lost work: Wlost
5) for temperature change inside the box write dT
6) for heat capacity within the box write Cisometric
Case 1) For the isometric (constant V) case the box’s volume remains constant: Hence no work is lost into the surroundings thus: “lost work” = 0
The change in energy (dEbox) of the above box becomes:
dEbox = CisometricdT= Ein – Eout
This process is reversible because there are no energy loses. This is of course assuming that we can control the direction and flow of energy. And that there is no friction.
Case 2) Let us consider that the box expands, hence work is lost onto the surrounding atmosphere. Since lost work does not equal zero, we now can write
dEbox = CisometricdT= Ein – Eout – Wlost
This process is now not reversible because the work lost into the surrounding atmosphere is permanently lost to the system. Of course if there was friction involved then the amount of energy lost would increase. So even if we could control and direct the energy flow, this process will always be irreversible.
We can further our knowledge by realizing that the difference between isobaric (constant P) and isometric (constant V) heat capacities is the work lost into our atmosphere.
Now the above principles apply to both macro and micro systems unless micro is so small that we are only talking about a couple of molecules. When only talking a couple of molecules then the same principles only apply if we repeat the process a numerous number of times and then take a mean or average
My point remains: You do not need entropy to explain irreversibility, whether you are dealing with a micro or macro state
Cheers and thanks again for your valuable time.
Kent
opps case 2) in my previous/above reply to joe should be written in terms of the isobaric heat capacity (and not isometric heat capacity) that being:
dEbox=CisobaricdT= Ein – Eout – Wlost
I apologize
Kent
Kent,
No problem to your comment below…
I will probably respond more completely at a later time, but this caught my eye:
You said
“Please let go of such entropy mindsets. A process is only reversible if no energy/work is lost and if the energy exchanges can be suitably controlled and directed. For example if a system expands thus displaces our atmosphere therefore there is “lost work” and that process is not reversible…”
“Suitably controlled and directed” implies engineered intervention, which is okay since there are no 100 percent truly reversible processes. But just saying that there is “lost work” IMHO says something less than entropy increased. Perhaps is my philosophical bent, but lost work (at least not at this time) does not convince me that the system is any less capable, but higher entropy does. Note that my ‘dissident’ position on most science does not mean that all current physics is wrong, just the pseudo science stuff where we believe and preach something as fact even thought it has not experimental support. I will get more into why I think the philosophical part is important if we cannot replace that part of the entropy implication within your framework. Also, I embrace some of the other uses of the term entropy that lie outside of thermodynamics because the ideas of things running down or decaying over time in everything we see makes very much sense to me. Entropy carries that implication – “lost work” only fits within the context of thermo.
-Joe
Dear Joe thanks again for the reply
Lost work: Means just that energy which is lost hence a system that loses work is less capable of performing energy related tasks, such as the work required to turn the wheels of a steam locomotive
One must keep in mind that the conceptualization of the relation between entropy change and lost work was indeed to explain work lost by expanding systems, as eqn 1) TdS=dE+PdV
Another issue is one of that we/science then inadvertently associated entropy change with all work. This leads to misconceptions. Just call the work what it is “work” whether it be a force times distance sort of work or an increase in potential energy of a rock etc. The point is the intermixing of all types of work into the beast called entropy actually complicates matters
Joe states “but lost work (at least not at this time) does not convince me that the system is any less capable, but higher entropy does”. Now Joe I realize that this is a case of severe indoctrination. But instead of thinking a system is more capable if it has higher entropy. Why not think of it being more capable if it has higher energy.
Moreover herein shows a complication of reality. When you speak of higher entropy being more capable you should really say that a system with a higher heat capacity is more capable at a given temperature and in this way with temperature multiplied by heat capacity equaling energy, then we are now all talking the same language, one of energy. While if you think of entropy as randomness of molecules in incessant motion (Boltzmann’s analogy) then the more entropy it has the more random it is and the more work it has done hence the less capable the system has become. To better understand consider eqn 1), where an entropy increase is associated with an isobaric volume increase, so is not thinking that such a system whose entropy is high should have become less capable.
No matter, the point remains energy defines a system’s capability and not entropy. Energy is as clear as water while entropy is mud dependent your interpretation of entropy (randomness or heat capacity).
Like I have said in thermodynamics we do not even need entropy to understand its fundamentals at any level. Reversibility is about a system which loses no energy and that the flow of energy can be readily controlled or at least naturally flows in both directions.
Lose energy in any form and the process is not reversible. If energy cannot flow both ways without energy lose or input then it is not reversible.
Cheers and all the best Kent
Dear Joe
I used Sommerfields quote in my book. I love it
Newton’s primary axiom F = ma. This easily leads to W=PdV
And what follows is this is the work required to displace our atmosphere.
What I am saying is that there is a simpler way to envision the vast majority of thermodynamics without even touching entropy. It is basically as simple as Newton’s axiom. So based upon Occam’s razor which theory is right. Luckily I can explain phenomena that traditional thermodynamics cannot, s in some ways I should have an edge,
Joe states: “for entropy in physics is defined. It is defined both by macro-states and micro-states.” It maybe well defined mathematically but not logically. No matter I am stating that thermodynamics simplifies once you throw out entropy. Let me explain another way. This journey started in the 1990’s and I became interested in bubble nucleation. Because of entropy-based thermodynamics basically all nucleation processes were thought of in terms of isometric pressure change:
Energy required for nucleation = dE+VdP
But bubbles have dramatic volume changes. Anyhow in my 2004 paper (Energetics of nucleation) I discussed how one could explain the energy required for bubble nucleation using
Energy required for nucleation = dE+d(PV)
And I was the first person to properly explain bubble nucleation. Of course the equation applies to all nucleation process.
So I spent the next few years rewriting thermodynamics and I kept on getting into circular arguments. I then realized that lost work was dues to the upward displacement of our atmosphere. This then allowed me to throw out entropy. Once I did thermodynamics became a constructive logical science.
In other words entropy complicated the science. Sure it is used to explain so much although no one knows what entropy is, Bur entropy based thermodynamics cannot explain bubble nucleation. Bubble nucleation that rare process wherein both volume and pressure increases. That is correct entropy based thermodynamics cannot deal with bubble nucleation in a simply way.
Yet bubble nucleation is a simple process explainable in simple terms.
Joe you talk about the third law. Can I simply say that there are other plausible explanations! I just cannot let to many cats out of the bag at once as it complicates things. We must start by showing in the simplest term that traditional thermodynamics is flawed and moreover it can be readily simplified.
Just so that you know. Joe I did not throw entropy out easily. Due to my indoctrination, it was the hardest thing I have ever done. But once I threw it out (circa 2009), realizing that it complicated a simple science then and only then did I begin a new simpler understanding. I do not know everything and certainly will need help in making what I say into a complete science. BTW it took me several yrs of convincing myself about entropy. It was not easy by any stretch of the imagination.
I can imagine how hard this will be for many. Hopefully there will be those who see what I ma saying. Trust me when I say you do not need entropy to explain what we see whether it be thermodynamics, cosmology, solubility of gases, life and the list goes on. How science like physical chemistry and physics will be restructured, well I have ideas but they are only new beginning. Opportunities for you all if you prefer. But I repeats let us start by simplifying the fundamental.
Cheers and many thanks for your interest
Kent
Hi Kent,
First, when I composed this (offline) I did not see you second response, so I will let this fly and then read what appears a more comprehensive response 🙂
I took some time to read your paper “The Second Law and Lost Work” so I have a better idea of what you seem to be saying so let me frame my question as a scenario to get your response. Note that in what I am proposing here is a thought experiment where we are assuming that a truly isolated system can exist – something we both know is not truly possible. This is analogous to the hypothetical Maxwell’s Demon apparatus, but without the demon…
Imagine that you have a completely sealed rigid container that is perfectly insulated, such that the walls cannot dissipate any thermal energy to the container’s surroundings. Furthermore inside this container are two substances which when combined will exothermically react with each other in such a way that new substances will be produced, exhausting all of both of the original substance. These two substances are held apart by a frictionless door that can be moved by simply changing the orientation of the container (we could use a draw string type mechanism but let’s keep it theoretically sealed) to start the reaction. Finally, let’s assume that the total volumetric heat capacity, c_v (not sure how to make a subscript here yet), is unity to simply everything.
From a traditional thermodynamics perspective, once the isochoric reaction starts both pressure and temperature will increase. Without going through all the mathematical steps we can say the when the reaction completes, the total change in entropy will be ΔS = ln(T2/T1) because c_v is unity. Because the temperature will increase, the entropy will increase. We also know that the reaction is not reversible from the problem definition so the amount of ‘order’ has changed and become less organized (this is of course subject to philosophical debate – but the reagents are clearly in a lower energy state even if the system temperature has increased). Note that I selected this scenario specifically since there is no sink for the ‘entropy’ (no atmosphere in which to expand the increased pressure and dissipate the excess heat) and the process is not reversible.
How do you propose we handle the entropy concept in this situation? Consider that a power-plant (steam engine, Rankine engine, hydro-electric generator…), which is an open system, is still subject to the same type of processes for at least the compression phase except that both mass and energy are permitted to pass through the system boundary; but when the energy source has been exhausted the power-plant ceases to operate.
I’m always open to considering ideas that I can completely wrap my mind around. And to be fair, I have not had time to read your entropy paper.
-Joe
Dear Joe: First of all thanks for reading my paper
I sort of realize what you are saying
Joe states: “Because the temperature will increase, the entropy will increase.”. Now I repeat I hate the word entropy because it is meaningless. No matter let us play the entropy game. Is it not possible that the entropy does not increase? I mean if your exothermic reaction is such that the number of molecules does not change. And if you have the same volume, then it is just the same number of molecules (in a different form) at an increased temperature. Of course herein I am considering entropy in its Boltzmann’s context that being the randomness of molecules
Let us continue playing games
1) What if the reaction is between gases and we have fewer gas molecules after the exothermic reaction then before. Fewer molecules means lower pressure (ideal gas law)
Yet we have a higher temperature because the reaction was exothermic. Has the entropy increased or decreased? It becomes an interesting headache. Sure you change start crunching equations but think of it. Entropy changes is generally thought of as an isobaric isothermal process, so be careful what equation you use. My point herein becomes you must know what the temperature change is vs the pressure change due to a decrease in the number of gaseous molecules. Is entropy really needed herein. No not really. You can calculate the expected pressure decrease due to decrease in number of molecules and then calculate the expected pressure due to the temperature increase: What about entropy? I am not a physical chemist but my gut says you are simply complicating things
2) What if we have more gaseous molecules after the exothermic reaction then before.
Again the temperature increases and the pressure increases due to both the temperature increase and increase in number of molecules. Now I ask why even bring entropy into this mess. You can calculate the expected pressure increase due to an increase in number of gaseous particles plus the pressure increase due to a temperature increase. Again so what about entropy? Should we care
Joe states: “We also know that the reaction is not reversible from the problem definition so the amount of ‘order’ has changed and become less organized” Yes order has changed, but my point remains toss pout entropy and the concept of order (corollary to randomness) because it can differ depending upon how and/or who, is considering the system in question
I have to emphasize that what I say completely changes the way one perceives chemical reactions. And herein I seek the help of an open-minded physical chemist but where do I find one who does not believe in entropy. Thus seeking such a person may take forever
Joe states: “ Consider that a power-plant (steam engine,”
Taken from my book. I am sorry but my figure did not show. Anyhow I hope that the words stil make sense
Often an engine functions as a closed system during the power stroke (work producing part of the cycle), and an open system during other parts of the cycle. Consider, the cyclic engine illustrated Fig. 1.8.10, wherein; a high-pressure gas first drives the piston in one direction, and then drives the piston back in the opposite direction. The system of high-pressure gas can be due to: 1) A lower mean molecular volume occupied by the gas molecules, and/or, 2) a higher mean kinetic energy of the gas molecules than that associated with the surrounding atmospheric gas. Note: An engine, whose cycle is illustrated in Figures 1.8.10 and 1.8.11, where water is boiled to create the high-pressure gas, is commonly known as a “steam engine”.
Such engines function by starting with valves 1 and 4 being open, causing the high-pressure gas to be on the L.H.S. of the piston thus driving the piston to the right, as shown in Fig. 1.8.10. When the piston makes it to the end of the cylinder, valves 1 and 4 close, while valves 2 and 3 open. With the high-pressure gas now on the R.H.S. of the piston, the piston is driven back in the other direction, i.e. towards the left.
Now consider that the piston is massless and frictionless, and that the engine performs no actual mechanical work. In this case, the higher pressure simply pushes the hot gas’ kinetic energy through the piston onto the surrounding atmosphere, resulting in a continuous upward displacement of the Earth’s atmosphere, i.e. . Plus the continuous exhausting of warm gases into the atmosphere, which happens irrelevant of which direction the piston is moving.
In real life, such cyclic engines experience friction while running and are constantly displacing the Earth’s atmosphere, hence is constant: . Combine this with the fact that only a portion of the input energy can be involved in the contribution of momentum onto the piston. Plus the fact that most gases ivnvolved are water capours, hence are not monatomic, hence all of the gas’s thermal energy is not simply kinetic. All this helps explain why steam engines tend to have such low efficiencies, e.g. steam engines: .
Cyclic Inefficiency & Combustion
Of course in a multi-cylinder engine, one piston-cylinder is compressing while another is expanding. How that affects the overall scheme of things will require some modeling and due consideration. Even so we can still draw some conclusions.
Does the internal combustion engine (ICE) suffer the same demise as the steam engine? Reconsider the previously discussed cycle of the four-stroke engine. Obviously, unlike the steam engine, work is not continually lost, due to the continuous displacement of the atmosphere. But the energy loss is still substantial!
In order to understand, we must consider the various steps, realizing that temperature increases (T ) will result in energy loss due to the consequential outflow of thermal radiation. During both the intake stroke (Step 1) and the exhaust stroke (Step 7), the only power loss should be due: T due to any viscous dissipation of a flowing gas. During the closing and opening of valves (Steps 2,6 & 8), expect no loss.
During the compression stroke [Step 3)]: P , T , a volume of the atmosphere is compressed which requires energy. Interestingly, the surrounding atmospheric gases should experience a change of their potential energy into a gain of kinetic energy, as the atmosphere’s volume decreases. This is not lost energy from the engine’s perspective but it results in heat being added to the atmosphere.
During Step 4, the explosion of fuel-air mixture creates heat: T . During the power stroke [Step 5)], the rapid expansion of gas occurs in a closed system causing the upward displacement of the atmosphere hence during the power stroke an increase to the surrounding atmosphere’s potential energy occurs, and signifies lost energy.
And all of the above can be added to our previous discussion concerning the inherent inefficiencies of a piston-cylinder apparatus.
In the above. Perhaps you can now see how I am thinking. Thanks again Joe for the great questions
Kent
Hi Joe
Based upon you question concerning Rankine cycIe, I decided to write a small narrative of how one could view it in terms of what I am saying. If you can let me know if it makes sense:
Rankine Cycle
The Rankine cycle closely emulates the Carnot cycle. The Rankine cycle is often used in power plants, wherein electricity generating steam turbines are powered by anything from a nuclear reaction to the combustion of oil, natural gas, or coal. Water is commonly used as the working fluid with the steam turbines typically operating around 838 K (565 oC), and condensation occurring at slightly above room temperature i.e. above 300 K.
Rankin cycle in power plants generally consists of four steps:
Step 1) Water is pressurized via a pump. Since water is incompressible, the energy required is considered as being minimal in comparison to power generated.
Step 2) High-pressure water enters a constant pressure & volume boiler then it is heated into a dry saturated steam. Energy is required to break the liquid’s bonds.
Step 3) The dry saturated steam is then allowed to expand thus spinning the turbine and generating power. As the steam expands it not only turns the turbine but it also displaces Earth’s atmosphere, hence does work, hence the steam’s pressure and temperature both decreases.
Step 4) The vapours then enter a condensate chamber, wherein they return to the liquid state. Herein changes in water’s bonding energy can be extracted.
If the process were as efficient as possible, the efficiency would still be limited because of step 3). This is due to the lost work (PdV) required to displace our atmosphere’s weight, as the expanding steam drives the turbine. In a real power plant there would be additional energy loses such as frictional loses in flowing fluids, heat loses to the surroundings as well as frictional loses in machinery.
Cheers Kent
Hello Kent,
I didin’t mean to question your conclusions which I believe are justified and correct. It is more fundamental to define something using energy as opposed to spatial coordinates as was done for disorder. Because of it you are able to define the system more precisely than in current definitions of entropy. I posted my reply a bit prematurely thinking maybe you had some thoughts different than mine on the origins of dissipative structures.
With regards to what you did write in your paper I wish to congratulate you for having found a flaw in the foundations of physical theory of which there are many. Physical theory was pasted together by many different people. If it wasn’t questioned at first then it just remained the way it was and everybody had to get used to it, which you point out is exactly what happened in the case of thermodynamics. I count the uncertainty principle as another of those glaring flaws.
Hello Kent,
I didin’t mean to question your conclusions which I believe are justified and correct. It is more fundamental to define something using energy as opposed to spatial coordinates as was done for disorder. Because of it you are able to define the system more precisely than in current definitions of entropy. I posted my reply a bit prematurely thinking maybe you had some thoughts different than mine on the origins of dissipative structures.
With regards to what you wrote in your paper I wish to congratulate you for having found a flaw in the foundations of physical theory of which I believe there are many. Physical theory was pasted together by many different people. If it wasn’t questioned at first then it just remained the way it was and everybody had to get used to it, which you point out is exactly what happened in the case of thermodynamics. I count the uncertainty principle as another of those glaring flaws.
Hello Kent,
I didin’t mean to question your conclusions which I believe are justified and correct. It is more fundamental to define something using energy as opposed to spatial coordinates as was done for disorder. Because of it you are able to define the system more precisely than in current definitions of entropy. I posted my reply a bit prematurely thinking maybe you had some thoughts different than mine on the origins of dissipative structures.
With regards to what you wrote in your paper I wish to congratulate you for having found a flaw in the foundations of physical theory of which I believe there are many. Physical theory was pasted together by many different people. If it wasn’t questioned at first then it just remained the way it was and everybody had to get used to it, which you point out is exactly what happened in the case of thermodynamics. I count the uncertainty principle as another fundamental flaw for the same reason. Absolute space is used to define uncertainty when energy would be a more precise measure.
Thanks again Richard
Thanks again Richard
To be honest I have not given dissipative structures any real consideration. I suppose that the Earth’s atmosphere could be one, that being an open system whose real constraining force is gravity, which somewhat contains the atmosphere to Earth. Even so I have never given such structures much thought in terms of the science of dissipative structuring.
I started with nucleation theory, realized it’s fundamentals were poorly conceived , all due to a misunderstanding of Gibb’s fundamental 350 page paper concerning inhomogeneous systems. I then wrote a paper that no one wanted, ended up at Physics Essays (recommended by the editor of American Physics). Physics Essays was not going to publish until I found proof. I found the irrefutable proof in data sets by other researchers, and got published, only after being called names: first lesson in human indignity. Of course those researchers whose data I used snubbed me when I contacted them: A quick second lesson in human indignity. And then strangely my 4 day old (new) computer that was never online until I contacted these folks got attacked: An even faster third lesson in human indignity. And my main working computers have never gone on line since.
I then basically finished writing a book (500-700 pages) giving a whole new perspective to nucleation theory, tensile layers, probabilities etc etc. But I never published the book because I realized I still had to figure out what is wrong with thermodynamics.
And finally several yrs later I figured out what was wrong with thermodynamics. And I was in a daze when I realized that it was entropy. The beast. In hindsight it actually makes sense that the whole complication of the simple has its origins in the one parameter that completely lacks clarity.
I am sorry but i have not given the uncertainty principle much thought. Perhaps it maybe similar to my claim for Boltzmann’s constant (k), that being it is only truly valid here on Earth. Change gravitation field and k’s value changes. I am not as sure concerning Heisenberg’s thoughts.
For me now comes the tough part. Spreading the word and having people actually open their minds. Sadly I lack the marketing machinery required and/or the finances to set one up. Hopefully every blog helps thus I thank the NPA for this opportunity.
So it is a daunting mess that I face. all help and expressions of appreciation are thanked.
cheers and so many thanks, especially for spending the time to read what I write.
Kent
Hi Kent,
A thought provoking write-up on entropy.
Despite the lack of clarity of what entropy is, and notwithstanding whether it is a mathematical contrivance as you suggest, I have a questions for you.
1. I wonder whether you would agree that introduction of energy into a pressureless system (so that PdV = 0) would increase entropy of that system?
2. Whether the scenario in 1. above can be mathematically described by
TdS = dE, which can be rearranged and also written dS = dE/T
where dS is the change in entropy, dE is the amount of energy introduced or removed, and T is the temperature in kelvin prevailing at the time of energy change.
3. If you agree with 1. and 2. above, what will be the mathematical, if not physical consequence of energy introduction in a system at absolute zero temperature, i.e. T = 0?
Akinbo
Hi Akinbo
Akinbo states: . I wonder whether you would agree that introduction of energy into a pressureless system (so that PdV = 0) would increase entropy of that system?
Actually entropy has little or no place in a zero pressure system discussion. First of all work has to be done onto something , hence work cannot be done onto a vacuum
I discuss this in my paper titled: “Improving our thermodynamic perspective” which can be found:
1)http://physicsessays.org/browse-journal-2/product/226-4-pdf-kent-w-mayhew-improving-our-thermodynamic-perspective.html
2) On my website http://www.newthermodynamics.com and under heading of mypapers
I found it interesting that after my paper was published I saw that others on the internet discuss that work cannot be done onto a vacuum. I like to think that I initiated this thought process.
Since one of entropy’s early conceptions is that eqn 1) TdS=dE+PdV, then increasing a volume of nothingness (pressure = 0) does not change that volume’s entropy. I again must emphasize that you need not bring entropy into such arguments. They can be made purely by stating that isobaric isothermal work is W=dE+PdV. And from here you can argue that such work cannot be done onto a zero pressure volume.
Akinbo states: Whether the scenario in 1. above can be mathematically described by
TdS = dE, which can be rearranged and also written dS = dE/T
where dS is the change in entropy, dE is the amount of energy introduced or removed, and T is the temperature in kelvin prevailing at the time of energy change.
You are now talking in circles. Let me explain: TdS was equated to work. Now if I take TdS=dE+PdV and say P=0 hence TdS=dE, you start walking the dangerous ground that traditional thermodynamics has walked. A path that lends itself to the complication of the simple
Let me explain in more detail
Scenario 1): When I write W=dE+d(PV) I am saying that dE is the change energy of everything other than that associated with pressure and volume (PV) i.e. the mechanical parameters. This only makes sense, so herein dE is associated with things like chemical bonds, energy to form tensile layers, electric fields etc etc
What is great in the above way of thinking is that you can explain everything that we witness in thermodynamics, including bubble nucleation, rankine cycle, cosmology etc etc . And of course the isobaric case for work becomes W=dE+PdV. And when there is no pressure then we assume that there is no matter within that described volume and if there is no matter then dE=0. And the universe and all that we know remains fairly simple. Of course no pressure could also imply no gravity, but we will keep it simple
Scenario 2) When you write TdS was equated to work. Now if you take TdS=dE+PdV and say P=0 and then write but TdS=dE. You are thinking in a convoluted manner, the sort that traditional thermodynamics employs. Herein dE now signifies the energy changes within a system, which may or may not include the mechanical capabilities associated with volume and pressure, an illogical consequence.
Anyhow this all becomes my point. Thermodynamics can be dealt with using scenario 1). And when I say this I mean all thermo. Forget about entropy and write in terms of energy, and work with work being
W=dE+d(PV) and isobaric work being PdV that being the energy required to displace our atmosphere.
another way of viewing this is consider the rankine cycle
Once the steam forms we can write its potential to do work is (PV)initial and once the gas has expanded its potential to do work has diminished to (PV)final . So the work done is d(PV). and this equals the work required to turn the turbine plus the energy required to lift out atmosphere (PdV, where P is atmospheric pressure)
Sorry I am getting off of topic
Sadly we all have been taught scenario 2). That unforgiving mishmash of illogical derivatives that we shuffle around until we get a result that resembles our empirical findings. I know that this is hard but it is this indoctrination that renders thermodynamics into an extremely complex science, that has to go.
Akinbo then states: If you agree with 1. and 2. above, what will be the mathematical, if not physical consequence of energy introduction in a system at absolute zero temperature, i.e. T = 0?
Akinbo there are issues with our understanding of T=0. And no I do not have all the answers but I will try to start you on the right path. We get our energy from the sun (for the most part) and our atmosphere/earth/oceans are nothing more than grandiose heat sinks/baths.
The thermal energy that our heat sinks contain is from the sun. And this to a first approximation obeys Wein’s law, which also means it obeys the Rayleigh-Jeans approximation. Herein I am considering thermal energy to be that energy which when adsorbed by condensed matter becomes intermolecular and intramolecular vibrations within that condensed matter.
So our grandiose heat baths have thermal energy defined by (proportional to) Rayleigh-jeans approximation which states that the energy density is proportional to temperature. And because of this our thermodynamic relations tend to be proportional to temperature (at least for the most part)
So now we have a reason for the energy densities that we witness to being proportional to temperature and if you want to call it entropy , whatever at this point I would choose another word..
Now when we start getting to temperatures approaching absolute zero, the thermal energy densities will start doing funny things. I repeat that I do not have all the answers but it is in the category of these funny things that your question is based. I have ideas written in my next edition or revised edition of my book but I also hope that others will contribute to these ideas because my ideas are only in the preliminary stages
No matter hopefully you can see where I am going. But the most important realization has to be that I need no entropy based arguments t explain what we witness. Sure I accept that something when multiplied by T defines thermal energy. Should we still call it entropy, perhaps. I certainly prefer heat capacity to entropy because that leaves no mystery as to what I am saying, that being the energy contained within this system of matter is.
Cheers and thanks for the great questions. Sorry if I babbled too much.
Kent
Hi Akinbo
Akinbo states: . I wonder whether you would agree that introduction of energy into a pressureless system (so that PdV = 0) would increase entropy of that system?
Actually entropy has little or no place in a zero pressure system discussion. First of all work has to be done onto something , hence work cannot be done onto a vacuum
I discuss this in my paper titled: “Improving our thermodynamic perspective” which can be found:
1)http://physicsessays.org/browse-journal-2/product/226-4-pdf-kent-w-mayhew-improving-our-thermodynamic-perspective.html
2) On my website http://www.newthermodynamics.com and under heading of “mypapers”
I found it interesting that after my paper was published I saw that others on the internet discuss that work cannot be done onto a vacuum. I like to think that I initiated this thought process.
Part of entropy’s early conception is founded in eqn 1) TdS=dE+PdV, then increasing a volume of nothingness (pressure = 0) does not change that volume’s entropy. I again must emphasize that you need not bring entropy into such arguments. Arguments can be made purely by stating that isobaric isothermal work is W=dE+PdV. And from here you can argue that work cannot be done onto a zero pressure volume.
Akinbo states: Whether the scenario in 1. above can be mathematically described by
TdS = dE, which can be rearranged and also written dS = dE/T
where dS is the change in entropy, dE is the amount of energy introduced or removed, and T is the temperature in kelvin prevailing at the time of energy change.
If you take TdS=dE+PdV and say P=0 hence TdS=dE, you start walking the dangerous ground that traditional thermodynamics has walked. A path that lends itself to the complication of the simple
Let me explain
Scenario 1) When I write W=dE+d(PV) I am saying that dE is the change energy of everything other than that associated with pressure and volume (PV) ie the mechanical parameters which are the parameters that can physically do work. In this context dE is associated with things like chemical bonds, energy to form tensile layers, etc etc
What is great in the above way of thinking is that you can explain everything that we witness in thermodynamics, including bubble nucleation, rankine cycle, cosmology etc etc . And of course the isobaric case for work becomes W=dE+PdV. And when there is no pressure then we assume that there is no matter within that described volume and if there is no matter then dE=0. And the universe and all that we know remains fairly simple. Of course you could argue that if we remove gravity then we can have zero pressure , but less up keep things simple
Scenario 2) When you write TdS was equated to work. And then take TdS=dE+PdV and say P=0 and then write this leaves TdS=dE. You are thinking in a convoluted manner, the sort that traditional thermodynamics employs. Herein dE now signifies the energy changes within a system, which may or may not include the mechanical capabilities associated with volume and pressure. And you end up with a complicated that logic that cannot necessarily explain all.
Anyhow this all becomes my point. Thermodynamics can be dealt with using scenario 1). And when I say this I mean all thermo.
For example consider the Rankine cycle and the steam is ready to turn the turbine. The change in the ability of that gas to do work is d(PV) or if you prefer
(PV)initial – (PV)final
And this change in the ability to do work equals the energy of the turbine plus the work required to displace our atmosphere that being PdV
Hence d(PV) = PdV+ Wturbine
Assuming that the only energy lost is that displacing our atmosphere
Then optimal efficiency = Wturbine/((PdV+Wturbine)
Sadly we all have been taught scenario 2). That unforgiving mishmash of illogical derivatives that we shuffle around until we get a result that resembles our empirical findings. I know that this is hard but it is this indoctrination that renders thermodynamics into an extremely complex science that has to go.
Akinbo then states: If you agree with 1. and 2. above, what will be the mathematical, if not physical consequence of energy introduction in a system at absolute zero temperature, i.e. T = 0?
Akinbo there are issues with our understanding of T=0. And no I do not have all the answers but I will try to start you on the right path. We get our energy from the sun (for the most part) and our atmosphere/earth/oceans are nothing more than grandiose heat sinks/baths.
The thermal energy that our heat sinks contain is from the sun. And this to a first approximation obeys Wein’s law, which also means it obeys the Rayleigh-Jeans approximation. Herein I am considering thermal energy to be that energy which when adsorbed by condensed matter becomes intermolecular and intramolecular vibrations within that condensed matter.
So our grandiose heat baths have thermal energy defined by (proportional to) Rayleigh-jeans approximation which states that the energy density is proportional to temperature. And because of this our thermodynamic relations tend to be proportional to temperature (for the most part)
So now without entropy, we have a reason for the energy densities that we witness to being proportional to temperature.
Now when we start getting to temperatures approaching absolute zero, the thermal energy densities will start doing funny things. I repeat that I do not have all the answers but it is in the category of these funny things that your question is based. I have ideas written in my next edition or revised edition of my book but I also hope that others will contribute to these ideas. Basically at such low temperatures the thermal energy would no longer be directly proportional to temperature, this is not much different than at blast furnace temperatures wherein thermal energy is no longer directly proportional to temperature
No matter you can see where I am going. But the most important realization has to be that I need no entropy based arguments. Sure I accept that something when multiplied by T defines thermal energy. Should we still call it entropy, perhaps. I certainly prefer heat capacity because that leaves no mystery as to what I am saying.
Cheers and thanks for the great questions
Kent
Dear Kent,
You are obviously very knowledgeable in thermodynamics which is the reason I posed those questions. I agree the entropy subject needs a lot of clarification to make it properly scientific.
I have a write-up pending with the site’s administrators (Early cosmic densities as Mother Nature’s thorn in the flesh of cosmologists). When approved I will discuss more on “energy density is proportional to temperature”, and “Now when we start getting to temperatures approaching absolute zero, the thermal energy densities will start doing funny things…”
Suffice for the moment to say a Big bang is a funny thing and a universe beginning from nothing may just require a scenario where funny things can occur.
All the best.
Akinbo
Hi Akinbo
Thanks for the comment
Concerning other peoples consideration of energy density proportional to matter. I would be interested in what others say
I must emphasize that I am talking THERMAL energy density be proportional to T, namely because our Sun’s blackbody radiation at 6000 degrees makes it so. Remember thermal energy is mainly infra red spectrum and it is that part of the spectrum whose energy density can be approximated so that it explains the temperature dependence i.e. linearly proportionality
If I took the energy density for the whole spectrum from the Sun, then I am no longer just considering thermal energy , and the whole energy density is certainly not linearly proportional to temperature. I hope that you do understand this, or at least I was clear about how I am thinking
Anyhow still lots to learn
Thanks again
Kent