Science topic

# Entropy - Science topic

The measure of that part of the heat or energy of a system which is not available to perform work. Entropy increases in all natural (spontaneous and irreversible) processes. (From Dorland, 28th ed)
Questions related to Entropy
• asked a question related to Entropy
Question
How we can integrate the statistical method (Relative Shannon's entropy) with remote sensing and GIS to quantify the urban growth patters of mountain towns?.
The Shannon’s entropy can be computed in terms of spatial phenomenon, in order to quantify the built up area (impervious area) (urbanization ).Higher value of overall entropy for the whole urban area represents higher dispersion of impervious area, which gives an indication of urban sprawl.
• asked a question related to Entropy
Question
How to plot Approximate entropy, Permutation entropy, and sensitivity analysis for a chaotic map to check their randomness behavior and complexity. I tried to code these tests but couldn't get a good result. If anyone helps me, it will be helpful to me.
• asked a question related to Entropy
Question
We know that entropy is a measure of disorder and also we know that it always increase never decreases. Time arrow move in forward direction not in reverse.
Ok now my question is when we born we are in highly ordered state as a kids our skin is so tight and shiny than we grow up became young than again we got old our face fade up everything gets more disorders as we know but at same time we produce our relocate we give birth to baby which is highly ordered form now if we think like that entropy always increased.it gets low to high so our baby entropy again became low from our entropy which was high is this doesn't break entropy rules?
Imagine a food factory , so at the beginning the factory was in high order state, time to time entropy of factory will increase it means the factory will be moving in disorder state.
The food factory was moving in lower disorder state but the food produced by the factory is in highly disorder state.
Why i am saying this to you is the rule is "Entropy of an isolated system cannot decreased" it's says only for single system not for sub-systems... entropy for all system is different.
Thankyou
Kind regards,
Ritik bhardwaj
• asked a question related to Entropy
Question
And can you reference articles or texts giving answers to this question?
Reviewing the literature would be helpful before considering whether to updatie the 2015 ideas.
...Just a bit more to the answer by colleague V. V. Vedenyapin:
in the Boltzmann-Planck's S = k*ln(W), W = (1+Power[x,K]), with x=(T/Tc), T stands for the Kelvin's absolute temperature, Ts is temperature scale, and K - efficiency of the process under study.
About 100 years ago, in the Journal of American Chemical Society, Dr. George Augustus Linhart has published the formal statistical inference of the above fact.
I have tried to answer the poser you have posted here - consequently and in detail:
Shorter versions:
• asked a question related to Entropy
Question
Entropy is one of the concepts used in textbooks and literature in a variety of descriptions and interpretations that often confuse students and teachers.
Recently, I published a manuscript in which I tried to simplify our understanding of the concept of entropy, relate it to potential energy and give a general description of entropy in the context of chemical reactions and biological processes.
Thanks Olivier denis , did you get the chance to read the manuscript, I will be happy to get your feedback.
• asked a question related to Entropy
Question
On February 5, 2021, Casper A. Helder posted the question “can the second law of thermodynamics be abandoned?” I was surprised to read that the second law )together with the first law( the only two laws that remain unchanged since their formulation by Clausius in 1867, are boldly doubted. Moreover, the entropy in which its propensity to grow is the second law is not even mentioned, and therefore, I wrote a short cynical answer. To my surprise every few days since I followed this question, more people add answers and for today, there are 6618 reads and 386 answers! For Example, Henning Struchtrup a professor at Victoria university working in the field wrote: “No doubt one can criticize Carnot or Clausius but one should not forget that they were at the very beginning”. Struchtrup received 13 recommendations for his answers. I wonder what causes a respectful scientist from this field to say that no doubt that Carnot and Clausius's works are problematics.
From reading part of the answers including Helder's argumentation, I believe that somehow in the last century, the definition and therefore the meaning of the second law was forgotten. Hereafter, I will summarize the definition of the second law and its immediate consequences.
2nd Law Definition: In any irreversible process, the entropy S increases.
Irreversibility: If we have a reservoir at a temperature T and one adds an amount of energy Q in an irreversible route its entropy increases by S>Q/T. If the process is reversible then the entropy increase is S=Q/T. This inequality is called Clausius inequality.
The amount of energy added or removed from a reservoir is a measurable quantity. However, we see that in an irreversible path "T" is smaller than T in a reversible path of the same system. Therefore, we cannot define temperature and therefore entropy for a system, in an irreversible route.
Equilibrium: If a closed system is resting for a long time its entropy will increase to the maximum. Clausius inequality means that energy flows from hot to cold and therefore in equilibrium all the subsystems of an ensemble have equal temperature i.e. all its degrees of freedom have the same amount of energy i.e. in an ideal gas every degree of freedom of any molecule has kT/2 energy. Here k is the Boltzmann constant, which is the gas constant, divided by the Avogadro number and T is the temperature. Moreover, in equilibrium, all the microstates (a distinguishable configuration of any ensemble) have an identical amount of energy. Therefore,
Temperature and Entropy are defined only for systems in Equilibrium: to find the temperature of an ensemble in equilibrium one can take a single molecule measure its energy and know the temperature. However, this is seldom the case. Usually, there are “hotter” molecules and “colder” ones and this is the reason why both the entropy and temperature are defined only in equilibrium. This ambiguity about the “temperatures” out of equilibrium causes all kinds of anomalous behaviors like “super cooling” and the Mpemba paradox. This is why the application of the 2nd law has difficulties in microscopic systems. These phenomena are proving Clausius's inequality and the second law rather than disproving it.
Maximum Entropy and Quantum theory. Since in equilibrium the entropy expression of a system is maximum, every ensemble tends to reach spontaneously equilibrium. Therefore, one can calculate many properties of an ensemble by maximizing the statistical expression of the entropy (Max Entropy Technique). Planck did such a calculation for EM radiation in equilibrium with a materialistic body and found the quantized nature of energy. Is abandoning the 2ndlaw means giving up the Quantum theory? Can physics without entropy exist?
I post this question to find out if there is any concrete scientific evidence or argumentation that may cause scientists to declare that there is serious criticism against Clausius's inequality, Carnot's efficiency, and the second law.
If you would read my question, you would understand that I am a second law defender. I wrote several papers about information theory and thermodynamics and I agree that information is entropy. I think that the problem today is that people are not reading past papers!
• asked a question related to Entropy
Question
I carried a similar reaction in two reactors ( batch reactor and microwave reactor); for sure, Gibbs free energy is negative in both cases ( same reaction). However, in MW, TΔS term becomes larger, and ΔG becomes more negative (higher entropy ) than the one obtained by conventional heating.
Moreover, the thermodynamic advantage provided by the MW is realized at lower temperatures where the free energy (ΔG = ΔH -TΔS) of the MW reaction becomes more negative. The fact that the MW-driven reaction has a negative ΔG at lower temperatures than the CH stems from its significantly lower value of ΔH, as ΔH will be less than - TΔS at lower temperatures. Therefore, at a lower temperature, the reaction with a microwave-driven reaction will become more favourable than the CH reaction as (-TΔS)CH > (-TΔS)MW
This is becoming a chemistry debate. Chemists look at the chemical make up of substances and investigate how they behave under different condition.
• asked a question related to Entropy
Question
I invite anyone to participate to an open discussion on the latest “findings” on Black-Holes' research. The motive of this thread is a set of articles appeared in the issue of September 2022 (p. 26-51) of Scientific American magazine under the title “Black Hole Mysteries solved”.
I have proposed a new way of thinking about Nature/Reality NCS(Natural Coordinate System) (https://www.researchgate.net/publication/324206515_Natural_Coordinate_System_A_new_way_of_seeing_Nature?channel=doi&linkId=5c0e3a7d299bf139c74dbe81&showFulltext=true) and I would ask whether you recognize any basic distinction between the above preprint(and the following Appendices) and the articles of Sci. Am.. This thread is intended to be an open– in respect to time and subject - discussion forum for the latest results of Black Hole research in order to advance new perspectives based on NCS and to put the proposals of NCS to the public assessment.
In order to seed points of arguments, I picked up some phrases from the articles of SciAm in comparison to phrases or references from NCS preprint.
1. “Paradox Resolved” by G. Musser. “Space looks three-dimensional but acts as if it were two-dimensional.” (p.30) → NCS (p.11-13, 49-52).
2. - “It says that one of the spacial dimensions we experience is not fundamental to nature but instead emerges from quantum dynamics” (p.31) → NCS (p.11-13).
3. - “Meanwhile theorists think that what goes for black holes may go for the universe as a whole” (p.31) → NCS (p.31-38, 46-47).
4. “Black Holes, Wormholes and Entanglement” by A. Almheiri- “The island itself becomes nonlocally mapped to the outside” (p.39) → NCS (p.44-47), https://www.researchgate.net/publication/345761430_APPENDIX_18_About_Black_Holes?channel=doi&linkId=5facf0fe299bf18c5b6a0d4d&showFulltext=true .
5. “A Tale of Two Horizons” by E. Shaghoulian- The whole article is about BH-Horizon, Holographic Principle, Observer, and Entropy → NCS (p.31-38, 44-47, 54-61, 6-7), https://www.researchgate.net/post/What_is_Entropy_about_Could_the_concept_of_Entropy_or_the_evaluation_of_its_magnitude_lead_us_to_the_equilibrium_state_of_a_system .
6. “Portrait of a Black Hole” by S. Fletcer- The article is about the history of the observation of Sagittarius A* (the BH at the center of Milky Way galaxy). There is no obvious connection with NCS.
PS. This discussion is NOT open for new “pet-theories” apart from NCS.(!!!)
All the articles mentioned are just essays about how not to do the calculation that has to be done-summing over metrics-and trying to guess the answer. Unfortunately the guesses, as expected, fail at some point.
• asked a question related to Entropy
Question
Dear research colleagues,
when I calculate the entropy S° of crystalline materials from heat capacity measurements (Cp) from (almost) 0 K to 298 K by integrating Cp/T, I have "the feeling" to end up with a quite precise/reliable value of S° for the respective material. However, when I compare such a value with other ones in literature (eg. obtained from vapor pressure measurements), I frequently find a significant discrepancy between them, most of the time that the value from the Cp-measurement is quite lower (>20 J/(mol*K) than the other values.
Now my question is, if my feeling is "right" that the values obtained from Cp-measurements are actually more trustworthy, or if there are some problems with this way of determining the entropy, that I'm not aware of?
• asked a question related to Entropy
Question
In our theoretical studies entropy and heat capacity for electron gas calculated for semiconductor nanowires. Hovewer for compare our results not enaugh experimental results (references).
Electronic measurements of entropy in meso- and nanoscale systems
• asked a question related to Entropy
Question
Hello.
I am trying to calculate enthalpy of mixing of a HEA alloy, using the dHmix equation from
I am trying to calculate a know alloy CoCrFeNiAl with a known value of dHmix (-12.32kJ/mol), while using values from http://www.entall.imim.pl/calculator/ as by definition Hmix/ij is taken for equimolar pairs.
And the value i get is -18.94kJ/mol.
I am attaching a xlsx file with all the numbers.
Have you done DSC/TGA/DTA for the sample?
• asked a question related to Entropy
Question
The phrase in the Title line imitates Karl Popper’s All Life is Problem Solving.
Since thermodynamics plays a role in life processes, it was surprising that searching “All life is thermodynamics” on Google on August 16, 2022 gave no results.
Don’t organisms seek to optimize and preserve the entropy of their internal energy distribution? And to optimize their use of energy and outcomes based on energy inputs? Aren’t survival and procreation ways of preserving previous products of energy use?
Is there justification for the statement, All life is thermodynamics? Or is the statement too simple to convey any insight?
Schrodinger in What is Life referred to thermodynamics, statistical mechanics; chapter 6 is Order, Disorder and Entropy. And more recently there is: J. Chem. Phys. 139, 121923 (2013); doi: 10.1063/1.4818538 Statistical physics of self-replication by Jeremy England.
Possibly the phrase is part of the dogma of our knowledge of physics (for now)... Perhaps the correct phrase would be "life as we know it..."
• asked a question related to Entropy
Question
If one uses a coordinate transformation, say t -> a t' +b_i x^i, does it change the thermodynamic quantities of a black hole, say entropy, temperature and others?
the general question is that: does coordinate transformation change the Smarr relation (the generalized form of the first law of black hole thermodynamics)?
Mojtaba Shahbazi But you changed the vectors, so in this regard it's not the physical senses of thermodynamics and the various preservation laws, just the distributions of space and time. In another regard, the detectors only have amplification effects on the detections, not the 1:1 energy collisions with them :)
But philosophically, they don't change. What has changed is only our understandings, perceptions on the detections, and acceptance on the inexact methods of observational astronomy (or observational cosmology).
• asked a question related to Entropy
Question
I want to compute entropy of system in nanofluid filled in a square cavity.
Dear Shan Ali Khan, Cavity problem is new for me
• asked a question related to Entropy
Question
I am trying to calculate the change in enthalpy and entropy for a heat transfer fluid (HTF) for modeling purposes. I have one pressure-enthalpy diagram or chart for the HTF and I am also using software 'EES' for the calculation of change in enthalpy and entropy.
For the chart, the reference state at 0 C saturated liquid is: h=43 kJ/kg and s=0.175 kJ/kg.K.
For the software, the reference state at 0 C saturated liquid is: h=200 kJ/kg and s=1 kJ/kg.K.
When I calculate the change in h or s using each method (i.e. chart and software), I get different results. Is it normal that the change in h or s is different when different reference states are used? Because of this I have two sets of power and efficiency calculations which are different from one another under the same conditions.
Hi Petr Lepcio. Thank you for the response. I understand that different reference states will lead to different values of enthalpy and entropy. Now I want to understand for the change in two arbitrary state and the use of scaling equation. For example, in my case, I am calculating the enthalpy change between inlet and outlet of the turbine. The difference between enthalpies at inlet and outlet of the turbine calculated from thermodynamic chart and 'EES' is not the same. How must I incorporate the scaling equation in this case to match my results from each method? Thanks again.
• asked a question related to Entropy
Question
I want the procedure to simulate the entropy generation in the enclosure with the effect of natural convection and radiation heat transfer using FLUENT / ANSYS.
More details can be found in
• asked a question related to Entropy
Question
I am conducting Latent Class Analysis with 8 binary indicators and a sample size of 6757 obs. The entropy for 2classes, 3classes, 4 classes clustering is very low (around 0.54). But when I reduce the sample size and drop some of the observations randomly, I obtain better entropies. For example the entropy for 4500 sample size is 0.68 and for 4280 sample size is 0.75.( For all of the sample sizes BIC suggests that 3 classes clustering is the best number of clustering). I am wondering is it acceptable to reduce the sample size to get a better entropy?! Is there any justifications to reduce sample size in this case ? Any hint would be of great help. Thanks
Entropy is the average probabilities of each individual's assignment to the classes. If you drop some sample, the entropy would certainly change with these individuals' assignment not included in the calculations. So the question is what criteria did u used to excude these individuals? Who are these individuals exluded? The answers might give you the justifications for exclusion. For instance, you exclude all the outliers
• asked a question related to Entropy
Question
Is that right? (From an article, Using graph theory to analyze biological networks.)
Things flow in biological systems: energy, nutrients, blood, air, as examples.
A graph is a set of points with connections.
A graph is like a photograph. Biological systems are like movies. If that analogy is valid (well, maybe it is not?), then graphs are not the optimal way to model biological system; it is necessary to also model flow.
The mathematical discipline which underpins the study of complex networks in biological and other applications is graph theory. It has been successfully applied to the study of biological network topology, from the global perspective of their scale-free, small world, hierarchical nature, to the zoomed-in view of interaction motifs, clusters and modules and the specific interactions between different biomolecules. The structure of biological networks proves to be far away from randomness but rather linked to function. Furthermore, the power of network topology analysis is limited, as it provides a static perspective of what is otherwise a highly dynamic system, such that additional tools should be combined with this approach in order to obtain a deeper understanding of cellular processes.
• asked a question related to Entropy
Question
How to select etchant for high entropy alloys?
go through the above site hope you get the information you were looking for
• asked a question related to Entropy
Question
my sample is La0.95Sr0.05MnO3, in another sample La0.67Sr0.33MnO3 has one peak in entropy change vs temperature .Thanks for explanation me about the physics of the problem.
I'm not so sure but even if it's two phases than the second phases wont give much effect in your magnetic profile due to the quantity are so small as shown by your refinement. I thinkl that changes due to the partial substitution, grain or particle size, defect or maybe occupancy.
• asked a question related to Entropy
Question
The general consensus about the brain and various neuroimaging studies suggest that brain states indicate variable entropy levels for different conditions. On the other hand, entropy is an increasing phenomenon in nature from the thermodynamical point of view and biological systems contradict this law for various reasons. This can be also thought of as the transformation of energy from one form to another. This situation makes me think about the possibility of the existence of distinct energy forms in the brain. Briefly, I would like to ask;
Could we find a representation for the different forms of energy rather than the classical power spectral approach? For example, useful energy, useless energy, reserved energy, and so on.
If you find my question ridiculous, please don't answer, I am just looking for some philosophical perspective on the nature of the brain.
Hi,
The mitochondrion in cells is a powerhouse of energy. There are some articles on the topics of your interest:
Jeffery KJ, Rovelli C. Transitions in Brain Evolution: Space, Time and Entropy. Trends Neurosci. 2020;43(7):467-474. doi:10.1016/j.tins.2020.04.008
Lynn CW, Cornblath EJ, Papadopoulos L, Bertolero MA, Bassett DS. Broken detailed balance and entropy production in the human brain. Proc Natl Acad Sci U S A. 2021;118(47):e2109889118. doi:10.1073/pnas.2109889118
Carhart-Harris RL. The entropic brain - revisited. Neuropharmacology. 2018;142:167-178. doi:10.1016/j.neuropharm.2018.03.010
Sen B, Chu SH, Parhi KK. Ranking Regions, Edges and Classifying Tasks in Functional Brain Graphs by Sub-Graph Entropy. Sci Rep. 2019;9(1):7628. Published 2019 May 20. doi:10.1038/s41598-019-44103-8
Tobore TO. On Energy Efficiency and the Brain's Resistance to Change: The Neurological Evolution of Dogmatism and Close-Mindedness. Psychol Rep. 2019;122(6):2406-2416. doi:10.1177/0033294118792670
Raichle ME, Gusnard DA. Appraising the brain's energy budget. Proc Natl Acad Sci U S A. 2002;99(16):10237-10239. doi:10.1073/pnas.172399499
Matafome P, Seiça R. The Role of Brain in Energy Balance. Adv Neurobiol. 2017;19:33-48. doi:10.1007/978-3-319-63260-5_2
Engl E, Attwell D. Non-signalling energy use in the brain. J Physiol. 2015;593(16):3417-3429. doi:10.1113/jphysiol.2014.282517
Kang J, Jeong SO, Pae C, Park HJ. Bayesian estimation of maximum entropy model for individualized energy landscape analysis of brain state dynamics. Hum Brain Mapp. 2021;42(11):3411-3428. doi:10.1002/hbm.25442
• asked a question related to Entropy
Question
How can I calculate entropy, and exergy efficiency in ANSYS FLUENT?
The article "Exergy efficiency" in Wikipedia is not correct. Have a look at the following references:
Tsatsaronis, G., "Thermoeconomic Analysis and Optimization of Energy Systems," Progress in Energy and Combustion Systems 19 (1993), pp. 227-257
A. Bejan, G. Tsatsaronis, and M. Moran, Thermal Design and Optimization, J. Wiley, New York, 1996
Lazzaretto, A. and Tsatsaronis, G., “SPECO: A Systematic and General Methodology for Calculating Efficiencies and Costs in Thermal Systems”, Energy – The International Journal 31, (2006), pp.1257-1289.
• asked a question related to Entropy
Question
am trying to add entropy generation for hybridnano fluid in my research article what should i note before and after . and suggest me some of the article to get clear idea and matlab program to plot graph for entropy
The command below may be applied:
se = pentropy(xt)
se = pentropy(x,sampx)
se = pentropy(p,fp,tp)
se = pentropy(___,Name=Value)
[se,t] = pentropy(___)
pentropy(___)
--------------------
Plot the spectral entropy of a signal expressed as a timetable and as a time series.
Generate a random series with normal distribution (white noise).
xn = randn(1000,1);
Create time vector t and convert to duration vector tdur. Combine tdur and xn in a timetable.
fs = 10; ts = 1/fs; t = 0.1:ts:100; tdur = seconds(t); xt = timetable(tdur',xn);
Plot the spectral entropy of the timetable xt.
pentropy(xt) title('Spectral Entropy of White Noise Signal Timetable')
• asked a question related to Entropy
Question
During denaturation, hydrogen bonds and hydrophobic bonds are broken. This results in an increase in entropy even at the highest severity of molecular breakdown. Although, denaturation can be reversible by doing renaturation. However, it’s generally not possible to restore the protein to its original form again. So that the solubility can be reduced and the inability of biological activity.
My question : If there is a natural ingredient such as sago caterpillar that has the potential for high protein content it can be used as an innovation in the form of preparation. As I said earlier about the effects of denaturation above. Can this interfere with the process or even have a negative impact when the preparation has been distributed to the human body?
Hi ,Ammanda
• asked a question related to Entropy
Question
1, The gas diffuses to vacuum,dq = 0, dS = dQ / T = 0,so the entropy in the
diffusion process cannot be calculated: S (T1).
2, If S (T1) has no physical meaning, then S (T0) and S (T1) have no physical
meaning.
This is a screenshot from the scientific and technological literature. The theory of the second law of thermodynamics is inconsistent with the experiment, which is equivalent to the death penalty. As a result, many people are still there to make silly excuses.
Remember: This is not a case. There are many more examples.
• asked a question related to Entropy
Question
Could anyone help me? Thanks
I've analysed a set of data including 11 continuous variables and 1 catergorical variable of 1031 patients. I explored to classify these patients into two class by latent class analysis using Mplus 8.3. But the Entropy was 1. These are the detail of the results:
Information Criteria
Akaike (AIC)                   39043.715
Bayesian (BIC)                 39122.805
(n* = (n + 2) / 24)
Class Counts and Proportions
Latent
Classes
1              597          0.57625
2              439          0.42375
CLASSIFICATION QUALITY
Entropy 1.000
VUONG-LO-MENDELL-RUBIN LIKELIHOOD RATIO TEST P<0.001
With just 2 classes, classification accuracy can be quite high (not a lot of error in the classification since there are only 2 choices for classifying individuals). The high entropy value may reflect the fact that your participants are for the most part assigned to their latent class with high certainty given their response patterns.
You can take a look at the average latent class assignment probabilities for the most likely class membership. I would expect these to be high in your case, perhaps > .9. I find the assignment probabilities to be more intuitive and informative than the entropy summary measure.
• asked a question related to Entropy
Question
I am right now doing research on Al-Ni varying high entropy alloys and I was wondering what was the process for selecting the most optimal etchants? I right now use Aqua Regia and it often over-etches which can be a bit of a burden.
Dear Kevin Jun. Now days I research the composite material of the AL - NI system produced by additive manufacturing. In my research, I used Keller's Etch (2.5ml HNO3, 1.5ml HCl, 1ml HF, 95ml H2O ) to reveal the macro- and microstructure. Etched by immersion. My sampels have 80-95% of aluminum. Etching lasted 15 seconds.
• asked a question related to Entropy
Question
• I have done an MRF and Sliding mesh analysis of the centrifugal pump. I need to know how to calculate and show the production of entropy.
Dudley J Benton, thank you so much
• asked a question related to Entropy
Question
I'm looking for a good etchant for OM analysis of a High Entropy alloy.The composition is Cr15Cu5Fe20Mn25Ni35.
hi
pls find the attachment
best wishes
• asked a question related to Entropy
Question
- The entropy of the attributes of alternatives for each criterion is a measure of the significance of this criterion. It is believed that the lower the entropy of the criterion, the more valuable information the criterion contains.
- Criteria Importance Through Inter-criteria Correlation (CRITIC). In the CRITIC method, the standard deviation is a measure of the significance of this criterion.
• asked a question related to Entropy
Question
Chaos leads to order, but can entropy be stopped? Please explain if you can find the time.
At extremely low temperatures entropy S -> constant and therefore to a small value. That is called the Nernst theorem or 3rd law of thermodynamics.
S can be slowed by decreasing the number of degrees of freedom, at very high temperatures molecules move and rotate, if T is decreased, then the rotational degrees of freedom are suppressed and S is slowed down, since the number of microscopic rotational states Nrot -> 0
S = kB ln W & usually W ~ eN but the real problem is that N is a huge number of microscopic states in a system, so if T -> 0 K then N can be reduced.
Please look at the numerical example in the following Open Learn resource:
Moore, Justin Shorb, Xavier Prat-Resina, Tim Wendorff, E. V., John W., & Hahn, A. (2020, November 5). Thermodynamic Probability W and Entropy. Chemical Education Digital Library (ChemEd DL). https://chem.libretexts.org/@go/page/49564
Please, look at the value of W in the example 16.5.116.5.1: Entropy
For Nernst theorem, see:
cc Wiki:
• asked a question related to Entropy
Question
After integrating the DSC peak, I got change in enthalpy. Further how to find delta S using BOLTZMANN EQUATION?
Dear all, please have a look at the following RG thread. My Regards
• asked a question related to Entropy
Question
Dear all experts
How can added or calculated the entropy generation in COMSOL Multiphysics?
Best Regards
Direct Contact: +8801759-731605 (whatsapp no.)
If the formula of entropy is known, it can be derived according to the calculation results of COMSOL.
• asked a question related to Entropy
Question
A very interesting topic, "quantification of randomness" in mathematics it is sometimes reffered to as "complex theory" (although it is more about pseudorandom than randomness) that is based on saying that a complicated series is more random and then there are tests for randomness in Statistics and perhaps the most intriguing test related to information theory -"entropy"(as also being of relevence to and result of second law of thermodynamics), while there are also random numbers generators (pseudorandom numbers generators) and true random numbers generators using quantum computing.
So, what I've been trying to, is making a complete list of all available algorithms or books or even random number generators that will allow me to tell me how much random a series is, allowing me to "quantify randomness".
There are 125 unique infinite series which are pseudorandom that I have discovered and generated based on a rule, now how do I test for randomness and quantify it? Uf the series is random or there is probably a pattern, or something that will allow me to predict the next number in the series given I don't know what the next number is.
Now, do anyone know of any github links based on any of the above? ^ (like anything related to quantifying randomness in general that you think will be helpful).
A book/books on quantifying randomness will be very very helpful too. Actually anything at all...
You should check out seminal and fundamental work by Gregory Chaitin starting in 1965 when he was a student in CUNY (City Univ. of New York) and continuing through the 1970's.
• asked a question related to Entropy
Question
Hello,
I have a dataset containing the concentration of a marker protein for different experiments that I wish to calculate the Shanon entropy for. To do this, I create a discretized space between the minimal and maximal values of the concentration observed and then I assign each of the concentrations to a certain interval and then calculate the probabilities from there. My problem is that I can chose the number of intervals in this discretized space and I do not know how to optimize the value of the entropy with regards to the number of intervals.
For example. Imagine I have 3 concentration values that are 1,4 and 10. I can chose to discretize the set to the two intervals [1,5] and [5,10] or to the 10 intervals [1,2], [2,3], [3,4] ... [9,10]. The Shanon entropy for the first set will be 2/3*ln(2/3) + 1/3*ln(1/3), while the Shanon entropy of the second set will be 3*(1/3*ln(1/3)). How do I know which set to chose? In other words, I'm looking for a way to discriminate between various discretizations of the concentration space to calculate the most accurate Shanon Entropy value, if such a thing exists.
Thanks
If you can find the value of H(p) (Shanon entropy) for known probabilities say (p_1, p_2,......., p_8), then the higher is the value of H(p) the uncertainty will be higher.
• asked a question related to Entropy
Question
I am currently considering to purchase the bench Top Mini high energy planetary ball mill. As I have never used it before, I need your perspectives on its suitability for synthesizing the above materials.
Thank you for your contribution. the paper you shared was of great help.
• asked a question related to Entropy
Question
Hello, everyone!
I' m using GSAS software to do the Rietveld refinement of equiatomic FeCoCrNi high entropy alloy, but I don't know how to get the cif file. So how can I get the cif file of FeCoCrNi high entropy alloy?
Thanks!
I don't think you can get .cif file for HEA from database.
Instead of that you can build own FCC/BCC file or just use Ni (FCC) or Fe (BCC) for first estimation.
• asked a question related to Entropy
Question
I want to calculate the Gibbs free energy and phase transitions for my high entropy alloy. I saw that the CALPHAD program is used more. How can I do this calculation? what type of file do I need as the input file. I have never used this program and these calculations. I would be happy if you could help me.
Hello,
You have to use a specific database dedicated to high entropy alloys ; they are called TCHEA in Thermo-Calc package.
What is the compostion of your alloy?
Regards
• asked a question related to Entropy
Question
Economic entropy:
Suppose, a nation gathers all the resources its citizens have and then divides the total resources equally among all the citizens. Will this nation progress? It would fail badly due to the economic entropy. The economic entropy means that the same resources are there, but they cannot be employed to benefit the people and the nation. Megastructures like factories and companies cannot be established. No new resources could be generated.
Felipe Morelli da Silva Interesting thoughts. Thanks!
• asked a question related to Entropy
Question
Arrow of time (e.g. entropy's arrow of time): Why does time have a direction? Why did the universe have such low entropy in the past, and time correlates with the universal (but not local) increase in entropy, from the past and to the future, according to the second law of thermodynamics? Is this phenomenon justified by the Gibbs law and the irreversible process?
With respect to all the answers, in my opinion, no answer to such questions is completely correct.
An irreversible process in an isolated system is accompanied by an increase of the entropy of the system.
An irreversible process in a system in contact with its surroundings is accompanied by a decrease in the Gibbs energy of the system (and an increase in the entropy of the system+surroundings, which constitutes an isolated system).
Then, the Gibbs energy is a thermodynamic potential that arises naturally when considering systems in thermal and mechanical contact with their surroundings. The increase of the system+surroundings entropy is equivalent to the decrease of the Gibbs energy of the system.
Now, the connection between time and entropy is another issue.
The universe is expanding, and, therefore, the entropy is increasing globally. All proceses are inherently irreversible, and, therefore, entropy is increasing globally.
Time goes in one direction. Entropy goes (globally) in one direction (always increasing, though not always at the same rate). Then, should there necessarily be a connection between entropy and time?
• asked a question related to Entropy
Question
In my current thinking/writing I have been exploring ideas behind quantum social theory, for example the potential of an entropic society. Here, such a society exhibits a default (temporal) tendency toward disorder. Entropy increases unless society works to reduce it. Why? Because, from a quantum super-positional perspective on a society of individuals, there is an infinite potential for interference through quantum interdependency: there is an indeterminate potentiality to disorder, with only a limited number of determinable, observable events that may signify order.
Statistically, unless we invest in reducing the range of interdependencies and thus work to reduce the indeterminacy of state changes and/or interferences, by implementing (social) negentropic constraints, we will experience emergent disorder. Such constraints, including our social institutions, laws, ethics and morals, are designed to increase the probability that a given/anticipated/expected/desired state change within society may be observable. This is society’s desire for normativity.
Yet, as I think on these lines, I begin to see the potential of the autistic mind and its consciousness as a radical free agent unbound to the idea of negentropic normativity. This, to my mind is a positive prospect: autism’s value to society. Society needs its free radicals to prevent excessive negentropy. By attention to the radical free agents of society, we can be reminded that social normativity cannot rule out indeterminacy entirely: society must respect its entropic potential. And, while all about us seek to normalise our activities, we can look to autism to remind us of our full, unrealised potential.
Thoughts? All opinions, normative and non-normative are welcome.
• asked a question related to Entropy
Question
I have a range of different ion pairs , some of them formed ionic liquids and others didn't. I've read before that for ionic liquids to be formed, the entropy should be greater than the enthalpy. I am wondering if there is a certain range for both entropy, and enthalpy, or delta G where ionic liquids can be formed within?
• asked a question related to Entropy
Question
The problem of self-interaction effects and errors arises in studies of, for example, anions, electrons, atoms and molecules.
It also arises in developing a theory of network effects in connection with network entropy (for example, https://arxiv.org/abs/0803.1443 ). In the network case, the concept of degrees of freedom leads to an apparent resolution.
Does the network case generalize?
For the case of quantum solid-state theory, there are previous studies, mostly based on Prof. E. Wigner's probabilistic distribution and kinetical equation, Prof. Robert Shour, there are important advances, the literature is found following the citations of
the effect of degrees of freedom is studied but in another kind of system, coherent bosons, they use a concept called the quadrature with the Re omega Imag omega parts of the scattering cross-section that has the interaction
Best Regards.
• asked a question related to Entropy
Question
To understand how gravity cannot be an entropic force, please see comment (5) in [1]. Please assume that the “atoms-of-spacetime” defined in [1] can be heated up to the Davies-Unruh temperature (T), to produce the desired entropic inertial force,
f = T∇S
where (∇S) is some entropy gradient [1]. What are the problems with this entropic definition of inertial force? Could entropic inertial force not be compatible with the principle-of-equivalence, etcetera?
Thank you
P.S. Remember what Boltzmann always used to say, “If you can heat it up, then it’s made out of atoms!”
[1] Thanu Padmanabhan, (PDF) Atoms of Spacetime and the Nature of Gravity (researchgate.net)
Need to rephrase the question.
How would inertia have to do with force? You only have conservation of linear momentum in the absence of forces.
Inertial force is only produced when your reference system is in some way accelerated, relative to a background.
• asked a question related to Entropy
Question
Recently, black-holes were demonstrated to exert a pressure on their adjacent surrounding space (see: https://scitechdaily.com/physicists-total-surprise-discover-black-holes-exert-a-pressure-on-their-environment/amp/"Physicists’ Total Surprise: Discover Black Holes Exert a Pressure on Their Environment – SciTechDaily")
Reference: “Quantum gravitational corrections to the entropy of a Schwarzschild black hole” by Xavier Calmet and Folkert Kuipers, 9 September 2021, Physical Review D. DOI: 10.1103/PhysRevD.104.066012
In two old preprints of mine (2018 and 2020), I've also predicted that all black-holes exert a mechanical pressure on their adjacent environment which I've defined as being actually a reaction(al) force produced by a predicted universal black-hole-associated Casimir force.
More precisely, a black-hole (bh) limits the spontaneous appearance of virtual pairs (VPs) inside it and that creates a gradient between the outer and inner VPs (a positive gradient between the number of VPs per unit of volume outside vs inside that bh), a gradient translated in a (bh-associated) Casimir force (bhaCF) exerting an additional out-to-in pressure on that bh (which bhaCF also generates a reactional force that acts from inside to outside that bh, manifesting as a pressure exerted by that bh on its adjacent surrounding space). See the two preprints cited next:
[1] “(eZEH working paper - version 1.0 - 10 pages - 2.08.2018) An extended zero-energy hypothesis: on some possible quantum implications of a zero-energy universe, including the existence of negative-energy spin-1 gravitons (as the main spacetime “creators”) and a (macrocosmic) black-hole (bh) Casimir effect (bhCE) which may explain the accelerated expansion of our universe”:
[2] “(bhCE-HR-AE-ObU - version 1.0 - 18.11.2020- 2 A4 pages without references) A proposed black-hole-associated Casimir effect (bhCE) possibly inhibiting Hawking radiation (HR) and creating a spatial expansion around any macro/micro black-hole (possibly driving the global accelerated expansion of our observable universe)”:
What do you think of my proposed black-hole-associated Casimir force and its reactional force (manifested as a pressure exerted by black-holes on their surrounding space)?
If black matter is actually the sum of all existing primordial black holes (pBHs) (https://scitechdaily.com/black-holes-could-be-dark-matter-and-may-have-existed-since-the-beginning-of-the-universe/), THEN my proposed bhCE would more plausibly explain the universal accelerated expansion of our universe by the pressure exerted by all these pBHs (which pressure is the reaction force of my B-associated Casimir force).
• asked a question related to Entropy
Question
These excitations at low temperatures give rise to disorder beyond the long/short-range ordered magnetic states. In the case of spin ice systems, the quantum fluctuations act as a significant perturbation force that deviated the system far away from the two-in-two-out spin configuration. It gives rise to the three-in-one-out spin configuration and forms the dynamic monopole-antimonopole pairs. Surprisingly, no change in the "Spin ice" entropy is observed.
Why is a dynamic ground state (or increase in the entropy) not observed beyond the spin ice state due to the dominance of quantum fluctuation at low temperatures?
Quantum fluctuations are just what the term says: They denote a bath, just like a thermal bath, only, in this case, the parameter that plays the role of temperature is Planck's constant.
It isn't however true that they are the reason that a 2->2 configuration can become a 3->1 configuration, that's, just, wrong.
Quantum fluctuations don't imply that the ground state of a quantum system is degenerate-which would mean that the entropy is non-zero. For quantum systems, in flat worldvolume (as are the magnetic systems referred to) the ground state is unique and the entropy, therefore, vanishes. The fact that these are spin systems refers to the geometry of the target space, the space of the spins, not the worldvolume.
• asked a question related to Entropy
Question
I used " Solve/set/expert " command to obtain the temperature gradient and used "Custom field function" to manually enter the entropy equation. However, the result is not plausible and I cannot validate my case with other researches.
• asked a question related to Entropy
Question
I am working on lid driven cavity problem in porous media. I have calculated the fluid flow using finite difference method and average nusselt number by simpson rule varying y from 0 to 1. I don't know how to calculate the entropy generation and bejan number which is described in the attached figure.
THANKS
• asked a question related to Entropy
Question
I carried out the DFT calculations using Gaussian. By varying the pressures (.8GPa, 5GPa,8GPa...) , I coud observe that there is a change in value the following parameters.
• Thermal correction to Gibbs Free Energy
• Thermal energy. specific heat capacity and entropy caused by translation .
• also. no change in the Thermal energy. specific heat capacity and entropy caused by rotational and vibrational motion
can anyone give a clear explanation
This question can be answered on a lot of different levels. I'll give a few ideas and you can decide where to go with the discussion.
Phenomenologically, in classical (chemical) thermodynamics the total differential of G is dG=Vdp-SdT. Therefore, at constant temperature, an increase of the pressure should lead to an increase of G. If that is the case for you, everything is stil normal.
Also, the heat capacity is directly connected to G, so if G changes at the same T, the heat capacity should also change.
Now let's go to the microscopic level. For a meaningful discussion, it would be good to know if you perform your statistical thermodynamics on single-molecule data or whether you are using an ensemble, maybe in some sort of a supercell approach. If you have a scenario in which intermolecular interactions play a role, it is clear that the translation will have to change since additional binding energies start playing a role. The vibrations, on the other hand, will barely be affected, the major effect in reality would probably be that at a higher pressure the equilibrium state might be reached faster due to more collisions. Discussing entropic changes without knowing the actual scenario is a bit difficult.
This last paragraph was, of course, quite a "random facts compilation" and every one of these sentences could be expanded into a full textbook chapter, so please let us know what topic you want to discuss in more detail.
• asked a question related to Entropy
Question
Dear colleagues.
Considering that ASCII characters take range from 32 to 127, so we need 7 bits to represent each character, what is the highest theoretical value of information entropy for a given text?
For example for grayscale images, since we represent each pixel using 8 bits and there is a maximum of 2^8=256 shades, the highest entropy value is 8.
Does this hold for text? So is the highest possible atainable value 7?
I am trying to configure this for a text encryption design.
Does the entropy command in MATLAB provide the correct result for this?
Several methods are there for determining the entropy of a text or its individual segments, called n-grams. Shannon method is one of the most popular method, which is based on the representation of the text by a Markov chain of depth n.
Second one is dictionary-based method for determining the entropy of n-grams, which combines the Kolmogorov’s combinatorial approach and a corollary of the Shannon’s second theorem, which gives an asymptotic estimate of the number of meaningful texts.
• asked a question related to Entropy
Question
I am implementing GLCM on MRI images. And also have got its features, & got the results too, but I don't understand the significance and utilization of all the features like entropy, contrast, etc. I know definition and equations, but want to know how to utilize all these features for classification and get the result.
I used the GLCM along with the LBP to classify vegetation density from RGB satellite images please reefer to this study:
• asked a question related to Entropy
Question
Dear experts, colleagues
In MCDM, is it possible to apply the Entropy method to derive weights of criteria without predefined alternatives?
In my case, the criteria data is available as raster layers that cover the entire area of ​​interest and I want to explore the optimal sites on it.
No, in my opinion, since entropy is based on examining the dispersion of criteria values, that are dependent on the alternatives
• asked a question related to Entropy
Question
I did XRD of my powder alloy samples, I got major peak of all samples with different concentration but beside major peaks some small distortion peaks, I think the high entropy alloy powder hasn't been well crystallized after 45hrs mechanical alloying (Ball milling). How can I improve the crystallinity of my powder alloy sample.?
• asked a question related to Entropy
Question
It is suggested that the Zero Point Energy (that causes measurable effects like the Casimir force and Van der Waals force) cannot be a source of energy for energy harvesting devices, because the ZPE entropy cannot be raised, as it is already maximal in general, and one cannot violate the second law of statistical mechanics. However, I am not aware of a good theoretical or empirical proof that ZPE entropy is at its highest value always and everywhere. So I assume that ZPE can be used as a source of energy in order to power all our technology. Am I wrong or right?
It isn't the ``zero point energy'' that is the origin of either the Casimir force or the van der Waals force. First of all, these two forces don't have anything to do with each other: The van der Waals force is the classical force between electric dipoles, the Casimir force is the force that expresses the fluctuations of energy about its average value in the state where this average value is equal to zero.
That's why zero-point energy is a misnomer.
The entropy of any physical system, in flat spacetime, in the vacuum state vanishes, since the vacuum state of a quantum system in flat spacetime is unique.
• asked a question related to Entropy
Question
As far as I know it is not really a matter of thermocalc version but the database. In thermocalc there is a separate database for high entropy alloys.
• asked a question related to Entropy
Question
My goal is to discretize a set of attributes using entropy discretization. The plan for this program is the following:
1. Discretize the attribute A1 using Entropy-Based discretization when either
a. The number of distinct classes within a partition is 1.
b-The ratio of the minimum to maximum frequencies among the distinct values for the attribute Class in the partition is <0.5and the number of distinct values within the attribute of Class in the partition is Floor(n/2), where n is the number of distinct values in the original dataset.
I've attached a sample of my python program to this discussion and a sample dataset. The issue that I am running into is minimizing the information gain. My hypothesis is that the distinct number of class within a partition is 1 when the information gain is 0. I would appreciate any feedback on this.
From your answer it is unclear what data you shared. It is not possible to proceed in this way furder.
• asked a question related to Entropy
Question
Anyone has experienced using the AFLOW framework to predict the mechanical and thermal properties of high entropy alloys?
For Mechanical Properties Calculating, There are not Different Between High entropy Alloys and Other Alloys and Metals.Furthermore, Researches Indicates that some High entropy Alloys have Considerably Better Strength-to-Weight Ratios, with a Higher Degree of Fracture Resistance, Tensile Strength, as well as Corrosion and Oxidation Resistance than Conventional Alloys.
• asked a question related to Entropy
Question
1. Can we use equation “delta S = delta q/ T” for any process for which T is constant? Can it be used for any isothermal process? Are all isothermal processes reversible? Can an adiabatic process be reversible if it is carried out infinitesimally slowly? Can we use equation “delta S = delta q/T” for such an adiabatic change to get “delta S = 0”?
2. What is the meaning of non PV work? How can it be extracted?
I would welcome any comments on these questions.
Dear Gema
ds>dq/t ; for irreversible process in thermodynamic system
ds= dq/t ; for reversible process in thermodynamic system
Also, ds= change in entropy of system
dq/t = entropy transfer by heat
So ds> dq/t
Or
ds= dq/t + S gen.
Where S gen = entropy generated in irreversible process
Sgen= 0 ; for reversible process
• asked a question related to Entropy
Question
(Is the study given at link https://doi.org/10.26434/chemrxiv.11553057.v6 is useful?)
Sure. Without thermodynamics, physical chemistry is blind.
• asked a question related to Entropy
Question
Can we define entropy for a particle?
in quantum mechanics when we are dealing with a quantum system, the measurement process makes us to suppose we are dealing with an ensemble of systems. Is it legitimate to define entropy for a single quantum system in this way?
My answer included the calculation of the thermodynamic entropy of a single particle. There have been answers in opposition to single-particle entropy. The idea of only ensemble entropy fits well with the invented entropies. Invented entropies are entropies that have borrowed Clausius' equation delta S => zero. They don't derive it, they borrow it and use it in ways that are not justified by Clausius' derivation. They justify it by applying it in their own ways without having to establish how their use is in agreement with thermodynamic entropy. Thermodynamic entropy is what Clausius says it is. All interpretations that are not in agreement with Clausius' meaning must derive their own equations from their own starting point. Those that do so can call their solutions 'entropy'. They just cannot continue to give the impression that their work is an advanced development of Clausius' thermodynamic entropy. Using Clausius' thermodynamic entropy, I solved for the magnitude of the thermodynamic entropy of a single particle. I gave reasons why there is usefulness in writing about single-particle entropy. The two major findings that are included in my work are the achievement of the formal physics definition for temperature, and, the meaning of Boltamann's Constant. The value of Clausius' thermodynamic entropy is shown to be diminished. It is a magnitude of time. Its inequality equation is in error. It is not physically or mathematically justified. These are major findings. Whether or not single-particle entropy applies to some other form of entropy does not matter for what I write about Clausius' thermodynamic entropy. The other entropies are invented entropies. They are useful. The inequality equation fits them. However, they have borrowed it.
• asked a question related to Entropy
Question
I done solid liquid adsorption studies and I done thermodynamic parametric studies. When I'm doing thermodynamic calculations I got negative enthalpy and entropy values, and I got positive gibb's free energy values? As per literature's if gibb's free energy values are negative the adsorption process is spontaneous and if it is positive process is non-spontaneous. I would like to know if the process is non-spontaneous, desorption will occur or not. Anyone please help me how I can explain this study with the above mentioned values and help me to get some idea about the thermodynamic studies.
• asked a question related to Entropy
Question
Globally, the phenomenon interpretations of the sign of: the standard Gibb′s free energy (ΔG°), the standard enthalpy (ΔH°) and the standard entropy (ΔS°) are the same in different publications.
Nevertheless, when the amount of the absolute value of one thermodynamic parameter is enough high or low, we read some small differences of interpretations, especially when there are some specific phenomenon, or adsorbent, or adsorbate, or metallic ions, etc…
The goal of this discussion is to help researchers in this field to enrich their interpretations in new experimental data by reading our different answers and replies on different situations already published.
Another delicate point which can enrich this discussion is that if one thermodynamic parameter varies slightly with temperature, how to affect the usual interpretations (the increase or decrease) for the case of positive or negative values?
The thermodynamic parameters that were applied to different systems are the Gibb′s free energy (ΔG°), the enthalpy (ΔH°) and the entropy (ΔS°). The Gibb′s free energy would indicate if the process is spontaneous (ΔG°< 0), if the process is non-spontaneous (ΔG° > 0) or if the process is at equilibrium (ΔG°= 0). The enthalpy would indicate if the process is endothermic (ΔH° > 0) or exothermic (ΔH° < 0). Furthermore, the absolute value of the enthalpy would indicate if the process is chemisorptions (80 <ΔH° < 200kJ/mol) or physisorption (ΔH° <80kJ/mol), while the entropy would indicate the degree of disorderliness of the studied process which is possible (ΔS° > 0) or not possible (ΔS°< 0).
• asked a question related to Entropy
Question
We know that entropy is associated with every matter. Can it be associated with the expansion of the universe too? Is there any spiritual angle associated with the entropy and universe expansion?
Note: one would say that Hubble recorded or measured that stars are moving away from us at about 70Km /sec as a proof of universe metric expansion, and the answer to that is that what was measured is not an expansion but a mathematically incorrectly setting of our coordinates of inertial system of universe,
• asked a question related to Entropy
Question
Firstly, I use historical time serial data of 20 ports' throughput to predict the future output. Then I calculate the information entropy of these time serial data. I find that the prediction accuracy in a certain interval of the entropy is very good. How to explain this phenomenon? why the prediction accuracy is related to the entropy? what is the mechanism in the ARIMA method?
It is a known fact that past data explains the future to a certain extent. Therefore, the entropy of information in the past time series can predict the information entropy of the future time. To some extent this may be possible.
• asked a question related to Entropy
Question
------------
INNER ENERGY
------------
The inner energy is: U= E(el) + E(ZPE) + E(vib) + E(rot) + E(trans)
E(el) - is the total energy from the electronic structure calculation
= E(kin-el) + E(nuc-el) + E(el-el) + E(nuc-nuc)
E(ZPE) - the the zero temperature vibrational energy from the frequency calculation
E(vib) - the the finite temperature correction to E(ZPE) due to population
of excited vibrational states
E(rot) - is the rotational thermal energy
E(trans)- is the translational thermal energy
Summary of contributions to the inner energy U:
Electronic energy ... -1650.67074291 Eh
Zero point energy ... 0.16891920 Eh 106.00 kcal/mol
Thermal vibrational correction ... 0.00502963 Eh 3.16 kcal/mol
Thermal rotational correction ... 0.00141627 Eh 0.89 kcal/mol
Thermal translational correction ... 0.00141627 Eh 0.89 kcal/mol
-----------------------------------------------------------------------
Total thermal energy -1650.49396154 Eh
Summary of corrections to the electronic energy:
(perhaps to be used in another calculation)
Total thermal correction 0.00786217 Eh 4.93 kcal/mol
Non-thermal (ZPE) correction 0.16891920 Eh 106.00 kcal/mol
-----------------------------------------------------------------------
Total correction 0.17678137 Eh 110.93 kcal/mol
--------
ENTHALPY
--------
The enthalpy is H = U + kB*T
kB is Boltzmann's constant
Total free energy ... -1650.49396154 Eh
Thermal Enthalpy correction ... 0.00094421 Eh 0.59 kcal/mol
-----------------------------------------------------------------------
Total Enthalpy ... -1650.49301733 Eh
Note: Only C1 symmetry has been detected, increase convergence thresholds
if your molecule has a higher symmetry. Symmetry factor of 1.0 is
used for the rotational entropy correction.
Note: Rotational entropy computed according to Herzberg
Infrared and Raman Spectra, Chapter V,1, Van Nostrand Reinhold, 1945
Point Group: C1, Symmetry Number: 1
Rotational constants in cm-1: 0.073719 0.034947 0.034938
Vibrational entropy computed according to the QRRHO of S. Grimme
Chem.Eur.J. 2012 18 9955
-------
ENTROPY
-------
The entropy contributions are T*S = T*(S(el)+S(vib)+S(rot)+S(trans))
S(el) - electronic entropy
S(vib) - vibrational entropy
S(rot) - rotational entropy
S(trans)- translational entropy
The entropies will be listed as mutliplied by the temperature to get
units of energy
Electronic entropy ... 0.00000000 Eh 0.00 kcal/mol
Vibrational entropy ... 0.00764430 Eh 4.80 kcal/mol
Rotational entropy ... 0.01390861 Eh 8.73 kcal/mol
Translational entropy ... 0.01975045 Eh 12.39 kcal/mol
-----------------------------------------------------------------------
Final entropy term ... 0.04130337 Eh 25.92 kcal/mol
In case the symmetry of your molecule has not been determined correctly
or in case you have a reason to use a different symmetry number we print
out the resulting rotational entropy values for sn=1,12 :
--------------------------------------------------------
| sn= 1 | S(rot)= 0.01390861 Eh 8.73 kcal/mol|
| sn= 2 | S(rot)= 0.01325416 Eh 8.32 kcal/mol|
| sn= 3 | S(rot)= 0.01287133 Eh 8.08 kcal/mol|
| sn= 4 | S(rot)= 0.01259970 Eh 7.91 kcal/mol|
| sn= 5 | S(rot)= 0.01238901 Eh 7.77 kcal/mol|
| sn= 6 | S(rot)= 0.01221687 Eh 7.67 kcal/mol|
| sn= 7 | S(rot)= 0.01207132 Eh 7.57 kcal/mol|
| sn= 8 | S(rot)= 0.01194524 Eh 7.50 kcal/mol|
| sn= 9 | S(rot)= 0.01183404 Eh 7.43 kcal/mol|
| sn=10 | S(rot)= 0.01173456 Eh 7.36 kcal/mol|
| sn=11 | S(rot)= 0.01164457 Eh 7.31 kcal/mol|
| sn=12 | S(rot)= 0.01156241 Eh 7.26 kcal/mol|
--------------------------------------------------------
-------------------
GIBBS FREE ENERGY
-------------------
The Gibbs free energy is G = H - T*S
Total enthalpy ... -1650.49301733 Eh
Total entropy correction ... -0.04130337 Eh -25.92 kcal/mol
-----------------------------------------------------------------------
Final Gibbs free energy ... -1650.53432070 Eh
For completeness - the Gibbs free energy minus the electronic energy
G-E(el) ... 0.13642221 Eh 85.61 kcal/mol
Dear all,
Could you please help me in this ORCA output thread. Which value should be considered as Thermal correction to Gibbs Free Energy ?
Sincerely,
Ajeet
"in the harmonic approximation"
• asked a question related to Entropy
Question
Should exergy analysis be taught to undergrad engineering students? I am wondering to what extent I should include this in my undergrad Thermodynamics course for 3rd year chemical engineering students? We do energy balance and the entropy balance already.
Dear Isabella.
Exergy is very complex CONCEPT. Actually, exergy is the limiting (highest or lowest) value of energy that can be useful in a thermodynamic process, taking into account the restrictions imposed by the laws of thermodynamics.
That is, exergy is the maximum work that a macroscopic system can perform during a quasi-static transition from a given initial state to a state of equilibrium with the environment. In other words, this is the minimum work that must be spent on the quasi-static transition of the system from the state of equilibrium with the environment to a given initial state.
Unlike energy, exergy is not a parameter of the state of the system, but a parameter of the process. Comparison of the exergy of an ideal quasi-static process with the energy of a real nonequilibrium process shows the degree of perfection of a real thermodynamic process.
Therefore, I believe that it is the topic for the postgraduate, certainly.
V. Dimitrov,
Professor (Emeritus) in Chemical Physics.
• asked a question related to Entropy
Question
Hi everoyne,
I'm trying to detrmine the gibbs free energy change for some possible reaction pathways to see if their feasible or not.
No current literature data can be found for these reactions. Initially, I thought to use the bond dissociation energy to find the change in enthalpy, but this doesn't take into account the entropy of the reaction.
Any ideas?
Thanks!
You do not write whether you are looking for inorganic or organic reactions, or whether they occur in solution or in the gas phase.
For reactions between organic molecules there exist increment methods. The Landolt-Börnstein encyclopedia contains such a method (Landolt-Börnstein, vol. II/4 (1962), p. 22ff.
If you want to use a computational chemistry software package and are interested in chemical reactions in solutions, please make sure that the solvent in is included in the computation.
• asked a question related to Entropy
Question
when a system say a heat engine draws heat from a hot reservoir (body at high temperature compared to the system) it does work, since heat is a low grade energy not all of heat is converted to work, therefore remaining heat is released into a cold reservoir (say to the environment, which is at lower temperature compered to the system's temperature). Now, this released heat (which is unavailable for the system) is somehow related to the entropy of the system, according to some literatures. My Question here is how is this heat related to Entropy?
In classical thermodynamics, entropy is defined as energy that is unavailable to generate mechanical work as given by Clausius. Entropy within classical thermodynamics represents the absolute law that is always true.
Boltzmann changed forever our understanding of entropy as he discovered that entropy is defined as a statistical distribution which relates a given macrostate to associated microstates. In his theory, it had been shown that entropy can change both ways.
Sometimes even in an isolated system entropy can spontaneously decrease, see the paper bellow for details.
This is possible due to the statistical distribution of velocities of gas in a chamber. To make work in a heat machine you need to go from one distribution to the other one, velocities becomes lower. As take energy from the system distribution is shifting to lower values and it becomes harder and harder to take energy from it, until you reach the temperature of the surrounding bath. The terminal energy has a corresponding distribution that is associated with the specific temperature.
I recommend reading the following historically chronological description of entropy notion evolution. Please, remember that Boltzmann's theory of entropy forever changed physics: proposed atomism & proved it, defined an arising of statistics from movements of perfectly defined deterministic particles!
***
When you want to read a bit longer version, explained in depth, that is about five pages long including pictures, then it would be interesting to read the section on entropy of the following paper:
I do recommend reading citations of papers and books given in this methodic paper. There is a lot of very important information hidden in it.
• asked a question related to Entropy
Question
The immediately answer without too much thinking would possibly be: Yes, we need them, because not all criteria have the same importance, and this is true.
Fine, now what are criteria weights? They are subjective transcriptions of verbal ordinal expressions, also subjective. Analyzing this step, one realizes that they are totally arbitrary, and in addition, subject to the decision of a DM, which may be different from that of another DM,
Really, a very suspicious procedure and with no mathematical support.
We agree that weights are used to have a rank of criteria according to their relative importance. And now, the crucial question comes, ‘Importance relative to what?
If you analyze a fact such as for instance, the relative importance of quality and price regarding the purchase of a car, you can legitimately say, that in this respect, and for YOU, quality is more important, without assigning a quantitative value to that preference, because it would be meaningless. Nobody can put a value to a feeling or a preference.
Observe, and this is important, that your preference must be related to something, in this case the car, because in other aspects you may prefer price to quality, according to your tastes and budget, for instance, in purchasing a necktie, and thus, it depends on the object of the comparison.
In case of criteria weights in MCDM, a practitioner may express these two postulates:
1) Weights are necessary to evaluate alternatives,
2) Due to the fact that not all criteria have the same importance.
Is this answer correct? NOT in the first statement, and YES in the second.
The reason of the fallacy of the first statement, is that the relative importance between criteria is NOT associated with alternatives evaluation, unless they are objective weights derived from entropy.
Probably you will say: But these entropy weights also establish a ranking between criteria, as subjective weights do, then, why the evaluation of alternatives is valid for entropy and not from preferences?
Because Shannon Theorem, the base for Information Technology, as we know it today, demonstrates that to evaluate something you must have capacity for evaluation, you need a certain quantity of information, and it depends on the discrimination of the values of a criterion that is, it is an attribute. As an analogy, the actual system is equivalent to asking a 5 years old child to evaluate different cars.
He/she, at that age, does not have on cars the amount of information that an adult possesses.
This amount of information in a criterion can be found by entropy; this is the great discovery of Shannon, and he even developed a formula to compute it.
If in a criterion the different values corresponding to the alternatives are very similar or close, the entropy or uncertainty is high, the quantity of information low, and thus, this criterion has a low significance for evaluating alternatives.
The best example are dice. The uncertainty about which number will appear when casting, is 100%, because all of them have the same values (1/6); the entropy will be 1, and the quantity of information 0.
Since weighting does not have this property, it can’t be used for evaluation.
Therefore, why to spend thousands of hours trying to find out how to have better subjective weights, when they are useless, at least for MCDM?
I am not claiming that I am right but at least is what I get based on reasoning and science.
Therefore, if subjective weights are inappropriate to evaluate alternatives, why are we using them?
I would very much like if some of my colleagues in RG can add something else, supporting or not, my arguments. As a bottom line, I believe that weights, other that entropy ones, should not be used in MCDM methods. I am not saying that we should consider all criteria with the same importance, because it is not true most of the times, thus, we should use either entropy weights, or allow the method determine by itself the importance of each criterion, based on inputted data, as is done in Linear Programming.
I will be more than glad to discuss this issue, but please, with arguments, not with simple words or masking references about other authors wrote.
ATA: This is a warning. it is off-topic to post where you are posting, which is traceable and against RG ToS. I ask you to delete what actually works as anti-social, and as an anti-advertising for you. Then, I will also delete this warning.
• asked a question related to Entropy
Question
Hello,
I designed a single effect absorption cooling system working with LiBr/H2O in Aspen Plus software. But the exergy flow rates don't seem right. For example, when I try to find the exergy destruction of the pump (and other components), the result is negative. What is the reason of this? My reference conditions are 25 oC and 101.325 kPa.
Hi
I think the exergy of material streams is incorrect, and Aspen does not calculate the chemical exergy.
Best wishes;
• asked a question related to Entropy
Question