What is Entropy?
In physics, entropy means the amount of disorder or the degree of randomness associated with a thermodynamic system. The system is not self-organized when the entropy of the system is high. For instance, when a system is at an elevated temperature, its molecules are in random motion, vibrating and colliding with each other. During this scenario, the probability of finding a molecule at a particular place twice, is low. Under this circumstance, the system can be said to be in a high entropy.
Entropy is a point function and is an exact differential. It is a thermodynamic property and denotes the amount of unavailable energy of a system. In engineering thermodynamics, entropy is a measure of the amount of unavailable thermal energy for mechanical work. Entropy is represented by and its SI unit is .
Entropy of a reversible process
The definition of a reversible process is associated with a closed system such that, after the process termination, the system can be brought back to its original conditions.
Consider the image below. A system is undergoing a cyclic process. The system is initially at state 1 and by undergoing a certain process it moves to state 2, following path 1-a-2. The system later returns to its original state, state 1, following path 2-b-1.
For a thermodynamic system undergoing a cyclic process, the expression for Clausius inequality is given as,
, where, is the heat interaction of the system with the surrounding and is the temperature of the surrounding.
The Clausius inequality for a system undergoing a cyclic reversible process is given as,
, where, denotes the integral over a cyclic reversible path.
And, the Clausius inequality for a system undergoing a cyclic irreversible process is given as,
, where, denotes the integral over a cyclic irreversible path.
Now, apply the Clausius inequality for the reversible cyclic process, 1-a-2-b-1,
…(1)
Assume that, instead of taking path 2-b-1, the system returns to its original path following path 2-c-1.
Hence, apply the Clausius inequality for the reversible cyclic process, 1-a-2-c-1,
…(2)
Subtract (1) and (2),
It is obvious from the above expression that no matter what path the system chooses, only the initial and the final state matters. Hence, the quantity must represent a change in the property. This is known as the entropy change and the property is known as entropy. Therefore, the final expression can be written as:
, where is the entropy change of the system.
Entropy of an irreversible process
The process is said to be irreversible when the system's initial conditions cannot be restored after the execution of the process or the system is in non-equilibrium condition after the process termination.
Consider the same image above, but this time the system follows an irreversible path 2-c-1 (denoted by the dotted line) while returning to its initial state, state 1. Refer to the image below.
It is known that for a closed system undergoing a cyclic irreversible process, the expression for Clausius inequality is given by,
Where represents the cyclic integral over a closed irreversible cyclic path.
Apply the Clausius inequality condition for the reversible cyclic process, 1-a-2-b-1,
…(3)
Apply Clausius inequality for the irreversible cyclic process, 1-a-2-c-1,
…(4)
From equation (3),
…(5)
Substitute the value of equation (5) in equation (4),
…(6)
Hence, for an irreversible process, the quantity is always less than the entropy change .
Therefore, equation (6) can be rewritten as,
Where is always a positive quantity known as the entropy production or entropy generation.
Boltzmann entropy
Coined by Ludwig Boltzmann, this form of entropy finds its application in statistical mechanics. In statistical mechanics, the number of microstates is infinite, since in statistical mechanics the matter is assumed to be continuous. Under such assumptions, the microstates of a system are characterized by the positions and momentum of all the atoms present in it. Boltzmann gave a microscopic definition of entropy, unlike Clausius, whose entropy's definition was in the sense of macroscopic point of view. According to Boltzmann, entropy is the logarithm of the number of microscopic states that share the physical quantities of the macroscopic states. The concept of probability of the microscopic states in the entropy was later coined by Gibbs.
The Gibbs entropy is the function of the probability distribution which is spread over phase space. The definition of entropy by Gibbs remains meaningful even if the system is isolated and is far away from equilibrium. The concept can be extended for multiple phases, as given by von Neumann entropy. If the ensemble of classical systems evolves according to the Liouville von Neumann equation, the entropy is of constant motion.
The Boltzmann entropy is generally represented by the following equation,
Where is the Boltzmann constant and is the thermodynamic probability of the microscopic states that are compatible with the microstates of the system.
Determination of the probability of the microstates
The probability of a microstate can be determined by a simple expression.
Consider an isolated system that contains N number of molecules, belonging to energy states, . Let be the occupation number, where, . Define as the set of occupation numbers.
The probability distribution of the set of occupation numbers, is given as,
Where is known as the prior probability to find a molecule in energy state.
Shannon entropy
Coined by Claude Shannon in 1948, Shannon entropy relates to the amount of uncertainty or the surprise in a random variable in information theory. Information theory is a branch of mathematics that deals with the transmission of data along a noisy channel.
This theory was originated in the area of the theory of communications. According to the theory of communications, a data communication system consists of three parts, namely, a source of data, communication channel, and receiver. The intuition behind Shannon was to know what data was generated by the source based on the signal received from the channel.
The concept of entropy in information is approximately analogous to entropy in statistical thermodynamics where the values of random variables designate the energies of the microstates. The entropy provides a medium to measure the average amount of information needed for the representation of an event from the probability distribution of a random variable.
Speaking in relevance to mechanical engineering, machinery having rolling element components, such as rolling element bearings, generates more random signals. When there is a fault, the signal becomes more precise and concentrated at a certain interval. The Shannon entropy is used in this case to detect the uncertainty of the random weak signals.
In the areas of medical research, entropy-based approaches are recently being adopted. Researchers have adopted the Shannon entropy formula to combine unique signaling networks and transcription data. While combining the transcription data, the researchers have found that the entropy level of the cancer samples was higher than the normal samples. Also, in the case of growing tumors, it has been found that more advanced stage tumors had higher entropies than lower stages.
This concept is also been used for the recent ongoing pandemic of COVID-19. A higher entropy value signifies more rapid spread of the virus than a lower value indicating less spread of the virus.
Shannon defined the entropy as H, after Boltzmann's H theorem. The expression for is given by,
where is the discrete random variable, is the probability of a single event.
The entropy in information theory will have different units depending on the base of the logarithm. For example, if one is dealing with computers, the value of the base of the logarithm is , and the unit would be bits.
Context and Applications
The topic finds its application in the following areas:
- Masters in Science (Mathematics)
- Bachelors in science (Mathematics)
- Bachelors in Technology (Mechanical)
- Bachelors in Commerce (Economics)
- Masters in Commerce (Economics)
Practice Problems
1. Which of the following characteristics of a system represents entropy?
- A random event
- A disordered behavior
- Unorganized behavior
- All of the above
Correct option- d
Explanation: Entropy means the amount of randomness or disorder present within a system. The self-organization of the system or the molecules is less.
2. Shannon entropy relates to the uncertainty of which of the following quantity?
- Random variable
- Boltzmann H-theorem
- Clausius inequality
- Microstate
Correct option- a
Explanation: Shannon entropy relates to the uncertainty or surprise of a random variable in information theory.
3. Which of the following gives the equation for Clausius inequality for an irreversible process?
Correct option- a
Explanation: The expression for Clausius inequality for an irreversible process is given by
4. What will happen to the entropy of a reversible system if the heat supplied to the system increases?
- Decrease
- Remains unchanged
- Increase
- May either increase or decrease
Correct option - c
Explanation: For a reversible system the expression for the entropy is given as , if increases, the quantity will also increase.
5. Which of the following is the nature of entropy production or generation in the expression of change in entropy for an irreversible process?
- Always negative
- Always positive
- Either negative or positive depending on system conditions
- Cannot be predicted as entropy is random behavior
Correct option - b
Explanation: For the following expression, holds true where has to be a positive quantity.
Want more help with your mechanical engineering homework?
*Response times may vary by subject and question complexity. Median response time is 34 minutes for paid subscribers and may be longer for promotional offers.
Search. Solve. Succeed!
Study smarter access to millions of step-by step textbook solutions, our Q&A library, and AI powered Math Solver. Plus, you get 30 questions to ask an expert each month.
Entropy Homework Questions from Fellow Students
Browse our recently answered Entropy homework questions.
Search. Solve. Succeed!
Study smarter access to millions of step-by step textbook solutions, our Q&A library, and AI powered Math Solver. Plus, you get 30 questions to ask an expert each month.