The question is, how does this relationship hold up for irreversible engines? Well he did some experiments! Obviously for a reversible engine, the two terms on the left hand side are equal to one another, but what happens in an irreversible engine? As you can imagine, there are many irreversibilities in an irreversible engine, friction being one major example. These irreversibilities consequently cause the work output of the engine to be less, and cause the heat output of the engine to be more. Therefore, for an irreversible engine, the second term on the left hand side of the above equation is greater than the same term in a reversible engine. If we divide a power cycle into an infinite amount of small cycles, and sum the quantity Q/L up for each one we obtain the following relationship. (The integral takes into account heat entering AND leaving.)
This is what is known as the Clausius inequality. Notice that we indicate that the integral is over a cycle by using a circle over it. (The circle should be in the middle of the integral, my math type program is weird like that.) The above expression is an equality when the cycle is reversible and an inequality when the cycle is irreversible.
If we think about what this means, we realize that this expression is acting like a thermodynamic property. Just as the change in internal energy, volume and other properties of a cycle are equal to zero, so is this expression. Clausius realized that this represents a new thermodynamic property which he called entropy. He defined it as such..
To obtain the change in entropy, one would integrate both sides. Notice that the differential for heat is an inexact differential, this is because heat is a path function, not a state function! Also realize that we define entropy as the change in entropy not what entropy is specifically.
So what does this mysterious property tell us? Entropy is very abstract and requires a bit of thought to first understand. When it boils down to it, entropy is a direct measure of disorder. The definition of entropy is part of the second law of thermodynamics, which basically says that energy has quality as well as quantity, and when energy is used in a device with irreversibilities, some of that energy gets dissipated into a less useful, more disordered form, heat essentially. If we go back to the engine in the beginning of this post, the second term turned out to be larger than the first because of the internal irreversibilities. This means that there was more entropy at the end of the cycle than at the beginning because the difference of the two entropies is negative. This means that entropy is generated due to irreversibilities. This is in contrast to energy which can never be created or destroyed. So because the Clausius inequality is less than or equal to zero, and since we know that it is impossible to have an irreversible process in real life, we can say that all processes generate entropy, and the entropy or disorder of the universe is constantly increasing.
An interesting thing happens when we look at a system that is undergoing an isothermal process. (T=constant)
Imagine that we are undergoing a cycle that has two paths. The path from point 1 to point 2. And the path (which is different than the first) from point 2 to point 1. If the path from 2 to 1 is reversible, and the path from 1 to 2 is either irreversible OR reversible from the Clausius inequality..
So the change in entropy is always greater than or equal to a process that is either irreversible (the inequality holds) or reversible (the equality holds). If we want to equate these two terms we would need to add another term on the right hand side.
This relationship is valid of any process, reversible or irreversible. When the process is irreversible, the entropy generated is equal to some value, and when the process is reversible, the entropy generated is equal to zero. A trick into calculating the entropy generation is to take the system and its surroundings as one big system, therefore the entropy generated would be equal to the change in entropy of the system plus the change in entropy of the surroundings. (One will be negative, and the absolute value of one of them will be greater than the other because of the entropy generated.)
The reason why entropy defines the second law is because it allows us to determine if the direction of a process is possible. The first law of thermodynamics does not, it would have no problem saying that a cup of hot coffee would get hotter as it sits on the table in the cold. This defies the second law however because the second law states that all processes must continue in the direction of increasing entropy. If the coffee cup were to get hotter, the surroundings would get cooler, which is a decrease in entropy of the surroundings without the addition of work.
If we rearrange the definition of entropy and then plug into the first law..
This is what is known as the combined statements between the first and second law. If we recall the definition of enthalpy, we can get another expression in terms of enthalpy.
We are going to use these formulas to find out how to calculate the entropy change of solids and liquids, and then ideal gases. For solids and liquids, we will use the first equation and realize that a solid or a liquids change in volume is negligible, therefore PdV is approximately equal to zero. Remembering that dU=cdT..
If we wish to approximate it, the change in temperature shouldn't be over 100 K or so, otherwise the error becomes larger and larger. To do this, we will take the average value of the specific heat capacity over the temperature change.
If we undergo an isentropic process, the change in entropy is equal to zero. (Isentropic means, constant entropy.) So then..
Therefore, an isentropic process with an incompressible substance is also isothermal.
So now we move on to ideal gases. We again use the combined statements of the first and second law. All we need to do is substitute du=CvdT, and P=RT/V into the first equation and dh=CpdT,V=RT/P into the second equation, and then integrate. (Note: We will again be using the average specific heat capacity approximation.)
Various relations can be derived from these equations in an isentropic process. (The change in entropy is equal to zero.) If we set the first equation equal to zero..
In a similar manner, we can do the same thing with the other equation.
To get the third relation, we simply combine both of these expressions since they both equal the ratio of the two temperatures.
These three equations can be expressed in a compact form as...
Entropy allows us to construct an entropy balance for a problem. There is no conservation of entropy but we can still do an entropy balance if we include the entropy generation term.
Let's recap the ways entropy can interact with the system.
Entropy enters or leaves a system through the boundaries. Therefore in an adiabatic closed system, the entropy change is equal to zero.
1. Heat Transfer:
Heat is is a major factor in entropy gain or loss. Heat transfer into the system causes the entropy of the system to rise, and on the other hand, heat transfer out of the system causes the system to lose entropy.
2. Mass Flow:
Another way entropy can change in the system is due to mass flow through the boundaries. If the system is closed then there is no mass flow, and hence no change in entropy due to mass.
3. Entropy Generation:
Remember, all real processes have irreversibilities, hence all processes generate entropy. The only type of process that does not generate entropy is the idealized reversible process.
That about wraps up this post on Entropy!