Friday, July 17, 2015

Deflating the Enigma Associated with ENTROPY


(Fundas Master Vidyadhar Tilak has clarified my physics concepts which has helped this article)


       Entropy is introduced as a variable in Thermodynamics. Then its nature is declared by a value-loaded term ‘Disorder’. Then Second Law of thermodynamic is paraphrased as “Even when you manage to decrease entropy in your enclosure under consideration (enclosure is unnecessarily called ‘system’) it is only at the ‘cost’ of increasing entropy in the universe.” There is further news that Entropy involves Randomness. If Randomness is going to increase, human grip on causality will be loosened, one thinks. Pessimist philosophers rejoice the scientific news as a proof for their position. Engineers impressionistically remember that after drawing Pressure-Volume diagram there was a custom of also drawing a Temperature-Entropy diagram and if you follow the custom your Thermodynamics Papers are cleared and then you can permanently pass out from the little embarrassment that you have not understood it. You have never bothered to ask as to what are dimensions of Entropy in terms of length, mass and time. Entropy is not perceptible as temperature, pressure, volume etc are. There is no Entropy Meter instrument in our labs. It remains something far from ‘clear’ and much less ‘intuitive’.  

In all walks of life everyone accepts that
(Available form of) Energy expended = (Application form of) Energy reaped + Losses. 

This applies to materials too. You know that area of cloth gone in your garment is less than area of the cloth that you purchased, without maligning the bona fides of the tailor, because it is technically inevitable. Of course we know that energy and matter are (even if inter-convertible) never generated and never destroyed, they are conserved. All equations are based on this axiom.

Then what we mean by losses? It is none other than conversion into a form that we don’t value. When a sculpture is carved, the chips can be used as rubble for filling up say plinth. Value of chips is much lower than the value of the sculpture. But value of the Rock which was converted into Sculpture + Rubble was much less valuable than even rubble. So economically there is outright gain. No one ever despairs the fact that whole of the rock could not be converted into sculpture. Rubble is a loss as compared to sculpture. It is not wastage. Had sculpture been broken while finishing it, such event would count as wastage. Chipping out material is an integral part of carving, as constructive as retaining the material which constitutes the sculpture.

Suppose if I tell you, “The pigment that you put in water for coloring the water, gets dispersed into water. As it disperses it loses its coloring power which it would have had if it were not dispersed.”  How can I prove (or refute) my statement? Because we simply can not have it concentrated and disperse it too! We can not know its coloring power had it not been dispersed. But wait! We can color different volumes of water up to exactly same shade and see how much pigment it takes and see if the relation is linear or otherwise. Before going into Entropy with full throttle, let me simply mention that it has a lot to do with dispersal.  

‘Losses’ is very important variable not for physical but for economic considerations. Get maximum output in minimum input, is one of the dictums of engineering. Always remember that ‘input’ and ‘output’ are anthropocentric terms depending upon, what you are ready to forgo(cost) for what you want to get(price). In nature, without any cost-sensitive animal, there is neither any input nor any output. Cheetahs are cost sensitive. They mutely calculate how much to tire for a game and how much energy they would get by eating the game and probability of success in the game. Although Thermodynamics is more notorious for its losses, no process is loss-free.

To see the relation between losses and re-distribution, let us conduct a simple experiment.

There are devices called capacitors. It is two conducting plates divided by an insulating layer. If we push in electrons on one side it attracts holes (positive charge) on other. So it can store static electricity. As it goes on storing charge it develops an opposing voltage (electrons will try to repel each other, won’t they?). If you try to push beyond its capacity it explodes. Q = CV charge is capacitance into voltage built up.

Suppose we have one charged capacitor and one uncharged capacitor. We connect them together in parallel.
First let us find out energy available
The energy stored in the charged capacitor is
½(capacitance)(voltage2) = ½CV2.
Now instead of using conservation of energy let us use conservation of charge because electrons can not escape out.
The initial voltage = charge/capacitance, V = Q/C.
charge conserved = Q
capacitance doubled = 2C
 voltage = charge/capacitance = Q/2C = ½V
The stored energy ½(capacitance)(voltage2) = ½(2C) (½V)2 = ¼ CV2.

Now the question is where the half of energy is gone? Answer is that we assumed no resistance in the circuit. This is never the case. When charge is transferred there is bound to be a flow of current (decreasing as equilibrium is reached) energy I2R is consumed in transfer.
This is an extreme example but makes clear as to what are the losses involved in re-distribution.  

The main point is that any re-distribution of energy does consume some energy even in electrical form which is the most efficient form. Heat is the worst form of energy and we will soon see why it is so.

Temperature is much analogous to ‘head’ in Hydraulics, ‘voltage’ in electricity, ‘force’ in mechanical systems at a gross level. Similarly, friction in mechanical processes is analogous to resistance in electrical circuit. Let us repeat the experiment above in conduction of heat. There are two identical cubes, made of very good heat conducting materials, insulated from the surroundings. One is initially heated up to 800C. Another is heated up to 400C and are joined fast with full surface pressing on each other. After letting some time while away, would we expect them to settle at 600C? We would certainly not and rightly so as per observation as well as contemplation, for we have just seen that there is loss in re-distribution. Don’t jump to conclusion that Entropy can be simply countenanced as ‘resistance in heat transfer’.  This is because entropy does cause losses but all losses are not caused by it. Also its dimensions would have been same as that of ‘energy’. M (L/T)2 because losses are in terms of energy, this however is not correct. It is a pure number like radians.

Now let us observe the curious thing that we call temperature and what it means at macro level and at micro level. In case of Thermodynamics the terms macro and micro have a particular meaning. Any thing or enclosure under consideration (unnecessarily called system which contributes to the enigma.) is at some temperature. One cube was at 800C another at 400C. There was certainly a potential difference and heat does move from higher to lower, but there was one more difference hidden in the thermodynamic entity

Temperature represents potential energy at macro (solid piece/enclosure) level but what constitutes Temperature at micro (molecules) level, is Kinetic energy viz. vibrations or collisions of molecules  

In case of capacitors in first example it was not an issue at all as to whether the charge and voltage were equally distributed within a single capacitor or not.
The macro entity (i.e. enclosure) contains in its turn, the holders and carriers of heat within itself and the energy in the enclosure is not equally distributed amongst the holder/carriers viz. molecules. It is precisely this difference that makes Thermodynamics less intelligible than other branches of physics (save nuclear physics).

How can we agree that molecules are vibrating or colliding with different amplitudes and velocities? Let us take example of evaporation of water at temperature much less than its boiling point. If all molecules of water at say 300 C were vibrating with equal kinetic energies corresponding to the level of 300 C, no molecule could have escaped the liquid state and merrily entered gaseous state. 

So there has to be a huge variation in the kinetic energies of the molecules. There is a further bad news that these energy levels are discrete rather than continuous. There is still worse news that distribution of number of molecules amongst the energy levels available at given macro temperatures is probabilistic. Which molecule will happen to be at a particular energy-level at a given point in time is not knowable. However, pattern and total energy (thank God) are knowable. All these breaking news were given by Boltzman who went on calculating Entropy at micro level.

But we are not going to get disheartened. Analogy is far better than mathematics when it comes to intelligibility and if possible perceptibility. Let us go for an altogether different ‘input’ and ‘output’. Suppose we are in a material packaging business. Batches of small uniform items are to be sent in containers requiring minimum volume. Customer is very kind to us in allowing whatever way the items are stacked. Had items been spheres they would have spontaneously conglomerated in minimal volume by sufficient percussions given to the containers. But unfortunately the items are cubes! If we stack very meticulously we can form big cubes of stack from small cubes, thus requiring lowest volume. Labor cost is too high to do this and customer’s workers are ready to pick up cubes from any configuration in which they come. So we can pour cubes in containers and let them form a heap wherein they are randomly oriented. Each container must carry fixed weight of cubes without any condition of array. We are least bothered about which corner of which cube will be touching (prick into) which surface of which cube! (They are tough.) So the volume required by each of our batch will be different. We will chose worst possible dis-array of maximum volume for designing our containers. Therefore almost all of our containers will be underutilized. However container cost is too low than the labor cost involved in meticulous array. Volume was our input and number of cubes delivered was our output. We are making losses in the input but still doing good business.

As we accepted dilution of our cubes similarly a Thermodynamic enclosure has to accept dilution of energy because of differential internal distribution. If more volume is allowed to the nasty molecules they have more opportunity for more differential internal distribution.  Now we have an explanation (and a scientific one) for the results of the Joined Conductors. When they were allowed to redistribute their internal energies, displayed lesser gross temperature than one would expect under the condition of conservation of energy.

By dilution of energy, the energy per se is not depleting but perhaps (as we shall see soon) its convertibility to mechanical work may reduce. It does reduce drastically which is one of the reasons why thermodynamic engines are in-efficient.

Before going into Entropy which is more differential internal distribution we must take into account another independent & important cause of the under-efficiency of thermodynamic engines.

If we take a thermodynamic enclosure for giving heat and taking out work (essence of the concept ‘engine’) we face a problem which we never face in case of electric motor or mechanical converter, say gear box. When we supply electricity to an electric motor no mass is put in or thrown out. No copper, no iron, no insulator, goes in or comes out of the motor. But the gross occupier of Thermodynamic enclosure which is called the working substance, has to be replaced en-mass in each cycle. For example the weakened steam that is thrown out of a steam engine contains lot of heat which is an outright loss. But this loss is very simple to understand than the dreaded Entropy.

Now we will consider what Boltzman did. Suppose there is a pair of dice, six surfaces marked by 1 to 6 dots. Drawing a 2 is only one possibility {1,1}. Drawing a 12 is also only one possibility {6,6}. But drawing a 7 can happen in six ways {1,6}, {2,5}, {3,4}, {4.3},{5.2},{6,1}. So there is a pattern of distribution derivable from number of dice and number of sides they have.

Like our heap of cubes the underutilization can be theoretically calculated by number of energy-levels available and number of molecules available. Boltzman  derived a dimensionless number (like radians in case of angle which is length upon length) and a Boltzman-Constant which made it into temperature. Co-efficient of differential internal distribution is Entropy and multiplied by temperature (varying instant to instant so calculus is involved) gives dimension of energy. We will skip the derivation for it will only further discourage us.

Let us assume that Boltzman was right mathematically. But Boltzman made a huge terminological goof by calling it co-efficient of Disorder(!) instead of Co-efficient of differential internal distribution. Suppose N molecules are colliding in volume V at temperature T. Then somehow we manage to double the volume. By using Boltzman equation we predict a temperature significantly less than T/2 and empirically verify it then where is the disorder? Why it is the case that at initial volume the molecules were behaving more orderly and in double volume they have qualitatively changed into more nasty molecules? The double volume could have been the initial volume. Accepted that work-extractability of a gas goes down more rapidly than linear proportion with volume. There are non-linearities in many functions in physics. If our macro-level calculations are coming right why we should be uneasy simply because a derivation did involve probability calculations?

Dis-order is a misnomer. Philosophers and ideologues are lurking around to latch on misnomers. In natural languages, disorder is read in context of health/wellbeing etc. Boltzman was after all a scientist. What the hell good philosophers were doing while Entropy ballooned scandalously?

Randomness is not always a liability. It has been the greatest asset in evolution. Had there been no random mutations in the genes where is the question as to, which will be eliminated and which will be retained (don’t call it Selected). Many possibilities are produced by Randomness. Some are good, some are bad and many are redundant. Is it not good to have some good possibilities than having no possibilities at all? Creativity in humans is possible due to possibilities in making new combinations. Constructive labor creates Order, out of what would look like disorder, after the product is made. If figurative use is allowed, (most of it is already figurative) evolution and human progress are Negentropic processes taking place in the same universe!

In physicists’ own derivation they call the hot conductor as system (enclosure) and its cooler partner as ‘environment’! Furthermore everything except what you take into consideration is pitched against it as Universe! How you are going to calculate the entropy of the Universe? Are you not overstepping the limited scope of science by bringing in the infinite? They also envisage a heat death of the universe when all temperatures will become equal. All stars including our Sun will be extinguished.  Who knows the process by which stars might be getting generated. Whatever that existed before big bang must be infinitely negentropic! Let us not indulge in speculative metaphysics any further.

Concentration of internal thermal energy is better than its dilution for its work-extractability is all that Entropy means. Is this always disadvantageous to humans?

Let us take Light as an example. Laser-Rays are made parallel and synchronized. This not only increases their burning power but also renders very precise and accurate human control over them. We know how easier and safe eye-operations are made possible owing to the advent of Laser technique. On the other end there lies diffused Light. What a huge and probabilistic calculation would be required to measure diffused-ness of light! Does that make diffused light useless? The reality is completely opposite. It is the diffused light that enables us to view in areas where direct and hot sunlight is not available. In fact what we generally mean by light is diffused light. Otherwise their would have been dazzling objects such that we would require sunglasses to look at them on the one hand and light up our torches in the darkest shadow. Indirect lighting is more comforting while designing interiors of houses. Output in case of light is not ‘work-done’ but soothing visibility. Output increases with ‘entropy’ of light!

Drip irrigation and spray irrigation is highly productive than flow irrigation in agriculture. Water is a scarce resource and plants are in fact harmed if subjected to too much irrigation. Even land gets saline due to over-irrigation. Probability of locating molecules correctly is high in case of flow irrigation than spray irrigation. ‘Entropy’ of water is more in spray than in flow. Water productivity increases with water ‘Entropy.’  People enjoying themselves in a garden or in a funfair are highly ‘Entropic’ than soldiers in a Parade.

When the unscientific concept of ‘Entropy as a bane’ ballooned, academicians started seeing Entropy everywhere. There came a thing called information entropy. They calculated probability of how many times an alphabet occurs in English. They defined the reciprocals of such alphabet probability as ‘information-content’! Why? Because a lesser likely news is more of a news! If lesser likely alphabet goes uncorrupted through your noisy channel you decreased the Entropy of information. I have never come across more of a misuse of the category ‘Form-Content’ than this.
As dE = dU + dW that is energy is equal to internal energy plus work done, they found an analogy in economics, Income = Expenditure + Savings and went on calculating economic Entropy.

The paradox viz. Temperature is potential energy at enclosure-level and Kinetic Energy at molecular level is particular to Heat. Unless you find such paradox you can not go on applying Entropy indiscriminately everywhere.

The Moral of the story is, “With dilution of insight ‘Entropy in usage of terms’ certainly increases.”       




No comments:

Post a Comment