[ Pobierz całość w formacie PDF ]
. [Rapaport and Knoller say at this point  The author evidentlyuses the word  ominous in the sense that the possibility of realizing the proposed arrange-ment threatens the validity of the Second Law. ] For brevity we shall talk about a  measure-ment, if we succeed in coupling the value of the parameter y (for instance the positioncoordinate of a pointer of a measuring instrument) at one moment with the simultaneousvalue of a fluctuating parameter x of the system, in such a way that, from the value y, we candraw conclusions about the value that x had at the moment of  measurement. Then let xand y be uncoupled after the measurement, so that x can change, while y retains its value forsome time.Such measurements are not harmless interventions.A system in which suchmeasurements occur shows a sort of memory faculty, in the sense that one can recognize xby the state parameter y what value another state parameter x had at an earlier moment, andwe shall see that simply because of such a memory the second law would be violated, if themeasurement could take place without compensation (p.303, brackets added).The last sentence indicates that a violation of the second law would take placeunless there was compensation.He is about to develop the idea of compensation, 80 3 On Interpretation of Mindand thereby, as it were, save the second law.But let me dwell for a moment on whatis being said here so far.The factual event is x and y is the measurement of the event.And even though the factual event continues to change, the y persists through timeas the memory of an earlier x.There is a coupling of x and y at the instant of mea-surement, but then they get uncoupled in the sense that x changes and y remains as and let me introduce the word at this time  the information about where x was but isno longer.It is precisely because of the existence of y after the occurrence of x, andafter the momentary coupling of x and y, that the second law might be violated.He then goes on to explain what he means by compensation.We shall realize that the Second Law is not threatened as much by this entropy decrease asone would think, as soon as we see that the entropy decrease resulting from the interventionwould be compensated completely in any event if the execution of such a measurementwere, for instance always accompanied by production of k log 2 unit of entropy.In that caseit will be possible to find a more general entropy law, which applies universally to all mea-surements (p.303).This is the crux of the argument.The decrease in entropy in the physical systemis compensated for by at least an equal amount of entropy production,  in a moregeneral entropy law. In effect, Szilard has posited a larger system than the relent-lessly physical one.This larger system contains both the physical system and theintelligent, or measurement, system. To put it precisely, he writes,  we have todistinguish here between two entropy values (p.304).The compensation he envis-ages is a rise in entropy in the intelligence system at least as large as the decrease inentropy in the physical system.By conceiving of this larger system, which includesthe intelligence system, including its memory which goes beyond physical facticity,the second law of thermodynamics is saved.For whereas he allows the possibilityof a decrease in entropy in the part of the system which is physical, there is a com-pensating increase in another part of the system at least as large.Thus entropy onlyincreases, as the law indicates.In 1948, a paper appeared in the Bell System Technical Journal by an engineerwho worked for the telephone company.The paper entitled  the mathematicaltheory of communication presented a theory which has played an extremelyimportant role in connection with all subsequent developments in technology ofprocessing and transmitting information.The essential feature of information the-ory as developed by Shannon was the transfer of the mode of the thought whichhad developed in connection with thermodynamics for the characterization ofessential features of information handling equipment.He applied the entropy for-mula which had emerged from the work of Bernoulli, Maxwell, and Boltzmann,and labeled the quantity II, the entropy, and identified that quantity with the aver-age information of a message set.Thus, when a message is transmitted, the amount of information in the messageis identified with all that had to be driven away, as it were, for that message to getthrough.And its amount of information is precisely a function of its prior probabil-ity.Simply put, the amount of information that is received in a message depends onhow likely it was in the first place, and what reasonable alternatives had to benegated for it to be transmitted. Thermodynamics and Information 81More specifically, every possible message in a message set is said to have anamount of information associated with it which is log (1/p), which is the same as thelogarithm of the reciprocal of the probability of the occurrence of the message.Thus, if the chance of a message occurring is ½, then its reciprocal is 2/1, and theamount of information that is associated with it is (log 2  log 1).The latter is equalto (1 0), which is 1.Similarly, the information which would be associated with, say,an alternative message having the same probability is also 1.The average amount of information in a message set is then precisely the averageof the amount of information in each alternative.To get this average, one simplymultiplies each amount of information by its corresponding probability and addsthem.This gives the formulaH = p log(1/p) + p log(1/p) p log(1/p),or in the more usual form in which it is presentedH = - p log p.This is identical to the formula for entropy in thermodynamics.Of course, the factthat those virtually identical mathematical formulations appear to apply to the behav-ior of a gas and to information raises the question as to whether some deep underly-ing condition has been identified, or that it is another example in which a branch ofmathematics just happens to have usefulness beyond the area in which it was devel-oped.The proliferation of literature on this question has been considerable.I suggest that the reason for the parallel between entropy and information isreadily understood in terms of the considerations of the third world.As has beenindicated, with Boltzmann the nonactual alternatives that lived only in the world ofpossibility were included in the measurement of states of a gas.It is, of course, thisfeature which Shannon picked up for the study of message processing.Certainly,the amount of information in a message which has been transmitted can be inter-preted in terms of all the other possible messages which were  overcome in thetransmission; and certainly the amount of information in any one message which isactually transmitted can be understood as a function of how small the probability ofits transmission was in the first place.But the latter probability is contingent on allthe contrary, and excluded, alternatives which, while not being transmitted, existonly in some nonactual world.One of the great disappointments of many who got very excited with the appear-ance of Shannon s information theory is that they hoped in some sense cracking themeaning of meaning with a new mathematical tool was never realized.The fact isthat the Shannon theory was a theory of information measurement and not one ofinformation.It took for granted without analyzing further that messages had proba-bilities associated with them [ Pobierz caÅ‚ość w formacie PDF ]

  • zanotowane.pl
  • doc.pisz.pl
  • pdf.pisz.pl
  • lunamigotliwa.htw.pl
  •