Dualism Review, Vol. 2, pp.11-21 (2017)

 

Paul Løvland

 

 

Notes on Information,
Entropy, Energy, and Mind

Mental energy – does it exist? Was Freud right?

 

The notion of mental or psychic energy has been employed in psychology since the last part of the 19th century. It is an integral part of Freud's metapsychology, but has been heavily criticised for not being scientifically proven.  In the present paper I argue for an analogy between part of this mental energy and physical energy. My method is to compare uncertainty, as defined in information theory, with physical entropy. It turns out that the former is quantitatively analogous to the latter, and can be named «mental entropy».  When this entropy is employed in an energy equation used in chemical thermodynamics we attain mental energy that is part of the total mental energy similar to motivation and Freud's energy.

 

1          Introduction

Many psychologists, psychoanalysts, and even philosophers, have been criticised for using the notion of mental or psychic energy which often has been named just a metaphor of the «phlogistic» sort. The purpose of the present paper is to find a scientific way to explain the existence of this kind of energy. My method is to combine information theory with thermodynamics, presupposing that information is a mental entity.

 

The notion of mental energy is not explicitly and thoroughly treated by philosophers in the past. But fortunately, there are some philosophies where energy is described as will or force, e.g.: Leibnitz with his constant force, Schopenhauer with his will and representation, Nietsche with his will to power, Spencer with his fundamental force. Ranheimsæter (1962, p. 254) holds that Schopenhauer's will is unconscious and in agreement with Freud's theories of the Id, and thus became a forerunner of psychoanalysis, although Freud probably would not accept this connection.

 

In contemporary philosophy, Hart's thorough analysis of psychic energy, belief, desire, quantity, action, and causation is a milestone (Hart 1988). He even discusses the conversion of pure wishful thinking to belief, and he presents an imaginary quantitative law for it (p.129). Hart summarises: The crux of an economic (energetic) model, in the present state of the art, is whether or not we can make sense of the possibility of psychic energy and, what is necessary for that in the light of our discussion of causation, its conservation at least through wholly intrapsychic processes.

 

Later in the present paper I try to respond to his crux at least with respect to the quantity of mental energy. Moreover I have suggested a model for mental processes that was based on irreversible chemical thermodynamics and statistical mechanics. Motivation, thoughts, and emotions were considered to be analogous to energy and entropy (Løvland 2006).       

 

During the 2nd half of the 20th century an extreme philosophy of the opposite kind has had, and has, a surprising influence: Physicalism (or materialism). It is well described by Kim (2008) and asserts that the mind is matter, and that it behaves in accordance with laws of physics. This hard monism is metaphysically grounded and difficult to explain logically, but its widespread popularity has made Freud's metapsychology harder to defend.

 

2          Mental energy in Freud's metapsychology

 

In the first part of the last century quite a few psychologists and physicians were occupied with the physical sciences, and e.g. the school of the famous physicist von Helmholtz was often consulted. Freud (1915/1991) held that unconscious «forces» such as instincts, needs, wishes, libido, and cathexes work as a source of potential energy for both primary and secondary mental processes. (See especially his paper on vicissitudes of instincts.)

 

The Penguin Dictionary of Psychology (Drever, 1952/1964) defines cathexis as the accumulation of mental energy on some particular idea, memory, or line of thought or action.

 

In his paper on repression Freud assumes that an idea (Vorstellung) or a group of ideas is «cathected with a definite quota of psychical energy (libido or interest) coming from an instinct». He says that, with the repression of an instinctual representative, clinical observation obliges us to divide what we hitherto have regarded as a single entity, though there are two parts in it: The idea and the instinctual energy linked to it. This is what Ricoeur (1970, p. 92), in his brilliant interpretation of Freud, has in mind when combining two universes of discourse: One relating to meaning and ideas subject to interpretation, and one to force or energy subject to explanation.

 

In addition to Freud, even Jung (1928/1973) and Harding (1948/1973) have given valuable contributions to the understanding of mental energy.

 

Even though the existence of mental energy has been found to be quite probable by many scholars, no proof has been presented. It is mostly regarded as an intuitive entity, and this has of course tempted several scientific researchers to phrase devastating criticisms of the concept. Ricoeur (1970, p. 344) has summarised some of the criticisms that focuses on the following major points:

 

-     Epistemologists, logicians, semanticists, philosophers of language have generally come to the conclusion that psychoanalysis does not satisfy the most elementary requirements of a scientific theory, e.g. it lacks empirical verification.

-     Psychoanalysis is not a science of observation: it is an interpretation, more compatible to history than to psychology.

-     The energy notions of Freudian theory are so vague and metaphorical that it seems impossible to deduce from them any determinate conclusions.

 

Rycroft (1968, p. 43), in his critical dictionary of psychoanalysis, says that Freud's theory of energy has little to do with the concept of energy as used by «the other natural sciences», but is really a theory of meaning in disguise.

 

In the following, physical entropy is applied to find a quantitative analogy between this entity and a mental entity.

 

3          Quantity of mental energy

 

3.1       Amount of  information

 

Attneave (1959) has presented an excellent explanation of the information theory for use in mental processes, and the reader is referred to him for a fundamental understanding of the subject.

 

Information can solely be gained if one is to a certain extent ignorant or uncertain. It may therefore be defined as «that which removes or reduces uncertainty» (Attneave 1959, p.1). In order to quantify uncertainty we can consider a subject in a choice situation, where he can choose between several alternatives. His job is to reduce the number of alternatives one by one. In this way we get a numerical measure of the reduction. Information theorists apply the old parlor game of «Twenty Questions» to carry out the job which for our purpose can be described as follows:

 

The alternative to be found is known by a subject who answers questions from a panel of players in the game. But he can merely answer with one of two words, either Yes or No. Gradually the players can eliminate the alternatives to finally arrive at the relevant one. If we designate the two words with digits, 1 means Yes and 0 means No, we get two possible digits for each question, that is a binary digit or bit, as it is usually called. Each question is thus associated with two possible alternatives and one bit. If we need e.g. six questions to select the wanted alternative we have 26 = 64 possible alternatives which are the uncertainty that must be removed to gain the needed information in bits. Generally then (Attneave 1959, p. 4):

 

            m = 2H        which gives

 

            H = log2 m                                                                                    (1)

 

where m is the number of equally likely alternatives, and H is the amount of uncertainty expressed in bits. When uncertainty is eliminated and information is maximum H expresses the amount of information gained (see §3.4).

 

Eq. 1 is most important and well established in information theory.

 

In some cases it is helpful to express information in terms of probability instead of alternatives. Probability is best described with an example: Throwing a die gives you 6 possible sides or alternatives to obtain one particular side, thus the chance or probability of getting this side is

 

            p = 1/6,           or, generally    p = 1/m,          thus     m = 1/p          

 

where m, as above, is the number of possible alternatives. When 1/p is substituted for m in

eq. 1 we get

 

            H = log2 (1/p) = − log2 p                                                                                                   (2)

 

The negative sign means that the uncertainty H is being reduced and that the information  goes up (see §3.4), as the probability p goes up.

           


3.2       Amount of statistical entropy

  

Eq. 1 is compared with the statistical entropy amount of a physical system which is

 

            S = k ln mphys     (Boltzmann-Planck formula; e.g. Brillouin 1971, p.120;

                                   Fast 1962, p. 61; Prigogine 1954, p. 45)                  (3)

 

where mphys , called P by Brillouin and Ω by Prigogine, is the number of «elementary complexions», as Planck called it. mphys can also be explained as the number of possible microstates or configurations of atoms or molecules. Fast (1962, p. 3) explains it as «the possibilities of realisation or microstates of the thermodynamic state». mphys increases with more structural disorder. The k is Boltzmann's constant that links the number of microstates mphys with thermodynamic or calorimetric entropy expressed in energy units (k~1.4x10-23 J/K).  

 

In a normal physical process mphys represents an enormous number of molecules or possible microstates, which have to be treated statistically. A tiny portion of energy is attributed to each molecule, Boltzmann's k, and the sum becomes the statistical entropy.

 

How is this statistical entropy related to probability? Imagine an isolated box with two kinds of perfect gases, say neon and helium, where there are no forces between the molecules. A spontaneous process from state 1 to state 2 in this box is described below:

 

State 1: All of the neon molecules are gathered at one end and all of the helium molecules at the other end of the box. This is a highly improbable and highly ordered microstate where the two kinds of molecules have very few chances of mixing with each other. Thus there is a very low number of possible microstates, i.e. mphys is extremely low, and the statistical entropy consequently is low too, ref. eq. 3.

 

By time the freely moving molecules are gradually mixed and the possibility for the molecules to join the other type is increasing. This means that the number of possible microstates goes up so that mphys and S also go up.

 

State 2: At a certain moment the mixing process comes to an end and all the molecules can freely join the other type so that the number of possible microstates mphys is at the maximum. This is also the equilibrium state which is the most probable one where the entropy is highest, ref. eq. 3. This is a simple statistical description of the 2nd law of thermodynamics, viz. that the entropy of an isolated system with no interacting parts tends to a maximum and to the highest possible probability. If this spontaneous process in some way or other is reversed the value of mphys , p, and S are all reduced.

 

3.3 Comparison of  information  and entropy.

 

The 2nd law describes a process that is the opposite of gaining information. In the latter the probability increases when the number of alternatives is reduced, §3.1. If the number of microstates in the physical process is reduced the probability decreases, §3.2. Thus we see qualitatively that the two processes are not analogous. This is shown in table 1 where the processes are compared on equal terms, i.e. both mphys and m are decreasing. We see that increasing information corresponds to decreasing entropy.

 

Table 1

 

                       Information                            Corresponding

                       gaining                                   physical process

 

                       Decreasing m                         Decreasing mphys

                       Eq. 1                                      Eq. 3

 

                       Increasing p                           Decreasing p

                        Eq. 2                                      §3.2

                                                                      (Reversed 2nd law)

 

                       Decreasing

                       uncertainty Hth

                       Eq. 1,2, §3.4

                                                                                                         Decreasing entropy S 

                                                                                                         Eq. 3

                                   Increas ing                                                    

                       information H            

                       Eq. 1, §3.4

 

3.4 Mental entropy

 

In order to interprete eq. 1 we must consider the whole process of gaining information, see §3.1. It starts with a number of alternatives, an amount of uncertainty, and ends with the selected alternative, the infomation. Thus we must divide the interpretation in two parts.

 

 

i) Before the process starts:

Eq. 1, H = log2 m, expresses the amount of uncertainty. It corresponds to the physical S = k ln mphys , eq. 3,  which is the physical entropy. H (uncertainty) is then analogous to S in this equation and can be named «mental entropy» and designated Hth (th refers to the thermodynamically construed H). The relations are clarified in table 1. Hence:

 

            Hth = log2 m                                                                                     (4)

 

ii) After the alternatives are removed:

Eq. 1 expresses the amount of information. It has the same number of alternatives and the same value as has Hth , but is interpreted conversely to the lattter, i.e. information is opposite to uncertainty: The more information, the less uncertainty, and vice versa. See also §3.1. If m is increased by one alternative before the process starts, uncertainty Hth goes up accordingly. But this extra alternative has to be eliminated, subtracted, when information is to be gained so that H gets a negative sign in relation to Hth , table 1:

 

            Hth =                                                                                        (5)

 

This is a quantitative relation where increased information H  means less uncertainty and  less mental entropy Hth  . The two interpretations of eq. 1 are also explained in another context (Løvland 2017).

 

The significence of the whole process is illustrated with the example in table 2 where m as before is the number of alternatives. The process starts with 6 possible cases, alternatives m and microstates mphys (6 microstates are merely symbolic since physical matter consists of an extremely high number of these states.)  The process ends with 1 selected alternative and 1 microstate, i.e. maximum information and no uncertainty and entropy. The amounts of mental and physical entropies are calculated according to eq. 4 and eq. 3, resp.

 

                                                          Table 2

 

            No. of cases               Mental entropy Hth                 Phys. entropy

 

            Initial     6                   Hth = log2 6                            S = k  ln 6

            Final       1                  Hth = log2 1                            S = k  ln 1

 

This shows what I explained above in the present subsection: Reducing possible cases causes diminishing mental and physical entropy contrary to gaining information. The mental entity Hth  in bits is quantitatively analogous to the physical entity S in energy units.

 

3.5       Entropy and energy

 

My initial question was: Does «mental energy» exist? If that is the case I contend that there must be «mental entropy» to satisfy the energy equations in chemical thermodynamics, e.g. for systems at constant temperature where only energy and entropy can be exchanged with the surroundings as in the following infinitesimal equation: 

 

            dF = dU – T dS          (Prigogine 1954, p. 36; Fast 1962, p. 90)     (6)

 

where F is the free energy (Helmholtz), U is the internal energy, T is the absolute temperature, and S is the thermodynamic calorimetric entropy, which is equal to the statistical entropy when applying Boltzmann's constant. F is the driving or potential energy of the process that may perform work on or eject heat to the surroundings; it can be increased if energy is supplied to the system from outside.

 

In an ordering process where the number of possible microstates mphys, and therefore S, diminish, dS becomes negative, dF positive and F higher. If there are no forces between the constituents, and no energy is exchanged with the surroundings, U will be constant and dU zero. Thus we can simplify our analysis by applying the shorter equation:

 

            dF = −T dS                                                                                      (7)

 

In some processes energy may be exchanged with the surroundings making dU different from zero thus affecting dF. However, this will disguise the effect of dS that will remain the same unaffected by dU. And dS is the major point of our analysis.

 

We see that the energy F is equal to the negative of the entropy S, both in joules, multiplied with a constant T. Thus energy is conversely proportional to entropy in similar units.

 

Since mental entropy is analogous to physical entropy and the latter is related to energy it is reasonable to presuppose that mental entropy relates to mental energy in a similar way; generally, all sorts of energy require relevant entropy in our context.

 

In mental processes the constant T is not applicable since it relates only to physical units, but there may be a factor in the mental domain similar to T that we may have to apply in order to obtain complete analogy. Let us call it M, which then is a positive constant analogously called «mental temperature».  The mental entropy Hth is directly determined by the number of alternatives and is measured in bits. We can then present our «mental» equation by substituting Hth for S in eq. 7 as follows

 

            dFana =  −M dHth                                                                              (8)

 

where Fana  is the «mental free energy» likened with motivation, and this energy in bits is analogous, but not identical or equivalent, to the physical free energy in joules, i.e. equation 7 and 8 are analogous. We see that lower amounts of mental entropy result in higher amounts of mental free energy just as in physical processes. In other words: Higher amounts of information, which means lower uncertainty, give more intensive motivation. A preliminary experiment described in §5 indicates that this is the case. Could this favour Freud's understanding of mental energy?

 

Worth noticing is that Hth is equal to −H, eq. 5, which gives

 

            dFana = M dH                                                                                   (9)

 

Calling the mental free energy just mental energy I suggest a definiton of the latter:

 

            The amount of mental energy in bits is analogous to physical energy and is proportional to the amount of corresponding information in bits.

 

4          Remarks

 

The mental free energy Fana , which is described in this paper, is the objectively created part of the total mental free energy that is likened with total motivation and Freud's mental energy. The objective part relates to numbers of possibilites and to order-disorder changes in the mental structure. Psychologists speak solely about subjectively emotional forces, but I contend that Freud's energy and thus the total mental free energy consist both of a subjective and an objective part (Løvland 2006). In any case my calculations suggest that mental energy exists, whatever one wants to compare it with in psychology, and is a mental entity separate from but analogous to the physical one.

 

Having contended that mental energy exists and can be quantified, an interesting question follows: Can this energy influence brain functions, or be transmitted to the brain? This vast topic lies outside the present scope, but I will mention that will may affect electrical potentials in the brain without transmitting any form of energy from the mind (Løvland 2009). The needed energy comes from the body or brain itself due to diminution of physical entropy in the brain.

 

 

 

5          Experimental

 

The present hypothesis can be tested experimentally with a psychological method that can measure «mental free energy». The latter is driving a mental process and is characterised as the intensity of motivation and aroused emotions. Such arousal can be measured with a special version of Osgood's «semantic differential» (Osgood, Suci, and Tannenbaum 1957/1978) that primarily is a psycholinguistic method:

 

A subject is shown a picture of concepts (words) that is associated with a set of bipolar adjectives such as weak-strong, soft-hard, and white-black. Each bipolar adjective is evaluated and given a ranking number related to the emotional intensity that it creates. An average ranking number is calculated for the whole picture, a number that represents the motivation or the mental free energy created by that picture. 

 

Løvland (unpublished) carried out a preliminary experiment where a subject was shown two pictures, first one with 6 equal words randomly distributed, quickly afterwards another with only one of the same words. The number of words is thus reduced from 6 to1. I carefully minimised the influence of affective factors such as the meaning of the words, and evaluated the effect of several such changes of pictures on the average ranking numbers. I found that this number increased some 30% due to reduction of the number of words. This preliminary result corroborates the present hypothesis that lowering the mental entropy raises the mental free energy/motivation. The result is the objective part of total motivation and of the total mental (free) energy.

 

In a future experiment it would be interesting to study the relation between mental entropy and cerebral electrical activity.

 

6          Conclusions

 

The amount of information as calculated by information theory is not analogous to physical entropy; if the former is reduced the latter is increased.

 

On the other hand, when uncertainty is calculated according to information theory, it turns out to be quantitatively analogous to statistical physical entropy and can be named «mental entropy».

 

Substituting this «mental entropy» in an equation for free energy according to chemical thermodynamics we attain the analogous «mental free energy» in bits, which is proportional to the amount of information.

 

This mental energy is the objective part of the total mental free energy that is likened with total motivation similar to Freud's mental energy.

 

The comparison of mental entropy and energy with the corresponding physical entities indicates that the former behave like the physical ones, but are not identical to them. Can we then say that mental energy exists?

 

It is possible to test the hypothesis experimentally employing a psychological method based on a work by Osgood et al.

 

                            Paul Løvland

Email: kloevlan@online.no

Oct 17, 2017

References

 

Attneave, F. (1959). Application of Information Theory to Psychology. New York: Henry Holt & Co. Inc.

Brillouin, L. (1971). Science and Information Theory. New York: Academic Press Inc. (First printed 1962)

Drever, J. (1964). A Dictionary of Psychology. Harmondsworth: Penguin Books Ltd. (First published 1952, rev. ed. 1964)

Fast, J.D. (1962). Entropy. Eindhoven (Holland): N.V.Philips.

Freud, S. (1991). 11. On Metapsychology. The Penguin Freud Library, vol. II. Harmondsworth: Clays Ltd., St. Ives plc. (Original work published 1915)

Harding, M.E. (1973). Psychic Energy. Princeton: Bollinger Paperback Edition. (Original work published 1948)

Hart, W.D. (1988). The Engines of the Soul. Cambridge: Cambridge University Press.

Jung, C.G. (1973). On Psychic Energy. In W. McGuire (Ed.), R.F.C. Hull (Trans.). Collected Works of C.G. Jung, Bollingen Series XX (Vol. 8, 3rd paperback printing). Princeton: Princeton University Press. (Original work published 1928)

Løvland, P. (2006). On explanation, interpretation, and natural science with reference to Freud, Ricoeur, and Von Wright. In A. Batthyany, & A. Elitzur (Eds.), Mind and Its Place in the World (225-244). Frankfurt: Ontos Verlag.

Løvland, P. (2009). On thermodynamics, mind, and the cerebral readiness potential: A novel aspect of the mind-brain relationship. In A. Batthyany, and A. Elitzur (Eds.), Irreducibly Conscious (163-178). Heidelberg: Universitätsverlag WINTER.

Løvland, P. (2017). Physicalism Begs the Question and Violates the 2nd  Law of Thermodynamics. Dualism Review, 2, pp. 1-10. (Online)

Osgood, C.E., Suci, G.J., & Tannenbaum, P.H. (1978). The Measurement of Meaning. Urbana: University of Illinois Press. (Original work published 1957)

Prigogine, I. (1954). Chemical Thermodynamics. London: Longmans Green and Co.

Ranheimsæter, H. (1962). Arthur Schopenhauer. In E. Skar and  H. Winsnes (Eds.) Vestens tenkere II. (254). Oslo: H. Aschehoug & Co. (In Norwegian)

Ricoeur, P. (1970). Freud and Philosophy. An Essay on Interpretation. (D. Savage, Trans.). New Haven: Yale University Press.

Rycroft, C. (1979). A Critical Dictionary of Psychoanalysis. Harmondsworth: Penguin Books Ltd. (Original work published 1968)