• Nie Znaleziono Wyników

Information as a flow of form in mass-energy systems

N/A
N/A
Protected

Academic year: 2021

Share "Information as a flow of form in mass-energy systems"

Copied!
12
0
0

Pełen tekst

(1)

Summary

The notion of information emerged in the mid-twentieth century, not only in cy-bernetics and communication theory, but also in natural sciences, including physics, chemistry, biology. An interesting approach to information presents the catastrophe theory, which laid foundations for the notion of information as a flow of form in to-tally mass-energy systems, identified as such by Wiener in the 1950s. It follows that the fundamental creative and control mechanisms of the universe lie in its use of forms as information.

Keywords: information, entropy, catastrophe theory, form 1. Introduction

According to modern science, the universe is a mass-energy system. Its origins, destiny, and many of its inner mechanisms remain unknown, but the mass-energy nature of all events and ob-jects is an a priori requirement of the scientific view. Among science’s major failings, however, is its inability to define mind, consciousness, and mental phenomena (whether in humans or others) in entirely mass-energy terms. The best it has been able to achieve so far is to characterize mind as a mass-energy system that manipulates or processes something we call information. Contrary to much popular belief, however, it turns out that information is a phenomenon with no clear-cut, general meaning and has never been fully defined for all its uses. So, enticing as it may seem to think of ourselves as information processors, characterizing mental events as information proc-esses has no value, scientific or otherwise, until we understand the nature of whatever it is we call information.

This article provides such an understandings defining information in entirely mass-energy terms in all senses in which it is or could be used. This could make possible a unification of mind and matter and enable us to approach the universe in its entirety as a wholly mass-energy system. The more detailed objectives of this article include a brief exploration of the concept of informa-tion, the core ingredient of all communication and control processes in living and nonliving sys-tems. This will show that in each and every case what we know as information is the precise equivalent of what traditionally has been referred to as form.

2. The emergence of information

We all are information dependent. We’re all inescapably embedded in a universe of flows, not only of matter and energy but also of whatever it is we call information.

Something called information is now a basic descriptive concept, not only in communication theory, cybernetics, and computer sciences but also in some of the most important areas of phys-ics, chemistry, biology, and psychology, particularly psychobiology and neuroscience. It is found in electromagnetic-field disturbances, sound waves, atomic excitations, and other physical

(2)

phe-nomena, in the structure and function of chemical molecules, in the sensory activities and behav-ioral displays of animals, in the activities of neurons, nervous systems, and brains, and in the entire complex of psychological or mental processes, so that all cognitive functions, including percep-tion, knowledge, thought, learning, and memory, as well as emopercep-tion, volipercep-tion, consciousness and the entire phenomenon of mind-are all now generally characterized by scientists as information-processing activities. So widespread is its influence in contemporary science that it has come to be viewed by many as a fundamental universal phenomenon alongside and related to matter and en-ergy.

P. Young argues, that if we knew exactly what the word information meant in every area in which it was used, the universe should be characterized as “a mass-energy information-processing system” [18]. Although surprising mathematical similarities have been uncovered between aspects of energy and aspects of information, other than its technical uses in the mathematics of communi-cation theory, it has come to mean virtually whatever anyone using it wants it to mean. Generally however, the term information seems to be used primarily as a figure of speech, implying only the obvious, namely, that what is taking place is something we call communication between mass-energy systems.

According to the Oxford English Dictionary, the earliest historical meaning of the word in-formation in English was the act of informing, or giving form or shape to the mind, as in educa-tion, instruceduca-tion, or training. A quote from 1387: “Five books come down from heaven for infor-mation of mankind” [7]. The English word was apparently derived by adding the common “noun of action” ending “-ation” (descended through French from Latin “-tio”) to the earlier verb to inform, in the sense of to give form to the mind, to discipline, instruct, teach – another quote from 1330: “Men so wise should go and inform their kings [7]. Inform itself comes (via French) from the Latin verb informare, meaning to give form, shape, or character to, therefore to be the forma-tive principle of, therefore to imbue with some specific character or quality. Furthermore, Latin itself already even contained the word informatio meaning concept or idea, but the extent to which this may have influenced the development of the word information in English is unclear. As a final note, the ancient Greek word for form was eidos, and this word was famously used in a technical philosophical sense by Plato (and later Aristotle) to denote the ideal identity or essence of some-thing [5].

Contemporary dictionary definitions provide versions of what we all think information means -something to do with knowledge. When we receive, store, or transmit information, we appear to be dealing in knowledge (a piece of information is considered an item of knowledge). Since the late 1940s it has also become associated with the concepts of signal and message, the central in-gredients of communication processes, resulting largely from the formalization of these ideas in the mathematics of communication theory and cybernetics. This approach was developed by C. Shannon and N. Wiener, discussed later in this article. When we send a signal or message, how-ever, even through the physical units employed, as patterns of radiation or chemical molecules do not embody meaningful human knowledge, we are still manipulating knowledge in symbolically coded forms. Even an inherently meaningless signal, by being able to produce knowledge to some receiver, is an aspect of a knowledge process. In one way or another, signals, messages, and in-formation are inextricably bound up with whatever it is we call knowledge and, of course, with the systems that code and decode this knowledge.

Widespread usage of the word “information” in science did not even begin until the late 1940s and early 1950s, after which it began to appear in many areas of scientific description. In physics,

(3)

although J. C. Maxwell published his field equations describing electromagnetic waves in 1873, Hertz produced the first radio signals in the late 1880s, Marconi built the first radio transmit-ting/receiving equipment in 1895, and the first crude television system was produced by Baird in England and Jenkins in the United States in 1923, radio and television transmissions were not generally referred to as information processes until the 1964 publication of McLuhan”s “Under-standing Media”, in which he heralded the age of “electric information.”

In biology, particularly biochemistry, there was a similar change. In 1871, twelve years after initial publication of “The Origin of Species,” Charles Darwin published “The Descent of Man in Relation to Sex,” in which he proposed his theory of "pangenesis" to account for the transmission of hereditary characteristics, proposing gemmules as the organs or atoms of the transmission pro-cess. In 1893, August Weissmann developed his germ-plasm theory, in which germ-plasm, carry-ing hereditary characteristics, is transmitted from generation to generation. More than fifty years later, in 1943, Oswald Avery and his colleagues demonstrated that the active component of genetic material was the DNA molecule, yet it was not until ten years after that, in 1953, following publi-cation by Watson and Crick of their now classic paper on the structure of DNA (in which the term information appeared only once), that the word came into vogue in biology to characterize the storage and transmission of hereditary characteristics [15]. Since then it has been widely used by biologists to describe activities as diverse as the genetic code, sensory perception, animal gestural and behavioral displays, and more.

The same sort of transition occurred in psychology. From the early days of scientific psychol-ogy in the late nineteenth century on, various schools dominated - Freudians, Gestaltists, and be-haviorists competing for the central role - although by the 1950s this approach had largely given way to studies of individual areas and problems. Yet here again, the word information was not used in psychology in any systematic way until the 1950s. After some initial attempts to use the Shannon-Wiener measure to quantify psychological activities, most of which did not bear signifi-cant fruit owing to the inherent limitations of the mathematics and the complexity of psychological processes, use of the mathematics and the term information itself seemed to reach a peak by the early 1960s, then fell into a somewhat fallow period, emerging again in the 1970s and 1980s as a basic descriptor of a vast number of both the physical and mental activities studied by psychol-ogy and its related disciplines.

So, the question is, Why does this broad, scientific use of the word information seem to have developed in so many areas around the late 1940s and early 1950s? There are undoubtedly many contributing factors, but the major reasons can probably be reduced to three:

1. The development in 1948 of what has come to be called the Shannon-Wiener measure H for the “amount of information” in a message, which seemed to provide a scientific and technological identity for information as a result of the appearance of its having been given a rigorous, mathematical treatment (which, it turned out, was not the case);

2. The fact that this measure, in addition to other evidence linking information and entropy, suggested a deep-seated relationship between information and energy; and

3. The concurrent development of electronic computers, the first of which became commer-cially viable also during the 1940s, and which manipulated data which, under certain conditions, could be turned to information.

The word information seemed to come from nowhere, appearing suddenly not only in a major mathematical expression, but in very short order in a slew of other scientific descriptions. It was

(4)

a word that seemed to fit almost anywhere, to be able to describe a vast array of diverse physical, chemical, biological, and psychological activities. It was a new, exciting, scientific concept. 3. Information and entropy in communication theory and computer science

With the development of telegraphy in the nineteenth century, particularly through the work of Samuel Morse, the importance of the reliable transmission of intelligence or information be-came important, and engineers began to explore methods by which a signal or message could be made more reliable in the presence of noise - electrical or other interference with the signal.

The work of a succession of scientists, including Nyquist, Hartley, Fisher, Gabor, and others, provided avenues for the almost simultaneous publication in 1948 of Norbert Wiener’s book, “Cy-bernetics,” and Claude Shannon’s paper, “The Mathematical Theory of Communication,” each of which provided essentially the same expression for something called “amount of information” in a message. What is important, “the word information, in this theory, is used in a special sense that must not be confused with its ordinary usage. In particular, information must not be confused with meaning” [12]. In communication theory an actual message is selected from a known set of possi-ble messages and measured probabilistically to the degree to which it reduces the uncertainty of what selections could have been made.

In fact the word information as used in communication theory differs from its use as associ-ated with knowledge in several important ways:

• it has nothing to do with meaning,

• it refers not to a single message, but probabilistically to an entire ensemble of possible messages (the amount of information generated varies according to the size of the ensem-ble from which it was selected),

• the ensemble must be unambiguously understood by both sender and receiver,

• it is associated with uncertainty - that is, with the prior state of knowledge of the receiver; each bit of information reduces uncertainty (as to what selections could have been made from the ensemble) - more prior uncertainty, more information.

And so, communication theory is an engineering theory through and through, not a theory of knowledge. Weaver comments, “The concept of information developed by this theory at first seems disappointing and bizarre - disappointing because it has nothing to do with meaning, and bizarre because it deals not with a single message but rather with the statistical character of a whole ensemble of messages, bizarre also because in these statistical terms the two words infor-mation and uncertainty find themselves to be partners”[12].

Aside from any other consideration, the Shannon-Wiener formula is a measuring device, and so, to equate H with information in its general sense is to confuse a measuring device with what it measures. A formula that measures the amount of apples in a barrel obviously is not the same as the apples. This has been one of the details most commonly overlooked by those who co-opted the term from the mathematics of communication theory. Despite this, despite the fact that the mathe-matical usage of the word is highly technical and entirely different from its general, linguistic usage, despite warnings by Shannon that the “semantic aspects of communication are irrelevant to the engineering problem” [11], a warning echoed by others in the field, scientists from numerous disciplines have flocked to the word in droves since the 1948 publication of the works of Wiener and Shannon, using either the mathematics and/or the word alone to describe whatever it was they were studying.

(5)

A second reason (the first being the development of telegraphy and successive communication disciplines) for the dynamic rise of the information concept has to do with the fact that the Shan-non-Wiener measure H for the amount of information in a message bears a striking formal similar-ity to the mathematical expression S for entropy (they are essentially the reverse of one another), a fact that contributed to Shannon’s decision to call his measure the “entropy of the signal.” This fundamental information-entropy connection was also arrived at independently by another route, and the combination of these results gave credibility and force to the idea that information was to be considered an energetic, rather than an abstract phenomenon, heightening its appeal to scientists in all sorts of fields.

This alternate route developed through analysis of problems in thermodynamics, the study of heat, rather than through those in telegraphy or other communication areas. The second law of thermodynamics, known as the entropy law, measures the dissipation of energy from closed ther-modynamic systems. Formulated in 1850 by mathematical physicist Rudolf Clausius, its best-known statement has been in the form, “Heat cannot of itself pass from a colder to a hotter system” [2]. While work can be derived from the heat of a body whose temperature is higher than its sur-roundings, none can be obtained from one whose temperature is lower. The less energy available for work, the higher a mathematical quantity called entropy (from the Greek word meaning trans-formation). The entropy of a thermodynamic system is, therefore, a mathematical expression of the amount of unavailable energy in such a system. According to this statement, in any closed system, the entropy never diminishes; energy is continually being lost. Another important dimension to the law of increasing entropy is that it implies a direction in which all universal processes take place -from lower to higher entropy states, involving a fundamental asymmetry between past and future.

As a result of later work in statistical mechanics by other physicists, particularly Ludwig Boltzmann, the initially abstract identity of the entropy measure was given a physical and prob-abilistic interpretation. Boltzmann’s work showed that entropy could be understood as a statistical law measuring the probable states of the particles in a closed system. According to this interpreta-tion, a mechanical system will approach a state of thermodynamic equilibrium because equilibrium (equalization of pressure, temperature, etc.) is overwhelmingly the most probable state of the sys-tem. In statistical mechanics, each particle in a system occupies a point in a “phase space,” and so the entropy of a system came to constitute a measure for the probability of the macroscopic state (distribution of particles) of any such system. Boltzmann may have been the first scientist to make an explicit connection between energy and information by noting in 1894 that the entropy of a sys-tem is related to “missing information” [9].

The probabilistic interpretation of entropy resulted in an interpretation of entropy that is one of the cornerstones of the modern relationship between measures of entropy and the amount of in-formation in a message. The connecting concept is that of order or organization. In the known universe the number of disorderly arrangements or states is very large in comparison to the number of orderly ones, so disorderly states are inherently more probable than orderly ones. The entropy law is therefore frequently stated as confirming that when a system containing a large number of particles is left to itself, it spontaneously assumes a state of maximum entropy - that is, it becomes as disorderly as possible. So entropy measures not only the most probable but also the most disor-dered state of a system.

And then, Erwin Schrodinger, the physicist, proposed the concept of negative entropy, i.e., en-tropy taken with a negative sign, as a measure of information, the order in a system [4] and Leon Brillouin, showed that “neg-entropy,” as he renamed it, was the same as information - information

(6)

representing a negative term in the entropy of a system [1]. According to Brillouin, before an intel-ligent being can use its intelligence, it must perceive objects, a process involving physical energies (e.g., light).

Information and energy were thus inextricably linked both mathematically and physically, a fact that together with the enormous success of the Shannon-Wiener formula, contributed to the acceptance by scientists of the concept information as a genuine, mathematically definable phe-nomenon with applications across a range of scientific disciplines.

The third major reason for the emergence of the information concept is its role in the meteoric rise of computer sciences from the 1940s onward, and the fact that the Shannon-Wiener measure for the amount of information in a message lies at the heart of electronic computing. This associa-tion, however, has only further muddled the differences between information in the sense of knowledge and its technical use in Shannon-Wiener.

Shannon-Wiener proved ideal for such systems as computers, because the binary choices it measures equate naturally with the two-state (yes/no) switching devices that are the heart of elec-tronic computer systems. Computers, therefore, came to be characterized as processing something measured by H and called information, and also because they seemed to generate knowledge. Upon analysis, however, these reasons are both inaccurate and deceptive. What flows through computers is not information in the knowledge sense, but symbolically coded patterns; and what is read out from them is only sometimes information in the knowledge sense. As we now know, the symbols generated by H could just as easily be nonsense.

Exactly, when in the history of calculating machines the product of their activities became generally referred to as information, is not clear, although it seems to have coincided with the development of the Shannon-Wiener measure and electronic components for computers. Regard-less of when the information terminology was adopted, suffice it to say that for most of the history of calculating technology there was no formal characterizing of what was being calculated as in-formation, a situation that appears to have undergone a radical shift, beginning in the 1940s. From this period onward, the product of computers came to be known as information processing and given their enormous influence throughout society, there seems little doubt that their activities, coupled with the success of the Shannon-Wiener measure and the information-energy connection, contributed substantially to the massive and historic rise of the information concept in twentieth--century science and culture.

There are undoubtedly other reasons for the explosion of the information terminology in sci-ence, many of them probably having to do with ease of usage. Information is a word that can be used to describe any communicative process, whether we understand it or not - an obvious appeal to scientists and nonscientists alike. Whatever the reasons, information has become one of the glamour concepts of twentieth-century science and culture, despite the fact that, contrary to much belief, information as knowledge has not been given a mathematical or scientific definition, and that its true identity as the central ingredient of the communication activities of mass-energy sys-tems remains largely misunderstood and solely in need of clarification.

4. Information as a form in physical and chemical systems

Form (Lat. forma), in general, refers to the external shape, appearance, configuration of an ob-ject, in contrast to the matter or content or substance of which it is composed [6].

Although the Shannon-Wiener formula is not a measure of human knowledge, what it does measure can be interpreted as the amount of patterning or order in a communication channel, and

(7)

therefore, what it measures is in some sense a manifestation of what we call form. In addition, in all information processes in physical, chemical, and biological systems, the information stored, transmitted, or manipulated is identical with one or another of the above definitions of form shape, structure, configuration, pattern, arrangement, order, organization, or relations - so that whatever information is, it appears to be in all senses a form phenomenon.

Physical energies used for information processing include electromagnetic radiation in the transmission of radio and television signals, air pressure (acoustical) waves that produce sound, and traditional photography and holography, both of which involve the interaction of radiation with chemical storage processes. What these events have in common is that the information they manipulate is found entirely in their wave characteristics - amplitude, wavelength, wave form, frequency, phase, or period, and in the physical or chemical patterns they generate in receptive systems.

Regardless of the energy medium, the information generated at each stage of all these proc-esses is found entirely in the vibratory patterns of electrons, electromagnetic waves, speaker dia-phragms, sound waves, or other participating systems, and in the structure and arrangement of chemical molecules in a receptive surface. The only difference between one signal or type of in-formation and another in a given medium is in the form characteristics of the wave disturbances and in the structure and arrangement of the receptive substances involved. All information is coded, transmitted, and received using the same basic mass-energy mechanisms; only the patterns change. What is more, a transfer of energy and/or information can occur in the universe without an actual transfer of matter, without any actual forward movement of the medium itself.

In contrast to most classes of molecules, which are utilized by organisms for energy, some, in-cluding nucleic acids (DNA and at least three types of RNA), hormones, proteins, neurotrans-mitters, and membrane receptors (most of which are proteins), are characterized as informational molecules, transmitting from place to place not quantities of energy, but information or instruc-tions that activate or deactivate energetic reacinstruc-tions [15]. In all known cases, the information ma-nipulated by these systems is embodied in their shape, configuration, conformation, or structural (form) characteristics, some or all of which can undergo dynamic changes and/or can be transmit-ted from one system to another.

According to this model, the information contained in the DNA molecule, which is copied and restated in the next generation of DNA during replication, and recoded in several stages of RNA during transcription and translation, is contained solely in the specific order of the base pairs along the chain - this is, in the structural form characteristics of these molecules - and has little to do with their energetic characteristics per se. Many protein chains are coiled into a helix before being folded into their characteristic configuration and conformation. In all these cases, the information stored or transmitted is found only in the precise structural characteristics of the molecule and in dynamic shifts in these characteristics. So vital are the precise form characteristics of a protein to its biochemical identity and function, that any major change in its geometry generally reduces or destroys its biochemical activity (its information), even if the order of its amino acids along the chain remains unchanged.

The importance of shape and shape changes in the information-processing activities of pro-teins can be illustrated in the activity of enzymes. According to Koshland [8], not only en-zyme/substrate interactions but the biochemical activity of many - perhaps all - regulatory proteins results from their capacity to undergo alterations in shape under external influences, including antibodies, many hormones, some neurotransmitters, and membrane receptors, all of which, by

(8)

undergoing shape changes, may provide the on-off controls characteristic of so many biochemical processes. All these essential regulations rest on the ability of a protein molecule to bend flexibly from one shape to another.

Next, hormones are produced in quantities far too small to supply energy for cellular proc-esses and are thought of as providing instructions or information to cells, usually involving either the acceleration or inhibition of otherwise ongoing energy processes.

Neurotransmitters are essential constituents of the information-processing activity of nerve cells and nervous systems; therefore of brains and minds.

Whether in the copying or delivering of precise structural characteristics, the altering of cru-cially shaped molecular sites, or shifts in an entire geometrical identity, the information manipu-lated by chemical molecules in all cases appears to result not from their energetic characteristics themselves, but from their geometrical or form characteristics, and from the effects of these on receptive systems.

5. Information as a form in biological and nervous systems

The probable basis from which the complex information-processing activities of all living or-ganisms evolved is this inherent capacity for protoplasm itself to receive and conduct signals (irri-tability). Information-processing capability in unicellular organisms appears to have the same character as in all its other biological manifestations, embodied either in the precise patterning of the wave characteristics of electrical excitations and/or in the specific structural or geometrical character of the chemical molecules and anatomical equipment possessed by the organism. The information processed at this level is found only in the specific form characteristics of the energy medium in which the informational activity takes place. We must also recognize that information-processing activities came into existence in living organisms long before evolution produced any-thing remotely resembling a nervous system or brain. This is consistent with the most recent views on systemic intelligence, when it is assumed that an organism doesn’t even need a brain in order to be intelligent. Intelligence is a property that emerges when a certain level of organization is reached which enables the system to process information. The greater the ability to process infor-mation, the greater the intelligence. If a system has the capacity to process inforinfor-mation, to notice and respond, then that system possesses the quality of intelligence [16]. Any entity that has capaci-ties for generating and absorbing information, for feedback, for self-regulation, possesses mind. This approach offers us a means to contemplate systemic intelligence or, in a narrower sense, organizational intelligence.

To sum up, nowhere is there the slightest evidence that information flow between or within living systems is embodied in anything other than the form characteristics of the types of mass-energy involved.

All sensation and perception, emotional and motivational activity, all cognition, volition, and consciousness - in fact, the sum total of mental activity, now generally referred to in psychobiol-ogy and neuroscience as information processing - results from the functioning of nervous systems, aggregates of nerve cells specialized in conduction of electrical excitation together with the secre-tion of chemical substances.

The primary signaling unit of nervous systems and brains is the neuron, a specialized type of cell that derives excitation either intrinsically, from the environment, from sensory receptor cells, or from other neurons, and delivers a representation of it as information, signal, or message to other neurons or to the effector systems of the body (muscles, glands, etc.).

(9)

Neuronal information processing is an electrochemical activity dependent on the selective properties of the cell membrane. Patterns of electrical excitation are generated by changes in the flow of ions (charged atoms) through pores or channels in the membrane, altering their distribution pattern across the membrane surface. These plastic changes are clearly seen in the alterations of patterns of circuitry within a system consisting of otherwise structurally fixed connections, so that a network of neurons can produce alterations in circuitry, in the flow of electrical activity, by functional changes in the synapses that govern neuronal connections. In many cases these altera-tions have been shown to result from the past history of a given synapse; that is, synapses can remember and learn.

Not only are the activities of nerve cells and their aggregates now generally referred to as in-formation-processing events, but most scientists tend to believe that cognitions, emotions, voli-tions, and consciousness - in fact, all mental events - result from the information-processing activ-ity of nervous systems and their components. Information storage and flow, therefore, is not only an essential characteristic of the activity of nervous systems, but is essential to an understanding of mind. Psychobiologist William R. Uttal writes, “It is the pattern of activity and the organization of the paths of information flow, rather than the mechanics of the neural components, that really matter in the representation of mental process by nervous tissue” [14].

In nervous systems we see, albeit at a far more complex level than in other systems, that every instance of what is referred to as information is once again a manifestation of any of a wide variety of form processes. In all cases - whether in the distribution patterns of ions across a membrane surface, the shape and size of membrane channels, the structure of neurotransmitters, membrane receptors and other chemical molecules, the precise anatomical structure of a neuron and its proc-esses, the amplitude and/or frequency patterns of electrical excitation constituting action and other neural potentials, the anatomical patterns of interconnection of neurons, or the functional organiza-tion of neural circuits and aggregates that form nervous systems and brains - the informaorganiza-tion is embodied entirely in form characteristics of the electrical, chemical, and mechanical components of the systems involved. So if our ability to understand mind depends on understanding infor-mation and inforinfor-mation flow in nervous systems, we can expect this to be possible only when we can define information for all its uses in wholly mass-energy terms; and this, in turn, will not be possible until we understand its relationship to something we call form.

6. Information as a form in catastrophe theory

The idea of information as a form phenomenon has been discussed and given some mathe-matical basis in 1966 by mathematician Rene Thom, creator of catastrophe theory, a topological theory that models discontinuous changes that can occur in systems. Topology is a branch of ge-ometry, but whereas Euclidean geometry describes the interactions of static forms - squares, cir-cles, and so on, topology studies the properties of forms that remain invariant under any of a num-ber of transformations, like bending or stretching. Specifically, it is a branch of mathematics that deals with the properties of systems that endure when the system undergoes any of various distor-tions, provided the system remains intact [3].

Catastrophe theory is a topological theory that can describe or model the various ways in which systems undergo sudden and abrupt changes from one state to another; in particular, it de-scribes the process by which previously smooth or continuous changes can suddenly trigger abrupt (catastrophic) changes of state, a slight initial condition suddenly producing a large, discontinuous alteration in the behavior of the system [10].

(10)

According to Thom, the universe of our experience is fundamentally a ceaseless creation, evo-lution, and destruction of forms, which it is the business of science to study and explain. In this view, all living phenomena can be interpreted as manifestations of a geometric object, a life field, similar to the electromagnetic or gravitational field, in which living systems are seen as particles or singularities of this field, and in which information can be treated as a form phenomenon. Thom writes “It is sometimes said that all information is a message, that is to say, a finite sequence of letters taken from an alphabet, but this is only one of the possible aspects of information; any geo-metric form whatsoever can be the carrier of information, and in the set of geogeo-metric forms carry-ing information of the same type the topological complexity of the form is the quantitative scalar measure of the information” [13].

Concluding that the mathematics of communication theory has been misinterpreted, that in-formation is a far more complex phenomenon than scientific interpretations of the Shannon-Wiener measure of the amount of information in a message have acknowledged, proposes that both the energy state and the informational state of a system can be seen to depend on a topologi-cal property of the configuration of the system, “all information is first a form, and the meaning of a message is a topological relation between the form of a message and the eigenforms of the recep-tor (the forms that can provoke an excitation of the receprecep-tor).”

Among his examples are the following:

1. Book + reader: A book can transmit information only when illuminated. Therefore, the intervention of a metabolic field, the light field, is necessary. Reading is seen as a process in which the book undergoes very small deformations, absorbing the energy from the electromagnetic field. These deformations degrade the impinging energy, which is ab-sorbed as heat without affecting the global form of the letters. The resulting energy pat-terns impinge on the eye, the receptor, which is presumed to resonate metabolically with the incoming light forms.

2. Transmission of information by the DNA molecule during cell division: The process is considered a dynamical system with structurally stable duplication. In this duplication, there are “spectral elements” or singularities, which correspond to the chromosomes and molecules of nucleic acid. Self-reproduction is interpreted as a change of structure (geo-metrical or chemical) of these spectral elements, and it is Thom’s view that this is the only sense in which DNA can be called the support of genetic information.

3. DNA molecules involved in the metabolism of small molecules: Thom suggests that en-ergy from the system’s metabolism excites, by resonance, “a geometricochemical defor-mation” of a part of the chromosome, these form changes not only liberating energy to sustain their own action, but the end products also constituting the informational activity of the molecule.

What is of value here is the realization that information can be interpreted to some extent mathematically as a form phenomenon, a conclusion in keeping with previous evidence that in-formational activity in all mass-energy systems appears to be embodied entirely in flows of form and form characteristics.

It is clear, however, that information and form are not simply equivalent. While all informa-tion processes appear to be form activities, the reverse is not true. Many form processes involved in mass-energy interactions have nothing to do with what we normally call information. Radiation patterns above certain frequencies will destroy living tissue and are therefore obviously not in-formational (they may be considered energetic). Most chemical molecules (e.g., carbohydrates) are

(11)

used for their energy content alone, and are therefore also examples of mass-energy interactions involving forms and form characteristics in noninformational roles.

7. Conclusion

The concepts presented in this article are sure to provoke a number of new questions. If in-formation is a form phenomenon, what type of form is it? If it isn’t always and everywhere the exact equivalent of form, what are the differences? What is the relationship between information, form, and energy? Can information and form be defined in entirely mass-energy terms? What are knowledge and mind interpreted in mass-energy terms?

What we tried, and, hopefully, managed to show in this study, is that a redefinition of form, and subsequently information, both heretofore considered abstract concepts, is possible. Form and information can be viewed as entirely mass-energy phenomena. This enables it to be made clear that the fundamental creative and control mechanisms of the universe lie in its use of forms as information, and that information must be viewed as a flow of mass-energy forms. It follows that this conclusion can be extended into the realm of living organisms by proposing that mental events - knowledge, emotions, volitions, consciousness, and mind itself - all heretofore viewed as abstract informational events - can be understood as flows of mass-energy forms. Mind and knowledge, in all their manifestations, whether human or other, can then be viewed as mass-energy, form-manipulating processes. Last but not least, the universe as a total mass-energy system will be seen to be able to exercise its creative, control, and communicative functions by manipulating forms of itself (e.g., structures, patterns, arrangements, etc.), many of which remain constant over multiple inward and outward mass-energy flows, enabling us to view ourselves, finally, as forms of a self-organizing, self-regulating mass-energy universes joined to the rest of the world in a natural and fundamental way.

Bibliography

1. Brillouin L. (1956/2004): Science and Information Theory, Mineola, N.Y.: Dover. 2. Clausius, Rudolf (2007). In Encyclopædia Britannica. Retrieved March 26, 2007 from

Encyclopædia Britannica Online: http://www.britannica.com/eb/article-9024255 3. Crossley M.D. (2007): Essential Topology. Springer.

4. Dronamrajua K.R. (1999/2007): Erwin Schrödinger and the Origins of Molecular Biol-ogy. In Genetics, Vol. 153, 1071-1076, November 1999. Retrieved March 26, 2007 from: http://www.genetics.org/cgi/content/full/153/3/1071#Negative_entropy

5. Eidos (2007). In “Britannica Internet Guide Selection: Philosophy Pages.” Retrieved March 19, 2007 from: http://www.philosophypages.com/dy/e.htm

6. Form (2007). In Wikipedia: The Free Encyclopedia. Retrieved March 26, 2007 from Wikipedia: The Free Encyclopedia: http://en.wikipedia.org/wiki/Form

7. Information (2007). In Wikipedia: The Free Encyclopedia. Retrieved March 26, 2007 from Wikipedia: The Free Encyclopedia: http://en.wikipedia.org/wiki/Information 8. Koshland D. E. (1958/2007): Application of a Theory of Enzyme Specificity to Protein

Synthesis. Proc. National Academy of Science 44 (2): 98-104. PMID 16590179. Re-trieved March 26, 2007 from: http://www.pnas.org/cgi/reprint/44/2/98

9. Salthe S.N. (2001/2007): The Mutual Implication of Physical and Informational Entro-pies. Retrieved March 26, 2007, from: http://www.nbi.dk/natphil/salthe/index

(12)

10. Saunders P.T. (2004): An Introduction the Catastrophe Theory. Cambridge University Press.

11. Shannon C.E. (1948/2007): A Mathematical Theory of Communication, Bell System Technical Journal, 27, pp. 379-423 & 623-656, July & October, 1948. Retrieved March 26, 2007 from: http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf 12. Shannon C., Weaver W. (1949/1998): The Mathematical Theory of Communication. The

University of Illinois Press. Urbana.

13. Thom R. (1989): Structural Stability and Morphogenesis: An Outline of a General The-ory of Models. Addison Wesley Publishing Company

14. Uttal W.R. (1978): Psychobiology of mind. John Wiley & Sons Inc. 15. Watson J.D. (2004): DNA. Arrow.

16. Wheatley, M. (2005). Finding Our Way: Leadership for an Uncertain Time. San Fran-cisco: Berrett-Koehler.

17. Wiener N. (1948/1965): Cybernetics: or control and communication in the animal and the machine. The M.I.T. Press and John Wiley & Sons, Inc., New York.

18. Young P. (1987): The Nature of Information. Praeger Publishers, New York

Jacek Unold

Networking, Operations and Information Systems Boise State University

1910 University Drive Boise, USA 83725 Jacek.unold@ae.wroc.pl

Cytaty

Powiązane dokumenty

na stronie internetowej naszego pisma dostępne są archiwalne numery „Przeglądu Rusycystycznego” w postaci plików pełnotekstowych. Redakcja nie zwraca materiałów niezamówionych

By iden- tifying the positions of Russia and Poland in the world rankings, first of all according to the indicators characterizing the use of modern information and

Point 3 is another meeting with Marx’s “old” theory of economy and its conceptual apparatus: the notion of capital migration and concentration is used to present the changes

Założyciel zgromadzeń bezhabitowych, mieszkający wówczas w Zakroczymiu, interesował się składem osobowym zgromadzeń, pragnął sam poznać wybitniejsze zwłaszcza

8 Elastyczność przedsiębiorstwa rozumiana jest, jako jego zdolność adaptacyjna, wśród wielu czynników warunkujących tę zdolność wymienia się akceptację zmian

W rozwiązaniu założyliśmy dodatkowo stabilność orbity Proximy wokół α Cen oraz orbity planety wokół Proximy, co w skali miliardów lat nie musi być spełnione, natomiast

We assessed 3D T1-weighted scans from participants with MCI and AD from the Rotterdam Study and ADNI database and a sample of the APOE ε4 allele carriers from the healthy

WAVES CALM WATER POWER SHIP MOTIONS SCREW CHARACTERISTICS MAX POWER DESIRED POWER SUSTAINED SPEED RESISTANCE INCREASE THRUST AND TORQUE FLUCTUATIONS CAVITATION SCREW RACING