AI: The Technological Trickster

By ai-depot | October 26, 2002

Other Possible Differences in Machine and Evolved Thought

In the brain, “…the code…is changed from chemical, to electrical, then back again to chemical as it moves from one nerve cell to another.” In addition, “…the signal may inhibit the neuron, and make it less likely to fire.” (Campbell, p. 191) Even with variable capacitors and parallel processing architectures, there is likely a very different processing of information in the binary codes and electrical processes of a computer.

We may assume that “…how the brain creates individual experience is that each of us processes our experiences differently, in terms of representational systems…those internal sense media that we use to experience the world…that mimic the brain’s input sensors.” (Pimental and Teixeira, p. 146) The brain works on total impressions with partial knowledge. The philosophy of science teaches us there are two types of knowledge, theoretical knowledge which is about everything and empirical knowledge which is about only sense experiences. Current feeling is that both types of knowledge must coincide to some extent in order to be considered real. It helps if hypotheses developed in the extension of our knowledge are testable as well. This metaphor has been used to explain how the human brain may work.

Regardless of the ability of the scientific paradigm to explain thought, “The brain’s strong point is its flexibility. It is unsurpassed at making shrewd guesses and at grasping the total meaning of information presented to it.” (Campbell, p. 190)

As Gregory Paul and Earl Cox have noted, “…our brains have not been subjected to the competition of other brains of similar performance…” (Paul and Cox, p. 195) as far as we know until recent history. In some respects, computers are doing exactly that. People have been replaced by machines in the work force to a greater extent than ever before, many from jobs of which they considered thinking to be an integral part, such as drafting, secretarial, even medical diagnosis and analysis of results of machines in chromatography, mineralogy and particle acceleration, among others.

While Laudauer notes “…computing has… pushed forward the replacement of human motor skills with more accurately controlled mechanical ones… in simple acts of arithmetic and filing,” there have also been areas in what many consider thought encroached by the computer. Thus, AI has proved “…chess machines can sometimes outplay grand masters, isolate words and handwritten characters can be recognized… with fair accuracy, artificial visual control can guide smart bombs… [but] language processing continues to be more difficult… truck driving is far too difficult…tutoring involves mysteries yet to be cracked…”, (Laudauer, p. 142, 152�153).

Further, while computers are very fast at manipulating information, they are not so accurate at some things the evolved brain is quite good at. In a nutshell, “Providing the right information has proven much more difficult than providing lots of it.” (Laudauer, p. 175)

Basically, in industrial applications, computers have been used successfully to increase productivity in four major ways: 1) They reduce unnecessary and duplicate work: important in almost all industries, especially banks, hospitals, insurance firms, stock markets, etc. 2) They coordinate and synchronize: especially in banks and airlines. 3) They support high-productivity products and services: in libraries, etc. and 4) They help individuals work more smoothly: engineers, writers and publishers, etc. (Laudauer, p. 234) They seem to be much more appropriate as an aid to the human brain rather than a replacement for it. Most of the ‘thinking’ tasks a computer is good at seem to be precisely those most boring and repetitious tasks humans do not like or are not well prepared to undertake with a processing organ that requires some amount of variation, excitement, romance, or whatever.

Another aspect of brain function that appears to be different from computer functioning is that the brain is apparently able to override it’s conditioning or programming from either a higher or lower order of complexity, depending on which ’subprocessors’ are considered to be influencing which processors. Hunger, a level of complexity we apparently share with the amoeba, may be sufficient to override conditioning against cannibalism or eating grubs. It is difficult, perhaps impossible, to think of a computer overriding its programming from a lower order of complexity, an hypothesis that may be open to empirical testing.

How May Meaning Be Derived from a Physical System, Whether Organic or Mechanical?

How could meaning come about from physical complexity?

John Eccles has suggested that “…some fields, such as the probability field of quantum mechanics, carry neither energy or matter…” Further, the “…mind may be regarded… as a non-material field, its closest analogue…a probability field.” Noting that the number of interactions and sizes of operants in the brain fall well within the parameters necessary for quantum effects, he states that “….calculation on the basis of the Heisenberg uncertainty principle shows that a vesicle of the presynaptic vesicular grid… could conceivably be selected for exocytosis by a mental intention acting analogously to a quantum probability field.” (Eccles, p. 187, 189, 190-191)

While this is no more than a working hypothesis, it does formulate the question of where meaning comes from in an empirically testable form. Studies of primitive brains suggest “…presynaptic vesicular grids in the synapses of the mollusk…” may indicate a mechanism for “…anticipatory evolution… utilized, according to the microsite hypothesis, in mind-brain interaction whereby animals became conscious.” (Eccles, p. 193) Some other researchers have begun to explore this avenue, notably “Penrose…[and] Hameroff… propose that microtubules in neurons are the sites of quantum interactions with molecules.” (Paul and Cox, p. 176) The search for the site of the interaction between symbol and meaning is of some importance.

As I’ve suggested earlier, the problem with this approach, aside from the obvious ambiguity of searching for testable hypotheses in the quagmire of quantum reality, is the matter-to-meaning direction of analysis. Meaning may precede the material complexity that allows symbolization to take place. The brain or mind may be adding symbolization to meanings that are already extent in the world, or at least in the electro-chemical interactions of the brain. An evolutionary view suggests meanings were and are probably existent prior to language development, or the ability to code those meanings using symbolization. Nevertheless, “…information must be formed out of the stimuli in terms of individual needs and desires.” (Rosenfield, p. 120-121) It is difficult to understand how these might be programmed into a computer, regardless of the amount of manipulation of symbols of which the machine is capable.

At the SFU Conference on Cognition Dellarosa told about the female chimp who sat on the provisions given her by the anthropologist until the dominant male learned to push her away to retrieve the food. Then she learned to sit some distance away from the food, until the dominant male learned to circle around in increasingly wider circles until he found it. It’s quite difficult to understand how such behaviour might evolve without the realization of other minds as well as self-awareness, as well as the ability on the part of both brains to understand from a variety of perspectives. Metaphor and symbolization may have arisen from this ability, so necessary to the survival of a hunting and scavaging human or prehuman.

Zenon Pylyshyn, in his reply to John Searle’s “Chinese room” thought experiment, notes that “…a symbol acquires meaning through the way it influences the behaviour of the system as a whole.” (Waldrop, p. 138-139) This definition assumes a total world within which symbolization takes place, a world that is necessary to provide meaning to symbols.

In the evolution of information processing: Stephen Gould’s punctuated equilibrium model seems to fit the hypotheses: In a premature attempt to reduce the communication revolution to an exponential sequence, we might say: Spoken language (50,000 Bp or earlier —> Written language (5000 Bp) —> Printing press (500 Bp) —> Hypertext (50 Bp, from today, not 1950), with concomitant upheavals in each new paradigm that have served to redefine humans and the relationship between us and our world. At each level, increased quantization has indeed led to a qualitative difference in our definitions of ourselves. At least so it seems to our subjective selves.

The alphabet and printing press suggest to Pamela McCorduck that “…technologies of the intellect… [are] the concern that separates us… from the other species” and that just “As the microscope revealed the underlying structure shared by all living organisms, the cell, so the computer is revealing the underlying structures of all symbols.” (McCorduck, p. 3, 16)

There must nevertheless be a context to provide any realistic picture of what we are describing when we say we symbolize and process information by so doing. “Information is information, not matter or energy.” (Waldrop, p. 20) It is all the more important that it be studied in interaction rather than in isolation. “Meaning is not a… fixed quality, but one which words… acquire in use…” and “Metaphor provides the means by which words are elevated to living things…” (Hawkes, p. 53, 58).

Our study assumes a definition of information that allows us to manipulate it through measures of entropy and redundancy within the system, with probability as the main tool and metaphor as the glue holding the whole thing together. Symbolization and codes may be understood as the information carrying structures in a context of ‘noise’ modified by the brain or the computer. There are then several ways we may proceed within these parameters to evaluate what is taking place within either system, and to discover how these two systems differ in the ways they process information. How we interpret the results may serve to create the next phase of the communication revolution that began with the discovery of speech, and that caused the Raven so much discomfort.

“It is human diversity, not just common goals, that has given rise to culture.” In other words, the “…central conception in Darwinian thought is that variations in populations occur from which selection may take place. It is the variation…that is real.” (Rosenfield, p. 127, 167) Meaning and metaphor are, as Plato noted in his presumed distrust of poetry and poets, variable entities that live in the realm of information rather than in the scientifically measurable realm of matter and energy. They depend on some means of representation, such as we find in the brain and in the computer, but they also require a world, an interaction with other meanings and metaphors which may be grounded in the world of matter/energy. And it must be able to change as information is added and modified.

As Weizenbaum has noted, “…truth is not equivalent to formal provability…” and “…scientific demonstrations, even mathematical proofs, are fundamentally acts of persuasion.” (Weizenbaum, p. 15, 222) We don’t make a great deal of headway when we assume machines are thinking on empirical evidence not backed by a sound theoretical basis. And there are dangers in assuming a particular metaphor is correct, or will always be accepted as correct for a given action. Calling a computer and electronic brain doesn’t mean it thinks.

McCorduck believes “…the computer has permitted an explanation …at a scientific level of what symbols are.”(McCorduck, p. 76) We can make some quite sound predictions about how manipulating symbols will change other symbols and even whole systems. On the other hand, there is the danger Weizenbaum has noted that we will develop “…a computing system that permits…only certain kinds of questions…” (Weizenbaum, p. 38), and make the mistake of assuming it will answer all our questions, as we hope the brain will do eventually. Yet as Waldrop has said, “…for the first time we are having to explain ourselves to an entity that knows nothing about us.” (Waldrop, p. 128) The exercise can expand our understanding if we are careful to analyze our results in the light of a total experience rather than confining them to the narrow realm of some particular specialty.

The dovetailing elements of the emerging paradigm, relativity and quantum mechanics in physics, punctuated evolution in the human sciences and virtual reality in the artistic/spiritual realm, can be as exciting as a mine field in disneyland. Great possibilities can be lost with a single step.

Bibliography

Baer, Robert M. The Digital Villain. Reading, Mass; Addison-Wesley. 1972.
Casti, John L. Paradigms Lost. New York; William Morrow. 1989.
Calvin, William H. The Cerebral Symphony. New York; Bantam. 1990.
Campbell, Jeremy. Grammatical Man. New York; Simon & Schuster. 1982.
Chomsky, Noam. Language and Mind. New York; Harcourt, Brace. 1968.
Chomsky, Noam. Language and Responsibility. New York; Pantheon. 1977.
Crane, Tim. The Mechanical Mind. London; Penguin. 1995.
Dreyfus, Hubert R. What Computers Can’t Do. New York; Harper & Row. 1979.
Eccles, John C. Evolution of the Brain. London; Routledge. 1989.
Fischler, Martin A. and Firschstein, Oscar. Intelligence, The Eye, the Brain and the Computer. Reading, Mass; Addison Wesley Publishing. 1987.
Furst, Charles. Origins of the Mind. Englewood Cliffs; Printice-Hall. 1979.
Gatlin, Lila L. Information Theory and the Living System. New York; Columbia Press. 1972.
Goldenson, Robert M. (ed). Longman Dictionary of Psychology and Psychiatry. New York; Longman. 1984.
Gregory, Paul S. and Cox, Earl. Beyond Humanity. Rockland, Mass; Charles River Media. 1996.
Hartnell, Tim. 1986. Exploring Artificial Intelligence on Your IBM PC. New York; Bantam. 1986.
Hawkes, Terence. Metaphor. London; Methuen & Co. 1972.
Jaynes, Julian. The Origin of Consciousness in the Breakdown of the Bicameral Mind. Boston; Houghton-Mifflin. 1976.
Kent, Ernest W. The Brains of Men and Machines. Peterborough, NH; BYTE/McGraw-Hill.1981.
Landauer, Thomas K. The Trouble with Computers. Cambridge, Mass; MIT Press. 1995.
Lenneberg, Eric H. Biological Foundations of Language. Malabar FA; Krieger Publishing. 1984.
Lycan, William. (ed). Mind and Cognition: A Reader. 1997.
McCorduck, Pamela. The Universal Machine. New York; McGraw-Hill.1985.
Paul, Gregory S. and Cox, Earl. Beyond Humanity. Rockland, Mass; Charles River Media. 1996.
Pimental, Ken and Teixeira, Kevin. Virtual Reality. New York. McGraw-Hill. 1993.
Posner, Michael I. and Raichle, Marcus E. Images of Mind. New York; Scientific American Library. 1994.
Rosenfield, Isreal. The Invention of Memory. New York; Basic Books. 1988.
Sacks, Sheldon. (ed). On Metaphor. Chicago; University of Chicago Press. 1979.
Searle, John. Minds, Brains and Science. Cambridge; Harvard Press. 1984.
Shannon, Claude E. “Prediction and Entropy of Printed English” Bell Syst Tech Jour 30(1): 50-64. 1954.
Stam, James H. InQueries into the Origin of Language. New York; Harper & Row. 1976.
Sternberg, Robert J. Metaphors of Mind. Cambridge; Cambridge University Press. 1990.
Stich, Stephen P. (ed). 1975. Innate Ideas. Berkeley; University of California Press. 1975.
Wagman, Morton. The Sciences of Cognition. Westport, Conn; Praeger. 1995.
Waldrop, M. Mitchell. Man Made Minds. Rexdale, Ont; John Wiley & Sons. 1987.
Weizenbaum, Joseph. Computer Power and Human Reason. San Francisco; W.H. Freeman. 1976.

Written by Jim Erkiletian.

Pages: 1 2 3 4 5 6 7 8

Tags: none
Category: essay |

Comments