Stevan Harnad has defended Searle's argument against Systems Reply critics in two papers. Understanding doesn't magically exist elsewhere in the motor cortex or the optic nerve or some other non-language-processing part of the brain. But I still find that your Chinese room is not analoguous to computers, as you claimed. Searle argues that however the program is written or however the machine is connected to the world, the mind is being simulated by a simple step-by-step digital machine or machines. Fetzer, James H 1990 , Artifical Intelligence: Its Scope and Limits, Kluwer Academic Publishers, Netherlands. In Dimensions of mind, ed. I am the robot's homunculus, but unlike the traditional homunculus, I don't know what's going on.
Berkeley philosopher John Searle introduced a short and widely-discussed argument intended to show conclusively that it is impossible for digital computers to understand language or think. The problem is not that a computer cannot be conscious or understand, it may well have intentionality one day, but saying it in terms that we ascribe to humans, are giving computers anthropomorphic qualities. Objections Chauvinism- based of the sort of information processing difference that exists Aunt Bertha Machine is impossible to build- need more strings than particles in the universe Block argues its possibility proves false the neo-Turing Test conception of intelligence Main aim is to reconcile two claims that can seem to be in tension: 1- As Block's Aunt Bertha machine shows, having a capacity to pass the Turing Test is not logically sufficient for being intelligent. New York, Heidelberg, Berlin: Springer Verlag. At the level of the synapses, what would or could be different about the program of the computer and the program of the Chinese brain? Perhaps other physical and chemical processes could produce exactly these effects; perhaps, for example, Martians also have intentionality but their brains are made of different stuff. You can visit him online at Doug Enterprises.
Proceedings of the Royal Society of London B 205:307-22. The brain simulator reply Berkeley and M. The focus belongs on the program's Turing machine rather than on the person's. And the connection between minds that transfers meaning through behavior is real. The reasons we find it natural are, roughly, two: we can't make sense of the animal's behavior without the ascription of intentionality and we can see that the beasts are made of similar stuff to ourselves -- that is an eye, that a nose, this is its skin, and so on. Going out of the room means essentially the same as changing the causal relations between the person and his environment. On the basis of these two assumptions we assume that even if Schank's program isn't the whole story about understanding, it may be part of the story.
Journal of Verbal Learning and Verbal Behavior 1: 319-32. They enable me to correlate one set of formal symbols with another set of formal symbols, and all that 'formal' means here is that I can identify the symbols entirely by their shapes. As far as the Chinese is concerned, I simply behave like a computer; I perform computational operations on formally specified elements. In any case, Searle's short reply to the Other Minds Reply may be too short. Paper presented at the Society for Philosophy and Psychology, Ann Arbor, Mich.
What is it like to be a bat? The argument counts especially against that form of functionalism known as the Computational Theory of Mind that treats minds as information processing systems. The internalist approaches, such as Schank's and Rapaport's conceptual representation approaches, and also Conceptual Role Semantics, hold that a state of a physical system gets its semantics from causal connections to other states of the same system. Imagine, if you will, a Chinese gymnasium, with many monolingual English speakers working in parallel, producing output indistinguishable from that of native Chinese speakers: each follows their own more limited set of instructions in English. When you are sitting in the room and getting meaningless input, you can't even know whether it is language or something else. Leading the opposition to Searle's lead article in that issue were philosophers Paul and Patricia Churchland. We can't know the subjective experience of another entity….
And the mental-nonmental distinction cannot be just in the eye of the beholder but it must be intrinsic to the systems; otherwise it would be up to any beholder to treat people as nonmental and, for example, hurricanes as mental if he likes. A computer might have propositional attitudes if it has the right causal connections to the world—but those are not ones mediated by a man sitting in the head of the robot. This discussion includes several noteworthy threads. If we are to conclude that there must be cognition in me on the grounds that I have a certain sort of input and output and a program in between, then it looks like all sorts of noncognitive subsystems are going to turn out to be cognitive. A human mind is considered as a piece of software which the human brain implements. Hudetz 2012, a review article. Searle accepts mechanism and rejects computationalism.
Fodor's semantic externalism is influenced by Fred Dretske, but they come to different conclusions with regard to the semantics of states of computers. If Searle's programmed activity causes Otto's artificial neuron to behave just as his disabled natural neuron once did, the behavior of the rest of my nervous system will be unchanged. There isn't anything at all to the system that he does not encompass. In: Cerebral correlates of conscious experiences, ed. But, in this thought experiment we are to imagine that the only person inside the room understands no Chinese and speaks only English. Searle's job is to look through the books until he finds the string of symbols that look exactly like the ones written on the piece of paper. I Now where is the understanding in this system? A program is a sequence of instructions in a formal language.
The Many Mansions Reply The Many Mansions Reply suggests that even if Searle is right in his suggestion that programming cannot suffice to cause computers to have intentionality and cognitive states, other means besides programming might be devised such that computers may be imbued with whatever does suffice for intentionality by these other means. To the extent that intentionality is subjective, we must examine these states in any machine as we would in any person, and this, too, is just an alteration of the Turing test. Translating and understanding are not separable in the brain. Searle correctly characterizes this as a formal symbol processing relationship wherein the Searle-computer manipulates the symbols purely syntactically, according their shapes alone, without doing any subjective interpretation of them. Most people don't think that the chinese room would necessarily be a false positive. Now suppose further that after this first batch of Chinese writing I am given a second batch of Chinese script together with a set of rules for correlating the second batch with the first batch.