SEARLE AND FUNCTIONALISM. Is the mind software?

Similar documents
SEARLE AND FUNCTIONALISM. Is the mind software?

Functionalism. (1) Machine Functionalism

The Limits of Artificial Intelligence

Functionalist theories of content

Eliminative materialism

Is integrated information theory viable as a theory of consciousness? George Deane

Comments on David Rosenthal s Consciousness, Content, and Metacognitive Judgments

Commentary on The Erotetic Theory of Attention by Philipp Koralus. Sebastian Watzl

Artificial Intelligence: Its Scope and Limits, by James Fetzer, Kluver Academic Publishers, Dordrecht, Boston, London. Artificial Intelligence (AI)

14 The Nature of Reality

Artificial intelligence (and Searle s objection) COS 116: 4/29/2008 Sanjeev Arora

Spectrum inversion and intentionalism

The Mind-Body Problem: Physicalism

Materialism and the Mind and Body Problem:

Consciousness II. Mechanical Consciousness. Carlotta Pavese

Our previous accounts of perceptual experience accepted the phenomenal principle:

Stances on the Relations of Psychology to the Brain

AI and Philosophy. Gilbert Harman. Thursday, October 9, What is the difference between people and other animals?

Representational Content and Phenomenal Character

Lecture 14. Functionalism

Explaining an Explanatory Gap Gilbert Harman Princeton University

On A Distinction Between Access and Phenomenal Consciousness

AI and Philosophy. Gilbert Harman. Tuesday, December 4, Early Work in Computational Linguistics (including MT Lab at MIT)

Perception Lie Paradox: Mathematically Proved Uncertainty about Humans Perception Similarity

PHENOMENAL CONSCIUOSNESS: QUALIA & QUINNING QUALIA. Prof. Rajakishore Nath, Department of Humanities & Social Science, IIT Bombay

Skepticism about perceptual content

CHAPTER 3 FUNCTIONALISM, COGNITIVISM AND CONSCIOUSNESS

What is Down syndrome?

The scope of perceptual content, II: properties

Chapter 1. Dysfunctional Behavioral Cycles

How to reach Functionalism in 4 choices (and 639 words)

U. T. Place, Is consciousness a brain process?

Paradoxes of personal identity: teletransportation, split brains, and immaterial souls

Appearance properties

Phenomenal content. PHIL April 15, 2012

Stances on the Relation of Psychological States to the Brain

Phil 490: Consciousness and the Self Handout [16] Jesse Prinz: Mental Pointing Phenomenal Knowledge Without Concepts

AGENT-BASED SYSTEMS. What is an agent? ROBOTICS AND AUTONOMOUS SYSTEMS. Today. that environment in order to meet its delegated objectives.

Subliminal Messages: How Do They Work?

The Conscious Mind. - What is Mind? -No matter. -What is Matter? -Never mind. -Homer Simpson. Misha Sokolov B.A., M.Cog.Sci.

A High Resolution Picture of the Observer Questions and Answers

Fodor on Functionalism EMILY HULL

What is the relationship between the mind and the brain?

A Level Sociology. A Resource-Based Learning Approach

Minds or Machines. John Beloff (1988) Intro to Philosophy Professor Douglas Olena

A Difference that Makes a Difference: Welfare and the Equality of Consideration

Processing of Logical Functions in the Human Brain

The Wellbeing Course. Resource: Mental Skills. The Wellbeing Course was written by Professor Nick Titov and Dr Blake Dear

Class 21 - April 13 Functionalism

Helping Your Asperger s Adult-Child to Eliminate Thinking Errors

Supplementary notes for lecture 8: Computational modeling of cognitive development

Views of autistic adults on assessment in the early years

The 6 Vital Keys to Turn Visualization Into Manifestation

Self-harm in social care: 14 key points

Why Is It That Men Can t Say What They Mean, Or Do What They Say? - An In Depth Explanation

The Nature of Consciousness Handout [6] William Lycan: Consciousness as Internal Monitoring

SWINBURNE S NEW SOUL: A RESPONSE TO MIND, BRAIN AND FREE WILL

Fahrenheit 451 Comprehension Questions

Beattie Learning Disabilities Continued Part 2 - Transcript

Class #22 - Functionalism Fodor, The Mind-Body Problem

Controlling Worries and Habits

CALM YOUR STRESS AWAY. For free health and spiritual gifts and goodies go to

Jerry R. Hobbs. SRI International. Menlo Park, California. rules operating at a symbolic level intermediate between neurophysiology

Behaviorism: An essential survival tool for practitioners in autism

Consciousness and Intrinsic Higher- Order Content

Zen and the Art of Explaining the Mind

INSTRUCTIONS FOR THE AD8 DEMENTIA SCREENING INTERVIEW (10/22/2015) (ADS, VERSION 1, 4/29/2015)

You re listening to an audio module from BMJ Learning. Hallo. I'm Anna Sayburn, Senior Editor with the BMJ Group s Consumer Health Team.

Is Cognitive Science Special? In what way is it special? Cognitive science is a delicate mixture of the obvious and the incredible

Healthy Communities Conference Ana Diez Roux 1. Okay, good afternoon. It s a pleasure to be here. I guess by, I

Delirium. Script. So what are the signs and symptoms you are likely to see in this syndrome?

the problem with 'mental illnesses' by Tio

Market Research on Caffeinated Products

The Standard Theory of Conscious Perception

Step One for Gamblers

My Review of John Barban s Venus Factor (2015 Update and Bonus)

A Direct Object of Perception

COMP329 Robotics and Autonomous Systems Lecture 15: Agents and Intentions. Dr Terry R. Payne Department of Computer Science

24.500/Phil253 topics in philosophy of mind/perceptual experience

"PCOS Weight Loss and Exercise...

A Guide to Help You Reduce and Stop Using Tobacco

5 MISTAKES MIGRAINEURS MAKE

Chapter 2 The Computational Mind

Papineau on the Actualist HOT Theory of Consciousness

Intentionality. Phil 255

SHARED DECISION MAKING WORKSHOP SMALL GROUP ACTIVITY LUNG CANCER SCREENING ROLE PLAY

Detective Work and Disputation

How to not blow it at On Campus Mock Trial Competition. with Tyler

UNDERSTANDING MEMORY

Gold and Hohwy, Rationality and Schizophrenic Delusion

Interview with Dr. Sara Lazar

An Escalation Model of Consciousness

Stay Married with the FIT Technique Go from Pissed off to Peaceful in Three Simple Steps!

5 Quick Tips for Improving Your Emotional Intelligence. and Increasing Your Success in All Areas of Your Life

Social-semiotic externalism in Can robot understand values?

Consciousness and Theory of Mind: a Common Theory?

The Fallacy of Taking Random Supplements

Helping the smoker decide to quit

Bending it Like Beckham: Movement, Control and Deviant Causal Chains

People & Roles. An Interdisciplinary Dialogue. Hearing the Voice

Transcription:

SEARLE AND FUNCTIONALISM Is the mind software? 1

Materialist Theories of the Mind The Identity Theory Each type of mental property is identical to a certain type of physical property. E.g. pain is just stimulation of the C-fibres. Functionalism A given mental state (e.g. pain) depends on the software not the hardware. Pain can be realised in many different physical ways. Eliminative Materialism Traditional mental states, like beliefs and desires, do not exist. 2

Functionalism and AI AI (Artificial intelligence) tries to design computer programs that will perform mental tasks of some kind. The whole idea of AI assumes functionalism. Functionalism says that the mind is software, not hardware. In his Chinese Room paper, Searle attacks functionalism. 3

Weak AI: The programmed computer simulates the mind. It is a research tool, for testing psychological explanations. It can perform tasks that require thought in humans. Strong AI: The programmed computer is a mind. The computer really understands and is conscious. 4

No one questions the possibility of weak AI, but strong AI is controversial. Searle focuses on intentionality in this paper. But conscious intentionality is probably an even bigger problem for computer programs. Intentionality = meaning, significance, or aboutness, understanding 5

Intentionality Random marks on a piece of paper have no meaning. They are not about anything. But the words the moon has no atmosphere are about the moon. They have a meaning. They somehow connect with an external state of affairs. But the real source of the intentionality is the mind that understands the sentence, and thinks the thought (proposition). Without that, the sentence is just a set of marks on paper. Searle doubts that computer programs can have intentionality in this sense. 6

Do chatbots have intentionality? Roger Shank s program can answer questions about restaurants. It has a representation of general knowledge about restaurants. Does it understand the story, the questions, and its own responses, however? (Similar to programs like Elbot, Cleverbot, igod, today.) 7

Functionally equivalent Imagine two black boxes (you can t see what s inside). Each box has buttons labelled A, B, C, and a red, green and a blue light. Suppose you press buttons ABCA on one box, and the green and blue lights turn on. You try the same ABCA on the other box, and the same thing happens. 8

N.B. a black box description of a system specifies its output (response) for every possible input (stimulus). 9

Functionally equivalent So you try other inputs, and get various outputs. But the two boxes always react in the same way as each other. When the two boxes are given the same input, they always give the same output. In that case they re functionally equivalent. 10

If you open the boxes and look inside, must they be exactly the same inside as well? No. E.g. two calculators both give the output 4 for the input 2+2=, and so on, but the calculators might have very different circuitry inside. 11

Functionalism: Functionally equivalent mentally equivalent I.e. if two systems are functionally equivalent (same outputs for the same inputs) then they re mentally equivalent (same consciousness, intentionality, etc.) 12

The Stanford Encyclopedia of Philosophy: Functionalism is the doctrine that what makes something a thought, desire, pain (or any other type of mental state) depends not on its internal constitution, but solely on its function, or the role it plays, in the cognitive system of which it is a part. More precisely, functionalist theories take the identity of a mental state to be determined by its causal relations to sensory stimulations, other mental states, and behavior. 13

Mental states are black boxes It doesn t matter what s going on inside. The mental state is whatever it is that is turning input experiences and other mental states into behaviour. 14

Programs and functional equivalence Two computers with different architecture can run the same program. (E.g. Mac can run Windows.) Computers running the same program are functionally equivalent. (Why is this?) If functionalism is true, then mental states are just a matter of the program that is running. 15

How chatbots work script general information that the virtual person has (e.g. people won t usually eat a badly burned hamburger). story e.g. the waiter brings a man a burned hamburger and he storms out without paying program Some very complicated rules, based on dissecting the sentences, seeing formal (structural) relationships, reordering words, substituting terms, etc. 16

Program instructions are things like: To any question of the form: Do you like [X]? reply: Yes, I like [X] if X is a member of the set {chocolate, money, fast cars, philosophy, }, but reply No, I don t like [X] if X is a member of {the smell of a wet dog, watching Glee, }, and reply I m not sure, I don t know what [X] is if X is on neither list. 17

questions e.g. Did the man eat the hamburger? responses e.g. No, he didn t eat the hamburger 18

Suppose that the chatbot s answers to the questions are convincing, as good as those of a real English speaker. Does the computer understand English? No. Not a word of it, says Searle. Is Searle right about this? 19

Chinese Room Thought Experiment To show this, Searle imagines that he himself does the job of the computer, obeying the chatbot program s commands. Searle is in a room with a script, a story, some questions and a program. The trick is that the script, story, questions and answers are all in Chinese, a language that Searle doesn t speak at all. (The program is in English, so Searle understands that.) 20

Suppose that the answers to the questions are convincing, as good as those of a real Chinese speaker. Does Searle understand Chinese? No. Not a word of it, says Searle. He has no idea what any of the questions, or his answers, mean. He is just cutting and pasting symbols, according to rules. 21

1. As regards the first claim, it seems to me quite obvious in the example that I do not understand a word of the Chinese stories. I have inputs and outputs that are indistinguishable from those of the native Chinese speaker, and I can have any formal program you like, but I still understand nothing. (p. 351) 22

In other words, meaning does not arise from running a program, no matter how sophisticated. (Searle argues) 23

Clarifications 1. What is understanding anyway? More than a mere function, says Searle. When a thermostat turns the furnace off, does it think it is too hot in here? No. (Certainly not conscious intentionality.) 2. Could a machine think? Yes, says Searle, we are such machines. But not in virtue of the program, he thinks. The hardware of the machine is relevant. 24

Yes, but could an artificial, a man-made machine, think? Assuming it is possible to produce artificially a machine with a nervous system, neurons, with axons and dendrites, and all the rest of it, sufficiently like ours, again the answer to the question seems to be obviously, yes. If you can exactly duplicate the causes, you could duplicate the effects. And indeed it might be possible to produce consciousness, intentionality, and all the rest of it using some other sorts of chemical principles than those that human beings use. It is, as I said, an empirical question. (p. 352) 25

OK, but could a digital computer think? If by digital computer we mean anything at all that has a level of description where it can correctly be described as the instantiation of a computer program, then again the answer is, of yes, we are the instantiations of any number of computer programs, and we can think. (p. 353) 26

But could something think, understand, and so on solely by virtue of being a computer with the right sort of program? Could instantiating a program, the right program of course, by itself be a sufficient condition of understanding? This I think is the right question to ask, though it is usually confused with one or more of the earlier questions, and the answer to it is no. (p.353) 27

Functionalism: If two systems are functionally equivalent (same outputs for the same inputs) then they re mentally equivalent (same consciousness, intentionality, etc.) Searle opposes functionalism, not materialism. A complex system of water pipes, etc. cannot be conscious, even if it executes the right program, Searle thinks. 28

What are the pipes thinking about? 29

Going deeper The idea that computer simulations could be the real thing ought to have seemed suspicious in the first place because the computer isn t confined to simulating mental operations, by any means. No one supposes that computer simulations of a five-alarm fire will burn the neighborhood down or that a computer simulation of a rainstorm will leave us all drenched. Why on earth would anyone suppose that a computer simulation of understanding actually understood anything? (p.353) 30

Concretism? Is thought intrinsically concrete? For programs are abstract. Abstract objects don t feel pain. 31

Comparison with idealism According to Berkeley, ordinary objects like sticks and stones are really ideas. But even Berkeley didn t go so far as to say that we (humans) are ideas! That notion would have been ridiculous. We are minds, active producers of ideas. Ideas are passive, inert. They cannot make free choices. They are not conscious. Aren t abstract objects also passive and inert?

Now, computer programs are also abstract ideas, at least in the sense of being fully accessible to the mind (and not to the senses). Yet functionalism says that consciousness is just a matter of running the right computer program. Searle s intuition seems to be that consciousness depends on the concrete aspects of a system.

Concreteness? N.B. The notion of concreteness is rather obscure, and is basically the same as the notion of substance or substratum that Berkeley ridiculed. A concrete object isn t just a collection of properties, or a bundle of ideas. There s some (concrete) substance that has the properties, or instantiates those ideas. 34

If computer programs cannot be minds because they are abstract, rather than concrete, then Searle s position looks precarious, however. For material systems are also abstract, in a sense, if they correspond precisely to (abstract) mathematical structures. 35

Argument for functionalism: neuronreplacement therapy Suppose you re starting to have some mental problems, perhaps memory loss, confusion, emotional instability, or a difficulty solving math equations. It s gradually getting worse. Your GP refers you to a specialist, who says that some of your neurons are breaking down. The best treatment is NRT, or neuron-replacement therapy. This unfortunately cannot undo the existing damage, but will prevent further decline. They identify neurons that are close to failure, remove them, and replace them with digital circuits that are (you guessed it!) functionally equivalent to the old neurons. (Of course the electronic neurons will last indefinitely.) By replacing all the neurons that might fail in the next 5 years, the treatment gives you 5 years with no further mental decline. 36

You re understandably nervous about the procedure, worried that you ll no longer be fully human, but part machine. The specialist reassures you with this argument: There ll be no loss of function at all, since the replacement neurons are functionally equivalent to the old ones. If you replace part of a system with another part that is functionally equivalent to it, then the whole system is functionally unchanged. 37

But, you reply, even if my behaviour is the same, under all possible stimuli, might I not feel different? Not a chance, he says. For if you felt different, you might talk about it, saying things like I feel funny. But in that case your behaviour is also different, which we know is impossible! So you cannot feel any different either. Don t worry. 38

Every 5 years you need another round of NRT, until eventually your brain is entirely electronic. But, of course, all is well. You recommend NRT to all your friends. This argument seems to establish the view that there cannot be a change of mental state as long as everything is functionally the same. Does it? 39

Note that this argument assumes materialism as a premise. So if it works, it means that every materialist should be a functionalist. 40

Argument for functionalism: the problem of other minds How do I know that other people are conscious, as I am? The only evidence I have is their behaviour, in response to different situations. If functionalism is false, then of course this wouldn t be very good evidence at all, so that our belief in other minds would be quite unjustified! 41

The Turing Test of intelligence Suppose we can program a computer so that it is able to hold (apparently) intelligent conversation, just like a human being. Such a machine would, in conversation at least, be functionally equivalent to a human. Now how could you regard the words of such a machine as meaningless to it, or claim that it has no idea what it s saying? If it overhears such talk, then it will firmly set the matter straight! You might just as easily think that your own mother lacks intentional states, the machine protests. It s discrimination, plain and simple. 42

Further arguments against functionalism 1. The inverted spectrum. A person with an inverted spectrum will be functionally equivalent to us, but have different mental states. (Ned Block) 2. Functional zombies. We can conceive of a being that is functionally equivalent to a human, but which has no consciousness at all. 43

1. The inverted spectrum Such a person will be functionally equivalent to us. But their mental states will be different. 44

2. Functional zombies are conceivable. A functional zombie, as philosophers use the term in this context, is someone who is functionally equivalent to a human and yet no one is home. Such zombies are not conscious, any more than electronic calculators are. Hence our conception of consciousness, at any rate, is distinct from any functionally-defined state. 45

E.g. Ned Block s Chinese nation may be a functional zombie. The population of China is connected together to be functionally equivalent to a human brain. Then, while individual Chinese people will have conscious experiences, the whole system (the Blockhead ) will conceivably have no experiences at all. 46

So a having a certain functional role isn t sufficient for a conscious experience. Is it necessary? Could a person have conscious experiences that didn t cause any behaviour? people may have mild, but distinctive, twinges that have no typical causes or characteristic effects (Stanford Encyclopedia) 47