Re: What the mind's eye tells the mind's brain

From: Harnad, Stevan (harnad@cogsci.soton.ac.uk)
Date: Mon Nov 20 1995 - 21:50:05 GMT


> From: "Smith, Wendy" <WS93PY@psy.soton.ac.uk>
> Date: Mon, 20 Nov 1995 08:57:30 GMT
>
ws> People certainly appear to experience images (based on introspection)
ws> but this does not explain either how the image occurs, or the nature
ws> of the image. The experience may be open to introspection, but the
ws> mechanism behind it is not.
ws>
ws> ...images are not the means
ws> by which thinking is carried out, but an effect of the thinking
ws> process.
>
ws> the representation is constructed from an interpretation of,
ws> rather than "snapshot" of, pictures and words.
>
ws> Three possibilities for the form this representation could take have
ws> been suggested. It could be in the form of a proposition (ie an
ws> assertion) and knowledge exists as a list of propositions. from this
ws> initial list, all other valid propositions can be derived on a true
ws> or false basis. I don't really understand this. I don't get where
ws> the initial list comes from.

You're quite right to ask that question. And a propositional theory like
Pylyshyn's can't really answer it: They're inborn? Am I born knowing a
certain set of propositions, for which others are derived? Unlikely. I
have to learn a lot about the world, and I also have to learn a
language, before I can have propositions. That prior stuff cannot, then,
be just more propositions, can it?

ws> By theory testing of the proposition
ws> does it mean:
>
ws> Proposition: If A, then B, but not C
ws> Observation: B
ws> Conclusion: Therefore A
>
ws> The representation could be as a data-structure. I didn't understand
ws> this either! Is it a set of propositions structured such that the
ws> relationship between each of the propositions is also specified? Is
ws> it like the data-base of a computer?

Good questions, and hard to say what Pylyshyn means, but probably yes,
he means something like not just the database but also the programmes
in a computer.

ws> The representation could also be as a system of rules. When a
ws> concept is abstracted, it is via a set of rules which determine
ws> necessary and sufficient grounds for inclusion within the concept.
ws> Does this mean:
ws>
ws> Proposition: If A then B, but not C
ws> Rule 1: Look for C, if present then not A
ws> Rule 2: If no C, then look for B, if present then A

Yes, but there's something missing: What is the meaning of the A's B's
and C's? Are those propositions too? But then what do they and their
components mean? More propositions? All the way down (up?) the
hierarchy? It sounds like a database alright, but whose the user?
Another homunculus, but this time not one that gazes at images but one
that understands the meaning of propositions?

No, Pylyshyn would like to say that we ARE those propositions, rather
than their users and understanders, but how can that be? Unless I
interpret them, they are just a bunch of meaningless symbols.

ws> Whatever the representation, it's organised hierarchically. There is
ws> a permanent structure which specifies the relationships between
ws> concepts, but with no details of 3-D positioning. This is how
ws> knowledge is represented in memory. When a particular concept is in
ws> current use, it becomes very active, and goes to the top of the
ws> hierarchy. Closely related concepts are also activated. As the
ws> concepts become more distant, they are less active. So, there will
ws> be several active concepts at the top of the hierarchy, and the rest
ws> are dormant. This temporary 3-D structure is the image we have when
ws> we are thinking. If another concept becomes the one is use, then the
ws> hierarchy is re-structured into a different 3-D configuration, and
ws> the image changes.
ws>
ws> Am I getting close?

Hard to say, until someone specifies the device that contains all this
propositional stuff, and shows what it can do with it. To dub it
"cognition" based only on this kind of characterisation (not yours,
Pylyshyn's) seems little better than introspection.

ws> If this is so, then as S was unable to abstract information, it
ws> suggests that the nature of his images were more in the form of
ws> intact sensory information than images referred to here. What about
ws> the storage of information in his memory and his thinking processes?
ws> What a pity that all that was tested was how long a list he could
ws> remember and for how long a duration.......!!

Pylyshyn would want to deny that. Sensory information comes in, alright,
but cognition only begins where it turns into propositions. So
sensorimotor systems are not cognitive systems.

Yes, more tasks (more behavioural capacity) and a more specific causal
explanation would help...



This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:56 GMT