> From: Dunsdon, Graham <firstname.lastname@example.org>
> So I must now add feelings to the equation, ie., mind = feelings =
> extraspection + introspection. And, as you say, cogsci deals with
> extraspection representations. How then, will cogsci recognise the
> inputs and workings of introspection in terms of the EFFECTS on the
> cogsci models?
By their works shall ye know them: The DATA of cogsci are (1) the
performance capacities of minds -- what minds can DO -- and perhaps (2)
the measurable activity and structure of brains. (I say perhaps, because
brain data are only helpful if they help explain performance capacity;
otherwise they are just correlations.)
> Stevan, Presumably, cogsci says introspection is hard coded in the
> machine - the black box. But introspection's logic may have a big
> impact upon extraspection. Take, for example, the first few e-mail
> notes from th PY104 group. Introspection may have produced a coherent
> (to the writer) message (which now becomes a physical representation)
> to be shared with the group; but was considered unintelligible or at
> best unclear by some. Thus, introspection expressed as extraspection
> through language can either be sense or nonsense - depending not just
> on the use of a common spoken language but also, I suggest, by a
> congruent introspective experience/association.
Let me repeat the best advice I can give any of you during your three
year undergraduate voyage: The litmus test for whether you have
understood something or said something understandable is if kid-sib
would understand it. Kid-sib would not understand the foregoing
I THINK what you may have been saying is that what we say in words
depends on our having had some of the same experiences. That's why
colours cannot be understood by someone who is sightless from birth;
something similar is true of any experience that has not been shared by
both speaker and hearer.
This is true, but the remarkable power of language is that once
enough of its words have been "grounded" in experience, a lot
more can be learnt from words alone.
> Do cogsci models cope with this sense - nonsense continuum(?) by
> assuming in effect no introspective intervention; or do they assume
> that introspective structures in each individual mind/brain
> relationship are the same? In fact, back to my starting point: mind =
> awareness = (extraspection+introspection).
Kid-sib says: "What?"
The only difference between introspection and extraspection is
the object of the experience: If it's outside your body, it's
extraspection (otherwise known as perception). If the object of your
experience is your experience itself, i.e., if you are reflecting on
your experiences -- on what's going on in your mind -- then you're
> If this statement is true, has anyone in cogsci any short-medium term
> plans to begin to integrate introspective psychology as a fellow member
> of the interdisciplinary team? Dunsdon, Graham.
Introspection not only fails to reveal how the mind works (we're going
to have to work harder than that to find how the mind works), but it is
unlikely that even the eventual complete explanation of how the mind
works will explain either introspection or extraspection to our
intuitive satisfaction: We will not feel, once all the cognitive
questions have been answered -- once all of our capacities have been
explained, and their bases in our brains too -- that there are no more
questions to ask.
Then again, maybe some of us will.
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:49 GMT