To appear in J. D. Cohen and J. W. Schooler (Eds.)
Cognitive and neuroscientific approaches to consciousness:
The Twentyfifth Annual Carnegie Symposium on Cognition.
Hillsdale, NJ: Lawrence Erlbaum Associates, in press.
Copyright (C) 1993 George Mandler
Not to to be copied electronically or to hard copy.
I am grateful to Jean Mandler for critical comments on an earlier draft. For correspondence: George Mandler, Center for Human Information Processing, University of California, San Diego; La Jolla, CA, 92093-0109. e-mail: email@example.com
I start with a review of 20 years of proposals on the functions of consciousness. I then present a minimal number of functions that consciouness subserves, as well as as some remaining puzzles about its psychology. In the process I stress a psychologist's functional approach, asking what consciousness is for. The result is an attempt to place conscious processes within the usual flow of human information processing.
Since then the proliferation of interest in consciousness has been truly awesome. Philosophers --- as usual on the hunt for a juicy topic --- joined the fray, and as of 1993 no respectable cognitive scientist can be without a position on consciousness. An informal recent survey suggests that for the N (a large number) proponents of theoretical positions on consciousness, there are now N+1 (a larger number) of theoretical positions. And there is little sign of any centripetal tendency to find a core agreement among those N+1 positions. As a result, I shall first describe briefly my own development from the early speculations to a more recent position (Mandler 1975a, 1984a, 1984b, 1985, 1986, 1988, 1989, 1992, 1993),  followed by a discussion of several central notions: conscious construction, the feedback function of consciousness, and the seriality and limited capacity of consciousness. I will then attempt to spell out some minimal requirements for a conscious mechanisms together with a sampling of several puzzles of consciousness that need more work. Finally I shall return to the big picture, offer some speculations about the uses of "mind" and end with a defense of a functionalist psychological approach.
In the 1975a paper I listed five adaptive functions that seemed to me at the time to require a conscious mechanism:
Choice and the selection of action - short-term actions are reviewed and selected, and possible and desirable outcomes and possible alternative actions are consciously represented. Modification and interrogation of long range plans - alternative actions for long term plans are considered, and different substructures and outcomes are evaluated. Retrieval from long-term memory - explicit remembering is achieved, including the use of simple addresses to access complex structures. Construction of storable representations of current activities and events - (i) social/cultural products are stored and retrieved, in part by the use of language as an effective instrument for communication and storage, and (ii) information is stored and retrieved for future comparisons of present and past events. Troubleshooting - representations and structures are brought into consciousness when repair or emergency action is necessary on various - usually unconscious - structures.
Over the next ten years I reconsidered more precisely what functions need consciousness, i.e., that could not be performed without some mechanism like consciousness. I also focused more on automatic and simpler process, though I did pursue problems of consciousness and memory in depth (Mandler, 1989). In a new book on emotion (Mandler, 1984b) and a small volume on cognitive psychology (Mandler, 1985, Ch. 3), I summarized a general view of unconscious and conscious processes:
First, consciousness is limited in capacity and it is constructed so as to respond to the current situational and intentional imperatives.
Second, unconscious representations and processes generate all thoughts and processes, whether conscious or not; the unconscious is where the action is!
Third, all underlying (unconscious) representations are subject to activations, both by external events and by internal (conceptual) processes. The three levels of representation are: unconscious and not recently activated; unconscious but activated (essential the same as Freud's preconscious); and conscious.
Fourth, activated structures (e.g., schemas) are necessary for the eventual occurrence of effective thought and actions. Only activated structures can be used in conscious constructions. Current models of schema theory and the more sophisticated, but compatible, models of parallel distributed processes are all based on these assumptions.
Fifth, and a new assumption, conscious events prime; they provide additional activations to the relevant underlying structures.
Assumptions 2, 3, and 4 are shared by many cognitive scientists; assumptions 1 and 5 need further elaboration. Both of them emphasize more automatic than deliberate processes in consciousness. That emphasis on automatic effects is found in particular in an analysis of the feedback effects of conscious contents. I shall discuss those at greater length later, but first wish to talk about the construction of consciousness, in general, as well as in the way it differs between daily life and dreams.
We are customarily conscious of the important aspects of the environs, but never conscious of all the evidence that enters the sensory gateways or of all our potential knowledge of the event. A number of experiments have shown that people may be aware of what are usually considered higher order aspects of an event without being aware of its constituents. Thus, subjects are sometimes able to specify the category membership of a word without being aware of the specific meaning or even the occurrence of the word itself (Marcel, 1983a; Nakamura, 1989). A similar disjunction between the awareness of categorical and of event-specific information has been reported for some clinical observations (Warrington, 1975).
This approach to consciousness suggests highly selective constructions that may be either abstract/general or concrete/specific, depending on what is appropriate to current needs and demands. It is also consistent with arguments that claim that we have immediate access to complex meanings of events. These higher order "meanings" will be readily available whenever the set is to find a relatively abstract construction, a situation frequent in our daily interactions with the world. We do not need to analyze the constituent features or figures to be very quickly aware/conscious of the import of a picture or scene. In general, it seems to be the case that "we are aware of [the] significance [of a set of cues] instead of and before we are aware of the cues" (Marcel, 1983b).
Conscious constructions represent the most general interpretation of the current scene that is consistent with preconscious information and with the demands of the environment. Thus, we are aware of looking at a landscape when viewing the land from a mountaintop, but we become aware of a particular road when asked how we might get down or of an approaching storm when some dark clouds "demand" inclusion in the current construction.
One of the best examples that conscious contents respond not merely to "veridical" representations is shown by the work of Nisbett & Wilson (1977). They show that conscious reconstructions of previous events reflect not just what "actually happened," but also respond to variables and structures of which we are not conscious and which distort "veridicality." Distortions (constructions!) of conscious memory, as for example in eyewitness testimony, provide many instances of this process. Vibration induced illusions (sensory misinformation) of limb motion produce novel but "sensible" apparent body configurations, so that, for example, biceps vibration of the arm while ones finger rests on one's nose produces the experience of an elongated nose (see Lackner, 1988, for this and many other examples). Similarly, misleading information about one's hand movements apparently requires and produces the experience of involuntary hand movements (Nielsen, 1963).
It is the hallmark of sane "rational" adults that they are conscious of a world that is consistent with its usual constraints as well as with the evidential constraints experienced by others in the same situation and at the same time. But there is another frequent human activity that is relatively unconstrained by reality, yet is conscious - namely our nightly dreams.
It is in this fashion that abstract (and unconscious) preoccupations and "complexes" may find their expression in the consciousness of dreams. It is what Freud (1900/1975) has called the "residue" of daily life that produces some of the actors and events, whereas the scenario is free to be constructed by otherwise quiescent higher order schemas. The higher order schemas - the themes of dreams - may be activated by events of the preceding days or they may be activated simply because a reasonable number of their features have been left over as residues from the days before. I should note that dream theories that concentrate only on the residues in dreams fail to account for the obviously organized nature of dream sequences - however bizarre these might be. In contrast to mere residue theories, Hobson's activation-synthesis hypothesis of dreaming (Hobson, 1988) supposes that, apart from aminergic neurons, "the rest of the system is buzzing phrenetically, especially during REM sleep" (Hobson, 1988, p.291). Such additional activations provide ample material to construct dreams and, as Hobson suggests, to be creative and to generate solutions to old and new problems.
This view is not discrepant with some modern as well as more ancient views about the biological function of dreams (in modern times specifically REM dreams), which are seen as cleaning up unnecessary, unwanted, and irrelevant leftovers from daily experiences. However, these views of dreams as "garbage collecting" fail to account for their organized character (Crick & Mitchison, 1984; Robert, 1886).
In short, dreams are an excellent example of the constructive nature of consciousness: they are constructed out of a large variety of mental contents, either directly activated or activated by a wide ranging process of spreading activation, and they are organized by existing mental structures.
I now turn to the issue of feedback, the effect of consciousness on later constructions.
The feedback assumption states that the alternatives, choices, or competing hypotheses that have been represented in consciousness will receive additional activation and thus will be enhanced. Given the capacity limitation of consciousness combined with the intentional selection of conscious states, very few preconscious candidates for actions and thoughts will achieve this additional, consciousness-mediated activation. What structures are most likely to be available for such additional activation? It will be those preconscious structures that have been selected as most responsive to current demands and intentions. Whatever structures are used for a current conscious construction will receive additional activation, and they will have been those selected as most relevant to current concerns. In contrast, alternatives that were candidates for conscious thought or action but were not selected will be relegated to a relatively lower probability of additional activation and therefore less likely to be accessed on subsequent occasions.
The evidence for this general effect is derived from the vast amount of current research showing that the sheer frequency of activation affects subsequent accessibility for thought and action, whether in the area of perceptual priming, recognition memory, preserved amnesic functions, or decision making (for a summary of some of these phenomena, see Mandler, 1989). The proposal extends such activations to internally generated events and, in particular, to the momentary states of consciousness constructed to satisfy internal and external demands. Thus, just as reading a sentence produces activation of the underlying schemas, so does (conscious) thinking of that sentence or its gist activate these structures. In the former case, what is activated depends on what the world presents to us; in the latter the activation is determined and limited by the conscious construction. Note that in order for the feedback function to make sense, we must assume that the "adaptive" function of construction that selects appropriate mental contents is also operating.
This hypothesis of selective and limited activation of situationally relevant structures requires no homunculus-like function for consciousness in which some independent agency controls, selects, and directs thoughts and actions that have been made available in consciousness. Given an appropriate database, it should be possible to simulate this particular function of consciousness without an appeal to an independent decision-making agency.
The proposal can easily be expanded to account for some of the phenomena of human problem solving. I assume that activation is necessary but not sufficient for conscious construction and that activation depends in part on prior conscious constructions. The search for problem solutions and the search for memorial targets (as in recall) typically have a conscious counterpart, frequently expressed in introspective protocols. What appear in consciousness in these tasks are exactly those points in the course of the search when steps toward the solution have been taken and a choice point has been reached at which the immediate next steps are not obvious. At that point the current state of world is reflected in consciousness. That state reflects the progress toward the goal as well as some of the possible steps that could be taken next. A conscious state is constructed that reflects those aspects of the current search that do (partially and often inadequately) respond to the goal of the search. Consciousness at these points depicts waystations toward solutions and serves to restrict and focus subsequent pathways by selectively activating those that are currently within the conscious construction. Preconscious structures that construct consciousness at the time of impasse, delay, or interruption receive additional activation, as do those still unconscious structures linked with them. The result is a directional flow of activation that would not have happened without the extra boost derived from the conscious state.
Another phenomenon that argues for the re-presentation and re-activation of conscious contents is our ability to "think about" previous conscious contents; we can be aware of our awareness. There is anecdotal as well as experimental evidence that we are sometimes confused between events that "actually" happened and those that we merely imagined, i.e., events that were present in consciousness but not in the surrounds. Clearly the latter must have been stored in a manner similar to the way "actual" events are stored (Anderson, 1984; Johnson and Raye, 1981). It has been argued that this awareness of awareness (self-awareness) is in principle indefinitely self-recursive, that is, that we can perceive a lion, be aware that we are perceiving a lion, be conscious of our awareness of perceiving a lion, and so forth (e.g., Johnson-Laird, 1983). In fact, I have never been able to detect any such extensive recursion in myself, nor has anybody else to my knowledge. We can certainly be aware of somebody (even ourselves) asserting the recursion, but observing it is another matter. The recursiveness in consciousness ends after two or three steps, that is, within the structural limit of conscious organization.
The positive feedback that consciousness provides for activated and constructed mental contents is, of course, not limited to problem-solving situations. It is, for example, evident in the course of self-instructions. We frequently keep reminding ourselves (consciously) of tasks to be performed, actions to be undertaken. "Thinking about" these future obligations makes it more likely that we will remember to undertake them when the appropriate time arrives. Thus, self-directed comments, such as, "I must remember to write to Mary" or "I shouldn't forget to pick up some bread on the way home," make remembering more and forgetting less likely. Such self-reminding not only keeps the relevant information highly activated but also repeatedly elaborated in different contexts, thus ready to be brought into consciousness when the appropriate situation for execution appears. Self-directed comments can, of course, be deleterious as well as helpful. The reoccurrence of obsessive thoughts is a pathological example, but everyday "obsessions" are the more usual ones. Our conscious constructions may end up in a loop of recurring thoughts that preempt limited capacity and often prevent more constructive and situationally relevant "thinking." One example is trying to remember a name and getting stuck with an obviously erroneous target that keeps interfering with more fruitful attempts at retrieval. The usual advice to stop thinking about the problem, because it will "come to us" later, appeals to an attempt to let the activation of the "error" return to lower levels before attempting the retrieval once again. The fact that a delay may produce a spontaneous "popping" of the required information speaks to unconscious spreading of activation on the one hand and the apparent restricting effect of awareness on the other (see Mandler, 1994, for extensive discussion of these issues). Another example of the deleterious effects of haphazard activation is represented in the likelihood of consciousness being captured by a mundane occurrence. Thus, as we drive home, planning to pick up that loaf of bread, conscious preoccupation with a recent telephone call may capture conscious contents to the exclusion of other, now less activated, candidates for conscious construction, such as the intent to stop at the store. Or, planning to go to the kitchen to turn off the stove, we may be "captured" by a more highly activated and immediate conscious content of a telephone call. The "kitchen-going" intention loses out unless we refresh its activation by reminding ourselves, while on the phone, about the intended task. If we fail to keep that activation strong enough and the plan in mind -- our dinner is burned.
The additional function of consciousness as outlined here is generally conservative in that it underlines and reactivates those mental contents that are currently used in conscious constructions and are apparently the immediately most important ones. It also encompasses the observation that under conditions of stress people tend to repeat previously unsuccessful attempts at problem solution. Despite this unadaptive consequence, a reasonable argument can be made that it is frequently useful for the organism to continue to do what apparently is successful and appears to be most appropriate. Finally, the priming functions of consciousness interact in important ways with its construction. If, as I have argued, conscious construction response to (subjectively) important aspects of the world, then it will be exactly those of course that will be primed and enhanced for future use and access.
Given our recent insights into the parallel and distributed nature of (unconscious) mental processing, the human mind (broadly interpreted) needed to handle the problem of finding a buffer between a bottleneck of possible thoughts and actions of comparable "strengths" competing for expression and the need for considered effective action in the environment. Consciousness handles that problem by imposing limited capacity and seriality. Conscious and unconscious processes are - in major ways - contrasted by their differences in seriality and capacity. Conscious processes are serial and limited in capacity to some 5 contemporaneous items or chunks, whereas unconscious processes operate in parallel and are - for all practical purposes - "unlimited" in capacity. Any speculations about the evolution of consciousness needs to take these distinctions into account. And finally, given the assumption that current conscious contents are constructed out of available activated structures and current demands, it follows that under different demands the same underlying structures should give rise to different conscious representations (see Mandler, 1992).
To illustrate the importance of limited, serial conscious representation, imagine consciousness as it is, behaving as yours does, but with one - and only one - exception, namely its seriality. Imagine consciousness as a parallel machine that permits everything currently relevant (or unconsciously active) to come to consciousness all at once. You would be overwhelmed by thoughts, potential choices, feelings, attitudes, etc. of comparable "strength" and relevance. As you read a book all the characters and their implications would cascade in your mental life. Consider the story of Lord Nelson and Lady Hamilton: As you read of one their trysts you would also be aware/conscious of his victory at Trafalgar, his defeats in the Mediterranean, his anti-republicanism, his narcissism - and her eventual obesity, her Lancashire beginnings, her lovers - and her husband's interest in classical vases and volcanos, - and ..... . A huge mishmash of associations and ideas would envelop you, and that discounts simple environmental events such as the chair you are sitting on, the lamp that illumines your book, and so forth. A "humanly" impossible situation. All of this would come in simultaneous snippets, still constrained by the limited capacity of the machine. In this account I have not relaxed the constraint of limited capacity. To relax that restriction too, to permit all unconscious content to become conscious might strain the capacity of the reader to suspend disbelief. But wait just one more moment; would that consciousness not remind you of a consciousness discussed in some other place? Is that not a description of God - aware of all that all of "his children" (including the merest sparrow) ever do and think. Can one really move that easily move from humanity to deity - by just suspending seriality, limited capacity, and the current relevance of consciousness?
a. The selective/constructive representation of unconscious structures.
b. The conversion from a parallel and vast unconscious to a serial and limited conscious representation.
c. The selective activation (priming) by conscious representations that changes the unconscious landscape by producing new privileged structures.
Of these processes, the priming function is more directly and obviously associated with consciousness, whereas the others may be more indirect and inferred. However, all of these characteristics are amenable to empirical investigations, and in the end the question is whether these minimalist assumptions are adequate to handle the most obvious or inferred functions of consciousness. If not, what else is needed, and is it consistent with these assumptions? Or is there another core of assumptions that might command assent from a large number of theorists.
Until that question can be settled (or even asked?), there are a number of specific questions with which I have been concerned, and which deserve further investigation.
Consciousness and short-term memory (STM). The distinction between short-term and long-term memory goes back at least to the beginnings of the information processing movement. Is it not about time that we bring STM into line with what we know about consciousness? William James coined the term "primary memory" to designate information that is currently available in consciousness. If in STM we "retrieve" only whatever consciousness will "hold," then we are limited to retrieving some 5+- items. The limitation is the same for STM and limited capacity consciousness, both of which a restricted to a single organized set with about 5 discernable constituent attributes, features, items, etc. But any operation on the limited material held in consciousness will further activate and bound that material, providing a small set of highly activated preconscious representations. Thus STM consists of "primary" currently conscious contents and additional material that is very easily retrieved because it is the product of these short term retrievals and activations. In addition items "in" STM may have been elaborated or merely activated, a difference that may determine their rate of decay or accessibility. Does this do justice to what we know about STM?
Why is the limited capacity what it is? Whether one wishes to define the limited capacity as 3 or 5 or 7 items/chunks, some such magnitude has been accepted ever since George Miller's seminal paper in 1956 on the "magical number." Of all the possible genetic determinants of human cognition, the one that defines the limited capacity of consciousness seems to demand more serious attention than some of the more extravagant evolutionary conjectures that circulate these days. It seems intuitively reasonable that the number needs to be more than 2 and probably less than 10 if fast decision processes on a reasonable number of alternatives are required for survival. But why the number we've got?
How do we determine conscious contents? Nearly thirty years ago Adrian (1966) noted that psychology's "uncertainty principle" may well be the fact that the very interrogation of conscious contents may alter these contents. Can we circumvent this problem? What alternatives, such as Dennett's (1991) heterophenomenology, are available?
Esoteric and other states of consciousness. Currently there is a rather wide gulf between the cognitive science community on the one hand and equally passionate investigators of esoteric and altered states of consciousness on the other. Neither side seems to pay much attention to what the other has to say, and given that we speak from the cognitive science side it may be time to take a look at the phenomena that the "others" cultivate. I tried to do that early on (Mandler, 1975b) when I suggested that a variety of different esoteric and meditative methods produce "conscious stopping," i.e., a frame-freezing experience. How does that come about?
On not being conscious. Patients with very dense amnesias have given us some anecdotal guides on the experience of "not being conscious." Tulving (1985) reports such a patient's description of living in a permanent present and not being able to think about future plans or events. More extensive follow-ups to these interesting leads should be most useful for a better understanding of "being conscious."
I want to conclude with comments on the place of consciousness in the discussions of "mind," even though the confusion between the two has created more heat than light.
There are specific, and sometimes very precise, concepts associated with the function of larger units such as organs, organisms, and machines, concepts that cannot without loss of meaning be reduced to the constituent processes of the larger units. The speed of a car, the conserving function of the liver, and the notion of a noun phrase are not reducible to internal-combustion engines, liver cells, or neurons. But nobody talks about the Cadillac-acceleration, the liver-sugar, or the noun-phrase-cell problem. Complex entities may develop new functions - a notion that has sometimes been referred to as emergence. The mind has functions that are different from those of the central nervous system qua nervous system, just as societies function in ways that cannot be reduced to the function of individual minds. This is, of course, true even within bounded scientific fields; mechanics and optics cannot be reduced to nuclear physics.
Some of the difficulty that has been generated by the mind-body distinction stems from the failure to consider the relation between well-developed mental and physical theories. Typically, mind and body are discussed in terms of ordinary-language definitions of one or the other. Because these descriptions are far from being well-developed theoretical systems, it is doubtful whether the problems of mind and body as developed by the philosophers are directly relevant to the scientific distinction between mental and physical systems.
Once it is agreed that the scientific mind-body problem concerns the relation between two sets of theories, the enterprise becomes theoretical and empirical, not metaphysical. And the conclusion would be that we do not yet know enough about either system to develop a satisfactory bridging system/language. If, however, we restrict our discussion of the mind-body problem to the often vague and frequently contradictory speculations of ordinary language, then, as centuries of philosophical literature have shown, the morass is unavoidable and bottomless.
For example, we can and do, in the ordinary-language sense, ask how it is that physical systems can have "feelings." A recurring philosophical blockbuster has been the question how a physical brain can generate mental qualia such as color sensations. The question has produced many premature explications, however ingenious some of them are (such as Dennett's, 1991). A healthy agnosticism, a resounding "I don't know" might be well-placed at the beginning of these interchanges. We don't know, and we might know sometime in the future, but is the question really different from any other island of human ignorance? Such questions assume that we know the exact nature of the physical system and of a mental system that produces "feelings." Usually, however, the question is phrased as if "feelings" were a basic characteristic of the physical and mental system instead of one of its products. Not only is the experience of a feeling a product, but its verbal expression is the result of complex mental structures that intervene between its occurrence in consciousness and its expression in language. If we have truly abandoned Cartesian dualism, then one may permit the question of how the brain "does" consciousness, seen as just another thing that it does.
The study of consciousness also had a very modern hurdle in its way. In their preoccupation with the computer analogy, many cognitive scientists have become uneasy with consciousness as a characteristic of one aspect of mind. In part because the problem of computer consciousness is at the least complex (though not difficult for science-fiction writers), some philosophers and others have become closet epiphenomenalists - refusing to assign to consciousness any function in mental life (e.g., Jackendoff, 1987; Thagard, 1986 - who is however willing to let consciousness have some functions in applied aspects of behavior!!).
Dennett, for example, much like most philosophers, is primarily concerned with the appearance and "feel" of consciousness and becomes uncharacteristically vague when talking about its possible functions (Dennett, 1991, pp.275 ff.; see also Mandler, 1993). Another inside-out theorist is very specific in his defense of the approach: Jackendoff (1987, p.327) rejects any inquiry as to what functions consciousness might serve. He specifically endorses the preferential use of evidence that is directed toward what consciousness is, not what it is for. The attractiveness of the inside-out/functionalism1 approach 21 is found in Chomsky's approach to linguistics. Questions of the function of language are secondary - in contrast to the linguistic "functionalists" who preferentially consider contemporaneously several aspects of language, including its communicative, cognitive, pragmatic, social "functions" in order to understand its origin and structure. The inside-out approach is also related to the preference for some sort of central homunculus that directs and knows all. Not all homunculi are bad, but this particular one usually adopts the language of the board room, with "executives" directing "slaves" and similar metaphors. It is, of course, inevitable that consciousness-talk at the end of the 20th century will reflect 20th century mores and prejudices, whether these are phrased in computer-language, boardroom talk, or whatever. The best we can do is to be aware of such obvious lures and to try to avoid evanescent sociocentric approaches that are likely to have a relatively short life. On the other hand, such pious exhortations may be useless since it is highly probable that we cannot truly escape our current situation and past history.
Adrian, E. D. (1966). Consciousness. In J. C. Eccles (Ed.), Brain and conscious experience. New York: Springer
Anderson, R. E. (1984). Did I do it or did I imagine doing it?. Journal of Experimental Psychology: General, 113, 594-613.
Atkinson, R. C. and Shiffrin, R. M. (1968). Human memory: A proposed system and its control processes. In K. W. Spence and J. T. Spence (Ed.), The psychology of learning and motivation(Vol. 2). New York: Academic Press
Baddeley, A. (1989). The uses of working memory. In P. R. Solomon, G. R. Goethals, C. M. Kelley, & B. R. Stephens (Ed.), Memory: Interdisciplinary approaches (pp. 107-123). New York: Springer Verlag
Craik, F. I. M. and Watkins, M. J. (1973). The role of rehearsal in short-term memory. Journal of Verbal Learning and Verbal Behavior, 12, 599-607.
Crick, F. and Mitchison, G. (1983). The function of dream sleep. Nature, 304, 111-114.
Dennett, D. C. (1991). Consciousness explained. Boston: Little, Brown & Company
Freud, S. (1900). The interpretation of dreams. In The Standard Edition of the Complete Psychological Works of Sigmund Freud(Vols. 4 and 5). London: Hogarth Press, 1975
Gregory, R. L. (1981). Mind in science. New York: Cambridge University Press
Hobson, J. A. (1988). The dreaming brain. New York: Basic Books
Hobson, J. A., Hoffman, S. A., Helfand, R., and Kostner, D. (1987). Dream bizarreness and the activation-synthesis hypothesis. Human Neurobiology, 6, 157-164.
Jackendoff, R. (1987). Consciousness and the computational mind. Cambridge, Mass.: MIT Press
James, W. (1890). The principles of psychology. New York: Holt
Johnson-Laird, P. (1983). Mental models. Cambridge: Cambridge University Press
Johnson, M. K. and Raye, C. L. (1981). Reality monitoring. Psychological Review, 88, 67-85.
Kahneman, D. and Treisman, A. (1984). Changing views of attention and automaticity. In R. Parasuraman & D. R. Davies (Ed.), Varieties of attention (pp. 29-61). New York: Academic Press
Lackner, J. R. (1988). Some proprioceptive influences on the perceptual representation of body shape and orientation. Brain, 111, 281-297.
Mandler, G. (1975a). Consciousness: Respectable, useful, and probably necessary. In R. Solso (Ed.), Information processing and cognition: The Loyola symposium (Also in: Technical Report No. 41, Center for Human Information Processing, University of California, San Diego. March, 1974., pp. 229-254). Hillsdale, N.J.: Lawrence Erlbaum Associates
Mandler, G. (1975b). Mind and emotion. New York: Wiley
Mandler, G. (1984a). The construction and limitation of consciousness. In V. Sarris and A. Parducci (Ed.), Perspectives in psychological experimentation: Toward the year 2000. Hillside, N.J.: Lawrence Erlbaum Associates
Mandler, G. (1984b). Mind and body: Psychology of emotion and stress. New York: Norton
Mandler, G. (1985). Cognitive psychology: An essay in cognitive science. Hillsdale, N.J.: Lawrence Erlbaum Associates
Mandler, G. (1986). Aufbau und Grenzen des Bewusstseins . In V. Sarris & A. Parducci (Ed.), Die Zukunft der experimentellen Psychologie.. Weinheim und Basel : Beltz
Mandler, G. (1988). Problems and directions in the study of consciousness. In M. Horowitz (Ed.), Psychodynamics and cognition (pp. 21-45). Chicago: Chicago University Press
Mandler, G. (1989). Memory: Conscious and unconscious. In P. R. Solomon, G. R. Goethals, C. M. Kelley, & B. R. Stephens (Ed.), Memory: Interdisciplinary approaches (pp. 84-106). New York: Springer Verlag
Mandler, G. (1992). Toward a theory of consciousness. In H.-G. Geissler, S. W. Link & J. T. Townsend (Ed.), Cognition, information processing, and psychophysics: Basic issues (pp. 43-65). Hillsdale, N.J.: Lawrence Erlbaum Associates
Mandler, G. (1993). Review of Dennett's "Consciousness explained". Philosophical Psychology, 6, 335-339.
Mandler, G. (1994). Hypermnesia, incubation, and mind-popping: On remembering without really trying. In C. Umilta & M. Moscovitch (Ed.), Attention and Performance XV: Concious and unconscious information processing (pp. 3-33). Cambridge, Mass.: MIT Press
Mandler, G. (1995). Origins and consequences of novelty. In S. M. Smith, T. B. Ward & R. Finke (Ed.), The creative cognition approach. Cambridge, MA : MIT Press
Marcel, A. J. (1983a). Conscious and unconscious perception: Experiments on visual masking and word recognition. Cognitive Psychology, 15, 197-237.
Marcel, A. J. (1983b). Conscious and unconscious perception: An approach to the relations between phenomenal experience and perceptual processes. Cognitive Psychology, 15, 238-300.
McClelland, J. L. and Rumelhart, D. E. (1981). An interactive activation model of context effects in letter perception: Part 1. An account of basic findings. Psychological Review, 88, 375-407.
Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63, 81-97.
Nakamura, Y. (1989). Explorations in implicit perceptual processing: Studies of preconscious information processing. Unpublished doctoral dissertation, University of California, San Diego.
Neisser, U. (1963). The multiplicity of thought. British Journal of Psychology, 54, 1-14.
Neisser, U. (1967). Cognitive psychology. New York: Appleton-Century-Crofts
Nielsen, T. I. (1963). Volition: A new experimental approach. Scandinavian Journal of Psychology, 4, 225-230.
Nisbett, R. E. and Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes. Psychological Review, 84, 231-259.
Posner, M. I. and Boies, S. J. (1971). Components of attention. Psychological Review, 78, 391-408.
Posner, M. I. and Keele, S. W. (1970). Time and space as measures of mental operations. Paper presented at the Annual Meeting of the American Psychological Association.
Posner, M. I. and Snyder, C. R. R. (1975). Attention and cognitive control. In R. Solso (Ed.), Information processing and cognition: The Loyola symposium. Potomac, Md.: Lawrence Erlbaum Associates
Robert, W. (1886). Der Traum als Naturnothwendigkeit erklart. Hamburg: H. Seippel
Shallice, T. (1972). Dual functions of consciousness. Psychological Review, 79, 383-393.
Shallice, T. (1991). The revival of consciousness in cognitive science. In W. Kessen, A. Ortony & F. Craik (Ed.), Memories, thoughts, emotions: Essays in honor of George Mandler (pp. 213-226). Hillsdale, NJ: Lawrence Erlbaum Associates
Thagard, P. (1986). Parallel computation and the mind-body problem. Cognitive Science, 10, 301-318.
Thatcher, R. W. and John, E. R. (1977). Foundations of cognitive processes. Hillsdale, N.J.: Lawrence Erlbaum Associates
Tulving, E. (1985). Memory and consciousness. Canadian Psychology, 26, 1-12.
Warrington, E. K. (1975). The selective impairment of semantic memory. Quarterly Journal of Experimental Psychology, 27, 635-657.
Woodward, A. E., Bjork, R. A., and Jongeward, R. H. Jr. (1973). Recall and recognition as a function of primary rehearsal. Journal of Verbal Learning an Verbal Behavior, 12, 608-617.