A nice recent development in philosophy publishing is the "5 Questions" series, in which philosophers in various fields offer personal and autobiographical ruminations. Snippets from a few of these volumes are available online, including formal philosophy, foundations of physics, normative ethics, philosophy of mathematics, political philosophy, and a few others.
The latest in the series is Mind and Consciousness: 5 Questions, edited by Patrick Grim, with an impressive cast of contributors. I've now written a draft of my contribution to this volume. This is mainly autobiographical rambling and metaphilosophical pronouncement rather than philosophy per se, so it won't be to everyone's tastes. But any thoughts are welcome. I see that the contributions by David Rosenthal and Michael Tye are also available online.
Very interesting reading, thanks.
I have a - most likely mainly terminological - question concerning what you said about the 'extended mind' thesis. You say "I have ended up being both an internalist and an externalist, though I don’t think that there’s really a contradiction here."
What kind of externalism would this be - 'of mind with respect to organism'? Is there a name for it?
I would have thought that this issue is completely independent from the question of semantic internalism or externalism.
Posted by: Kappatoo | October 10, 2008 at 12:38 AM
Yes, there are at multiple kinds of externalism/internalism at issue here. There are even multiple distinctions. One distinction is that between semantic E/I (meaning of words) and content E/I (contents of thoughts). Here the issue is content E/I. Another distinction is that between organismic E/I (internal/external to the biological organism) or systemic E/I (internal/external to the cognitive system). Here I am an systemic internalist (about the relevant sort of content) but an organismic externalist. A third distinction is the distinction between active externalism (which holds that the relevant external features are those active in the current cognitive loop) and passive externalism (which allows that past and distal features maybe relevant). Here I'm an active but not a passive externalist (about the relevant sort of content).
I suppose that the systemic/organismic distinction is the key to resolving the contradiction. I don't know whether one or the other of these is standard in defining internalism/externalism in the post-Putnam/Burge literature. My sense is that both are often used.
Posted by: djc | October 10, 2008 at 06:52 PM
Okay, thanks. Some of these distinctions are indeed new to me. I should read your article about the topic (with Andy Clarke).
Admittedly, content internalism is often characterized as saying that mental content is shared by 'intrinsic duplicates' or 'duplicates from the skin in'. At least the second of these formulations would be incompatible with your view, which would thus make you a content externalist. But I guess most of those formulations are given with the implicit assumption that the mind does at least not extend beyond the body.
To me, these seem to be totally independent issues. I would thus think that defining content internalism as systemic internalism would allow for a more useful way of structuring the debate.
Posted by: Kappatoo | October 11, 2008 at 12:46 AM
Thanks for the reading, it was very interesting and inspiring.
While I was reading, the following question appeared to me: If there is something like a mind, and there is something like consciousness, do they both necessarily go together? My first intuition was that they don't; but if so, what can the unconscious mind contribute to consciousness?
Posted by: Lena | October 11, 2008 at 01:55 PM
I enjoyed the article, it really offered up some insight into how your ideas about consciousness gradually took shape over time.
Posted by: Matthew C. | October 13, 2008 at 01:59 PM
Thank you very much, David. This text is really very informative. It's interesting, however, that you believe that your "hard problem" was not a big contribution to the philosophy of mind. As far as I know, the question "Why our brain activity is accompanied by subjective experience?" in no time before 1994 was in the focus of attention.
Posted by: Vadim Vasilyev | October 15, 2008 at 04:38 PM
Vadim: Well, there has been any amount of discussion of this question in the last 150 years or so (surprisingly, less before then): Huxley, Tyndall, Broad, Sherrington, Nagel, and Levine are just a few of those who come to mind. It may be that the intensity of the focus in the 1990's was greater, and maybe the "hard problem" formulation helped with that, but if so I'd see that as more of a sociological contribution than a philosophical contribution per se.
Posted by: djc | October 15, 2008 at 06:11 PM
David, it is clear enough that your hard problem includes two questions: "How is it possible for the brain to give rise to consciousness?", and "Why does consciousness exist at all?" As far as I know, Huxley and others had asked the first question only. The second question is quite innovative, I believe. Still, I can't exclude that it can be reduced to the question about the function of consciousness, and that question is not new, of course, and can be found already in James' article "Are we automata?" (1879), for example. But it doesn't change anything. Descartes' cogito can be found everywhere, but we believe it was he who invented that principle, because, as history of philosophy shows, to invent something in philosophy is to shout about it so loudly that everyone would hear you. I think, you have done precisely that in the case of your hard problem.
Posted by: Vadim Vasilyev | October 16, 2008 at 07:57 AM
Vadim: The question "Why does consciousness exist at all" is one of the best. Nagel seems to have done some work with it, although he would probably phrase it as "why should subjectivity exist". The whole subjectivity problem can be further summed up in the question, "why is reality being experienced from the perspective of my body, instead of another?" (the childhood "why am I me" question). I have found this to be one of the most effective arguments against those who have a mechanistic view of consciousness. There is no mechanistic or physical answer to this question, since in physicalist models there are no preferred points in space or time (nor is there a preferred universe). Why should there be this obviously preferred perspective in space and time? Why should this bias or asymmetry exist? Yet, that is simply what is observed. If the physicalism was a complete description of reality, we should be zombies. The universe should be one big machine, if presentism was the correct view of time, or one big structure, if block time was correct.
To paraphrase Nagel's take on it: In our physical system (and you can extend that to other universes, if you like), you have everything you need to construct any number of conscious brains, except for one item of information that's missing - which one of those brains is "you"? Where is that index information encoded in the physical system? It's missing, and it's not a trivial piece of information. Everything depends on who and what you are. Reality would be quite different if you were a dog, or an advanced computer, or an alien in another galaxy, or a being in another universe. Why you, and why here - on this rock, around an ordinary star in an ordinary galaxy, among hundreds of billions of other galaxies, in a universe that may be one of countless others? It is as if the entire multiverse has a preference for viewing itself through your eyes. That is simply what is observed. Doesn't it seem odd? Subjectivity...
Posted by: Jeff | October 22, 2008 at 04:46 AM
Jeff, thank you for your comments. I think, however, that David Chalmers' question is not an indexical one. As far as I understand it, it is not a question about why my consciousness exists here and now. Rather it is a question about why consciousness does exist at all.
Posted by: Vadim Vasilyev | October 23, 2008 at 05:44 AM
Vadim - well, my take on it would be this: if you ask why consciousness should exist at all, you must first define what consciousness is. It is certainly an overused word and there may be many definitions with slightly different shades of meaning, but to me at least, subjectivity is a vital element. I cannot conceive of an objective consciousness or an objective experience. Both "consciousness" and "experience" necessarily imply subjectivity, and subjectivity implies an indexical. Just my relatively naive opinion, of course ;)
Posted by: Jeff | October 23, 2008 at 01:48 PM
Jeff: here comes a naive reply... I would question your claim that subjectivity implies an indexical. I think we can ask the question "why is there any subjectivity anywhere?" in a way that is quite distinct from the question "why is there my particular subjectivity?" Of course, to make sense of the question, we must first be acquainted with subjectivity in the form of our own, but we can still abstract the concept from our particular experience of a specific instantiation of it when asking the question.
Posted by: CERL | October 23, 2008 at 09:10 PM
Of course, to make sense of the question, we must first be acquainted with subjectivity in the form of our own
CERL: And that is why I would say that you need an indexical - the thing about subjectivity is that it's, well, subjective ;) Unlike thermodynamic systems, subjective experience is a 100% closed "system" - it does not dissipate or mix like matter (as far as I know, that was first stated by the pre-socratic Greek philosopher Anaxagoras fr 12 450 B.C.E). Consciousness is a strongbox from which there is no escape, other than possibly unconsciousness or death. Yes, I can infer that some other arrangement of atoms or software "experiences" something perhaps like I do, but I can never really know. No matter how drunk or stoned I get, I will never, ever experience reality from the perspective of another mind (future Borg implants or mind melds excepted ;) Everything that is known or can be known (science, math, philosophy, religion, art, noumenal, phenomenal, etc) is a part of that subjective first-person experience, and has never actually been separated from it. Not ever. Objectivity is a third-person model in the first-person mind, that agrees well with most of what is observed (subjectivity and why-am-I-me excepted).
BTW, solipsism will make the subjectivity and why-am-I-me problems vanish, but then you're trading these problems for a whole set of even more severe problems (not that solipsism is refutable in an absolute sense).
And of course, all of this is my highly subjective opinion...
Posted by: Jeff | October 27, 2008 at 08:20 AM
OK, so Jeff, basically you're invoking the problem of other minds - but I still don't see how it supports your contention that subjectivity implies an indexical. Basically we have three options - either we solve the problem of OM, we become solipsists, or we ignore the problem and just assume that other minds exist. I take it the first two options are nigh-impossible and absurd, respectively, so I (and most others, I think) proceed under the assumption that others do in fact have minds like mine.
This being taken for granted, it makes just as much sense to ask "why is there consciousness at all, the thing that is instantiated in all these different minds" as it does to ask "why is there blood at all, the thing that is instantiated in all these different circulatory systems"? And the term "consciousness" in the first question need no more be indexical than "blood" in the second. It is true that we must have firsthand experience of consciousness to know whereof we speak, but this is also true of green, for instance - yet I can still unproblematically quantify over all green things, and when I ask "why are all the green things green", the term 'green' is not an indexical.
Posted by: CERL | October 27, 2008 at 03:48 PM