tag:blogger.com,1999:blog-2528427209170389893.post5224431022471383888..comments2023-04-29T07:16:10.662-07:00Comments on Sub-sub-librarian: The Traveler's DilemmaSub-sub-librarianhttp://www.blogger.com/profile/01914007517722127444noreply@blogger.comBlogger10125tag:blogger.com,1999:blog-2528427209170389893.post-38687634720693448582008-06-30T08:35:00.000-07:002008-06-30T08:35:00.000-07:00One more thing. I think your idea about the pragm...One more thing. I think your idea about the pragmatic natural of rationality is a good one. The question: "what do we want rationality for?" is exactly the right question, in my opinion. (Any you know how Dennett says that good philosophy is about asking the right questions.) I wonder though: are you really a pragmatist about scientific truth? (It's okay if you are. I think I am too at some level.) But then, what is the "use" or purpose of science if not just the desire to know the truth? And will you then give a pragmatic account of truth? (Again, it's okay if you do, I'm just curious.)<BR/><BR/>Also the idea that rationality can perhaps sometimes lead to an unsatisfying life is right on. This is the excellent point that William James makes against Clifford's "snarling logicism" the the example of the man who doesn't know (doesn't have sufficient evidence) that a certain woman likes him, and so refuses to act until he has sufficient evidence that she likes him. James says that 10:1 he'll bet that the woman will never like him. Sometimes it takes action without sufficient justification that some fact is the case to bring that fact about.Silentiohttps://www.blogger.com/profile/02329691157023177256noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-72832572258861354182008-06-30T08:04:00.000-07:002008-06-30T08:04:00.000-07:00Sub-sub:Interesting post. I remember talking with...Sub-sub:<BR/><BR/>Interesting post. I remember talking with this with you while on a hike somewhere. I can't remember where.<BR/><BR/>After reading your post and some of the replies, it seems to me like there’s an ambiguity in the way the paradox is set up. What is the goal of the game: to get more money than the other person or to get a large amount of money? It seems like the rationality of the decision depends on what the goal is and from what you described it isn’t clear what the goal of the game it. I think it is clear that if the goal is to beat your opponent then the best thing to do is to say $2. That way, you are guaranteed that your opponent won’t underbid you and thus get more money. Any bid higher than $2 makes it more likely that you won’t come out on top. But if the point is the get a large sum of money (but not necessarily to get more than your opponent) then the best thing to do would be to go high. So you could be guaranteed at least $98 if you bid 100. And if both participants had that same goal, then they would both bid 100.<BR/><BR/>On funny paradoxes. The Monty Hall paradox is pretty cool: http://en.wikipedia.org/wiki/Monty_Hall_problem<BR/><BR/>On human rationality. Some of the most interesting work on human irrationality comes from the social psychologists up at University of Michigan: Nisbett and Ross. They have all these crazy experiments about how irrational humans are. My friend, Max, is writing a dissertation on this stuff. I think that one really interesting finding is that many times people give post hoc rationalizations for decisions they make when in reality they are choosing some far more basic strategy for decision such as “choose the one on the left.” An interesting article on this is: Nisbett and Wilson, 1977 http://www.psychwiki.com/wiki/Nisbett_&_Wilson_(1977)<BR/><BR/>Another interesting article on the failure of human reason to measure up to the standards of logic and decision theory is the Wasson selection task: http://en.wikipedia.org/wiki/Wason_selection_task<BR/><BR/>The interesting thing about research with the wason selection task is that if you change the subject matter then people’s performance improves drastically. Thus, even though the formal logic of the situation is the same, performance changes with the content of what is reasoned about. Thus human reasoning doesn’t seem to match up with the strictures of formal logic, according to which inferences are made in virtue of their form, not content.<BR/><BR/>I think that it is probably correct to say that human cognition did not evolve to solve logical problems. What this area of research seems to suggest this that humans are not computers or even computer like in our processing. Big surprise (much to the chagrin of Fodor). One interesting thought here is how an immersion in language (the capacity for which may very well be innate—which doesn’t mean there’s no environmental stimulus necessary in order to actually acquire language) can actually change the way we think. Andy Clark has some interesting (though, I think ultimately unsuccessful) ideas about how language transforms our thought. Think, for example, about the ability (shared by nonlinguistic creatures) of making rough estimations of differences between quantities. For example, someone could tell the difference between 2 and 5 apples (but probably not between 9 and 10). Chimps can do this. But chimps cannot add to get a precise sum, and they can’t count to tell the difference between say, 82 and 79. Clark thinks that symbols are needed to pull off a task like this. Our ability to use symbols brings in its wake new skills unavailable to nonlinguistic creatures. Perhaps an immersion in language enables us to become more logical and more rational. Or perhaps not. Maybe the cognitive mechanisms that account for many of our decisions are not linguistic. But it seems that at least some are. So even if we aren’t perfectly rational, we’re more so than chimps. And maybe that is because we have language.<BR/><BR/>Finally, I wanted to say a few things about rationality. I think you’re right to say that rationality is about acting for reasons. Perhaps acting for reasons is integrally tied up with presenting justifications for actions, and perhaps justification for action is essentially linguistic. Typically when we think of acting for reasons we are attributing to either ourselves or others some propositional attitude. I think it is plausible that propositional thinking requires language and so probably animals don’t act for reasons. This isn’t to say they don’t think. Of course they do. They just don’t have propositional thought. And propositional thought is where reasons reside.<BR/><BR/>So acting for reasons is certainly part of rationality. But there’s another question (one that is the purview of epistemology) of whether reasons are justified. Call this Rationality with a capital “R.” This issue of justification is what philosophers are mostly concerned with. And the question of justification and the challenges to it (e.g. “naturalized epistemology,” “naturalized ethics,” in general, attempting to naturalize normativity) are some of deepest and most difficult there are in philosophy today (or so I think). I would submit that these are the real questions and contentions that are at the root of much disagreement in a wide range of philosophy, from reductionist theories of mind to free will debates and beyond. Are we nothing more than a bunch of relatively stupid homunculi that, when working in concert, produces what we call rationality (and Rationality with a big “R” too for that matter)? And if so, does that mean we have eliminated Rationality?Silentiohttps://www.blogger.com/profile/02329691157023177256noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-17951277552970865602007-06-23T07:57:00.000-07:002007-06-23T07:57:00.000-07:00The Crazy Dreamer: Yes, I think your re-re-revised...The Crazy Dreamer: Yes, I think your re-re-revised thesis statement works well for both Basu and me. On the free will point, I’m not sure Descartes’ evil genius does quite what you say it does. As far as I understand it, the evil genius is an example of hyperbolic doubt that is used to see if we can figure out what we can know without using intuition. The fact that we can think of such a thing as intuition, be conscious of our intuition, means that we exist. We are not relying on intuition to come to that conclusion, although intuition might be the content of our consciousness while we consider whether we exist or not. <BR/><BR/>My point is that there are situations where it makes sense to rely on intuition and some where it doesn’t. You have an intuition that you have free will, and I think it is reasonable to act on that belief. There is no satisfactory way to explain how free will works in a cause and effect, material world, without having recourse to something like an immaterial soul that works in some mysterious, otherworldly way. An analogy here would be relativity. Our intuition is that time is not relative, and there is a good reason for that, considering our dimensions and speed. But the fact is that time is relative. All of my life I will probably get away with the assumption that time is a constant, but if I ever start doing some serious intergalactic traveling (or become an engineer for GPS technology) I better think ahead. The determined human mind has not been experimentally demonstrated as conclusively as relativity has, but in my opinion it is not a matter of if we can do so, only when.<BR/><BR/>What You Dream: I see the point of your third paragraph. The most successful choices for the Traveler’s Dilemma are successful for a reason, and eventually we might be able to explain that based on a better understanding of reality. We both think that’s true because we believe that the world acts according to natural laws, and these laws can be understood using logic, mathematics, etc. This is my point about a pragmatic understanding of rational behavior. If the logic of game theory is pointing us in the wrong direction, then it is better to go with our intuition and hope we can come up with a better rational explanation for why it works. <BR/><BR/>When it comes to religious texts, I agree that they are not likely to tell us the truth about the origin of the universe or of life, and they should not be seen an infallible historical documents. But they do a little more than provide a social life. A belief in a loving god is deeply satisfying and gives a sense of meaning that is confirmed by everyday experience in the minds of believers. This is such a common intuition and makes sense to so many, that it is hard to deny it as a rational behavior for coping with an otherwise “meaningless” life. I personally find meaning in a life without god, but many find this to be impossible.Sub-sub-librarianhttps://www.blogger.com/profile/01914007517722127444noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-76379702440595680622007-06-20T08:06:00.000-07:002007-06-20T08:06:00.000-07:00Comment Disclaimer: I am absolutely ignorant when ...Comment Disclaimer: I am absolutely ignorant when it comes to game theory.<BR/><BR/>I'm going to try and change the direction of these comments because your underlying question is very interesting: Are there times when it is better, existentially, to behave irrationally on a consistent basis?<BR/><BR/>Before game theory came along, TCD's logic/probabilities made the most sense. Game theory brought the added layer of a kind of mathematics/logic of psychology. In the same way, it's possible that acting instinctually (irrationally) may later be discovered to be rational as our knowledge of the brain and psychology increases. For example, the first people to start thinking it was a good idea to start washing their hands before doing open-heart surgery probably didn't realize how rationally they were behaving, and their behavior was later justified by scientific discovery. Similarly, I think it's possible that behavior that is only based on likelihoods right now could eventually be proven logical. I do not, however, believe that behavior based on religious texts like the Bible would eventually be proven rational because of how unlikely it is that they are based on truth.<BR/><BR/>People who believe the Bible tells a good story are probably right, but taking the further step of basing your life on is isn't looking at it rationally at all when so much of it is internally inconsistent and fails to describe the world as it really is. If it fails in those regards, how can it be expected to direct us in any truly helpful way?<BR/><BR/>Your decision to let Diamond's story enrich your life is based on the likelihood of its truth, its seeming accuracy. His book is full of scientific evidence and logical theories and can serve as a basis for explaining a hell of a lot of stuff. Religious texts cannot make this claim, since it is not likely they're true, based on evidence.<BR/><BR/>With our current breadth and depth of knowledge about the world and humans, the best we can do at this point is behave as rationality, science and likelihood dictate. I truly believe that starting with a rational, scientific foundation starts a person out better down the path of self-discovery and search for universal truth, and that starting with a religious foundation only puts us behind on that path.<BR/><BR/>The only value I can see for religious belief and behavior is purely social, since yes, it would make my social and family life run much more smoothly if I believed the same things they did. In the end the choice to behave irrationally has to benefit the parts of life someone feels are most important, and if the social element wins out over something like a personal quest for truth, there isn't much I can argue against that.The Unapologetichttps://www.blogger.com/profile/02958637167634952540noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-52449087938130525472007-06-20T07:37:00.000-07:002007-06-20T07:37:00.000-07:00Thanks for pointing that out. That really seems t...Thanks for pointing that out. That really seems to change the tenor of the entire article to me.<BR/><BR/>My protest is more with the advert-teaser of "When It Pays to Be Irrational". Perhaps Basu had no hand in that subtitle and it is a misrepresentation of what he was attempting to say.<BR/><BR/>Based on that, I'd say I mostly agree with his assertions, since that sort of takes the teeth out of it. I guess I'd have to change his thesis to something like "Game theory is rational, but it is also rational to not rely solely on game theory, as it may be flawed." I think that ends up supporting your argument better as well.<BR/><BR/>As for the free will debate, I'm willing to admit that I don't know enough about it. My current stance is that determinism is simply counter-intuitive. I can imagine the theories, but I do not know of any compelling arguments to lead me to discard my intuition in this case. From what I can see determinism is sound as a theory in a similar way to descartes' evil genius theory, but in both cases, while the theory doesn't collapse on itself, i dont find that sufficient evidence to adopt it. I could not argue against descartes without relying on my intuition and i cannot argue against determinism without my intuition as well.thecrazydreamerhttps://www.blogger.com/profile/11829060795346199159noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-1204405132188288292007-06-20T06:51:00.000-07:002007-06-20T06:51:00.000-07:00Actually, going back to my original post, I quote ...Actually, going back to my original post, I quote Basu as saying, “there is something rational about choosing not to be rational when playing the Traveler’s Dilemma.” So he doesn’t think it is irrational to go against the most formal rational arguments. I’m suggesting the same goes for free will and (some) religious beliefs.Sub-sub-librarianhttps://www.blogger.com/profile/01914007517722127444noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-86076239357225011922007-06-20T05:40:00.000-07:002007-06-20T05:40:00.000-07:00Yes, the thesis statement for Basu’s article would...Yes, the thesis statement for Basu’s article would be “game theory is rational, and going against game theory is irrational…Therefore, sometimes it is beneficial to be irrational.” What I’m asking is, if going against formal rational conclusions is consistently beneficial in certain circumstances, would we want to call such acts irrational? <BR/><BR/>You make an interesting point about starting at 2$ and working your way up. The reason that doesn’t work is that all answers from 3-100$ are values from which an individual “player can do better by deviating unilaterally.” Starting at 2$ would only stop an individual from raising the value at all, because by giving a value of 3$ you are running the risk of being underbid. That’s the game theory answer.<BR/><BR/>I have a hard time believing game theory’s conclusions as well. My concern is with rationality as a process that we all engage in at different levels and to different extents. As I said in my original essay, rationality at its best is abstraction in terms of logic and mathematics. There are different reasons we use for doing things, which I still consider rationality, and sometimes our less formal reasons are at odds with our abstract formulations. One way to decide between the two is to use the one that works the best.<BR/><BR/>The free will situation is the most convincing to me. I’d be interested in your thoughts on this topic when you consider the arguments for and against human free will. Again, I believe there is no such thing, but I also believe that living as if there is free will is rational behavior.Sub-sub-librarianhttps://www.blogger.com/profile/01914007517722127444noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-64724775185701290432007-06-18T08:28:00.000-07:002007-06-18T08:28:00.000-07:00I can somewhat see what you're saying... Correct m...I can somewhat see what you're saying... Correct me if I'm misunderstanding this: Game theory suggests that humans will benefit most from a bid of $2. In reality, humans actually benefit more from higher bids. Basu is asserting that game theory is rational, and going against game theory is irrational. Therefore, sometimes it is beneficial to be irrational.<BR/><BR/>Is that a fair summary? If so, I figuratively pull out my red sharpie and circle the statement "game theory is rational, and going against game theory is irrational". <BR/><BR/>I understand what game theory is saying... that you're trying to guess what your opponent will choose, and if you keep guessing one lower than your opponent then you end up better off. However, what is stopping the same pattern of hypotheticals from occuring in the opposite direction? Start at $2, and if your opponent chose $4, you realize you could've earned more money if you had chosen $3. And if they had chosen $5, you could've earned more having chosen $4, etc. If anything, I think it is an irrational assumption that your opponent will start at $100 and follow the recursion process that Basu predicts.<BR/><BR/>In other words, Basu is saying that this game theory is rational, but incorrect. I am saying this game theory is incorrect, therefore irrational.<BR/><BR/>Perhaps we are at odds over the definition of the word irrational, or perhaps we are at odds over the acceptance of the rationality of game theory. <BR/><BR/>As for your free will topic, I haven't been exposed to a convincing argument against free will. If I were willing to concede the point, or for others who are willing to concede the point, its an acceptable example. Perhaps we can revisit this discussion after I learn more about the free will vs. determinism arguments.thecrazydreamerhttps://www.blogger.com/profile/11829060795346199159noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-86399845418416163862007-06-17T16:10:00.000-07:002007-06-17T16:10:00.000-07:00Thanks for the comment, Crazy Dreamer. I too found...Thanks for the comment, Crazy Dreamer. I too found it hard to believe that the rational answer is 2$. The trick, I think, is that there is a psychology built into the analysis. Basu says that game theorists describe this kind of situation, where I think that you think that I think that you think I will say X, by “saying that rationality is common knowledge among the players.” This means that a formal analysis can’t just be figuring out the odds, because this is not a random event. For this kind of situation theorists rely on equilibrium concepts, such as the Nash equilibrium (of A Beautiful Mind fame). “A Nash equilibrium is an outcome from which no player can do better by deviating unilaterally. Consider the outcome (100, 100) in TD (the first number is Lucy’s choice, and the second is Pete’s). If Lucy alters her selection to 99, the outcome will be (99, 100), and she will earn $101. Because Lucy is better off by this change, the outcome (100, 100) is not a Nash equilibrium.” Basu says that there are other kinds of equilibriums that game theorists use, and they all predict 2$.<BR/><BR/>In regards to your third paragraph, I’m not sure what you mean by “faith.” I’m assuming you mean some kind of absurd belief system, like worshiping the porcupine god of my dreams who has 97 needles, making 97 the most Holy number. Picking 97 and having success would be seen as a justification for the porcupine god, and you find that objectionable (correct me if I’ve misunderstood). Yes, I agree that this would be unjustified, but I’m not suggesting this sort of scenario. In the Traveler’s Dilemma, having “faith” that the other person is going to pick something in the 90’s, and not 2$, is justified. It is based on an intuition, a hunch, about what other people are like. The reality of a personal god and the assurance that comes from such a belief is an intuition that many people feel.<BR/><BR/>Here’s another example that I think is appropriate to bring up. I would never say that a person who believes in human free will, and acts on the basis of that belief, is being irrational. But I think the most formal rational arguments on the subject indicate that there is no such thing as free will. Therefore, I think it is strictly irrational to believe in free will, knowing full well that all people (even the most thorough determinists) behave as if free will exists. This is rational behavior based on irrational conclusions. You may not find this a good analogy either. For me, this confirms that there are actually Traveler’s Dilemma situations out there, and they may be more common than we think.<BR/><BR/>Yes, I’m going to join Soul & Meat. I still have your original e-mail. I’ve just been procrastinating.Sub-sub-librarianhttps://www.blogger.com/profile/01914007517722127444noreply@blogger.comtag:blogger.com,1999:blog-2528427209170389893.post-85691477668725717182007-06-15T21:03:00.000-07:002007-06-15T21:03:00.000-07:00I have an objection to the assertion that the most...I have an objection to the assertion that the most rational choice would be $2. Perhaps that is true if you were on some sort of game show and the objective was to end up with more money than your opponent, but in this real life scenario, I think rationality does not dictate you select $2. If we're willing to assume an equal chance for your opponent to select any amount between $2 and $100, then a selection of $2 will give you a 1/99 chance at $2 and a 98/99 chance at $4. If you select $100, then there is a 5/99 chance that it will be less than $4, but a 94/99 chance that it will be equal to or more than $4. In fact, the most rational choice, statistically speaking, would be $97 or $96. Both give you a mean earning of $49 and 8/99ths of a cent (yes i'm geeky enough to make a spreadsheet to determine the exact value one ought to choose)<BR/><BR/>Perhaps I am missing the logic behind selecting $2, or misunderstood the question. I feel like you were attempting to address this in your fourth paragraph, but I either don't find your or Basu's explanation satisfactory or I am just not understanding it. <BR/><BR/>Even were I to concede though, a single instance of irrationality having a better outcome than rationality doesn't seem to be a compelling defense of faith. A person enjoying this game may have a faith that dictates they always choose the number 97 every time they are asked to pick a number between 2 and 100. While choosing 97 will end up netting them the most money over an extremely large sample size, and also provides the highest probability of a pleasant outcome in the one instance we have, that is certainly no justification for their faith.<BR/><BR/>I tend to agree with your conclusion that a life without emotion or irrationality is an incomplete life, but I'm not satisfied with the traveler's dilemma to support that conclusion. <BR/><BR/>Also, I hope this does not come off as a hostile response. I enjoyed the post, and I am just trying to continue the conversation. I readily concede that I probably am misunderstanding the game or the argument, because it seems unlikely that an economist who's getting his work published on the cover of <I>Scientific American</I> was incapable of making a spreadsheet to calculate the probabilities himself. Also, were you interested in being added as an author to Soul & Meat? If so, let me know the email address of your blogger account so I can send you an invite. You can post it here or on my blog in a comment, or send it to my username @gmail.com. The blog is at <A HREF="http://soulandmeat.blogspot.com" REL="nofollow">soulandmeat.blogspot.com</A> if you would like to join. We used your suggestion of "What say ye, pagans?" as the tagline.thecrazydreamerhttps://www.blogger.com/profile/11829060795346199159noreply@blogger.com