<p>This weekend, I've started to try to teach myself microeconomics. I'm failing miserably, and to make matters worse, a question at the back of a chapter has stumped me.</p>
<p>Here it is:</p>
<p>A mathematically "fair bet" is one in which a gambler bets, say, $100, for a 10 percent chance to win $100 dollars. Explain why this is not a fair bet in terms of diminishing marginal utility of dollars. Why is it even a less fair bet when the "house" takes a cut of each dollar bet? So is gambling irrational?</p>
<p>I'm thinking it through like this:</p>
<p>$100 dollars can win $1000 dollars, and, by the same mathematical token, $99 dollars should be able to win the gambler $990 if he or she is successful. However, diminishing marginal utility says that from $1000 on down, each successive dollar will hold less and less value for the gambler than did the first $1000 dollars. I.e., $1000 will be worth more to the gambler than will $990, even though he or she is putting up a dollar less. A possible reason might be that $1000 dollars will buy a flat-screen TV whereas $990 will not be enough.</p>
<p>It is even more unfair when the house takes a cut because the house's cut will turn a $1000 turnout into a $990 turnout even if the gambler puts down $100 (or some other number of dollars fewer than $1000, depending on the house's rake rules).</p>
<p>In light of the above two paragraphs, gambling is indeed irrational because the gambler will never be able to maximize the utility of his dollars.</p>
<p>My book doesn't have an answer key, but I feel that my explanation is completely wrong. So can someone here explain it to me? Does it have to do with the 90% likelihood that the gambler will squander his or her bet?</p>
<p>My guess is that it means that, because of the diminishing marginal utility of dollars, the first 100 dollars (the ones that you are risking) are more valuable than the second 100 dollars (the ones that you could potentially win).
I think it’s a bad question because marginal utility is much easier to understand and more relevant when dealing with consumer items. For example, would you bet two TVs for a 10% chance to win two TVs? No, because what in the heck would you do with 4 TVs? It’s a totally worthless bet whether you win or lose, due to diminishing marginal utility.</p>
<p>Dollars are different because you’d need to have a heck of a lot of them to get sick of them or to be able to find no use from them. At the scale that the question uses, the utility doesn’t even diminish really, but that’s the point it’s trying to get at.</p>
<p>
</p>
<p>right but if you win, you get $1000. So “more valuable” could mean that $1000 has more total utility than the $100, which is obvious, but that doesnt use law of diminishing utility. My explanation was that the dollars from $1000 to $990 are worth more than those from, say, $500->$400 even if the gambler only puts up $50 or $40 to win them because the $1000 dollars can buy a lot more than $500 can even though both are mathematically safe bets.</p>
<p>I do agree that it is a bad question.</p>
<p>Here’s another one, from the same book. It only touches briefly on game theory, with almost nothing as to which combination of a matrix is most likely to take place–just that there are many that are possible.</p>
<p>C’s prices are on top and D’s are on the left. D’s profits are left of the backslash.</p>
<p>---------40-----------35</p>
<p>40------60/57--------55/59</p>
<p>35------69/50--------58/55</p>
<p>I said that C will probably choose a low price because it is only with a low price that it will have any profit over D. Knowing that C will choose a low price, D will also choose a low price because charging a high price with C’s low price will result in consumers’ flocking to C. If, instead, I had chosen to observe D’s possibilities first, I would have reached the same conclusion: D will choose a low price because it can both maximize profit if C chooses a high price, and even if C doesnt, D will still maintain an advantage whereas D risks losing to C if it charges a high price. Knowing D will charge a low price, C will charge a low price to compete.</p>
<p>However, not every payoff matrix will look like the one above, one that allowed me to tackle each firm’s decision possibilities one at a time. How do you determine the most likely pricing outcome in general cases?</p>
<p>^Actually, for AP Micro that is seriously all you need to know. The Game Theory questions are easy as you-know-what. The hardest thing they will throw at you is maybe one firm will not have a dominant pricing strategy and you have to realize that.</p>
<p>It’s not for the exam; it’s about knowing the material. For me, at least.</p>
<p>That is the AP material and it’s as far as my textbook covered. If you want to know more check out a Game Theory book.</p>