I'm probably slow to this, but I recently read a wikipedia article on the Monty Hall Problem. (http://en.wikipedia.org/wiki/Monty_Hall_Problem) I found it fascinating. The basic concept is this - if given a choice of 3 concealed options, 1 of which being the one you want and 2 being ones you don't want. If you select (but not verify) one of the concealed options - then a knowledgeable observe eliminates one of the remaining options (that is one of the undesirable options), and offers to allow you to switch your choice to the other concealed option, you are 33% MORE likely to get the option you are looking for by switching (as in you'll have a 66% chance of getting what you want). Please read the wiki link for a better understanding as it explains it better than I can.
Part of what intrigues me is how counter-intuitive it is. And I've been wondering how to incorporate this concept into game design. Partly as a skill test (knowledgeable players will be able to gain an advantage if they understand how the system works) and partly as ways to tweak the scenario to give desired results.
For a couple of examples of what I mean - I'll describe some other similar scenarios. You'll need to understand how the above works to get what I'm talking about...
Gambling Game Scenario
You pay 50$. There are 3 cards, 2 with 0$ on them, and 1 with 100. Utilize the Monty Hall Scenario. You pick a card, the dealer picks a card and flips it over revealing a 0$. Then gives you the option to pay 15$ to switch.
Optimal play (switching) leaves the player ahead by 1$ per game over infinity games. Non-optimal play (staying) leaves the player down 17$ per game over infinity games.
4-door scenario - MH opens 1
This time there are 4 concealed objects (for purposes of board game design, I see these working best at cards). You choose one, then the opponent (dealer, GM, moderator, etc.) looks at the remaining 3 cards and flips 1 over (a negative one). Choosing to switch only imparts a 12.5% improved chance at getting what you want (from 25% to 37.5%).
4-door scenario - MH opens 2
Same as above, but the opponent flips 2 negative cards over. Chances improve by switching to 75%.
2 Colors of Doors
This one was proposed by a friend of mine.
2 Red Backed Cards, 2 Blue Backed Cards - still 1 object that's desirable.
You choose a card, the opponent looks at the cards of the OPPOSITE color, and flips 1 undesirable one over - and offers a chance to switch.
Stay - 25% chance at a good card, switch to the remaining opposite choice card 50% chance at a good card. And I THINK (please confirm or deny my suspicions on this) if you switch to the other card of the same color you picked you'd have a 33% chance at the good card (this seems weird though...)
Anyway - anyone have any interesting ways to incorporate these into game design? I'll post more as I come up with them - or further implementations.
Two things:
1. The numbers of your first example are slightly misleading, because you've rounded them up and down (as I'm sure you're aware). The expected return per game of switching is really $1.33, and the expected loss of not doing so is $16.66. Essentially the players either pay $50 to have a one-third chance of winning $100 (an obvious loss, and a choice no player would make) or pay $65 to have a two-thirds chances of winning $100 (a small but worthwhile return). But there's no game (I would say) since there's only one possible 'strategy' that doesn't result in an obvious and predictable loss for the player.
Yes, I was rounding. And I agree it's not a "game" outside of the way the term would be used for other games of chance (like Candy Land, or Chutes and Ladders). But it could be interesting in the context of another game. Especially if the answer is less certain than just 1 card you want, and 2 you don't want. Like if it was 1 card you want, and 1 you were indifferent about, and 1 that would be certainly negative to you. Though I think the 3 card scenario is far less interesting than the 4 card scenario for game design purposes - because while the player is stronger in switching - they have 2 choices to switch to. Making it still a gamble.
Well that's what I'm wondering...
If you look at it this way. There's a 50% chance that the card is a red backed card, and a 50% chance that the card is a blue backed card. I choose a red card. There's a 25% chance it's the card I chose. My opponent looks at the blue cards, and then flips over a "goat". That means the unflipped blue card has a 50% chance to be the "car" doesn't it?
Now, the unchosen red card is where (to me) it get's confusing. The card I picked doesn't have improved chances of being the "car" But the card I didn't pick previously now has a 1 in 3 chance of being the car..
I know its counter-intuitive... especially since the percentages all add up. But the other red card just can't really be 25% chance can it? I'm also wondering if the new chances would change to be 50% - blue card, 37.5% - unchosen red card, 12.5% - chosen red card. But it also doesn't make sense that the originally chosen card has a lower then 25% chance.
If I'm understanding your response correctly, you feel the 2 red cards have 25% chance, and the unchosen blue card has a 50% chance? Is that correct?