Tag Archives: Probabilities

#195: Probabilities in Dishwashing

This is mark Joseph “young” blog entry #195, on the subject of Probabilities in Dishwashing.

I was going to call this, What Are the Odds?, but that’s too useful a title to use for this.  Actually, almost every time my bill rings up to an exact dollar amount, ending “.00”, I say that to the cashier, and usually they have no idea, so usually I tell them.  But I’m a game master–I’ve been running Multiverser™ for over twenty years, and Dungeons & Dragons™ for nearly as long before that.  I have to know these things.  After all, whenever a player says to me, “What do I have to roll?”, he really means “What are the odds that this will work?”  Then, usually very quickly by the seat of my pants, I have to estimate what chance there is that something will happen the way the player wants it.  So I find myself wondering about the odds frequently–and in an appendix in the back of the Multiverser rule book, there were a number of tools provided to help figure out the odds in a lot of situations.

And so when I saw an improbable circumstance, I immediately wondered what the odds were, and then I wondered how I would calculate them, and then I had the answer.  It has something in common with the way I cracked the probabilities of dice pools decades back (that’s in the book), but has more to do with card probabilities, as we examined in web log post #1:  Probabilities and Solitaire, than with dice.

So here’s the puzzle.

At some point I bought a set of four drinking cups in four distinct colors.  I think technically the colors were orange, green, cyan, and magenta, although we call the cyan one blue and the magenta one red, and for our purposes all that matters is that there are four colors, A, B, C, and D.  We liked them enough, and they were cheap enough, that on my next trip to that store I bought another identical set.  That means that there are two tumblers of each color.

I was washing dishes, and I realized that among those dishes were exactly four of these cups, one of each of the four colors.  I wondered immediately what the odds were, and rapidly determined how to calculate them.  I did not finish the calculation while I was washing dishes, for reasons that will become apparent, but thought I’d share the process here, to help other game masters estimate odds.  This is a problem in the probabilities of non-occurrence, that is, what are the odds of not drawing a pair.

The color of the first cup does not matter, because when you have none and you draw one, it is guaranteed not to match any previously drawn cup, because there aren’t any.  Thus there is a one hundred percent chance that the first cup will be one that you need and not one that you don’t want.  Whatever color it is, it is our color A.

In drawing the second cup, what you know is that there are now seven cups that you do not have, one of which will be a match.  That means there is one chance in seven of a match, six chances in seven of not matching.  This is where I stopped the math, because I hate sevenths.  I know that they create a six-digit repeating decimal that shifts its position–1/7th is 0.1̅4̅2̅8̅5̅7̅, and 2/7ths is 0.2̅8̅5̅7̅1̅4̅, and in each case the digits are in the same sequence, but I can never remember that sequence (I don’t use it frequently enough to matter, and I can look it up on the table in the back of the Multiverser book as I just did here, or plug it into a calculator to get it).  So the probability of the second cup matching the first–of drawing the other A–is 14.2̅8̅5̅7̅1̅4̅%, and the probability of not drawing a match is 85.7̅1̅4̅2̅8̅5̅%.

So with a roughly 86% chance we have two cups that do not match, colors A and B, and we are drawing the third from a pool of six cups, of which there are one A, one B, two Cs and two Ds.  That means there are two chances that our draw will match one of the two cups we already have, against four chances that we will get a new color.  There is thus a 33.3̅3̅% chance of a match, a 66.6̅6̅% chance that we will not get a match.

We thus have a roughly 67% chance of drawing color C, but that assumes that we have already drawn colors A and B.  We had a 100% chance of drawing color A, and an 86% chance of drawing color B.  That means our current probability of having three differently-colored cups is 67% of 86% of 100%, a simple multiplication problem which yields about 58%.  Odds slightly favor getting three different colors.

As we go for the fourth, though, our chances drop significantly.  There are now three colors to match, and five cups in the deck three of which match–three chances in five, or 60%, to match, which means two in five, or 40%, to get the fourth color.  That’s 40% of 67% of 86% of 100%, and that comes to, roughly, a 23% chance.  That’s closer to 3/13ths (according to my chart), but close enough to one chance in four, 25%.

A quicker way to do it in game, though, would be to assign each of the eight cups a number, and roll four eight-sided dice to see which four of the cups were drawn.  You don’t have to know the probabilities to do it that way, but if you had any matching rolls you would have to re-roll them (one of any pair), because it would not be possible to select the same cup twice.  In that sense, it would be easier to do it with eight cards, assigning each to a cup.

I should note that this math fails to address the more difficult questions–first, what are the odds that exactly four of the eight cups would be waiting to be washed, as opposed to three or five or some other number; second, how likely is it that someone has absconded with one of the cups of a particular color because he likes that color and is keeping it in his car or his room or elsewhere.  However, the first question is an assumption made in posing the problem, and the second question is presumably equally likely to apply to any one of the four color cups (even if I can’t imagine someone taking a liking to the orange one, someone in the house does like orange).  However, it should give you a bit of a better understanding on how to figure out the odds of something happening.

For what it’s worth, the probability of the cost of the purchase coming to an even dollar amount, assuming random values and numbers of items purchased, is one chance in one hundred.  That, of course, assumes that the sales tax scheme in the jurisdiction doesn’t skew the odds.

[contact-form subject='[mark Joseph %26quot;young%26quot;’][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment: Note that this form will contact the author by e-mail; to post comments to the article, see below.’ type=’textarea’ required=’1’/][/contact-form]

#45: The Math of Charging Your Phone

This is mark Joseph “young” blog entry #45, on the subject of The Math of Charging Your Phone.

I once had a charger for my phone that I could plug into the cigarette lighter outlet (now called “power outlet”) of a car.  I used it sometimes when I would leave the house and then discover my phone was dying, or when I was headed to a convention and knew I was not going to be in the hotel room long enough or frequently enough to support the battery, or when the wires on the one in the house came loose and I couldn’t justify buying another house charger right away.  I don’t use it now because the lighter outlet in the one car we still have on the road broke.  However, I’m given to understand that newer cars are more and more coming equipped with USB ports for the specific purpose of charging cell phones or powering similar equipment, and people are doing this far more.

So of course now someone has come along and said that we shouldn’t do that because it’s contributing to an environmental disaster.

img0045Phone

He’s not a nutcase.  He’s an automotive electronics engineer, retired.  In general, he makes a good point; but in making it, he does a few things that create a misleading result.  We will get to the good point eventually here.

He begins his calculation by estimating that a smartphone requires 4.8W (four and eight-tenths watts) to recharge.  That’s fascinating, because Universal Serial Buss (USB) ports don’t deliver that much.  All USB ports deliver five volts (5V).  The common USB 1.0 and USB 2.0 ports are limited to a maximum of five hundred milliamps (500mA), or half an amp.  That means maximum output is two and a half watts (2.5W).  The newer USB 3.0 ports can deliver nine hundred milliamps (900mA), nine tenths of an amp, which comes to a maximum output of four and a half watts (4.5W).  You can’t get as much as he says the phone draws from a USB port.  We might presume that the ports in a car, not being directly tied to a computer, might have higher current capabilities, but the way USB works, the connected device controls the current flow (amperage) and thus the total power (wattage), and the smartphone is not going to assume the port can provide more than specifications dictate.

Our author gets his number not from what USB ports provide but from the amount required to charge the phone from completely dead to fully charged.  It might take a long time to do that on a trickle charge, but he’s right to the degree that if you are completely charging your phone from nothing, you’re going to use that much power to do it.  But then he assumes that you get that 4.8W in one hour, from which he calculates that this will cost you 0.03 (zero-point-zero-three, or three one hundreds) of a mile per gallon.  That’s not negligible–it’s about half a football field per gallon, a bit more than the distance around the high school track on a ten gallon tank–but if you’re getting thirty miles per gallon, it’s point one percent (0.1%), one part in one thousand.  He then multiplies that by the three trillion road miles traveled by all United States drivers in a year assuming an average velocity of thirty miles per hour, and comes up with one hundred million gallons of gasoline spent to charge phones.  That’s two hundred million dollars.  It also produces as much greenhouse gas as burning nine hundred forty-five million pounds of coal.

The article does make one excellent point:  It will cost you thirty times as much to charge your phone on your car’s engine as it will to charge it on your house current.  That’s because electric companies don’t use gasoline engines to generate electricity, but go for the least expensive options at all times, and automobiles are designed to be efficient transportation, not efficient electrical generators.  It will cost you about two cents an hour to charge your phone in your car, about six one-hundreds of a cent for the same hour of charging at home.  It will cost you, personally, even less to charge it in your hotel room if you’re on the road.  Car chargers should be the backup option, not the primary choice.

On the other hand, he’s using the phantom of big numbers to frighten us.  It is reminiscent of the famous “National Geographics Disaster” covered thoroughly (and facetiously) in The Journal of Irreproducible Results, in which scientists jokingly calculated the long-term consequences of the fact that the relatively heavy National Geographic Magazine is never scrapped but rather stored in growing piles in basements and garages, such that in millions of years the accumulated weight would cause continents to buckle and sink.  Because we’re multiplying that tiny two cents an hour by three trillion miles of driving, of course we get a huge number.  If all of that charging was shifted to wall outlets, the cost would still be over six million dollars–a lot less than two hundred million, but still one of those huge frightening numbers.  The amount of power we’re talking about for one phone is still a very small amount.  Your car stereo probably draws several times that.  If you don’t have the new light-emitting diode (LED) or similar high-technology low-power headlights, they almost certainly do.  Besides, even were you to leave your phone connected to the charger for every minute that you drive, one of the functions of USB charging systems is that when the device is fully charged it stops drawing power.  So if in that first hour of driving your phone is fully charged, it doesn’t charge more until you’ve used it.  It’s absolutely foolish to imagine that we are, or ever will be, charging our cell phones every mile that we drive.  We charge them until they tell us they’re charged, then we put them away until we notice that they’re getting low again.  The scary numbers are inflated by this critically unreal assumption.

So do the reasonable thing and charge your phone from less expensive more environmentally sound wall current instead of the power system of a gasoline engine, but don’t obsess over the number of charging ports in new vehicles.  Driving is already an expensive environmentally unsound convenience.  Using the charging ports in the car is another one, a far less one on the grand scale of things and one which can more easily be replaced by something better.  Do so when you can.

[contact-form subject='[mark Joseph %26quot;young%26quot;’][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment: Note that this form will contact the author by e-mail; to post comments to the article, see below.’ type=’textarea’ required=’1’/][/contact-form]

#40: Multiverser Cover Value

This is mark Joseph “young” blog entry #40, on the subject of Multiverser Cover Value.

In a thread on Facebook on a completely different issue (an article I encountered on an effective non-lethal weapon) posters made some comments about the complexity of the Multiverser game system.  I don’t happen to think it that complex, really (to create an Original Advanced Dungeons & Dragons™ character without limiting in advance what the player might want to be, the referee needs to have access to twelve of the thirteen hard-cover volumes), but they did tackle two of the more complicated areas:  the spell system and the way to calculate cover value for armor.  I promised to provide answers, and since I no longer have the Gaming Outpost forum for such things, the answers are going to land here.  We previously addressed the issue of Multiverser magic; this entry will deal with the cover value problem.

Combat image from Multiverser: The Game: Referee's Rules, by Jim Denaxas, (c)E. R. Jones & M. Joseph Young

This part of it was raised by one of the most experienced Multiverser referees out there, my own son Kyler:

While you’re talking about complicated math in multiverser, I’m surprised no one has brought up Cover. That was one of the first things I changed when I was trying to streamline the system.

The math for Cover can get ridiculously complicated when you’re wearing layers of armor. “Add this, divide that. Take into account material density.” I abandoned it in favor of a system that focused more on where you were hit and ascribed a damage value to each piece of armor.

I’m not saying that the Multiverser system’s way of dealing with it is bad. I’m just saying that it’s needlessly complex, basically no matter what we’re trying to do.

Ouch.

Well, in my defense, the rule book does say that calculating cover is a complicated bit of math–but at the same time, that you don’t have to do it generally, as once for any piece of armor is sufficient.  Reading some of the other comments on the thread, I’ll note that if for Multiverser purposes you’ve calculated the “cover value” of five different pieces of armor, and you wear them all, your cover value is simply the sum of all the pieces you’re wearing, even if they cover the same body parts.  So the math is only difficult when a particular piece of armor is created or acquired, and after that the only question is whether you’re wearing the same pieces or left something off.

So, what is the complication?

How well armor protects is based on two factors, one of which is also based on two factors.  The one factor is how much of the body the armor covers.  It is kind of the joke that people wear bullet proof vests but are easily killed by a shot to the head.  That’s why combat and riot gear includes helmets.  The system would be complicated indeed if we required the referee to work out how much protection was afforded to each part of the body, but we allow a sort of fiction here–if you’re wearing a bullet proof vest, you are that much harder to hit, and the “cover value” takes into account that blows against your torso are less likely to penetrate, even though your head is still vulnerable.  In theory, someone can aim for an unprotected head, but they’d take a size penalty on the shot.

The second factor is how difficult it is to penetrate.  We know from history that iron armor protects better than bronze armor, because iron weapons are more likely to penetrate bronze armor but not iron armor.  It thus follows that a suit of white dwarf alloy (if such a thing could be obtained and worn) would protect better than a suit of aluminum.  We cover this factor with a density number–nothing too scientific, just the application of a game concept of “density” extended to cover materials that have not yet been created.  We also allow the issue of thickness, when it comes to armor–if you make your armor twice as thick, it’s more difficult to penetrate–but that particular factor is usually ignored because thicker armor of that sort is overly restrictive:  armor that is twice as heavy is only twenty-five percent more protective.

So the system really comes down to these two factors:

  1. How much of your body is covered by the armor?
  2. How hard is it to penetrate the material covering it?

It’s not usually difficult.  For example, let’s suppose someone gets a full suit of jointed full plate armor.  The book suggests that such a suit covers ninety-five percent of the body–there are some slots for vision and air in the front of the visor, and a few small gaps where the metal comes together most of which open and close as the body moves.  It would be made of a relatively hard metal, but that could be a softer one like bronze or a harder one like steel.  Thus there’s a range of densities for hard metals, from 2@6 to 4@8.  From there it’s simple to convert the values to “decimalized” numbers and multiply.  If we’re looking at 95% coverage at 2@6 density, that comes to 26 x 0.95=24.7, which we round to 25, a 25 percentage point penalty on incoming attacks.  If we have heavier denser metal, say a 4@8, that’s 48 x 0.95=45.6, again rounded to 46.

It looks complicated probably probably for two reasons.  One is because of that table in the book that looks like this (you don’t have to read this table, it’s just here so you can see it):

From Multiverser: The Game: Referee's Rules, (c)E. R. Jones & M. Joseph Young

That makes it look complicated–add this, subtract that, put it all together to get a number–but ultimately, all it’s really saying is, figure out how completely the wearer is covered.  It tries to take into account things that should be considered–chain doesn’t really cover your entire body because it has little holes in it, and we’ve all read stories about the arrow or knife that went through the holes in the chain armor.  Ultimately, though, all the referee really needs to do is decide what percentage of the body is covered–or conversely not covered–to get his basic “percent covered”.  That’s all that that table is for.

The second complication arises, though, when players attempt to “game the system”.  They’ll usually try to make armor thicker to get more protection out of it–and sure, a phone book is harder to penetrate than a manila envelope, so thickness does matter.  It does not matter if the design uses layers–that is, if you’re wearing a chain shirt under solid breast and backplates, you get the full value of both.  It’s only complicated if you make the material thicker, such as making the breastplate half an inch thick instead of a (standard) quarter inch.  That requires a bit of math–but the thickness of the armor is not going to change, and wearing multiple layers of armor is simple addition, so you only have to do the complicated bit once.

After all, how many times does someone get a new suit of armor?  A few minutes to work out how effective it is should not be that much of a problem.

The game also has rules for ablative armor–armor which protects by absorbing damage–but these rules in essence say that unless the ablative armor is also stated to provide cover value, it does not provide cover value and so isn’t part of this calculation at all.  There can also be complications if someone is hiding behind a wall and someone else is destroying the wall, but that’s an attack on cover or structures, not at all about armor, so it’s not part of the usual “cover value” issue.

Or did I miss something?

[contact-form subject='[mark Joseph %26quot;young%26quot;’][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment: Note that this form will contact the author by e-mail; to post comments to the article, see below.’ type=’textarea’ required=’1’/][/contact-form]

#1: Probabilities and Solitaire

This is mark Joseph “young” blog entry #0001, on the subject of Probabilities and Solitaire.

solitaire

I expect that this blog is going to tackle a lot of issues–I am already working on another on marriage and another on copyright and another on why I left TheExaminer–and judging from past response I will get a lot more hate mail than thank you notes (although I appreciate both). However, I thought it best to begin with something light and inconsequential, something that has been nagging at me periodically for a long time that more than anything else shows how foolish we “superintelligent” people can be, as we get mentally stuck on little things that bother us.

I had been thinking about part of this article for a number of years, and kept saying it would be a silly waste of time. Then probably a year or more ago I was watching the TV series Scorpion. As one episode opens, math savant Sylvester has been charged with entertaining Paige until the rest of the team returns from somewhere, and he hands her a deck of cards, recommends she play Solitaire for a while, and then–this was the part that bothered me–tells her the odds of winning. The more I think about it, the more I think he can’t know that.

I have played Solitaire since before high school, and I won’t argue as to whether he did the math right; my argument is that there are too many variables, things he cannot know. Never mind that Hoyle has an entire chapter on solitaire card games, even if we assume agreement that it is the standard seven-pile variant in which there are increasingly from one to seven cards in each pile with the top card faced, there are still too many variants. Even in the popular MicroSoft® computer version you can switch between advancing the pile three cards at a time (the traditional version) or each card individually, and with the latter your odds of winning rise significantly (because with the three-at-a-time rules there are often cards you cannot put in play that would move the game forward). Too, I was taught that with the three-card variant when you got to the end of the pile you went back to the top, but with the one card variant you got only one pass through the deck–which significantly lowers your odds. I have also known players who believe that after each pass they are permitted to shuffle the deck before beginning the next pass, or that if as they reach the end they have only one or two cards (not three) in the pile these go on top of the others so that what was the first card becomes the second or third (both rules making it much easier to free cards from the deck). Before you can calculate the odds, you have to know the rules.

So maybe Sylvester was thinking of “standard” rules–three cards at a time, repeated passes, no rollover or shuffle–and maybe on that basis you might calculate the odds. However, there is still the matter of strategy, and some people enforce rules that interfere with strategy.

I know about this because when I played Solitaire for years as a youth (what, you thought I was a popular kid always out with friends?) I played with real playing cards. Whenever I lost, I faced all the cards to see why I lost–I learned that you could be stopped if a card on top of a pile was sitting on the card you had to have to move it, and exactly what that meant, and how sometimes to avoid it. I think it a shame that the computer version does not let you do this, look at cards in the piles when you lose. It was a significant part of my education in game probabilities. In the game, you can make choices, and the way you choose impacts your ability to win.

What you have to understand is that winning Solitaire is achieved by freeing all the trapped cards. As the game begins, twenty-one cards are trapped on the board–six under the right-hand pile, five to its left, down to one in the second pile from the left end. In order to free these cards you must legally move the cards above them. There are also cards trapped in the deck from which cards are drawn. In traditional rules games the hardest of these to reach is the top card, as you must move both the second and the third to reach it; note that you can reach the fourth card in any of several ways, as it can be reached by moving the sixth and fifth, or by removing the third and waiting for the second pass, or by removing the third and second then on the second pass removing what was the original fifth and is now the third. Because of this, cards in the deck are the easier ones to free, and progressively more so the further down the deck they are. (There are initially twenty-four cards in the deck, and on the first pass eight will be accessible if none are removed.)

So how do you improve your odds of winning?

The first rule is do not make a move simply because you can; make a move because it improves your position. There are people who play that if they can move they must move, but if for example the three of hearts is sitting in the left pile (atop nothing) and the four of spades is on the right pile (atop six cards), there is no advantage to moving the three of hearts to the four of spades, and in fact it can cost you the game. It might be that the only way to move that four of spades is to play it to the spade pile atop the three of spades, and putting the three of hearts on it will prevent that. It might be that the three of diamonds will appear in a position in which it must be moved. Assuming the rule does not say that you must make any move you can make, the only reason to move a card that leaves an open space is that you have a king to place in that space immediately. As long as the three of diamonds does not appear, you can move the three of hearts when it becomes useful; if the three of diamonds appears and must be moved, you will be glad you did not move the three of hearts.

Second, always target moves that release the maximum number of cards. At the beginning of the game, there are six cards blocked by the card on top of the right-hand pile. That is at that moment the most important card to move. Once it has been moved, there are five still blocked–the same as the pile adjacent to it on the left–and so they become the most important cards to move. Throughout the game this changes, and when you have a choice of moves you want to be aware of what move will free the largest number of cards. It is almost always the case that moving the top card from the piles is a better move than moving one from the deck, by this measure. The computer version is your friend in this regard, because at the top of each pile the edges of the cards below appear, permitting you to count how many are still in each pile. Absent that, you probably have to remember.

As to the deck, keep track of how many cards remain in it. If the number is evenly divisible by three, you are going to see the same cards on the next pass. This is the most difficult bit strategically, as unless you have the kind of memory that allows you to keep track of the order of all the cards in the deck (and I do not) you are not going to know what moves are still possible from the remaining cards in the deck. However, on the first pass through the deck you need either to remove a number of cards from the deck, preferably nearer the top, that is not divisible by three, or you are going to have to change the board sufficiently that cards near the top are going to come into play in the next pass. Sometimes you will pass on a possible move because it will worsen your situation rather than improving it. It is better to play a card that will shift the deck on the next pass than to play a card that will restore it to the same sequence.

As an example, with the situation previously suggested, the three of hearts on the left pile atop nothing and the four of spades to the right atop six cards, you might well turn up the three of diamonds in the deck. At this point you have to decide whether or not to play the three of diamonds on the four of spades, and there are several competing issues in answering that. If the three of diamonds is the first faced card, that is, the third card in the deck, or if you have not yet played a card out of the deck on this pass, there is a strong argument not to play it–it will be in exactly the same place on the next pass, and you can see what other moves are possible before making that decision (e.g., if the king of hearts appears as the next card, and you need to move the queen of spades off the fifth pile and so moving the three of hearts is the better choice). This applies, too, if the three of diamonds is the last card in the deck, because it will be there on the next pass. On the other hand, if moving the three of diamonds out of the deck will give you new cards on the next pass, you want to do that, as it frees up cards in the deck. Note that deferring the decision to the next pass in the first instance has merit, because you might play two more cards from the deck in the next turn or two, and had you played the three of diamonds that would mean you played three cards from the deck and will see mostly or all the same cards on the next pass.

Another factor in the probabilities is that there are more ways to move a low card than a high one. If you are trying to decide whether to open a space for a king by moving the three of hearts or the eight of diamonds, it is probably better to put the three of hearts on the black four because once the ace-two of hearts are played it will be possible to remove the three. If you put the eight of diamonds on the nine of clubs, it is going to sit there until you get seven other cards played to the ace pile, or you have the unlikely opportunity to move it to the nine of spades (which again is something some players do not allow: splitting a pile to move part of it).

It is also advisable that you not let your ace piles become too disparate. If your diamonds pile gets up around seven or eight and you still don’t have your black aces, it is going to be much harder to find places for all those black cards that have no ace piles and no diamonds on which to be played. This is again a balancing issue: it is more important to get the cards in the piles into play than to worry about the disparity on the ace piles, but that ace pile disparity can prevent you from doing so if it goes wrong. You can (in most games, again some have a rule against this) play cards back from the ace piles to the main piles, but only if there are places for them, and that, too, can be blocked.

One last note: kings are ultimately the easiest cards to move after aces. (It is never a bad move to start an ace pile, unless moving the ace will lock your draw deck.) A queen can be moved to one of three places–the two black kings and the proper suit ace pile. A king can go to the ace pile and to any of the seven board piles once they are open. That makes moving kings a lower priority than moving any other card, and the other cards should be moved first if you have both moves available, unless it is clear that the king is blocking a significant number of other cards and the other move is not.

Hopefully this is enough to get you thinking about what moves in Solitaire will improve your position and what ones will reduce your chance of winning. Before I drop the subject completely, I will mention a strategy rule I got from a Contract Bridge expert: if you can only be prevented from winning if the cards fall one way, you must play as if that is how they fall; if you can only win if the cards fall one way, you must play as if that is how they fall. Note, then, that understanding the odds of how the cards might fall will help you win more games than solitaire, and will even carry to other games of chance such as dice games.

I hope this nonsense was at least entertaining; and perhaps it was educational as well. It also probably won’t be too controversial, but if anyone has comments you know how to find me.
[contact-form subject='[mark Joseph %26quot;young%26quot;’][contact-field label=’Name’ type=’name’ required=’1’/][contact-field label=’Email’ type=’email’ required=’1’/][contact-field label=’Website’ type=’url’/][contact-field label=’Comment’ type=’textarea’ required=’1’/][/contact-form]