View New Posts
1. ## Expected Value vs Expected Growth (Kelly criterion Part I)

A quantitative introduction to the Kelly criterion

Part I -- Expected Value vs Expected Growth

A question I'm often asked is how exactly expected value differs from expected growth. The difference is somewhat subtle but understanding it is essential to risk management in general and the Kelly criterion in particular.

The question frequently arises (as it did here in the the next-to-last paragraph) in the context of the idea that betting one's entire bankroll implies -100% bankroll growth. (That's 100% bankroll shrinkage -- a bankroll that shrinks to \$0.) What's more, if you bet your entire bankroll in one go, 100% bankroll shrinkage is implied regardless of both the probability of the bet winning (as long as it wins less than 100% of time) and the odds paid out on the bet (as long as the odds are less than infinity).

Think about that for a moment, because it's an important point: If when you bet you wager your entire bankroll each time then you expect your bankroll to eventually shrink to zero.

Well, at least it should be important, but the truth is doesn't really get us any closer to understanding what exactly bankroll growth is and how it differs from expected value. We’ll get back to this later.

Let's start with a brief review of expected value. I described it in relative depth here. And I quote:
Originally Posted by Ganchrow
The notion of expectation is central to probability and statistics and may be thought of as an average with an extra syllable. If you were to flip a coin 10 times then you could expect it would land on heads 5 times and you could expect it would land on tails 5 times. In reality of course the coin’s not always going to land on heads exactly 5 times out of 10 (in fact it would only do so about 24.6% of the time), but if you were to repeat the experiment (flipping a coin ten times) many, many times over then on average it would land on heads 5 times each trial.

The same thought process is also applicable to sports. If the Yankees can be expected to win a particular game 60% of the time, then this would mean that if the exact same game were repeated under the exact same conditions across many, many parallel universes, we would expect the Yankees to win 60% of those encounters.

So let’s say you bet \$1 straight up that the Yankees are going to win that game. Now that’s quite obviously a good bet. But just how “good” is it?

That’s where expectations come in with sports betting. If you made the same bet in each of those parallel universes you’d win \$1 60% of the time, and lose \$1 40% of the time. Now let’s say that there are actually 1,000,000 of these universes. Exactly how much money would you make? Well, in 600,000 of those universes you’d make \$1 for a total of \$600,000 dollars, and in the remaining 400,000 of those universes you’d lose \$1 in each game for a total of \$400,000 dollars. So you'd receive \$600,000 and would pay out \$400,000 meaning that your total profit would be \$200,000. Winning \$200,000 across 1,000,000 means on average you would have won \$200,000 / 1,000,000 games = \$.20 per game.

Now of course 1,000,000 is just a made up number in this context. There aren’t really 999,999 other universes where we could make such a bet. This bet can only be made once. But that doesn’t actually matter in the world of statistics. Whether you can make this bet only one time or you can make it multiple times the expectation per game is precisely the same, namely 20%.
So to summarize, the expected value of a bet is the amount we would receive on average if we were to repeat the exact same bet a very large number of times. As such, the expected value of a bet is a a metric by which one might judge the relative attractiveness of that bet. If one bet has an expected value of 5% (meaning that for every \$10,000 we bet we would expect to win \$500) and another has an expected value of 10% (meaning that for every \$10,000 we bet we would expect to win \$1,0000), then we would tend to think that one would prefer the latter bet to the former.

But there's a bit of a difficulty here -- namely, expected value ignores any consideration of the relative likelihoods of given outcomes alone. For example a \$10,000 bet on a 0.0000000000000000000000000000000000001% likelihood event paying out at +110,000,000,000,000,000,000,000,000,000 ,000,000,000,000 odds corresponds to an expected value of 10% (+\$1,000). But who among us would be willing to essentially throw away \$10,000 on such a long shot? To put it in perspective you'd be about 1,870 times more likely to win the the New Jersey State Lottery five times in a row, than you would be to win this particular bet. Does it really matter that if by some fluke of nature you actually did win you'd have an unfathomably huge amount of money? If you're like most people, the answer is probably not.

So now here's the difficulty ... there's no way whatsoever to account for this very real phenomenon of preferences by appealing to the theory of expected value alone.

(Enter stage right, expected bankroll growth.)

One major problem with the proposed bet is that for most people, \$10,000 represents a rather large chunk of one’s bankroll to be throwing away on a bet that’s nearly certain to lose. But while a \$10,000 bet is probably too large a quantity to risk on this bet, there’s still a sufficiently small dollar amount that most people would be willing to risk to make this bet. Granted, for most people that dollar amount would be somewhere in the neighborhood of a tiny fraction of a penny, but it nevertheless would still be a positive dollar amount.

The fundamental issue with bets such as these is that, despite being positive EV, placing them is an excellent way to go broke. The apparent contradiction is easily reconciled. If you were to repeat this bet once in each of a gigantically huge number of parallel universes, in nearly all of the universes you’d lose your bet, but in a tiny, tiny, tiny, tiny, tiny fraction of those universes you’d have win the bet and that win quantity would make up for all the losses plus an additional 10% of the amount risked.

The fact is that most people just aren’t willing to live through billions of trillions worth of bets just to have a vanishingly minuscule probability of winning a huge odds bet once. So while the bet may have positive expected value, the expected outcome is for your bankroll to shrink by \$10,000 each time the bet’s made. If your bankroll were \$1,000,000 and you made the bet 100 times, you could expect to be broke after the 100th bet (even though your expected value would be 10% × \$1,000,000 = +\$100,000).

So let’s look at some more practical numbers. Assume you’re considering at a bet that wins with 50% probability and pays out at odds of +200. Further assume your total bankroll is \$100,000 and that you want to place 1% of your bankroll on this wager.

Question: Where do you expect your bankroll to be after 2 wagers?

Answer: There are 4 possible outcomes after placing two wagers:
1. Win both bets.
2. Win 1st bet, lose 2nd bet
3. Lose 1st bet, win 2nd bet
4. Lose both bets
Now because winning and losing the bet are both equally likely, all 4 outcomes occur with equal probability, namely 25%. Recall that you’d be betting 1% of your bankroll on each bet and would be paid off at odds of +200. Therefore, your ending bankroll under each of the 4 outcomes would be:
1. B = \$100,000 × (1 + 2×1%) × (1 + 2×1%) = \$104,040
2. B = \$100,000 × (1 + 2×1%) × (1 - 1%) = \$100,980
3. B = \$100,000 × (1 - 1%) × (1 + 2×1%) = \$100,980
4. B = \$100,000 × (1 - 1%) × (1 - 1%) = \$98,010
(The derivation of these equations is simple. Every time you win your bankroll would grow to 102% of its previous value, and every time you lose your bankroll would shrink to 99%.) The expected value from betting in this manner would be 25%×\$104,040 + 25%×\$100,980 + 25%×\$100,980 + 25%×\$98,010 = \$101,002.50. To calculate expected growth, we would first need to recognize that given our 50% win probability, our expected outcome would be to win a bet and to lose a bet (# of wins = 50% × 2 bets, # of losses = 50% × 2 bets). Therefore our expected growth would be that associated with that outcome (with expected growth, the relative ordering of wins/losses is irrelevant), namely \$100,980.

Therefore, the expected value from the two bets is \$1,002.50 or 1.0025%, and the expected growth is \$980 or 0.9800%. Notice that expected value is higher than expected growth -- this is what you’re always going to see. Expected value will always be higher than expected growth (except for probabilities of 0 or 100%, we’ll they’ll be equal) because a few relatively large, relatively uncommon outcomes will increase EV. Another way to think about this is by realizing that the worst case scenario is losing everything one time over., while the best case scenario would be winning your bankroll infinity times over – in other words you while your maximum possible profit is unlimited, your maximum possible loss is limited to your bankroll.

So in this instance our expected outcome would be a bankroll of:
B* = \$100,000 × (1 + 2×1%)2×50% × (1 - 1%)2×50% = \$100,980,
implying expected bankroll growth of
E(G) = \$100,980/\$100,000 = 0.9800%

It should be readily apparent our expected outcome after n bets would be a bankroll of:
B* = \$100,000 × (1 + 2×1%)n×50% × (1 - 1%)n×50% = \$100,000 × (100.48881%)n,
implying expected bankroll growth of
E(G) = (100.48881%)n -1.

By extension, our expected outcome after just 1 bet would be:
B* = \$100,000 × (1 + 2×1%)50% × (1 - 1%)50% = \$100,488.81

And our expected bankroll growth would be
E(G)= (1 + 2×1%)50% × (1 - 1%)50% - 1 = 0.48881%

(This last result bears a little discussion. We can talk about expected growth after only 1 bet in the same manner as we can talk about expected value after just one bet. In the same way as we’d never see a real result equal to our expected value, we’d never actually see growth after one bet equal to expected growth. This should cause absolutely no concern.)

So let’s generalize our results with expected outcomes and growth. Given a starting bankroll of B0, decimal odds of O, a win probability of p, and a bet size of X (as a percentage of starting bankroll, B0), the bankroll associated with the expected outcome from placing the bet would be:
B* = B0 * (1 + (O-1) * X)p * (1 - X)1-p

And expected growth would be:
E(G) = (1 + (O-1) * X)p * (1 - X)1-p - 1
Expected value, you’ll recall, would be:
EV = p*(O-1)*X - (1-p)*X = (pO - 1)*X

Q: So let’s look at a concrete example: What are the expected value and bankroll growth associated with a bet equal to 1% of bankroll paying out at -110 and winning with probability 54%?

A:
EV = 1% × (54% × 1.909091 - 1) = 0.03091% of bankroll
E(G) = (1 + 0.909091 × 1%)54% × (1 - 1%)46% - 1 = 0.02638% bankroll growth.

Q: Now let’s consider the same terms, but in the case of a player placing a bet 25% of bankroll. What would expected value and growth be in this case?

A:
EV = 25% × (54% × 1.909091 - 1) = 0.7727% of bankroll
E(G) = (1 + 0.909091 × 25%)54% × (1 - 25%)46% - 1 = -2.1510% bankroll growth = 2.1510% bankroll shrinkage

So think about these results for a moment. We have a positive expectation bet and hence, quite naturally, the more we bet on it the more we expect to make. However, if we were to wager too much on this bet then we’d expect our bankroll to shrink by 2.1510% per wager (were we to place this positive expectation bet 32 times, for example, we’d expect our bankroll to depreciate roughly a half).

So this should help elucidate the huge odds bet above. No matter how positive EV a bet might be, if you bet too much on it then you expect your bankroll to shrink. This is the concept to which people are referring when they talk about "money management". Even if you could pick NFL spreads at 75% (which you can’t), were you to bet too much, you'd expect to head towards bankruptcy.

So as a limiting case let’s look at one more example, the example of betting one’s entire bankroll mentioned at this start of this article: win probability = p, bet size = 100% of bankroll.

EV = 100% × (pO - 1) = pO – 1 (EV > 0 for p > 1/O)
E(G) = (1 + (O - 1))p × (0)1-p - 1 = -100% (for p < 1 and O < ∞)

So what does this tell us? Well for one thing it tells us that even if you were the “best handicapper ever”, were you to risk your entire bankroll on every bet, you would expect to go broke. More generally, it illustrates the concept that looking solely at expected value as a metric for the attractiveness of a given bet is not the proper way to maintain long term growth.

In the next part of this article we’ll discuss how one might use the concept of expected growth to determine bet size. This is the essence of the Kelly criterion.
Points Awarded:
 Harmy G gave Ganchrow 1 SBR Point(s) for this post.
Nomination(s):
 This post was nominated 7 times . To view the nominated thread please click here. People who nominated: ClimbSomeRocks, masticore, Love The Action, Harmy G, GunShard, allin1, and dj_destroyer

SBR Founder Join Date: 8/28/2005

2. Good stuff, Ganch. The scary part is I think I'm actually starting to understand this stuff

3. Originally Posted by Willie Bee
The scary part is I think I'm actually starting to understand this stuff
Well, I would certainly hope so.

If anything's unclear feel free to ask.

SBR Founder Join Date: 8/28/2005

4. I've been playing around with the Kelly calculator for simultaneous wagers, but I can't seem to figure out how it's calculating expected growth.

Based on what this article presents (along with a basic understanding of the geometric mean) I've been trying to calculate the expected growth of 2 simultaneous wagers.

Assume a quarter Kelly stake on 2 wagers at +100 odds with a 55% chance of winning. The calculator says to bet 2.4450% on each, and the expected growth is 0.4397%.

When I try to calculate the expected growth by hand:

1.048899^0.3025 x 1.00^0.4950 x (1-0.048899)^0.2025 - 1

I obtain 0.4299% (I know the middle piece isn't necessary, I just included it for clarity).

I assume I'm messing up somewhere, so any thoughts would be great!

5. Originally Posted by rjp
I've been playing around with the Kelly calculator for simultaneous wagers, but I can't seem to figure out how it's calculating expected growth.

Based on what this article presents (along with a basic understanding of the geometric mean) I've been trying to calculate the expected growth of 2 simultaneous wagers.

Assume a quarter Kelly stake on 2 wagers at +100 odds with a 55% chance of winning. The calculator says to bet 2.4450% on each, and the expected growth is 0.4397%.

When I try to calculate the expected growth by hand:

1.048899^0.3025 x 1.00^0.4950 x (1-0.048899)^0.2025 - 1

I obtain 0.4299% (I know the middle piece isn't necessary, I just included it for clarity).

I assume I'm messing up somewhere, so any thoughts would be great!
You've neglected the 0.0629% stake on the 2-team parlay at +300. This makes the complete EV equation:

(1.048899 + 0.000629*3)^0.3025 x (1.00 - 0.000629) ^0.4950 x (1-0.048899-0.000629)^0.2025 - 1
0.4397%

SBR Founder Join Date: 8/28/2005

6. OK, I think I was misled with how the calculator is setup.

What you're saying is the optimal solution is listed under the 'Stakes for parlay sizes of: All' tab? Thus the appropriate bets would be:

1 2.4450%
2 2.4450%
1+2 0.0629%

Thanks!

7. Originally Posted by rjp
OK, I think I was misled with how the calculator is setup.

What you're saying is the optimal solution is listed under the 'Stakes for parlay sizes of: All' tab? Thus the appropriate bets would be:

1 2.4450%
2 2.4450%
1+2 0.0629%

Thanks!
You got it. Let me know if you think the documentation is unclear and I'll change it.

The "Stakes for parlays of size n" tab, shows all stakes for that parlay size (except for mutually exclusive outcomes, for which all parlays of size > 1 will obviously always be zero).

These values may be edited (in every tab except for the "All" tab). Clicking "Calculate Expectations" will then recalculate the values on the bottom third of the tool using the new stake sizes.

SBR Founder Join Date: 8/28/2005

8. Makes much more sense reading it a second time. I guess if it defaulted to 'All' then I might have understood it better, as for me that's what I'm really interested in.

Now that I understand how it works, would the optimal stakes change if I didn't want to make any parlay bets? If so, is it possible to use the calculator to compute these stakes?

9. Originally Posted by rjp
Would the optimal stakes change if I didn't want to make any parlay bets? If so, is it possible to use the calculator to compute these stakes?
1. Yes
2. No

In general, you'll probably be ok (meaning you won't be giving up very much expectation) just placing the recommended single bets and parlay of size 2. You can check the impact that this will have on expected growth by editing the "Stakes" textbox and clicking the "Calculate Expectations" button.

SBR Founder Join Date: 8/28/2005

10. Thanks, I appreciate your help. I'm working on a perl script that calculates these for me, and your work has been a great help.

11. Originally Posted by rjp
Thanks, I appreciate your help. I'm working on a perl script that calculates these for me, and your work has been a great help.
Cool. I'd love to see what you've got.

I'll just add that as long you're writing it in Perl you might want to include a nonlinear optimization module (or even just a bidirectional pipe to an external exe) so as to calculate optimal solutions for boundary cases (for example, constraining all parlays of size ≥ n to be zero). http://www.ampl.com/ has some sample optimization engines.

SBR Founder Join Date: 8/28/2005

12. Never heard of AMPL--checking it out now. Once I get this done then I have to tackle the issue of betting simultaneous correlated bets that are not parlay-able.

Any idea if any published work covers this topic?

13. Originally Posted by rjp
Never heard of AMPL--checking it out now. Once I get this done then I have to tackle the issue of betting simultaneous correlated bets that are not parlay-able.

Any idea if any published work covers this topic?
The methodology for determining an optimal solution with calculus with one or more bets constrained to zero is no different than in the general case.

Check out footnote 2 of Kelly Part II.

SBR Founder Join Date: 8/28/2005

14. What exactly do you mean when you say constrained to zero in regards to correlated bets?

15. Originally Posted by rjp
What exactly do you mean when you say constrained to zero in regards to correlated bets?
You had stated that in some cases you might want to determine an optimal solution that did not include any allocation for one or more given parlays. In such a circumstance we'd say that the weightings of these parlays would then be "constrained to zero".

Or have I not understood your question?

SBR Founder Join Date: 8/28/2005

16. Ah, I see... yes, I did ask about that earlier, but now I'm asking about correlated bets. I think when I mentioned that these correlated bets can't be parlayed threw you off.

17. Originally Posted by rjp
Ah, I see... yes, I did ask about that earlier, but now I'm asking about correlated bets. I think when I mentioned that these correlated bets can't be parlayed threw you off.
Whether the bets are correlated or not is largely irrelevant.

You just need to specify the outcome set and associated probabilities, constrain the undesired parlays to zero and then perform the global utility maximization.

The point is that minor complications such as outcome correlation and constraints on bet sizes leave the optimization equation and procedure completely unchanged. In fact, the best way to look at is that uncorrelated outcomes and no wagering constraints are simply a special case of the generic utility maximization problem.

SBR Founder Join Date: 8/28/2005

18. That's good to hear. I had just assumed there was something more that needed to be done.

19. Lets say I have a \$100,000 bankroll and a full Kelly for a particular bet is \$5000. Now, if I understand correctly, if I bet \$7000 it will result in negative expected growth.

However, this assumes that my bankroll is finite and that if I lose it that I will never be able to raise more funds and I will never place another bet. But, that wouldn't be the case. At some point (even if it were years later), I would replenish my bankroll to some degree and start betting again.

Does this leave me in the odd situation where a bet has negative expected growth on the immediate bankroll situation but positive bankroll growth expectation on the true long-term bankroll? Is it possible that if you look at only your current "cash on hand" bankroll as the basis that you are sub-optimizing your bet size?

20. Originally Posted by Utah
Lets say I have a \$100,000 bankroll and a full Kelly for a particular bet is \$5000. Now, if I understand correctly, if I bet \$7000 it will result in negative expected growth.
Technically speaking this is not possible. If edge/(odds-1) = 5%, then a stake of 7% will never result in negative growth. Nevertheless, your point is clear. Given even odds and a win probability of 52.5% (Kelly stake of 5%), a stake of 10% would result in expected growth of about -0.00084%.

Originally Posted by Utah
However, this assumes that my bankroll is finite and that if I lose it that I will never be able to raise more funds and I will never place another bet. But, that wouldn't be the case. At some point (even if it were years later), I would replenish my bankroll to some degree and start betting again.

Does this leave me in the odd situation where a bet has negative expected growth on the immediate bankroll situation but positive bankroll growth expectation on the true long-term bankroll? Is it possible that if you look at only your current "cash on hand" bankroll as the basis that you are sub-optimizing your bet size?
When dealing with optimization problems it's always very important to accurately define your terms. In theory, "bankroll", as far as Kelly is concerned, wouldn't just be cash on hand but is the fully discounted present value (discounted for both prevailing interest rates and personal preferences) of all future earnings and mortgageable assets (including your heart, kidneys, liver, soul, etc. -- all of which you'd be likely to heavily discount based on your probable affinity for each.) For the utility function to make sense, losing your entire "bankroll" needs to be an infinitely bad outcome.

In practice, of course, this is rather unrealistic. I'm not going to get into the mathematics of how one might modify Kelly to use a bankroll function as opposed to a single scalar bankroll quantity, but I will say that one's bankroll should generally be construed as broadly as possible.

I wrote this a while back with respect to narrowly construing bankroll:
Originally Posted by Ganchrow

Now people might very well object when they read this, saying that this bankroll valuation just doesn't make any sense, and that no one would want to bet in this matter, etc. etc. And you know what? You'd probably be right. Kelly assumes logarithmic preferences and as I've mentioned many times before most human just don't have log prefs. So to get around this issue, people often claim (in fact I don't know anyone who doesn't) that a Kelly bankroll is only what the bettor would feel comfortable losing. That's all well and good -- but to be perfectly clear that's a compromise position and doesn't represent "true" Kelly.

In conclusion, the Kelly stake represents the optimal bet size as percentage of total bankroll that should be bet if the bettor's goal were to maximize the expected growth rate of that bankroll. (In fact, this is equivalent to saying that the bettor has log preferences.) Were that bit your goal, and it probably isn't, then strictly, strictly speaking Kelly's not for you. (The ambitious might consider implementing Kelly using a more appropriate utility function. This actually isn't too difficult to figure computationally for well-behaved, convex preferences.) But that doesn't mean that you couldn't use a version of it with which you're sufficiently comfortable.

SBR Founder Join Date: 8/28/2005

21. Bump

22. Much needed.
175 pts

3-QUESTION
SBR TRIVIA WINNER 06/13/2013

23. Very detail written post, it seems I'll need read it twice

Originally Posted by Ganchrow
A quantitative introduction to the Kelly criterion

Part I -- Expected Value vs Expected Growth

A question I'm often asked is how exactly expected value differs from expected growth. The difference is somewhat subtle but understanding it is essential to risk management in general and the Kelly criterion in particular.

The question frequently arises (as it did here in the the next-to-last paragraph) in the context of the idea that betting one's entire bankroll implies -100% bankroll growth. (That's 100% bankroll shrinkage -- a bankroll that shrinks to \$0.) What's more, if you bet your entire bankroll in one go, 100% bankroll shrinkage is implied regardless of both the probability of the bet winning (as long as it wins less than 100% of time) and the odds paid out on the bet (as long as the odds are less than infinity).

Think about that for a moment, because it's an important point: If when you bet you wager your entire bankroll each time then you expect your bankroll to eventually shrink to zero.

Well, at least it should be important, but the truth is doesn't really get us any closer to understanding what exactly bankroll growth is and how it differs from expected value. We’ll get back to this later.

Let's start with a brief review of expected value. I described it in relative depth here. And I quote: So to summarize, the expected value of a bet is the amount we would receive on average if we were to repeat the exact same bet a very large number of times. As such, the expected value of a bet is a a metric by which one might judge the relative attractiveness of that bet. If one bet has an expected value of 5% (meaning that for every \$10,000 we bet we would expect to win \$500) and another has an expected value of 10% (meaning that for every \$10,000 we bet we would expect to win \$1,0000), then we would tend to think that one would prefer the latter bet to the former.

But there's a bit of a difficulty here -- namely, expected value ignores any consideration of the relative likelihoods of given outcomes alone. For example a \$10,000 bet on a 0.0000000000000000000000000000000000001% likelihood event paying out at +110,000,000,000,000,000,000,000,000,000 ,000,000,000,000 odds corresponds to an expected value of 10% (+\$1,000). But who among us would be willing to essentially throw away \$10,000 on such a long shot? To put it in perspective you'd be about 1,870 times more likely to win the the New Jersey State Lottery five times in a row, than you would be to win this particular bet. Does it really matter that if by some fluke of nature you actually did win you'd have an unfathomably huge amount of money? If you're like most people, the answer is probably not surebets

So now here's the difficulty ... there's no way whatsoever to account for this very real phenomenon of preferences by appealing to the theory of expected value alone.

(Enter stage right, expected bankroll growth.)

One major problem with the proposed bet is that for most people, \$10,000 represents a rather large chunk of one’s bankroll to be throwing away on a bet that’s nearly certain to lose. But while a \$10,000 bet is probably too large a quantity to risk on this bet, there’s still a sufficiently small dollar amount that most people would be willing to risk to make this bet. Granted, for most people that dollar amount would be somewhere in the neighborhood of a tiny fraction of a penny, but it nevertheless would still be a positive dollar amount.

The fundamental issue with bets such as these is that, despite being positive EV, placing them is an excellent way to go broke. The apparent contradiction is easily reconciled. If you were to repeat this bet once in each of a gigantically huge number of parallel universes, in nearly all of the universes you’d lose your bet, but in a tiny, tiny, tiny, tiny, tiny fraction of those universes you’d have win the bet and that win quantity would make up for all the losses plus an additional 10% of the amount risked.

The fact is that most people just aren’t willing to live through billions of trillions worth of bets just to have a vanishingly minuscule probability of winning a huge odds bet once. So while the bet may have positive expected value, the expected outcome is for your bankroll to shrink by \$10,000 each time the bet’s made. If your bankroll were \$1,000,000 and you made the bet 100 times, you could expect to be broke after the 100th bet (even though your expected value would be 10% × \$1,000,000 = +\$100,000).

So let’s look at some more practical numbers. Assume you’re considering at a bet that wins with 50% probability and pays out at odds of +200. Further assume your total bankroll is \$100,000 and that you want to place 1% of your bankroll on this wager.

Question: Where do you expect your bankroll to be after 2 wagers?

Answer: There are 4 possible outcomes after placing two wagers:
1. Win both bets.
2. Win 1st bet, lose 2nd bet
3. Lose 1st bet, win 2nd bet
4. Lose both bets

Now because winning and losing the bet are both equally likely, all 4 outcomes occur with equal probability, namely 25%. Recall that you’d be betting 1% of your bankroll on each bet and would be paid off at odds of +200. Therefore, your ending bankroll under each of the 4 outcomes would be:
1. B = \$100,000 × (1 + 2×1%) × (1 + 2×1%) = \$104,040
2. B = \$100,000 × (1 + 2×1%) × (1 - 1%) = \$100,980
3. B = \$100,000 × (1 - 1%) × (1 + 2×1%) = \$100,980
4. B = \$100,000 × (1 - 1%) × (1 - 1%) = \$98,010

(The derivation of these equations is simple. Every time you win your bankroll would grow to 102% of its previous value, and every time you lose your bankroll would shrink to 99%.) The expected value from betting in this manner would be 25%×\$104,040 + 25%×\$100,980 + 25%×\$100,980 + 25%×\$98,010 = \$101,002.50. To calculate expected growth, we would first need to recognize that given our 50% win probability, our expected outcome would be to win a bet and to lose a bet (# of wins = 50% × 2 bets, # of losses = 50% × 2 bets). Therefore our expected growth would be that associated with that outcome (with expected growth, the relative ordering of wins/losses is irrelevant), namely \$100,980.

Therefore, the expected value from the two bets is \$1,002.50 or 1.0025%, and the expected growth is \$980 or 0.9800%. Notice that expected value is higher than expected growth -- this is what you’re always going to see. Expected value will always be higher than expected growth (except for probabilities of 0 or 100%, we’ll they’ll be equal) because a few relatively large, relatively uncommon outcomes will increase EV. Another way to think about this is by realizing that the worst case scenario is losing everything one time over., while the best case scenario would be winning your bankroll Infinity times over – in other words you while your maximum possible profit is unlimited, your maximum possible loss is limited to your bankroll.

So in this instance our expected outcome would be a bankroll of:
B* = \$100,000 × (1 + 2×1%)2×50% × (1 - 1%)2×50% = \$100,980,
implying expected bankroll growth of
E(G) = \$100,980/\$100,000 = 0.9800%
It should be readily apparent our expected outcome after n bets would be a bankroll of:
B* = \$100,000 × (1 + 2×1%)n×50% × (1 - 1%)n×50% = \$100,000 × (100.48881%)n,
implying expected bankroll growth of
E(G) = (100.48881%)n -1.
By extension, our expected outcome after just 1 bet would be:
B* = \$100,000 × (1 + 2×1%)50% × (1 - 1%)50% = \$100,488.81
And our expected bankroll growth would be
E(G)= (1 + 2×1%)50% × (1 - 1%)50% - 1 = 0.48881%
(This last result bears a little discussion. We can talk about expected growth after only 1 bet in the same manner as we can talk about expected value after just one bet. In the same way as we’d never see a real result equal to our expected value, we’d never actually see growth after one bet equal to expected growth. This should cause absolutely no concern.)

So let’s generalize our results with expected outcomes and growth. Given a starting bankroll of B0, decimal odds of O, a win probability of p, and a bet size of X (as a percentage of starting bankroll, B0), the bankroll associated with the expected outcome from placing the bet would be:
B* = B0 * (1 + (O-1) * X)p * (1 - X)1-p
And expected growth would be:
E(G) = (1 + (O-1) * X)p * (1 - X)1-p - 1
Expected value, you’ll recall, would be:
EV = p*(O-1)*X - (1-p)*X = (pO - 1)*X
Q: So let’s look at a concrete example: What are the expected value and bankroll growth associated with a bet equal to 1% of bankroll paying out at -110 and winning with probability 54%?

A:
EV = 1% × (54% × 1.909091 - 1) = 0.03091% of bankroll
E(G) = (1 + 0.909091 × 1%)54% × (1 - 1%)46% - 1 = 0.02638% bankroll growth.
Q: Now let’s consider the same terms, but in the case of a player placing a bet 25% of bankroll. What would expected value and growth be in this case?

A:
EV = 25% × (54% × 1.909091 - 1) = 0.7727% of bankroll
E(G) = (1 + 0.909091 × 25%)54% × (1 - 25%)46% - 1 = -2.1510% bankroll growth = 2.1510% bankroll shrinkage
So think about these results for a moment. We have a positive expectation bet and hence, quite naturally, the more we bet on it the more we expect to make. However, if we were to wager too much on this bet then we’d expect our bankroll to shrink by 2.1510% per wager (were we to place this positive expectation bet 32 times, for example, we’d expect our bankroll to depreciate roughly a half).

So this should help elucidate the huge odds bet above. No matter how positive EV a bet might be, if you bet too much on it then you expect your bankroll to shrink. This is the concept to which people are referring when they talk about "money management". Even if you could pick NFL spreads at 75% (which you can’t), were you to bet too much, you'd expect to head towards bankruptcy.

So as a limiting case let’s look at one more example, the example of betting one’s entire bankroll mentioned at this start of this article: win probability = p, bet size = 100% of bankroll.

EV = 100% × (pO - 1) = pO – 1 (EV > 0 for p > 1/O)
E(G) = (1 + (O - 1))p × (0)1-p - 1 = -100% (for p < 1 and O < ∞)

So what does this tell us? Well for one thing it tells us that even if you were the “best handicapper ever”, were you to risk your entire bankroll on every bet, you would expect to go broke. More generally, it illustrates the concept that looking solely at expected value as a metric for the attractiveness of a given bet is not the proper way to maintain long term growth.

In the next part of this article we’ll discuss how one might use the concept of expected growth to determine bet size. This is the essence of the Kelly criterion.

24. bump

26. this is worth bumping.

thanks for the nice work ganchrow!

27. Should be stickied.

28. I miss Ganch.
175 pts

3-QUESTION
SBR TRIVIA WINNER 06/13/2013

29. In the practical example, I'm confused how he gets this line for the EV calculation:

EV = 1% × (54% × 1.909091 - 1) = 0.03091% of bankroll

How do you get 1.909091 from a line of -110?

30. Decimal odds.

SBR
Bash 2012
Attendee 8/17/2012

31. I used the tool for odds converter and it says 1.9091.. How did he get his number?

32. Favorite (Line of F): 1 - (100/F)

Line of -110:
1 - (100/-110) = 1.9090909090909090909090909090909

Dog (Line of D): 1 + (D/100)

Line of +110:
1+(110/100) = 2.1

SBR
Bash 2012
Attendee 8/17/2012

33. Originally Posted by MonkeyF0cker
Line of -110: 1 - (100/-110) = 1.9090909090909090909090909090909
Originally Posted by thesoxman101
How do you get 1.909091 from a line of -110?
Originally Posted by thesoxman101
I used the tool for odds converter and it says 1.9091.. How did he get his number?
Hmm, perhaps there is a bug in the odds converter?

SBR Founder Join Date: 8/28/2005

34. Yeah it gave me 1.9091, but in the example you used 1.909091

35. Are you serious? Is that a third grade math question in the Think Tank?

SBR
Bash 2012
Attendee 8/17/2012