Imagine you get the opportunity to flip a coin for only 1 dollar : if it turns up heads, you win 50% and if it turns up tails you lose 40% of your initial bet. Of course we’re all familiar with calculating expected payoffs: since this bet has an expected payoff of 0.5*1.5 + 0.5*0.6=1.05, you should take it. Right? Well, it might be a little less obvious.
Suppose you flip the coin another time. Your wealth will, depending on the flips, either be 2.25 (two heads), 0.9 (one heads, one tails), or 0.36 (two tails). A sample trajectory of your wealth can be seen below, tossing a coin every minute for sixty minutes, starting with an initial wealth of 100 dollars. Based on this sequence, it is impossible to say anything about where you’ll end up: it just looks like white noise.
What is more interesting is plotting several trajectories (in this case 20) and look at the average of these trajectories. It still looks like white noise, so we need more data to make an educated guess about the trajectory.
As we increase the number of sequences, we can see something interesting: flip a coin for 60 minutes and repeat that a million times (you’ll end up with very sore fingers, but it is all for science). The trajectory will get straighter and we will ultimately end up with the following graph. Below, on the left hand side you can see the ensemble average, the average of all the trajectories of flipping a coin for 60 minutes. It surely looks like a good game!
On the right you can see your wealth after flipping a coin every minute for a year. Of course the scaling is different, but the two graphs are going into the opposite direction. What is going on here?
The difference between ensemble- and time-averaging
In order to understand this difference we need to be a bit more precise when we talk about averaging. On the left hand side we looked at an ensemble average (think in this case about a million people playing the game). But is this relevant for an individual? On average you’re making money, but for the individual only one sequence matters. We can’t go back in time or access parallel universes where the coin tosses might have ended up differently. On the right hand side the time average for one individual can be seen. Over time you’re definitely not making money (and you’ve spent a year flipping a coin, time you could perhaps better invest studying).
One thing to note is that the ensemble average is being overemphasized by the extreme cases. It is clear from the time average that almost all of the people will end up with no money, but the people who still have money actually will see an increase in their wealth. Still, the ensemble average doesn’t show this: it looks like it is going well because the extremes are covering up the bad cases. As another example, think about GDP. It is one of the most used measures of wealth of a country. But what is interesting is that if a billionaire makes another billion dollars, whilst simultaneously 20,000 school teachers are losing their job, the GDP of that country will rise. Is this then a good measure to base national policies on?
Non-ergodic processes
The example above is one of many examples of a non-ergodic process. Ergodic processes are mostly known from physicist Ludwig Boltzmann, who formulated the ergodic hypothesis for statistical physics. An ergodic process is a process where the time average is equal to the expected value (the sample mean well known to most econometrics students by now). But thinking about randomness has been a topic of interest for centuries, starting with Pascal and Fermat. Around a century later Daniel Bernoulli (probably the most famous mathematician born in Groningen) incorporated randomness into economic theory. After Bernoulli, Boltzmann and later Maxwell focussed on the time average aspects of randomness.
The St. Petersburg paradox revisited
Bernoulli (Nicolas, that is, Daniels cousin) actually devised another thought-experiment to illustrate sample and time averages not always being equal: the famous St. Petersburg paradox. It works as follows: imagine you get the possibility to flip a coin. If it hits tails on the first flip, you win 2 dollars. If it hits heads on the first flip and tails on the second, you win four dollars and so forth. So if it hits tails on the kth flip, you win 2^k dollars. This bet has an enormous expected value: ½ * 2 + ¼ * 4 + ⅛ * 8 + 1/16 * 16 + … = 1 + 1 + 1 + 1+…=∞. So you would pay a huge amount to participate in this bet, right? Yet, when over 10,000 people were asked how much they would pay for such a bet, the average answer was less than 5 euros!
I hope this article gave you a bit of an idea about the complexity of randomness and another way of thinking about bets. If this article got you thinking, I would recommend checking out the research by Ole Peters. The next time you get an offer to flip a coin, think twice! Or make sure you’re able to flip multiple coins at the same time: that way you’re certain to make money on average!
References:
TEDxGoodenoughCollege – Ole Peters – Time and Chance
Peters, O. The ergodicity problem in economics. Nat. Phys. 15, 1216–1221 (2019). https://doi.org/10.1038/s41567-019-0732-0
Ole Peters, 2010. “The time resolution of the St. Petersburg paradox,” Papers 1011.4404, arXiv.org, revised Mar 2011.
This article is written by Simon Elgersma