Gambler’s ruin
Consider a symmetric random walk on line $x\ge 0$ starting from some $c>0$. Suppose the walker stops when he reaches $x=0$. Let $X(t)$ denote the location of the walker at time $t$. Then $\Pr(X(t)=0\ \mbox{for some}\ t)=1$, but $E[X(t)]=c$ for all $t$. The latter can be seen by noting that $X(t) = X(t-1) + Y$ where $\Pr(Y=1)=\Pr(Y=-1)=0.5$. (It might be surprising that $E[X(t)]$ is the same as in an unbounded random walk.)Furthermore, if we define the time to stop as $T_k=\inf\{t:X(t)=k\}$, then for any $c>0$ we have $Pr\{T_0<\infty\}=1$ but $E[T_0]=\infty$. This is because that the distribution of $T$ is very heavily tailed. In fact, one can show that $\Pr\{T\ge n\}\approx \frac{a}{\sqrt{n}}$ for some constant $a$. An intuition to see $E[T]=\infty$ is as follows. Since $E[T_0] = c\cdot E[T_{c-1}]$, it suffices to consider $E[T_{c-1}]$. Conditioning on the direction of the first step: \begin{eqnarray*} E[T_{c-1}] & = & E[T_{c-1}\ |\ \mbox{left}] \Pr(\mbox{left}) + E[T_{c-1}\ |\ \mbox{right}] \Pr(\mbox{right})\\ & = & (1 + E[T_{c+1}]) \cdot 2^{-1}\\ & = & (1 + 2\cdot E[T_{c-1}]) \cdot 2^{-1} = 2^{-1} + E[T_{c-1}]. \end{eqnarray*} Hence the only possibility is $E[T_{c-1}] = \infty.$
The St. Petersburg paradox I
Consider a game where a player flips a fair coin and wins $2^n$ dollars if the head turns up at the $n$th flip. The game continues until the head turns up. Then the expect amount of money a player can win from the game is $\sum_{n\ge1} 2^n\cdot 2^{-n} = \infty$. Hence, to make this game fair, a player should pay an infinite amount of money to play this game!The St. Petersburg paradox II
Following the previous example, how much should a player be charged if he is only allowed to play for at most $N$ rounds? Denote the reward of the player as $S_N$. It is tempting to estimate the reward by $E[S_N]=\sum_{n=1}^N 2^n\cdot 2^{-n} = N$. However, it can be shown that $\frac{S_N}{N\lg N}\rightarrow 1$ almost surely. Namely, when $N$ is large, the player is very likely to win about $N\lg N$ dollars from this game. Hence, $N\lg N$ is arguably a more reasonable price than $N$ to charge the player.A winning strategy for a fair game
Consider a game where a player bets $X_n\ge0$ dollars on the $n$th round, $n\ge1$. After $X_n$ is decided, a fair coin is flipped. The player loses the bet if the tail turns up, and wins another $X_n$ dollars otherwise. Here is a strategy that guarantees a player to win 1 dollar almost surely from this game: (i) set $X_1=1$, (ii) set $X_{n+1}=2^n$ if he lost the $n$th round, and (iii) set $X_n=0$ for $n>N$ if he wins the $N$th round. Suppose the head first turns up at the $N$th flip. Then the expected amount of money he can win from the game is $2^N-(1+2^1+\cdots+2^{N-1})=1$ dollar. However, although the average number of rounds for this strategy is $1/0.5=2$, you are expected to pay $1\cdot 2^{-2}+2\cdot 2^{-3}+...=1/4+1/4+...=\infty$ dollars before you can get the dollar.Waiting time for a better offer
Suppose you want to sell you car on the Internet. It turns out that if the offers to buy your house are i.i.d., then you may expect to wait forever to get an offer better than the first offer. To see this, denote the first offer by $X_0$ and the sequence of offers by $\{X_n:n\ge0\}$. Let $N=\inf\{n:X_n>X_0\}$ be the waiting time until the first offer better than $X_0$. Note that $$\Pr\{N>n\}=\Pr\{X_0=\max_{0\le k \le n} X_k\}\ge \frac{1}{n+1}$$ by symmetry. Hence $E[N]=\sum_{n\ge0} \Pr\{N>n\} = \sum_{n\ge1} \frac{1}{n} = \infty$. This fact suggests that you should accept the first offer if you have to make decision immediately after after an offer is made, and cannot change your mind after you made a decision.Power of hints in a fair game
A banker tosses two fair coins and then covers the coins. To play this game, you choose to uncover one of them in blind. You win this game if the coin you choose is head.Case 1. Without further information, you have a 1/2 chance to win regardless of the coin you choose.
Case 2. Suppose the banker uncovers one of the two coins. If you decide to choose the other coin, the chance you win is still 1/2 regardless of the result of the former coin. Hence, uncover a coin doesn't provide any information for the other coin.
Case 3. Suppose the banker peeked the coins before they are covered, and he tells you that one of the coins is head. Then whatever coin you choose, you have 2/3 probability to win.
Case 4. Following case 3, suppose the banker not only tells you that one of the coins is head, but also uncovers that coin. If you choose the other coin, your chance to win declines to 1/3.
Case 5. Suppose the banker peeked the coins before they are covered. He tells you that one of the coins is tail. Then whatever coin you choose, you have 1/3 probability to win.
Case 6. Following case 5, suppose the banker not only tells you that one of the coin is tail, but also uncovers that coin. If you choose the other coin, then your chance to win increases to 2/3.
Mutually independency does not imply true independency
Let a set $\{X_i\}$ of random variables be $k$-independent if $$\Pr(X_i) = \Pr(X_i\ |\ X_{a1},...,X_{ak})$$for any $i\not\in\{a1,...,ak\}$. Namely, one cannot infer the value of $X_i$ given the values of up to $k$ other random variables. Given any $k>2$, it is easy to construct $k+1$ random variables $X_0,...,X_k$ that are mutually independent but not $k$-dependent. Let $U_0,...,U_k$ be iid random variables uniformly distributed over $1,...,N.$ Define $X_i=U_i−U_{i+1~\rm{mod}~k+1} \mod N$. Since $X_0+\cdots+X_k=0$, they are obvious not $k$-independent. It is also easy to show that $X_i$ and $X_j$ are independent for $i\neq j$. In fact, $X_0,...,X_k$ are $m$-independent for any $2\le m <k.$An immediate consequence of this fact is: $\Pr(X) = \Pr(X\ |\ Y)$ and $\Pr(X) = \Pr(X\ |\ Z)$ doesn't imply that $\Pr(X) = \Pr(X\ |\ Y,Z)$.
No comments:
Post a Comment