Consider a Markov Chain with 2 states, [imath]\Omega=\{0,1\}[/imath]. The winning/losing probabilities are given by the transition matrix [imath]P:[/imath]
[math]P=\left[\begin{matrix} 0.8 & 0.2\\ 0.3 & 0.7 \end{matrix}\right][/math]
We are interested in knowing the fraction of the time the Markov Chain spent in each state (winning/losing) as n grows larger i.e. the limiting distribution [imath]\lim_{n \to \infty} \pi^{(n)}[/imath], where [imath]\pi^{(0)}= [P(X_0=0)\quad P(X_0=1)][/imath].
[math]\lim_{n \to \infty} P^n=\frac{1}{0.2+0.3} \left [\begin{matrix} 0.2 & 0.3\\ 0.2 & 0.3 \end{matrix} \right][/math]
Then we have:
[math]\lim_{n \to \infty} \pi^{(n)}= \lim_{n \to \infty} \pi^{(0)}P^{n}= \frac{}{}[P(X_0=0)\quad P(X_0=1)]\cdot \frac{1}{0.5} \left [\begin{matrix} 0.2 & 0.3\\ 0.2 & 0.3 \end{matrix} \right]= [0.6\quad 0.4][/math]
Thus the probability of winning is [imath]P(W) = 0.6[/imath] and the probability of losing is [imath]P(L) = 0.4[/imath].
Additionally, we are given [imath]P(S|W)=0.2, P(S|L)=0.7[/imath]
Therefore,
[math]P(S) = P(S \cap W) + P(S\cap L)=P(S|W)P(W) + P(S|L)P(L)=(0.2)(0.6)+(0.7)(0.4)=\boxed{0.4}[/math]