You should be able to find a proof of this fact in any undergraduate stochastic processes books. Durrett's book Essentials of Stochastic Processes has a good proof of this.

I'll give an outline of how to prove it. Suppose that the Markov chain starts at $X_0=x$. Let $0 = R_0 < R_1 < R_2 < \ldots$ be the sequence of return times to the site $x$. Since the Markov chain is positive recurrent $E[R_n - R_{n-1}] = E[R_1] < \infty$. Next, let be the number of returns that have occurred by time $n$ (that is $R_{N_n} \leq n < R_{N_n+1}$).
Finally, let $Y_k = \sum_{i=R_{k-1}+1}^{R_k} X_i$. With this notation then we have that
$$
\frac{1}{n} \sum_{k=1}^{N_n} Y_k \leq \frac{1}{n} \sum_{i=1}^n X_i \leq \frac{1}{n} \sum_{k=1}^{N_n} Y_k + \frac{Y_{N_n+1}}{n}.
$$

Next, note that the renewal theorem implies that
$$
\lim_{n\rightarrow\infty} \frac{N_n}{n} = \frac{1}{E[R_1]},
$$
and so
$$
\lim_{n\rightarrow \infty} \frac{1}{n} \sum_{k=1}^{N_n} Y_k = \lim_{n\rightarrow \infty} \frac{N_n}{n} \frac{1}{N_n} \sum_{k=1}^{N_n} Y_k = \frac{E[Y_1]}{E[R_1]},
$$
where the last equality also follows from the fact that the $Y_k$ are i.i.d. Now, it can also be shown that $Y_{N_n+1}/n \rightarrow 0$ and so the upper and lower bounds on $n^{-1} \sum_{i=1}^n X_i$ given above imply that
$$
\lim_{n\rightarrow \infty} \frac{1}{n} \sum_{i=1}^n X_i = \frac{E[Y_1]}{E[R_1]}.
$$

The last step of the proof is to show that $ \frac{E[Y_1]}{E[R_1]} = E^\pi[X_0]$, where $\pi$ is the unique stationary distribution. This can be shown by noting that the stationary distribution $\pi$ has the formula
$$
\pi(y) = \frac{1}{E[R_1]} E\left[ \sum_{i=1}^{R_1} \mathbf{1}_{X_i = y} \right].
$$