Pages

Basic properties of Markov Matrices

Image result for markov chains

What's up people? A few minutes ago, while I was browsing AoPS, I saw a post about Markov matrices. Being the curious guy that I am, I immediately jumped on Wikipedia to see what they are all about. Apparently, a Markov matrix, also known as a Stochastic matrix (you've got to admit though, Markov matrix sounds a little bit more complex) is a matrix used in probability and statistics in order to find out how a system develops. A Markov matrix usually encodes a Markov chain (this guy surely has many things named after him). Initially, Andrej Markov wanted to use these matrices for linguistic analysis and card shuffling, yet they ended up being used in many other fields. In this article, I want to discuss several interesting properties of these matrices. 

First of all, let me define a Markov matrix. A matrix \(A=(a_{i,j})_{0<i,j<n+1}\) with positive elements is a right stochastic matrix if the sum of the elements on each row is 1, and A is a left stochastic matrix if the sum of the elements on each column is 1. A first, rather obvious property, would be the following:

1. If \(A∈M_n(R)\) is a right stochastic matrix, then \(A^T\) is a left stochastic matrix.

The proof of this property is rather simple and straightforward.

Since the Markov matrix represents how a system goes from one state to another in the Markov chain, the product of two stochastic matrices is also a stochastic matrices.

2. If \(A, B\) are right/left stochastic matrices, then \(AB\) is also a right/left stochastic matrix.

We shall prove that the sum of the elements on the first row is 1. One can observe that the other rows have the same property. 

Let \(C=AB\). Thus, \(c_{11}=a_{11}b_{11}+a_{12}b_{21}+...+a_{1n}b_{n1}\), and, in the end, after summing all the elements and taking the common factor, we'll get exactly what we wanted, that the sum of the elements on the row is 1.

Now, we shall get to the eigenvalues of a Markov matrix.

3. Let \(A\) be a right stochastic matrix. Then all its eigenvalues have absolute value smaller or equal to 1.

Let us consider an eigenvector \(v\) and let \(v_i\) be its greatest element in absolute value. We know that \(Av=av\), where \(a\) is an eigenvalue of \(A\). By finding the value of the product on the i-th column, we get that \(a_{i1}v_1+...+a_{in}v_n=av_i\). We shall write this as a module, and, thus, \(|a_{i1}v_1+...+a_{in}v_n|=a|v_i|≤a_{i1}|v_1|+...+a_{in}|v_n|≤|v_i|\), so \(|a|≤1\), as required.

In fact, I believe that a stronger result can be established.

4. Let \(A\) be both a right and a left stochastic matrix. If \(a\) is an eigenvalue for which there is an eigenvector with the property that the sum of its elements is different than 0. Then, \(a=1\).

Yet again, we look at \(Av=av\). For every row, we have that (a_{i1}v_1+...+a_{in}v_n=av_i\). We sum up these relations for each row, and we get that \(v_1+v_2+...+v_n=a(v_1+v_2+...+v_n)\), because the sum on columns is equal to 1. Since the sum of the elements of the eigenvector is different from 0, we can just divide by it and obtain that \(a=1\).

This property is really interesting, because it tells us that the Markov matrices have one known eigenvalue, if such an eigenvector exists. In fact...

5. 1 is always an eigenvalue of a Markov matrix.

In order to prove this, let us first observe that, if \(A\) is a left stochastic matrix, then \(A^T\) is a right stochastic matrix. By hand, we can check that \([1 1 ... 1]^T\) is an eigenvector of \(A^T\) with eigenvalue 1. Since \(A\) and \(A^T\) have the same eigenvalues, we conclude that \(A\) has eigenvalue 1.

The property that \(A\) and \(A^T\) have the same eigenvalues is well known, yet I believe that a short explanation is in order. Take the characteristic polynomial of \(A\), \(p(X)\). We know that \(p(A)=O\), and, by transposing, \(p(A^T)=O\). Thus, we know that not only does \(A^T\) have the same eigenvalues as \(A\), but the eigenvalues also have the same multiplicities.

This was but an introduction in  the study of Markov matrices. I believe that the properties I described are good enough for a first impression on this topic. I don't know too much probability (you may have seen that I only tackled the STEP 2 probability problems, because I don't know advanced probability), but when I find out more, I'll be sure to discover more properties of these matrices. Tune in next time for probably something regarding the IMO or the IMC (since the IMC starts this week), or something completely random!

Cris.




No comments:

Post a Comment