MathAlgebraLinear algebraSystems of linear equations

Matrix rank

9 minutes read

The rank of a matrix is an important subject that comes up frequently when studying and using matrices. Looking at the name ''rank'' you would probably think this matrix property defines a certain hierarchy, which is correct.

In this topic, you will find out what the rank of a matrix is and discover how rank can be used to work out how many solutions there are to a linear equation system.

Definitions

Generally speaking, the rank of a matrix is the number of linearly independent rows that it contains, and it is denoted as rankArank A.

Linear independence is a subject that you have already studied. A set of rows is linearly independent if it is impossible to express any of them linearly via the others. The same definition applies to matrix columns. So, you can count the rank of a matrix by rows (known as row rank) or columns (column rank).

Let's imagine a 3×33\times3 matrix as a 3×33\times3 table with colored rows/columns. The red rows are linearly dependent on some other rows below in the matrix, otherwise, they are green. For columns, do the same with them without worrying about the rows. So In such a simple representation, the row rank of the matrix will be equal to the number of green rows, and the column rank of the matrix will be equal to the number of green columns.

Surprisingly, row and column ranks are equal to each other. Why is it so? Wait until you learn how to determine the rank, and everything will be clear.

Illustration of lineraly independent rows and columns

And now, we're ready to discuss properties of the matrix rank.

  • If a matrix only contains zeros, it has a rank of 00. That's quite obvious since any row or column can be expressed linearly via the others, because 0a=00\cdot a = 0 for any real number aa.

  • If all rows (or columns) of a square n×nn\times n matrix are linearly independent, it will have a rank equal to its size, nn. If you once more imagine a 3×33\times3 matrix as the 3×33\times3 table, you'll get a table only with green rows and columns, since all of them are linearly independent.

    Matrix with independent rows and columns

  • Since some of the rows or columns of a square matrix can be linearly dependent, the rank of a square matrix can be any integer between 00 and nn. Here are some illustrations with different ranks of a 3×33\times3 matrix. Ranks of 3*3 matrix

  • And what about rectangular matrices m×nm\times n? You can define the rank similarly for them. Thus, the rank of a rectangular matrix is an integer number from 00 to min(n,m)min(n,m), where nn and mm are the numbers of rows and columns it contains.

Let's consider an example with a specific matrix, as shown below:

A=(123046)A = \begin{pmatrix} 1 &2& 3\\ 0 &4 &6 \end{pmatrix}The row rank of it is equal to 22. Why is it so? This matrix has only two rows, and they are linearly independent. Why? We can't express the first one via the second one and vice versa, because the first element in the second row is 00 and the first element in the first row is 11.

Full rank

A rank of a matrix is called full if it is equal to the maximum possible rank. For a square n×nn\times n matrix, full rank is equal to nn, and for a rectangular matrix m×nm\times n it is equal to min(n,m)min(n,m), as you've seen in the previous paragraph. Full-rank matrices are crucial since they have properties, This makes them useful in various applications in linear algebra and related fields like finding solutions of linear systems, inverse matrices, basis, etc.

And what about matrices whose rank is not full? These matrices are called rank-deficient matrices. Let's focus on geometry and these matrices.

Imagine you have a collection of arrows in a two-dimensional plane. Each arrow represents a vector, showing both its direction and magnitude. Now, let's say you have a set of arrows that are not pointing in independent directions. Some arrows may be pointing in the same direction or lie on the same line. In terms of rank-deficient matrices, these arrows represent the columns (or rows) of the matrix. The matrix has a rank deficiency if these arrows are not pointing in independent directions. This means that some arrows can be expressed as a combination or linear sum of others.

Geometrically, the rank deficiency of a matrix corresponds to the arrows lying on lower-dimensional subspaces instead of covering the entire plane. It's as if some of the arrows are collapsed or squeezed onto the same line or plane. This collapse or squeezing is due to the linear dependencies among the arrows. If two arrows lie on the same line, it means they are linearly dependent, and one can be obtained by scaling the other. If three arrows lie on the same plane, they are linearly dependent, and one can be expressed as a combination of the others.

Vector on a plane

Now you know why the full rank is so important. But how can you determine whether the matrix is rank-deficient or not? Let's learn how to do it now!

If the determinant of a matrix is equal to 00, this matrix is rank-deficient.

Otherwise, the matrix can be a full-rank or rank-deficient, but you can't tell this for sure by calculating its determinant. What to do in this case? You'll see it in the following topic. You'll also learn methods for determining the rank of matrices there, since not only the full rank is important for linear algebra and its applications.

How is the rank used?

The rank of a matrix mainly helps to find out how many solutions there are to a linear equation system. Let's consider a system of linear equations: {a1x+b1y+c1z=1a2x+b2y+c2z=0a3x+b3y+c3z=3\begin{equation*} \left\{ \begin{array}{ll} a_1x+b_1y+c_1z=1 \\ a_2x+b_2y+c_2z=0 \\ a_3x+b_3y+c_3z=-3 \end{array} \right. \end{equation*}

Here ai,bi,ci,i=1,2,3a_i,b_i,c_i, i=1,2,3 are real numbers.

Let's write down coefficients as a matrix AA: A=(a1b1c1a2b2c2a3b3c3)A =\begin{pmatrix} a_1 &b_1& c_1 \\ a_2 &b_2 &c_2\\ a_3&b_3&c_3 \end{pmatrix}

When you solve systems of linear equations, you should also write down the extended matrix BB of this system. In order to do it, just add a column of constants (103)\begin{pmatrix} 1 \\ 0\\ -3 \end{pmatrix} to the matrix AA:

B=(a1b1c11a2b2c20a3b3c33)B =\begin{pmatrix} a_1 &b_1& c_1 & 1 \\ a_2 &b_2 &c_2 & 0\\ a_3&b_3&c_3 & -3 \end{pmatrix}And now, the best part comes: based on this single matrix and the concept of matrix rank, you can determine the number of solutions the system of equations has. Yes, that is right: that is all you need in order to gain such valuable information, and this is how it works:

  • If the rank of the extended matrix is higher than the rank of the matrix of coefficients, it means that the system doesn't have any solutions.

  • If the ranks are equal, then the system has at least one solution.

  • And if these equal ranks are equal to the number of unknowns, then the system only has a single solution. Otherwise, the general solution has kk free parameters, where kk is the difference between the number of unknowns and one of these equal ranks.

Conclusion

Here are the most essential facts you've learned in this topic:

  • The rank of a matrix is the number of linearly independent rows (or columns) that it contains

  • A matrix is full-rank if its rank is equal to the maximum possible rank, and it is a rank-deficient matrix otherwise

  • Determinant of a rank-deficient matrix is equal to 00

  • The rank of a matrix is used in solving systems of linear equations

4 learners liked this piece of theory. 0 didn't like it. What about you?
Report a typo