You might think that invertible matrices are the best around. After all, they are easy to manipulate and have many useful properties. But let's go a step further and build even better matrices that also take into account the geometry of space.
In this topic, you'll start studying orthogonal matrices, which are well-known for their geometric behavior and essential role in various matrix decompositions. You will learn about their key properties and geometric interpretation. Prepare yourself for a significant advancement into the more advanced concepts of linear algebra!
Nice matrices
The bases are very important because they reconstruct the entire space with the minimum necessary information. But orthonormal bases are even better because they are orthogonal directions to each other, allowing any vector to be described in terms of its dot product with each element of the basis.
Since invertible matrices, whose columns form a basis, are very useful and easy to handle, it would be natural that matrices composed of columns forming an orthonormal basis have even better properties! These special matrices are known as orthogonal matrices.
A square matrix of size is called an orthogonal matrix if its columns form an orthonormal basis.
The first thing to notice about any orthogonal matrix is that, since its columns are a basis, it's invertible. As the canonical basis forms the columns of the identity matrix, it is orthogonal. But in general, constructing an orthogonal matrix is simple. For example, since is an orthonormal basis, the matrix that is formed by putting these vectors as its columns is orthogonal:
However, to determine if a given matrix is orthogonal, you must check three things about its columns:
They form a basis
They're orthogonal to each other
Their size is .
Although this isn't easy, in the next section, you'll find a test that will make this task extremely simple.
You already know that orthonormal bases are the best vectors, so it isn't a surprise that matrices that have them as columns are undoubtedly the best ones. Doesn't this convince you? Keep reading!
The main properties
But what advantages do these matrices have? The following result condenses the most important ones, and you'll see them one by one.
The following conditions are equivalent to being orthogonal:
, which means that
for every
for any
Surely the first condition caught you off guard, right? Orthogonal matrices are invertible matrices for which you know their inverse without any effort: it is simply their transpose. By now, you know that, in general, calculating the inverse of a matrix is not easy, so avoiding this cumbersome work is a great help.
But this isn't all. Thanks to this property, it's really easy to know if a matrix is orthogonal. You just have to multiply it by its transpose and see if the result is the identity. As an easy example, look at the following matrix:
You can easily compute its transpose, so:
Similarly . Thus and this signifies that is orthogonal.
The other two properties have a strong geometric meaning, so let's take a look.
The geometry of isometries
The linear operator associated with an orthogonal matrix has a comfortably simple behavior. You can rewrite the last two properties of these matrices as and for any .
Then the first one means that preserves the angle between vectors. The second just says that preserves the length of vectors. So you can think about as a rotation, a reflection, or a composition of rotations and reflections:
Linear operators with all of these features (which are equivalent between them) are called isometries. As they have a rigid behavior, they don't distort the space in any strange way, like stretching or shrinking. Compare their behavior with that of a simple square matrix :
In particular, the first property means that orthogonal vectors are sent to orthogonal vectors. If you remember, invertible operators convert bases into bases. Well, you've just shown that isometries transform orthonormal bases into orthonormal bases.
Importance
With everything you've learned so far, you can compare orthogonal matrices with invertible ones:
In orthogonal, not only do you know that the inverse exists, but you can calculate it with ease.
When you transform vectors with an invertible matrix, it isn't easy to determine the relationship between their values. However, if you use an orthogonal matrix, you know that the length and the angle of the resulting vectors are just the same as the original ones.
While the geometric behavior of an invertible matrix can be difficult to describe, it turns out that orthogonal matrices generate rigid transformations in space.
The very simple and controlled behavior of orthogonal matrices practically makes them one of the easiest non-trivial matrices to manipulate. For this reason, they play a relevant role in matrix decompositions, which are ways of writing complicated matrices as the product of much simpler ones.
Some of the deeper decompositions are the spectral decomposition and the polar decomposition. You can use them to compute the square root of a matrix or to simplify the equations of the conic sections. However, the most important decomposition that has the most applications is the famous singular value decomposition. Here, the orthogonal matrices have a leading role.
Conclusion
Let's review everything you just learned. For this purpose, consider a matrix of size .
is orthogonal if its columns form an orthonormal basis.
For any orthogonal matrix, its inverse is easy to compute: .
orthogonal matrices preserve angles and sizes: and for every
The linear operators associated with orthogonal matrices have a rigid behavior and are known as isometries. They have a rigid behavior.
Orthogonal matrices are very important in matrix decomposition, mainly in the singular value decomposition.