Just as vectors are essential in understanding the movement and relationships within a space, linear transformations provide a framework to explore how one vector space can be transformed into another.
You're already familiar with the basics of linear transformations, but now you'll learn to manipulate them just as fluently as vectors. Also, you'll start using an effective operation that only functions have, and that behaves like the product of matrices.
The linear transformations are vectors, too!
Vectors may be abstract objects, don't you think? But so far, you've been able to work with them fluidly and get a lot of useful results thanks to their structure: addition and scalar product have given you all the flexibility you need. What's more, we often don't think about vectors individually but rather exploit these operations. It doesn't matter if you are adding vectors in the plane or matrices, in the end, the operation behaves in the same way, and you get similar results.
Well, it turns out that among the linear transformations, you can also define a sum and a product by a scalar. But remember, the sum of two vectors and their product by a scalar must also be another vector, so the result of combining the transformations must be another transformation.
Let's take two vector spaces and . From now on, denote the set of all linear transformations from to as . If , then define their sum, , as the transformation that assigns to each vector the vector . Similarly, for any scalar , you can define a new transformation whose value in is simply . In summary:
But here comes the litmus test: does this sum and this scalar product make a vector space? The answer is yes! And the proof is simple, the only fine point is that the set needs to have a . Well, this role is fulfilled by the constant function given by for all .
In particular, the set of linear operators on , denoted by , is simply written as .
Composing transformations
Functions are like processes that are applied to objects in a set, so they can be chained together to form more complex processes that connect different sets. Since linear transformations are functions, of course, you can compose them, you only need one more space, say . If and , then their composition is:
That is, first transforms into and then transforms the latter into .
But what really matters is whether the composition preserves linearity, that is, whether is linear. Fortunately, the answer is also affirmative and will have invaluable consequences later. So .
Proof, is linear
This new operation is quite peculiar and unique, it is an independent quality of the vector space structure. In reality, it is similar to the product of matrices. Actually, you'll find out later that this relationship is much more intimate than it may seem, and incidentally it has similar properties:
Looking carefully at these properties, it is impossible not to notice their resemblance to multiplication. This is why they are often also called a product. But this is only an appearance, because the composition is not commutative, since the expression doesn't even make sense. Even when it is rare for commutativity to hold, another similarity with the product of matrices.
Taking powers
Let's go further with the product. As with the multiplication of numbers, you can define the powers of an operator through composition. For example, the square of is the product of with itself, that is, the composition of with :
Larger powers are defined in a similar way, but in general, you can write them compactly as . Again in analogy with numbers, you can define the powers and as (the identity function) and . Thinking of the composition as the successive application of the same transformation, it is evident that the power complies with the elemental properties:
But the functions are not numbers, and the composition is not a product, therefore it has other properties. Sometimes after applying an operator several times, it stops having any effect. These types of operators are called idempotent, and the simplest example is the projection on a line.
Take a vector in the plane, after projecting it with a transformation on a line, no matter how many more times you project it again, the result doesn't change.
The geometry of transformations
Since linear transformations are vectors, let's start treating them as such with some examples. Let's go to the plane and take a vector , you are going to investigate the geometry of the projections in the coordinate axes given by:
Let's start combining the projections. Just by adding them, you have that and so .
If you now scale by a scalar, say , then you get that , which means that the result is simply a stretch of the projection.
A curious combination is . Note that and therefore is the transformation that reflects each vector about the horizontal axis. What geometric meaning do you imagine that have?
Since and are projections, they are idempotent. For example, , so any power of is equal to , you can verify that the same thing happens with .
It's time to compose the transformations! On one hand, and on the other hand. This means that ,
Even more surprising, you can easily calculate their powers since .
Conclusion
Let and be two vector spaces.
- is the set of linear transformations from into
- is the set of linear operators in
- is a vector space with the operations:
- If and , then their composition is in and is given by:
- Usually
- If then their powers are:
- If then it's idempotent if