Vector spaces by themselves have a lot of useful properties. But it's until we connect them through special functions that we discover their deepest qualities and find the strongest results.
In this topic, we'll begin to know these functions, their most important properties, and some examples. Later we will investigate more elaborate results and their relationship with the matrices. As in previous topics, we'll denote by and two vector spaces.
Preserving the structure
A function connects sets, so it acts as a bridge between them. But vector spaces are not simple sets, they have an algebraic structure.
This suggests that we should be primarily focused on the functions that preserve such a structure, i.e. that get along well with the sum of vectors and the product of a vector by a scalar. These functions are called linear transformations (usually denoted by a more dramatic ) and are the core of linear algebra.
From now on in this course, practically all the concepts and results will involve linear transformations in one way or another.
- Central notions such as matrices, eigenvectors, and orthogonal projections are closely related to these functions.
- We can also use them to calculate the dimensions of spaces, interpret many geometric transformations, change coordinates, exploit the basis of spaces, simplify many calculations, approximate more complicated functions, fully understand systems of linear equations, and much more.
We are now ready to introduce the definition of a linear transformation and explore its immediate properties.
Definition and basic properties
A function is a linear transformation if for all and for every :
The first point means that if you have calculated the value of in both and , then we can reduce the calculation of in the sum to simply the sum of those values. In the same way, the second point establishes that when transforming under , you automatically know the transformation of any scalar multiple of .
The two properties are merged when analyzing the interaction of linear transformations with linear combinations:
It's time to know a fundamental property of linear transformations: they map the of to the of , that is, . You can verify this quickly, since , so subtracting from both sides of the equation we get that .
This gives us a simple test to determine if a transformation is not linear:
If , then is not linear.
The most important linear transformations are those that map a space to itself. Their properties are so special that they deserve their own name: If is linear, then we'll call it a linear operator.
Examples of linear transformations
Undoubtedly, the simplest linear transformation is the zero constant function , which is given by . It is immediate that it is linear since and .
Another well-known function is the identity function , which is simply given by . It's easy to see that it's also linear, so you can quickly check it. It is a linear operator.
Let us consider a slightly more difficult function given by . Let's prove that this function is linear:
Proof that is linear
Let and , then:
In future topics, you will learn to analyze linear transformations geometrically, but for now, just note this operator converts the entire plane into the line generated by the vector .
Now let's look at a really important example. Let's take an matrix . We can define a function in given by the product , which is a vector in . This function is the transformation associated with and is denoted as:
From the properties of the matrix product, it is clear that it's a linear transformation: and .
Now we look at a slightly more extravagant transformation. If denotes the set of square matrices, then we can define the trace function given by . We can easily check that it's linear:
Proof that is linear
The transpose defines a transformation from the set of matrices to the set of matrices. It's easy to see that it's linear thanks to its properties, since and .
But let's not rush, not all functions are linear, actually only a few are. Think about the line . It is clear that , so it cannot be linear. Another example is the cosine function, since we know that , it's impossible for it to be linear due to the last result of the previous section.
Relationship with the basis
A vector space is completely determined by a basis . This means that any vector can be uniquely written as a linear combination of the basis vectors: But wait a second, linear transformations get on perfectly with linear combinations. So let's see how it interacts with the basis:
Bingo: to know the value of in it is enough to know the values of in the basis! So if we had another vector we could reuse this information , and only the coefficients are different.
In summary, having computed the value of in a few vectors we have virtually computed all of its values. And this also means that if we want to build a linear transformation we only have to choose its values in the basis.
is completely determined by the values it takes in the basis. Just as the basis describes all the space, it also describes any linear transformation on it.
This is very useful because if you have a linear transformation that is very complicated to evaluate, you would only have to calculate its values in the basis and then derive a general formula for every vector.
Let's see a simple example. We'll reconstruct a linear transformation . The only thing we know is that and . Then we obtain that in general:
Thanks to the flexibility of linear transformations, we can sometimes easily construct their inverses. In the previous example, it is enough to denote and and construct the transformation given by and . Thus, "returns" each vector to its place of origin and transforms the curvilinear space into an original rectangular.
Conclusion
Let be a basis for .
- Linear transformations are functions that preserve the structure of vector spaces.
-
A function is a linear transformation for all and for every
-
-
When is linear it's called a linear operator.
-
If is linear, then .
-
Every matrix generates a linear transformation defined by:
-
The transformation is determined by its values in the basis. For every vector :