Tensors

Rank 1 - Tensors

Geometric vectors

Vectors are an example of a tensor. They are geometric objects which possess both a direction and a magnitude. The components of the vectors can change depending on the basis used to represent the space but regardless of the basis vector used, the magnitude and the direction of the vector will be the same.

As a vector needs to maintain the same magnitude and direction at all times, when the basis of the vector is scaled up by a transformation, to ensure that the magnitude and the direction remains the same (invariant), the components are scaled by the inverse of the basis transofmration

Covariant and Contravariant

To ensure that a vector is invariant in magnitude and direction when the basis is transformed, the components are transformed by the inverse baisis transformation. Therefore vectors are an example of contravariant vectors

Covariant vectors are objects in which invariance is maintained by transforming the components of the vector with the basis transformation matrix (not the inverse).

An example of a covariant vector is a linear function that takes in a geometric vector and outputs a scalar value

Rank 2 Tensors

Linear transformation as Tensors

A function L:u → v is a map for any 2 vectors u and v, and any scaler c if the following is satisfied.

$L(u+v) = L(u) + L(v)$

$L(cu) = cL(u)$

To recap:

A tensor is an object which shows invariance when performing a transformation. A covariant tensor shows invariance when it is transformed by forward transform of the basis vector and a contravariant vector is invariant when transforming by the inverse of the basis transform.

In the case when a geometric vector v is acted upon by a linear transformation L, $w = Lu$ in the original basis.

After a change of basis, w is changed to $\hat{w}$ and we need to use $\hat{u}$ instead of u. To change $\hat{u}$ to $\hat{w}$ we need to use a different view of L, which is $\hat{L}$ and therefore: