Vector substraction
The substraction of one vector from , is the vector going from the tip of to the tip of .
Scalar Product
The scalar product is defined as:
Commutativity and Distributivity hold. We also have with equality exactly if . We can also take out scalars .
The scalar product measures “how much the vectors point in the same direction”. It projects the length of one onto the other and then multiplies those lengths. The more they are aligned, the longer this projection will be.
If and are orthogonal, we have which holds because: and if . From this definition we can also see that is orthogonal to all vectors.
Coplanarity and Colinearity
If such that then and are colinear. This means that they are on the same line.
Two vectors in cannot span the whole space, but they automatically form a plane (except if they are colinear). To check if three vectors are coplanar, you have to check if such that .
Hyperplanes
A hyperplane is defined as the set of all orthogonal vectors to a .
Covectors
The scalar product can also be written as . The transpose is a covector, written as a row vector instead of a column vector.
Formally a covector is a function .
Linear dependance and independance
Dependance:
- At least one vector in the set is a linear combination of the other ones.
- is a non-trivial combination of the vectors in the set (this means that there is a linear combination in which at least one ).
- At least one of the vectors is a linear combination of the previous ones.
Independance:
- None of the vectors is a linear combination of the other ones.
- There are no besides all such that the linear combination sums to (i.e. there is only the trivial combination).
- None of the vectors is a linear combination of the previous ones.
IMPORTANT
A linear combination of linearly independent vectors can only be written in one way. There is a unique linear combination of the spanning vectors for each vector in C(A)!
The empty set is linearly independent by definition, because there is no for which to test for linear combination of other ones.
Linear dependence of sequence containing
Any sequence of vectors which contains the vector is linearly dependant by definition. Set all other ‘s to 0 and the scalar factor of the zero vector to .
Linear dependence of sequence containing a vector twice
Any sequence of vectors which contains a vector twice is dependent, since you can set their ‘s to be opposite.
Empty set
The empty set is always linearly independent (since there is no vector to compare).
Theorems and Lemmas
Cauchy-Schwarz Inequality
It holds exactly if .
Triangle Inequality
It holds exactly if .
Angle between vectors