The Basis of a Vector Space

In linear algebra, a basis is a minimal generating system of the vector space V.

Definition

In a vector space V over the field K, a set of vectors B={v1,...,vn} that generates V is a basis of V if it consists of linearly independent vectors.
$$ B={v_1,...,v_n} $$

It is also known as a free generating system.

Characteristics of a Basis, Coordinates, and Dimension

There are two essential conditions for a vector space basis:

  1. Generators of a vector space
    The set of vectors must be a generating system.

    Note. This means that every vector in the vector space V must be represented by the linear combination of the basis vectors. This is the case in any generating system LR.

  2. Linear Independence
    The vectors must be linearly independent. Each vector vn in the vector basis cannot be written as a linear combination of the other basis vectors {v1,..,vn}.

    Difference between a Basis and a Generating System. The characteristic of linear independence of all vectors in the basis B distinguishes bases from generating systems. In a generating system, vectors can also be linearly dependent.

Coordinates or Weights of Vectors

In a vector basis B, each vector in the vector space V is defined by a unique linear combination, meaning there exists a unique n-tuple of scalars (a1,...,an) known as coordinates or weights.

$$ v = a_1 v_1 + ... + a_n v_n $$

Note. This characteristic of vector bases is demonstrated by the theorem of the uniqueness of vector representation through a basis.

Therefore, the system of linear equations admits only one solution.

Thus, determining a basis is equivalent to establishing that a linear system has a unique solution.

Dimension of the Basis

The number of elements { v1,...,vn } in the basis is called the dimension. $$ dim ( n ) $$ $$ with \:\: n ∈ Z ≥ 0 $$

The dimension of a basis is denoted by dim followed by the cardinality n of the basis elements.

A basis can have either a finite or an infinite number of elements.

  • Finite Dimension. A basis has a finite dimension of n if it consists of n vectors.
  • Infinite Dimension. A basis has an infinite dimension if it consists of an infinite set of vectors.

Zero Dimension. Only the trivial space {0v} has a dimension of zero. The trivial space consists only of the zero vect/math/the-dimension-theorem-of-a-vector-space-basisor, i.e., a linearly dependent vector. Having no linearly independent vectors within, the trivial space does not have vector bases. For this reason, the trivial space {0v} is the only space with a zero dimension.

Example

The basis B consists of two vectors.

$$ B = \{ v_1 , v_2 \} $$

Therefore, the basis has a dimension of 2.

Examples and Exercises on Vector Bases

Example 1

In the vector space V=R2 over the field K=R, a generating system is formed by the vectors v1 and v2.

$$ v_1=(1,0) $$ $$v_2=(0,1) $$

To determine if they form a basis, I need to check the linear independence of the vectors.

Their linear combination is

$$ v = a_1 v_1 + a_2 v_2 $$ $$ v = a_1 (1,0) + a_2 (0,1) $$ $$ v = (a_1,0) + (0,a_2) $$ $$ v = (a_1,a_2) $$

Expressed in x,y coordinates, this becomes

$$ (x,y) = (a_1,a_2) $$

Which corresponds to the linear system

$$ \begin{cases} a_1=x \\ a_2=y \end{cases} $$

This linear system has an obvious solution. For any (x,y) coordinates in the plane, there are two scalar parameters a1 and a2 that identify it.

Therefore, the two vectors v1 and v2 are linearly independent.

This constitutes a basis of the vector space V.

Also, analyzing the system in matrix form, it's immediately apparent that the rank of the matrix is the maximum for the linear system.

$$ \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} $$

In the matrix, the complementary minor with a non-zero determinant is of order 2.

$$ \begin{vmatrix} 1 & 0 \\ 0 & 1 \end{vmatrix} = 1+0 = 1 \ne 0 $$

The rank equals the number of columns in the matrix (2), that is, the number of unknown variables in the linear system.

This confirms the linear independence of the vectors.

Example 2

Staying within the vector space V=R2 over the field K=R, let's analyze another pair of vectors.

$$ v_1=(1,1) $$ $$v_2=(-1,1) $$

To determine if they form a basis, I need to check their linear independence.

Their linear combination is

$$ v = a_1 v_1 + a_2 v_2 $$ $$ v = a_1 (1,1) + a_2 (-1,1) $$ $$ v = (a_1,a_1) + (-a_2,a_2) $$ $$ v = (a_1-a_2,a_1+a_2) $$

Expressed in x,y coordinates, this becomes

$$ (x,y) = (a_1-a_2,a_1+a_2) $$

Which corresponds to the linear system

$$ \begin{cases} a_1-a_2=x \\ a_1+a_2=y \end{cases} $$

When expressed in matrix form, the system has a rank of two.

$$ \begin{pmatrix} 1 & -1 \\ 1 & 1 \end{pmatrix} $$

The complementary minor with a non-zero determinant is of order 2 (rank = 2).

$$ \begin{vmatrix} 1 & -1 \\ 1 & 1 \end{vmatrix} = 1+1 = 2 \ne 0 $$

The rank is equal to the number of columns in the matrix.

Note. Linear independence of vectors can be verified by observing the rank of the matrix. For more information, click here.

Therefore, the vectors v1 and v2 are linearly independent and the system has at least one solution.

This constitutes a basis of the vector space.

To verify this, let's try solving the system of linear equations using the substitution method.

$$ \begin{cases} a_1-a_2=x \\ a_1+a_2=y \end{cases} $$

$$ \begin{cases} a_1-a_2=x \\ a_1=y-a_2 \end{cases} $$

$$ \begin{cases} (y-a_2)-a_2=x \\ a_1=y-a_2 \end{cases} $$

$$ \begin{cases} a_2=(y-x)/2 \\ a_1=y-a_2 \end{cases} $$

$$ \begin{cases} a_2=(y-x)/2 \\ a_1=y-(y-x)/2 \end{cases} $$

$$ \begin{cases} a_2=(y-x)/2 \\ a_1=-x/2 \end{cases} $$

The system has a solution for any coordinate (x,y) on the plane.

The linear independence of vectors v1 and v2 is confirmed.

How Many Bases Exist in a Vector Space

There isn't just one basis in a vector space.

Every real vector space contains an infinite number of bases.

If there are various bases in a vector space, every vector in the space can be represented differently depending on the chosen basis.

The Case of the Trivial Vector Space.

The trivial vector space {0v} is an exception to the rule.

The trivial vector space {0v} has no basis.

Demonstration

The trivial vector space lacks a basis because it contains only the null vector.

A null vector is always linearly dependent.

Note. A vector is linearly independent if it equals the null vector only when all coefficients of the linear combination are zero. $$ \vec{v} = k_1 \vec{v_1} + ... + k_n \vec{v_n} = \vec{0} $$ In the case of the null vector, however, I can obtain the null vector even if any one of the coefficients is non-zero. For example, k1≠0 and all others k2,...kn=0. $$ \vec{0} = k_1 \vec{v_1} = \vec{0} $$ Therefore, the null vector can never be linearly independent. Hence, it is always linearly dependent.

Thus, the trivial vector space has no linearly independent vectors within it and cannot have a basis.

The Canonical Basis

A basis is called canonical if each vector vi has all elements at zero except for the i-th element.

In every vector space Kn, there always exists a canonical basis.

Example

In a vector space R4 over the field R, I have four vectors.

$$ v_1 = (1,0,0,0) $$ $$ v_2 = (0,1,0,0) $$ $$ v_3 = (0,0,1,0) $$ $$ v_4 = (0,0,0,1) $$

This is an identity matrix because its diagonal is composed of 1s while the other elements are 0s.

$$ \begin{pmatrix} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end{pmatrix} $$

It is immediately evident that the vectors are linearly independent because the rank of the matrix is 4, equaling the number of columns.

The linear combination of vectors is as follows:

$$ v = a_1 v_1 + a_2 v_2 + a_3 v_3 + a_4 v_4 $$ $$ v = a_1 (1,0,0,0) + a_2 (0,1,0,0) + a_3 (0,0,1,0) + a_4 (0,0,0,1) $$ $$ v=(a_1,0,0,0)+(0,a_2,0,0)+(0,0,a_3,0)+(0,0,0,a_4) $$ $$ v=(a_1,a_2,a_3,a_4) $$

Therefore,

$$ (x,y,z,w)=(a_1,a_2,a_3,a_4) $$

can also be expressed as a system:

$$ \begin{cases} a_1 = x \\ a_2=y \\ a_3=z \\ a_4 = w \end{cases} $$

It's a basis because the system admits a single solution.

Theorems on Vector Bases

Main theorems of a vector space basis:

Theorem of Unique Vector Representation in a Basis

Every vector v in the vector space V is representable by a vector basis B through a unique combination of scalar numbers a1,...,an.

Theorem of Linear Dependence of Every Vector Relative to the Basis

All vectors in a vector space V over the field are linearly dependent relative to the vectors of the vector basis B.

Theorem of the Dimension of a Basis

If a vector space over the field K has a basis B with a finite number of elements (dimension), then every other basis B' of V has the same number of elements, i.e., the same dimension as B.

Corollary

The number of elements in a basis depends not on the choice of the basis but on the vector space itself.

Theorem of Completing the Basis

If a basis has k<n elements, it is an incomplete basis and can always be completed by adding the missing n-k linearly independent vectors.

Other Theorems on Bases

  • In a finitely generated vector space V of dimension n, from any set of generators {v1,v2,..,vs} with s>n, it is always possible to obtain a basis B of the vector space by eliminating linearly dependent vectors from the set of generators (proof).
  • In a finitely generated vector space V of dimension n, from any set of linearly independent vectors {v1,v2,..,vp} with p<n, it is always possible to obtain a basis B of the vector space by adding linearly independent vectors to the set of vectors (proof).
  • In a vector space V with a known dimension dim(V)=n, if {v1,v2,..,vn} are a set of generators of V, then they are also a basis of V (proof)
  • In a vector space V with a known dimension dim(V)=n, if {v1,v2,..,vn} are a set of linearly independent vectors, then they are also a basis of V (proof)

 

 
 

Please feel free to point out any errors or typos, or share your suggestions to enhance these notes

FacebookTwitterLinkedinLinkedin
knowledge base

Vector Bases