Uniqueness Theorem for Vector Representation in a Basis

Every vector in a vector space is represented by a vector basis B through a unique linear combination of scalars.

The Definition

Let B={v1,v2,...,vn} be a basis of the vector space V over the field K, then every vector v in V can be expressed as a unique linear combination $$ v = \alpha_1 v_1 + \alpha_2 v_2 + ... + \alpha_n v_n $$ with a finite number of coefficients α1,...αn ∈ K

    Proof

    Consider a generic vector v in the vector space V

    $$ \vec{v} \in V $$

    The basis of the vector space consists of n vectors {v1,v2,...,vn}

    $$ B = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_n \} $$

    Hence, the vector v can be written as a linear combination of the n basis vectors

    $$ \vec{v} = \alpha_1 \vec{v}_1 + \alpha_2 \vec{v}_2 +... + \alpha_n \vec{v}_n $$

    For the sake of contradiction, assume that the vector can also be written as another linear combination of the vectors {v1,v2,...,vn}

    $$ \vec{v} = \beta_1 \vec{v}_1 + \beta_2 \vec{v}_2 +... + \beta_n \vec{v}_n $$

    As it is the same vector, the two linear combinations must be equal

    $$ \vec{v} = \vec{v} $$

    $$ \alpha_1 \vec{v}_1 + \alpha_2 \vec{v}_2 +... + \alpha_n \vec{v}_n = \beta_1 \vec{v}_1 + \beta_2 \vec{v}_2 +... + \beta_n \vec{v}_n $$

    Move all terms to one side of the equation

    $$ \alpha_1 \vec{v}_1 + \alpha_2 \vec{v}_2 +... + \alpha_n \vec{v}_n - \beta_1 \vec{v}_1 - \beta_2 \vec{v}_2 ... - \beta_n \vec{v}_n = 0 $$

    Then, factor out the common terms

    $$ ( \alpha_1 - \beta_1 ) \cdot \vec{v}_1 + ( \alpha_2 - \beta_2 ) \cdot \vec{v}_2 +... + ( \alpha_n - \beta_n ) \cdot \vec{v}_n = 0 $$

    Since the set of vectors {v1,v2,...,vn} forms a basis of the vector space, by the definition of a basis, the vectors are linearly independent.

    When vectors are linearly independent, their linear combination equals zero only in the trivial case, namely when all coefficients are zero.

    $$ \underbrace{ ( \alpha_1 - \beta_1 ) }_0 \cdot \vec{v}_1 + \underbrace{ ( \alpha_2 - \beta_2 ) }_0 \cdot \vec{v}_2 +... + \underbrace{( \alpha_n - \beta_n ) }_0 \cdot \vec{v}_n = 0 $$

    Therefore, the differences in coefficients are zero

    $$ \alpha_1 - \beta_1 = 0 \\ \alpha_2 - \beta_2 = 0 \\ \vdots \\ \alpha_n - \beta_n = 0 $$

    This implies that the coefficients are equal

    $$ \alpha_1 = \beta_1 \\ \alpha_2 = \beta_2 \\ \vdots \\ \alpha_n = \beta_n $$

    The two linear combinations are, in fact, the same.

    This proves that the linear combination representing a vector through a basis of the vector space is unique.

    Alternate Proof

    A basis is a minimal set of vector generators.

    Thus, any vector in the vector space can be represented by a linear combination of the n vectors of the basis using n scalars.

    $$ v = a_1 v_1 + ... + a_n v_n $$

    I need to prove that this representation of a vector using the basis is the only possible one.

    According to the theorem, every vector in the space has a unique representation in coordinates relative to the basis, meaning there is a unique n-tuple of scalars a1,...,an such that v=a1v1+...+anvn.

    To prove this, I consider the contrary hypothesis.

    If two distinct linear combinations (v and w) determined the same vector (v) in the vector space

    $$ v = a_1 v_1 +...+ a_n v_n $$ $$ v = b_1 w_1 +...+ b_m w_m $$

    this leads to the following equation:

    $$ a_1 v_1 +...+ a_n v_n = b_1 w_1 +...+ b_m w_m $$

    $$ a_1 v_1 +...+ a_n v_n - b_1 w_1 - ... - b_m w_m = 0 $$

    Take the smaller value between n and m, i.e., the lesser number of elements in the two linear combinations

    $$ k = min(n,m) $$

    Now suppose that up to the k-th element, vectors v and w are the same, while from the k+1-th element onwards, they differ.

    $$ v_1 = w_1 ... v_k = w_k $$

    $$ v_{k+1} \ne w_{k+1} ... v_n \ne w_m $$

    So, I can rewrite the equation as follows:

    $$ v_1 ( a_1 - b_1 ) +...+ v_k ( a_k - b_k ) +...+ a_{k+1} v_{k+1} - b_{k+1} w_{k+1} +...+ a_n v_n - b_m w_n = 0 $$

    The equation equals zero only if:

    $$ a_1=b_1 ... a_k=b_k $$

    $$ a_{k+1}=0 ... a_n=0 $$

    $$ b_{k+1}=0 ... b_n=0 $$

    Up to the k-th element, vectors v and w share the same components, so the only way to nullify the sum is when the scalars a and b are equal (a1=b1 ... ak=bk).

    $$ v_1 ( 0 ) +...+ v_k ( 0 ) +...+ a_{k+1} v_{k+1} - b_{k+1} w_{k+1} +...+ a_n v_n - b_m w_n = 0 $$

    From the k+1-th element onwards, vectors v and w differ (v≠w). Being linearly independent, the only way to nullify the sum is by setting the scalars a and b to zero (ak+1=0,...,an=0 and bk+1=0,...,bm=0).

    $$ v_1 ( 0 ) +...+ v_k ( 0 ) +...+ 0 v_{k+1} - 0 w_{k+1} +...+ 0 v_n - 0 w_n = 0 $$

    Only in this way is the equation equal to zero and true.

    However, this is a trivial linear combination.

    Therefore, to represent the vector v through the basis B, one must necessarily use a unique combination of k scalars.

    $$ a_1 = b_1 , ..., a_k = b _k $$

    These coefficients are known as coordinates or weights of the vector with respect to the basis.

    In conclusion, if the vectors of the basis are linearly independent, there exists a unique n-tuple of scalars in the representation of every vector in the vector space.

     
     

    Please feel free to point out any errors or typos, or share suggestions to improve these notes. English isn't my first language, so if you notice any mistakes, let me know, and I'll be sure to fix them.

    FacebookTwitterLinkedinLinkedin
    knowledge base

    Vector Bases