Grassmann's Theorem or Formula

Understanding Grassmann's Theorem

In a finite-dimensional vector space V over the field K of dimension n, given two vector subspaces A and B, the dimension of the sum of the subspaces A+B is given by
$$ dim_k(A+B) = dim_k(A)+dim_k(B)-dim_k(A⋂B) $$

In essence, to determine the dimension of the two subspaces, one must add the dimensions of each subspace and subtract the dimension of their intersection.

Thus, the sum of the dimensions of the two subspaces is not equal to the dimension of the sum set.

$$ dim_k(A+B) \ne dim_k(A)+dim_k(B) $$

This holds unless the subspaces are in direct sum.

Note. This theorem is founded on a principle similar to that used in set theory, where the cardinality of the union of two non-disjoint sets |A∪B| is equal to the sum of the elements of each set |A|+|B| minus the elements of their intersection |A⋂B|. $$ |A∪B| = |A|+|B| - |A⋂B| $$ Otherwise, the common elements A⋂B would be counted twice.
set cardinality

    The Proof

    Consider two vector subspaces U and W within the vector space V.

    The intersection U⋂W of these vector subspaces U and W contains vectors common to both U and W.

    $$ U \cap W $$

    By hypothesis, the basis of the subspace U⋂W consists of r vectors

    $$ B_{U⋂W} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r \} $$

    Therefore, the dimension of the subspace U⋂W is equal to r

    $$ \dim (U⋂W) = r $$

    As intersection vectors, the vectors v1, v2, ...,vr belong to both U and W

    By hypothesis, the basis of the subspace U is made up of r+s vectors

    Thus, the dimension of the subspace U is equal to r+s

    $$ \dim (U) = r+s $$

    The basis of the subspace U is formed by vectors v1, v2, ...,vr that also belong to U, and other vectors u1, u2, ...,us from U

    $$ B_{U} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s \} $$

    Since the vectors v1, v2, ...,vr form the basis of U⋂W, they are linearly independent

    $$ B_{U} = \{ \underbrace{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r }_{\text{lin. independent}}, \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s \} $$

    By hypothesis, the basis of the subspace W consists of r+t vectors

    Thus, the dimension of the subspace W is equal to r+t

    $$ \dim (W) = r+t $$

    The basis of the subspace W includes vectors v1, v2, ...,vr that also belong to W, and other vectors w1, w2, ...,wt from W

    $$ B_{W} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t \} $$

    Since the vectors v1, v2, ...,vr form the basis of U⋂W, they are linearly independent

    $$ B_{W} = \{ \underbrace{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r }_{\text{lin. independent}}, \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t \} $$

    Next, I need to find a basis for U+W

    The subspace U+W consists of all vectors that can be written as a sum of vectors from U and W.

    The vectors of U are generated by the basis

    $$ B_{U} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s \} $$

    The vectors of W are generated by the basis

    $$ B_{W} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t \} $$

    Therefore, a generating system for U+W is

    $$ G_{U+W} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s , \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t \} $$

    However, I must determine if they are also linearly independent to form a basis for U+W

    Knowing that vectors from v1 to us form the basis of U, they are certainly linearly independent among themselves

    $$ G_{U+W} = \{ \underbrace{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s }_{\text{lin. independent}} , \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t \} $$

    Knowing that vectors from w1 to wt form the basis of W, they are also linearly independent among themselves

    $$ G_{U+W} = \{ \underbrace{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s }_{\text{lin. independent}} , \underbrace{ \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t }_{\text{lin. independent}} \} $$

    However, it's not certain that these two groups are linearly independent when taken together.

    To verify their linear independence, consider a linear combination of G(U+W) equal to the zero vector v

    $$ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + \beta_1 \vec{u}_1 + ...+ \beta_s \vec{u}_s + \lambda_1 \vec{w}_1 + ... + \lambda_t \vec{w}_t = \vec{0} $$

    I need to prove that all coefficients are zero.

    I move the vectors forming the basis of W to the right side of the equation

    $$ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + \beta_1 \vec{u}_1 + ...+ \beta_s \vec{u}_s = - \lambda_1 \vec{w}_1 - ... - \lambda_t \vec{w}_t $$

    This demonstrates an equality between a generic vector of U (on the left) and a generic vector of W (on the right)

    $$ \underbrace{ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + \beta_1 \vec{u}_1 + ...+ \beta_s \vec{u}_s }_{ \in U }= \underbrace{ - \lambda_1 \vec{w}_1 - ... - \lambda_t \vec{w}_t }_{ \in W } $$

    The equality of the vectors on the left and right sides implies that both belong to both U and W, i.e., the intersection U⋂W.

    $$ \underbrace{ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + \beta_1 \vec{u}_1 + ...+ \beta_s \vec{u}_s }_{ \in U⋂W }= \underbrace{ - \lambda_1 \vec{w}_1 - ... - \lambda_t \vec{w}_t }_{ \in U⋂W } $$

    Since the base of the intersection is BU∩W={v1,...,vr}, it follows that the β coefficients are all zero, as the u1,..,us vectors are not necessary to generate the vectors of U∩W

    $$ \beta_1 = \beta_2 = ... = \beta_r = 0 $$

    For the same reason, all the λ coefficients on the right side are zero, as the w1,..,ws vectors are not necessary to generate the vectors of U∩W

    $$ \lambda_1 = \lambda_2 = ... = \lambda_r = 0 $$

    Therefore, knowing that β1...βs=0 and λ1...λt=0, the linear combination becomes

    $$ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + \beta_1 \vec{u}_1 + ...+ \beta_s \vec{u}_s = - \lambda_1 \vec{w}_1 - ... - \lambda_t \vec{w}_t $$

    $$ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + 0 \cdot \vec{u}_1 + ...+ 0 \cdot \vec{u}_s = - 0 \cdot \vec{w}_1 - ... - 0 \cdot \vec{w}_t $$

    $$ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r = 0 $$

    Under the initial assumption, the vectors v1, v2, ...,vr form a base of U∩W. Thus, they are linearly independent.

    This means that all the α1...αr=0 coefficients are zero, because the only way to achieve a null linear combination α1v1+...+αrvr=0 is the trivial solution.

    $$ \alpha_1 = \alpha_2 = ... = \alpha_r = 0 $$

    In conclusion, I have demonstrated that the coefficients α1...αr=0, β1...βs=0, and λ1...λt=0 are all zero.

    If all the coefficients are zero, then the only way for the following linear combination to equal the null vector is the trivial solution

    $$ \alpha_1 \vec{v}_1 + ...+ \alpha_r \vec{v}_r + \beta_1 \vec{u}_1 + ...+ \beta_s \vec{u}_s + \lambda_1 \vec{w}_1 + ... + \lambda_t \vec{w}_t = \vec{0} $$

    This proves that the vectors v1,...,vr,u1,...,us,w1,...,wt are linearly independent of each other

    Therefore, the generator G(U+W) is also a base for U+W

     

    $$ G_{U+W} = B_{U+W} = \{ \vec{v}_1 , \vec{v}_2 , ... , \vec{v}_r , \vec{u}_1 , \vec{u}_2 , ... , \vec{u}_s , \vec{w}_1 , \vec{w}_2 , ... , \vec{w}_t \} $$

    The dimension of U+W is equal to the number of vectors in the base

    $$ \dim(U+W) = r+s+t $$

    Now I know the dimension of all the subspaces

    At this point, I can verify whether Grassmann's formula is correct

    $$ \dim(U+W) = \dim(U) + \dim(W) - \dim(U ∩ W) $$

    I substitute the respective dimensions

    $$ \underbrace{ \dim(U+W) }_{r+s+t} = \underbrace{\dim(U)}_{r+s} + \underbrace{ \dim(W) }_{r+t} - \underbrace{ \dim(U ∩ W) }_{r} $$

    $$ r+s+t = (r+s)+(r+t)-r $$

    $$ r+s+t = r+s+r+t-r $$

    $$ r+s+t = r+s+t $$

    The equality of both sides of the equation demonstrates that Grassmann's formula is correct.

    The Case of Direct Sum Subspaces

    Only if the subspaces A and B are in direct sum with each other, does the dimension of the sum set of the subspaces coincide with the sum of the dimensions of the subspaces.

    $$ \dim_k(A \oplus B) = \dim_k(A)+\dim_k(B) $$

    In the case of a direct sum, the intersection of the two subspaces is null.

    $$ A∩B = {0} $$

    thus

    $$ \dim_k(A+B) = \dim_k(A)+\dim_k(B)-\dim_k(A∩B) $$

    $$ \dim_k(A+B) = \dim_k(A)+\dim_k(B)-\dim_k(0) $$

    $$ \dim_k(A+B) = \dim_k(A)+\dim_k(B) $$

    And so on.

     

     

     
     

    Please feel free to point out any errors or typos, or share suggestions to improve these notes. English isn't my first language, so if you notice any mistakes, let me know, and I'll be sure to fix them.

    FacebookTwitterLinkedinLinkedin
    knowledge base

    Vector Bases