Basis of a Vector Space and its Subspace


Definition

Let $V$ is a vector space over a field $F$ and $S$ is a non-empty subset of $V$.
We say that W is a Subspace of $V$ if $S$ is also a vector space over $F$ under the operations of $V$. Equivalently, $W$ can also be a subspace of  $V$ whenever $w_1, w_2 \in$ W and $c_1, c_2 \in F$, it follows that $c_1w_1$ + $c_2w_2 \in W$.
A subset $B$ of $V$ is called a Basis for $V$ if $B$ is linearly independent over $F$ and every element in $V$ is a linear combination of elements of $B$.

8

Motivation

The subspace in linear algebra is analogous to the idea of subsets. A subset is a set of which all the elements are contained in another set. 
There are many functions that we cannot define everywhere so what we do is restrict its domain. For example, $f(x) = 1/x$ is not defined everywhere in $\mathbb{R}$, specifically at $x=0$ so we restrict its domain to be all the non-zero real numbers which is a subset of $\mathbb{R}$.
A vector space is also a set whose elements are vectors and since it is a 'set' we can take a subset within it and that subset can be considered as a subspace. But we cannot choose those subsets randomly to be a subspace, i.e., if the subset is closed under the vector addition and scalar multiplication, then and only then, it forms a subspace. Therefore, a subspace can be defined as a subset of a vector space where we can perform linear algebraic operations - Addition and scalar multiplication.
 

        Consider a vector space $V$=$\mathbb{R}^2$ and take any vector, say (2, 5) in the space. Now, we know that this vector can be expressed as
(2, 5) = 2(1, 0) + 5(0, 1).
Likewise, we can choose any random vector in $\mathbb{R}^2$ and express it in the form of a linear combination of (1, 0) and (0, 1).
In general, (a, b) = a(1, 0) + b(0, 1) i.e., we can get any possible vector in the space if we have the two vectors (1, 0) and (0, 1). These two vectors are sufficient to generate any other vector in the space and hence, they can generate the whole space. The vectors that are just enough to form the whole vector space are called the 'Basis' vectors. In other words, the set of all such vectors is called 'Basis' of the vector space. In this case, basis of $\mathbb{R}^2$ is { (1, 0), (0,1) }.  
Similarly, every vector space has a basis with which it can be fully generated.
 

9

10

Figure 1: Basis of a 2-D Vector Space

Bird's Eye View

Consider a vector space $V$ such that $X, Y$ are its subspaces.
The intersection of the two subspaces, $X$ $\cap$ $Y$ is a subset of $V$.
$X \cap Y$ = $\{u$ | $u \in X$ and $u \in Y\}$
$X$ and $Y$ are the subspaces $\implies$ the zero vector belongs to both $X$ and $Y$.

Let $u, v \in X \cap Y$,
$\implies$ $u \in X$ and $u \in Y$
Similarly, $v \in X$ and $v \in Y$
Since $X$ is a subspace and the vectors $u$ and $v$ are both in $X$, their sum $u + v$ is also in $X$.
Similarly, since $Y$ is a subspace and the vectors $u$ and $v$ are both in $Y$, their sum $u + v$ is also in $Y$.

Suppose $r \in \mathbb{R}$ and $u \in X \cap Y$ and since $X$ and $Y$ are subspaces, they are closed under scalar multiplication
$\implies$ $ru, rv \in X$ and $ru, rv \in Y$ 
$\implies$ $ru, rv \in X \cap Y$

The subset $X \cap Y$ contains a zero vector and is closed under addition and scalar multiplication.

Hence, the intersection of the two subspaces is a subspace.

Moreover, if  $X_1,X_2, . . . . . . ,X_n$ are the subspaces of a vector space $V$, then the intersection of all the subspaces $X_1 \cap X_2 \cap . . . . .\cap X_n$ will also be a subspace.
 

11

12

Figure 2: Intersection of two subspaces is a subspace.

Context of the Definition


Subspace

The subspace is a subset of a vector space that itself is a vector space.
Consider a vector space $V$ over the field $F$ and let $W$ $\subseteq$ $V$, then for the subset $W$ to be a subspace, there are some rules to be obeyed which are as follows:

       (a)  Non-emptiness
             The subset $W$ must contain an identity vector $\vec{0}$.

       (b)  Closed under addition
              If  $u, v \in$ $W$, then $u + v \in$ $W$.

        (c)  Closed under scalar multiplication
              If $v \in W$ and $c \in F$, then $cv$ $\in W$.

Let's look at some examples to get an intuitive idea of what a subspace implies.

1)  Let $\mathbb{R}^2$ be our vector space and $W \subset \mathbb{R}^2$,
     $W$ = { $(x, y) \in \mathbb{R}^2$ | $ax + by = 0$ where $a, b \in \mathbb{R}$ and are fixed}
     Now we need to show that $W$ is a subspace.
     Let's interpret it geometrically!
         (a)  If $a = 0 = b$, then $W$ = $\mathbb{R}^2$ itself. Since, it can take every possible ordered pair of $(x, y)$ $\in \mathbb{R}^2$.
         (b)  If $a = 0$ and $b \neq$ 0 then $W$ is the X-axis and similarly, if $b = 0$ and $a \neq 0$ then $W$ is the Y-axis.
         (c)  If $a \neq 0$ and $b \neq 0$ then $W$ is a line passing through the origin which is other than the coordinate axes. 
    Therefore, $(0, 0) \in W$.
     $\implies$ $W$ is non-empty.
   Let $ax_1 + by_1 = 0, ax_2 + by_2 = 0 \in W$
   $(ax_1 + by_1) + (ax_2 + by_2) = 0$
   $\implies a(x_1 + x_2) + b(y_1 + y_2) = 0$
   $\implies ax'$ + $by' = 0$ $\forall (x', y') \in \mathbb{R}^2$
   Hence, $W$ is a subspace of $\mathbb{R}^2$.

What if ax + by $\neq$ 0?

    In this case, $a(0) + b(0) \neq 0$
    $\implies$ $(0, 0) \notin$ $W$
    Hence, the set of straight lines will not be a subspace if this is the case.

13

14

Figure 3: Straight lines as subspaces

2)  Let $W = \{ (x, y, z) \in \mathbb{R}^3$ | $ax + by +cz = 0$, $a, b, c \in \mathbb{R}$ and are fixed$\}$ is a subspace of $\mathbb{R}^3$.

15

16

Figure 4: The planes passing through the origin are the subspaces of $\mathbb{R}^3$ but the line is not a subspace.

Note: The set $\mathbb{R}^n$ is a subspace itself, since it contains zero and is closed under the scalar multiplication and addition.  

3)  Consider an ordinary differential equation $\frac{d^2y}{dx^2} + ay$ = 0, $a \in \mathbb{R}$.
      Let $W$ be a set of all the solutions of the above differential equation.
      Suppose $y_1$ and $y_2 \in$ $W$,

      $\implies \frac{d(y_1)}{dx} + ay_1$ = 0 and $\frac{d(y_2)}{dx} + ay_2$ = 0

      Consider $\frac{d(y_1 + y_2)}{dx} + a (y_1 + y_2)$

                   = $\frac{dy_1}{dx} + ay_1 + \frac{dy_2}{dx} + ay_2$

                   = 0 + 0 = 0
     
     $\implies y_1 + y_2 \in$ $W$
      
     Now consider $\frac{d(\lambda y)}{dx} + a (\lambda y)$

                            = $\lambda \frac{dy}{dx} + \lambda ay$

                            = $\lambda \left[\frac{dy}{dx} + ay\right]$

    $\implies \lambda y \in W$

    $W$ is closed under addition and scalar multiplication.
    Hence, it is a subspace of all real-valued functions.

 

Is unit circle a Subspace?

Suppose $V$ = $\mathbb{R}^2$ be our vector space over a real field and $W \subset V$, where the subset $W$ = $\{ (x, y) \in \mathbb{R}^2$ | $x^2 + y^2 = 1 \}$ is a unit circle in a
2-Dimensional plane.

  • (0, 1) $\in W$ but c(0,1) = (0, c) $\notin W$.
    Since, 0$^2$ + c$^2$ = c$^2 \neq$ 1
    $\implies W$ is not closed under scalar multiplication.

  • (0, 1), (1, 0) $\in W$ but (0,1) + (1, 0) = (1, 1) $\notin$ W
    $\implies W$ is not closed under vector addition

Hence, it is not a subspace.

17

18

Figure 5: A unit circle

Basis

A non-empty subset $B$ of a vector space $V$ over the field $F$ is said to be a basis if $B$ is linearly independent over $F$ and spans the vector space $V$.
If either of the conditions fails to satisfy, then it will not be a basis.

What does the span of the space means?

The set of all the vectors that are the linear combination of the vectors in the set V = $\{ v_1, v_2, . . . . . . . . v_n\}$  is called the span of V. 
The condition of one vector being a 'linear combination' of the other is termed as Linear Dependence

A set of vectors $\{ v_1, v_2, . . . . . . . . v_n\}$ is said to be linearly dependent if there are scalars $c_1, c_2, . . . . , c_n$, not all zero, such that
$c_1v_1 + c_2v_2 + . . . . . . . + c_nv_n$ = 0


What does linear independence mean and what is the condition for the basis to be linearly independent?

Let $v_1, v_2 \in$ V then these vectors are called linearly independent when $v_1 \neq$ $cv_2$, $\forall c \in F$
OR
If $v_1$ does not lie in the span of  $v_2$, then these vectors are linearly independent.


For example, The set $K$ = { (1,0), (0,1) } is a basis for $\mathbb{R}^2$, since every element of $\mathbb{R}^2$ can be written as a linear combination of the elements of $K$. Both the elements of $K$ are linearly independent as one of them cannot be written as the multiple of the other.
In a similar way, { (1, 0, 0), (0, 1, 0), (0, 0, 1) } is a basis of $\mathbb{R}^3$.
In general, we can say that { $e_1$ = (1, 0, 0, . . . .0), $e_2$ = (0, 1, 0, . . . .0), . . . . . . . . . $e_n$ = (0, 0, 0, . . . . . . . .1) } is the standard basis for $\mathbb{R}^n$.

19

20

Figure 6: Linear Dependence and Independence of the vectors

enlightened Remark:

1)  Suppose $B$ is a basis of the vector space $V$ and $B \subseteq T \subseteq V$
     Then $T$ cannot be the basis of vector space $V$.
     Because if $B \subsetneq T$ $\implies \exists$ $v \in T$ such that $v \notin B$.
     But every element of $V$ is a linear combination of the elements in $B$.
     So, $B$ $\cup$ {$v$} is linearly dependent.
     $\implies T$ is linearly dependent.

2) Suppose $Q \subsetneq B$ and $B$ is a basis of $V$, then $\exists$ $w \in B$ such that $w \notin Q$.
    So, we cannot write $'w'$ as a linear combination of the elements of $Q$. For if we can, $Q \cup${$w$} becomes linearly
    dependent which in turn means $B$ is linearly dependent and hence not a basis. 
    Hence, "No proper subset of a basis can be a basis".

3) By the above two points, we can undoubtedly conclude that 
    "Basis is the minimum information required to generate the whole space".


 

Existence of Basis

Every finite-dimensional vector space admits a basis. This is because every spanning set contains a Basis.
Let $V$ be a vector space over some field $F$ and $v_1,v_2,v_3, . . . . . . . . . v_n$ spans $V$.
Now we will remove some vectors from $v_1,v_2,v_3, . . . . . . . . . v_n$ and notice that set with the remaining vectors will be a Basis of $V$.
Let's begin with a set B being equal to the set of vectors $v_1,v_2,v_3, . . . . . . . . . v_n$.
Step 1: If $v_1$ = 0 then remove it from the set B and if $v_1 \neq$ 0 then leave the set B unchanged.
Step $r$: If $v_r$ is in the span S = $\{v_1, v_2, . . . . . . . v_{r-1}\}$ , then remove $v_r$, otherwise leave it unchanged.
.
Repeat the process till Step n and then stop. Now you must be left with a modified set $B$. 

OBSERVE:

The new set $B$ spans vector space $V$ and this is because our original set also spanned $V$ and we removed just a few vectors that were already in the span of the preceding vectors.
The process makes sure that there is no vector in $B$ that is in the span of the previous ones.
Therefore, $B$ is linearly independent and hence a Basis.
This also shows that the spanning set always contains a Basis.


Dimension

The number of elements in a basis of a vector space $V$ over a field $F$ is called Dimension of the basis and is denoted as $'n'$.
 For completeness, the improper set {0} is said to be spanned by an empty set and have zero dimensions.

 

Example Problem:  Let $M = \begin{bmatrix} u & u + v \\ u + v & v \end{bmatrix}$ be a vector space over $\mathbb{R}$. Show that the set $B = \left\{ \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ 1 & 1 \end{bmatrix}\right\}$ is a basis for $M$.

Solution:  Suppose there are real numbers $u$ and $v$ such that 

$u \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix} + v \begin{bmatrix} 0 & 1 \\ 1 & 1 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$ 

$\implies \begin{bmatrix} u & u + v \\ u + v & v \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}$

So that $u = v = 0$

Also, every element of M is of the form $\begin{bmatrix} u & u + v \\ u + v & v \end{bmatrix} = u \begin{bmatrix} 1 & 1 \\ 1 & 0 \end{bmatrix} + v \begin{bmatrix} 0 & 1 \\ 1 & 1 \end{bmatrix}$ 

Therefore, we can say that $B$ is the basis for $M$.

21

Applications
 

Subspace Clustering in DNA Sequence Analysis

The development of evolutionary classifications such as predicting functional proteins in a newly sequenced species had subspace clustering of genes playing a significant role when applied to the orthologous gene. 
The functioning theory is based on the idea that genetic changes between nucleotide sequences coding of proteins among selected species and the groups may lie in a union of subspaces for clusters of the orthologous groups.
For calculating the subspace dimensions, the sample of a small population is recorded and then a row of experiments is conducted to cluster randomly selected sequences. The main hypothesis is uniform with the clustering results and then a random mutation binary tree is used to reproduce events that exhibit the interdependence of the mutation and subspace rank versus time rates.
This mutation model is established to be highly compatible with the recorded subspace clustering singular values. 
Therefore, this method of subspace clustering can be of great help in orthology exploration.

22

History

23

Pause and Ponder

1)  Is a combination of a line and a plane in $\mathbb{R}^3$ forms a subspace?

2)  What will happen if there is a union of two subspaces? Will it be also a subspace?

3)  What will you call an empty set of vectors {$\phi$}, dependent or independent?

4)  Think about whether the concept of linear independence or dependence can be applied for individual vectors or only for the set of vectors.

24

References

[1]  Gallian, J. (2012). Contemporary abstract algebra. Nelson Education.

[2]  Stover, Christopher and Weisstein, Eric W. "Euclidean Space." From MathWorld--A Wolfram Web Resource.

[3]  Wallace, T., Sekmen, A., & Wang, X. (2015). Application of subspace clustering in dna sequence analysis. Journal of Computational Biology22(10), 940-952.

[4]   Margalit, D., & Rabinoff, J. (2018). Interactive Linear Algebra. Georgia Institute of Technology.

[5]  https://link.springer.com/content/pdf/10.1007%2F978-3-319-11080-6_2.pdf

[6]  http://homepage.math.uiowa.edu/~roseman/m33/m33_chap_2_sec_8.pdf

[7]  http://math.mit.edu/~trasched/18.700.f11/lect5-article.pdf

[8]  https://www.math.csi.cuny.edu/~ikofman/review_exam2_F17.pdf

25

Further Reading

[1]  http://mathonline.wikidot.com/the-intersection-and-union-of-subspaces

[2]  https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4589114/

[3]  http://math.mit.edu/~trasched/18.700.f11/lect5-article.pdf

26


Contributor:
Mentor & Editor:
Verified by:
Approved On:

The following notes and their corrosponding animations were created by the above-mentioned contributor and are freely avilable under CC (by SA) licence. The source code for the said animations is avilable on GitHub and is licenced under the MIT licence.




The work under this website is licenced under a Creative Commons Attribution-Share Alike 4.0 International License CC BY-SA