The Four Fundamental Subspaces


Introduction

Every Linear Transformation $T$ has four fundamental subspaces associated with it, namely

  • Column Space
  • Null Space
  • Row Space
  • Left Null Space

Let $A$ denote the matrix of the Linear Transformation, $T$

  • Column Space of $A$, $\mathbf{CS}(A)$ - vector space generated by a linear combination of the columns of $A$.
  • Null Space of $A$, $\mathbf{N}(A)$ - vector space consisting of all the solution $x$ to the equation $Ax = 0$.
  • Row Space of $A$, $RS(A)$ - vector space generated by a linear combination of the columns of $A^{T}$.
  • Left Null Space of $A$, $N(A^T)$ - vector space which consists of all the solution $x$ to the equation $A^{T}x=0$.

8

Motivation

Every Linear Transformation $T: V \rightarrow W$ is associated with four fundamental subspaces. Examining the nature of these subspaces (such as dimension) can often provide better clarity to the nature of the linear transformation. For example, consider a linear transformation from $\mathbb{R}^2 \rightarrow \mathbb{R}$. We say that the dimension of the Null Space, in this case, is $1$ because of an entire set of vectors(spanned by a particular basis) get transformed to the $0$ vector.

Consider a linear system of equations denoted by $Ax=b$(Video 1). Essentially, this system of equations has a unique solution only if the vector $b$ (made up of the constants of the equations) lies in the column space of $A$(Video 2). They play a critical role to understand the rank-nullity theorem and fundamental theorems of linear algebra. As they are defined on these subspaces.

9

Bird's Eye View

The four subspaces are generally expressed in terms of the matrix $A$, the matrix of a linear transformation, say $T: V \rightarrow W$, and its transpose $A^T$. The column space is defined by the span of the columns of the matrix $A$. Similarly, row space is defined by the span of columns of $A^{T}$. The null space is defined as the set of all vectors $x$, which are solutions to equation $Ax=0$. This formulation is quite intuitive to understand as the null space consists of all vectors in the vector space $V$ that get transformed into the null vector. Likewise, the left null space is the solution to the equation $A^{T}x=0$.

It is important to note that the $\mathbf{CS}(A)$ and left null space is a subset of the image space, $W$ whereas the row space and null space is a subset of the original vector space, $V$.

In the following notes, we shall understand certain properties of each of the subspaces both visually and in the purview of the matrix representation of the linear transformation. 

10

Context of the Definitions

Let us consider a system of linear equations -
$x_{1}+x_{2}+x_{3} =b_{1}$
$x_{1}+2x_{2}+x_{3} =b_{2}$
$x_{1}+x_{2}+3x_{3} =b_{3}$
Represent this system in form $Ax=b \rightarrow$ 
$\left( \begin{array}{c c c} 1 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 3 \end{array}\right) \left[ \begin{array} {c} x_{1} \\ x_{2} \\ x_{3} \end{array}\right] = \left[ \begin{array}{c} b_{1} \\ b_{2} \\ b_{3} \end{array}\right]$        (See Video 1)

11

12

Video 1: Writing System of linear equations in form of $Ax=b$

Column Space

The $CS(A)$ is a linear combination of columns of $A$.(See Video 2)

If $A = \left( \begin{array}{c c c} 1 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 3 \end{array}\right) \Rightarrow C(A) = x_{1}\left( \begin{array}{c c c} 1 \\ 1 \\ 1 \end{array}\right) + x_{2} \left( \begin{array}{c c c} 1 \\ 2 \\ 1 \end{array}\right) + x_{3} \left( \begin{array}{c c c} 1  \\ 1 \\  3 \end{array}\right) = \left( \begin{array}{c c c} 1 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 3 \end{array}\right) \left[ \begin{array} {c} x_{1} \\ x_{2} \\ x_{3} \end{array}\right]=Ax$

Therefore, we can say that $Ax \equiv CS(A)$ for $x\in V$. Columns of matrix $A$ associated with linear transformation $T: V\rightarrow W$ are the transformed standard basis of $V$. Hence, $CS(A)$ is linear combination of transformed standard basis of $V$. Thereforeit is also called the range or image of $A$ (as $CS(A) \in W$).(See Video 2)

13

14

Video 2: Column Space is same as the image of Linear Transformation

 The equation $Ax = b$ will have solutions only if $b\in CS(A)$. Refer to video 3.

15

16

Video 3: Existance of Solution w.r.t. vector $b$

Finding the Basis of Column Space

For a matrix A, let R be the RREF of matrix A. From R, we can identify pivot columns (columns corresponding to pivot variables or basic variables). The basis of column space of A are the columns of A corresponding to pivot columns of R.

17

Null Space

The null space $N(A)$ or kernel of matrix $A$ ($ker(A)$) associated with a linear transformation $T: V\rightarrow W$ is the set of all vectors $x$ for which $Ax=0$ i.e. it is the vector subspace of $V$ that converges to 0 after the linear transformation (See video 4).

18

19

Video 4: Null Space (Visually)

Finding the Basis of Null Space

Let us assume that the matrix $A$ is:

$A= \left( \begin{array}{c c} 1 & -1 \\ 1 & -1 \end{array} \right)$

$R$ = RREF of $A= \left( \begin{array}{c c c c c c} 1 & -1 \\ 0 & 0 \end{array} \right)$

$Ax = 0$ and $Rx=0$ will have the same solution. Let the x be $\left( \begin{array}{c} x_{1} \\ x_{2} \end{array} \right)$

$\left( \begin{array}{c c} 1 & -1 \\ 0 & 0 \end{array} \right) \left( \begin{array}{c} x_{1} \\ x_{2}\end{array} \right)=0$

Using $R$, we can see that $x_{1}$ is basic variables whereas $x_{2}$ is free variable.

$\Rightarrow x_{1} = x_{2}$

$\Rightarrow x = \left( \begin{array}{c} x_{1} \\ x_{2} \end{array} \right) = \left( \begin{array}{c} x_{2}\\x_{2}\end{array} \right) = x_{2} \left( \begin{array}{c} 1 \\ 1 \end{array} \right)$.

For find the basis, keep one of the free variables as 1 and other free variables as zero and then repeat this process, one by one, for all the free variables. Each time you do this, you will find a basis of ker(A).

$\Rightarrow$ Basis of ker(A) $= \left( \begin{array}{c} 1 \\ 1 \end{array} \right)$

The method illustrated above is a general method to find the basis of null space. You can verify this from video 4 where the same matrix of linear transformation was used. Let us look at another example.

Let us assume that matrix $A$ is:
$A= \left( \begin{array}{c c c c c c} 1 & 1 & 1 & 2 & 3 & 2 \\ 1 & 3 & 2 & 2 & 7 & 4 \\ 0 & 1 &1 & 0 & 2 & 2 \end{array} \right)$

$R$ = RREF of $A= \left( \begin{array}{c c c c c c} 1 & 0 & 0 & 2 & 1 & 0 \\ 0 & 1 & 0 & 0 & 2 & 0 \\ 0 & 0 & 1 & 0 & 0 & 2 \end{array} \right)$

$Ax = 0$ and $Rx=0$ will have the same solution. Let the x be $\left( \begin{array}{c} x_{1} \\ x_{2} \\ x_{3} \\ x_{4} \\ x_{5} \\ x_{6} \end{array} \right)$

$\left( \begin{array}{c c c c c c} 1 & 0 & 0 & 2 & 1 & 0 \\ 0 & 1 & 0 & 0 & 2 & 0 \\ 0 & 0 & 1 & 0 & 0 & 2 \end{array} \right) \left( \begin{array}{c} x_{1} \\ x_{2} \\ x_{3} \\ x_{4} \\ x_{5} \\ x_{6} \end{array} \right)=0$

Using $R$, we can see that $x_{1}, x_{2}$ and $x_{3}$ are basic variables whereas $x_{4}, x_{5}$ and $x_{6}$ are free variable.

$\Rightarrow x_{1}+2*x_{4}+x_{5} = 0, x_{2}+2*x_{5} = 0$ and $x_{3}+2*x_{6} = 0$

$\Rightarrow x = \left( \begin{array}{c} x_{1} \\ x_{2} \\ x_{3} \\ x_{4} \\ x_{5} \\ x_{6} \end{array} \right) = \left( \begin{array}{c} -2*x_{4}-x_{5} \\ -2*x_{5} \\ -2*x_{6} \\ x_{4} \\ x_{5} \\ x_{6} \end{array} \right) = x_{4} \left( \begin{array}{c} -2 \\ 0 \\ 0 \\ 1 \\ 0 \\ 0 \end{array} \right) + x_{5} \left( \begin{array}{c} -1\\ -2 \\ 0 \\ 0 \\ 1 \\ 0 \end{array} \right) + x_{6} \left( \begin{array}{c} 0 \\ 0 \\ -2 \\ 0 \\ 0 \\ 1 \end{array} \right)$.

For find the basis, keep one of the free variables as 1 and other free variables as zero and then repeat this process, one by one, for all the free variables. Each time you do this, you will find a basis of ker(A).

$\Rightarrow$ Basis of ker(A) $= \left( \begin{array}{c} -2 \\ 0 \\ 0 \\ 1 \\ 0 \\ 0 \end{array} \right),  \left( \begin{array}{c} -1\\ -2 \\ 0 \\ 0 \\ 1 \\ 0 \end{array} \right),  \left( \begin{array}{c} 0 \\ 0 \\ -2 \\ 0 \\ 0 \\ 1 \end{array} \right)$

20

Row Space of a matrix A

As its name suggests (relating it with column space), it is a linear combination of rows of a matrix. But the formal definition of row space mentioned above appears to be completely different.

Definition 1: Row Space of a matrix is the linear combination of the rows of that matrix.
Definition 2: It is a vector space generated by a linear combination of the columns of $A^{T}$.

Let us see how these 2 definitions are the same(See Video 1).

Hence, definitions 1 and 2 are equivalents. Only the difference between the 2 definitions is -

  • In definition 1, vectors are represented horizontally.
  • In definition 2 vectors are represented vertically.

But we usually consider definition 2 as the standard definition, as we normally want to represent vectors vertically.

22

23

Video 5: Definition 1 is equivalent to Definition 2

Finding the Basis of Row Space

Let $R$ be RREF, $r$ be the rank of the matrix $A$. Then, the basis of row space of the matrix $A$ are the first $r$ rows of $R$.

24

Here, I want to introduce a term orthogonal complements. If 2 vector spaces are orthogonal complements to each other that means that the dot product of bases of those 2 vector space has to be 0(or they have to be perpendicular). When we talk about perpendicularly of 2 vectors, we say that the inner product of the two vectors has to be zero (Refer to Lecture notes of Inner Product Spaces for details). It is indeed recommended that when you read about orthogonal complements, come back to this part.

Now, consider a matrix $A$ corresponding to a Linear Transformation. Row Space of $A$ will be the orthogonal complement of null space of $A$. Let us consider a linear transformation represented by the matrix $A = \left[\begin{array}{c c} 1 & -1 \\ 1 & -1 \end{array}\right]$. In Video 3, the transformation shown is the same transformation represented by matrix $\left[\begin{array}{c c} 1 & -1 \\ 1 & -1 \end{array}\right]$. From this video we can conclude that the basis of the null space of $A$ is $\left[\begin{array}{c} 1 \\ 1 \end{array}\right]$ or you may try to calculate it by using the method described above. If you calculate the basis of row space using the method given above, it will come out to be $\left[\begin{array}{c} 1 \\ -1 \end{array}\right]$. Now, if you take the dot product or inner product of the 2 basis, it comes out to be zero i.e. $(1,1).(1,-1) = (1).(1)+(1).(-1) = 0$. This verifies our statement that Row Space of $A$ will be the orthogonal complement of null space of $A$. Refer to video 5 for a visual explanation. Key things to notice in video 5 are -

  • All vectors belonging to the Null Space of $A$ transforms to zero vectors.
  • All vectors of Row Space of $A$ transform to the Column Space of $A$.

25

26

Video 6: Relation between Row Space and Null Space

You may have this question in mind that what happens to vectors which neither belongs to row space nor to null space but belongs to $R^2$. e.g. $\left[\begin{array}{c} 0 \\ 1\end{array}\right]$. We write such vectors as a linear combination of the basis of row space and the basis of null space. e.g.

$\left[\begin{array}{c} 0 \\ 1\end{array}\right] = \frac{1}{2}\left[\begin{array}{c} -1 \\ 1\end{array}\right]+\frac{1}{2}\left[\begin{array}{c} 1 \\ 1\end{array}\right]$

Now we can find the vector to which $\left[\begin{array}{c} 0 \\ 1\end{array}\right]$ is transformed using the properties of linear transformation - 

$T(\left[\begin{array}{c} 0 \\ 1\end{array}\right]) = \frac{1}{2}T(\left[\begin{array}{c} -1 \\ 1\end{array}\right])+\frac{1}{2}T(\left[\begin{array}{c} 1 \\ 1\end{array}\right])$

$T(\left[\begin{array}{c} 0 \\ 1\end{array}\right]) = \frac{1}{2}T(\left[\begin{array}{c} -1 \\ 1\end{array}\right])+\frac{1}{2}.0$

$T(\left[\begin{array}{c} 0 \\ 1\end{array}\right]) = \frac{1}{2}T(\left[\begin{array}{c} -1 \\ 1\end{array}\right])$

We know $\left[\begin{array}{c} -1 \\ 1\end{array}\right]\in \mathbf{RS}(A)$ and $T(\left[\begin{array}{c} -1 \\ 1\end{array}\right]) \in \mathbf{CS}(A)$. And we saw in video 6 how the vectors of row space gets transformed to $\mathbf{CS}(A)$. Point to note here is that $\frac{1}{2}\left[\begin{array}{c} -1 \\ 1\end{array}\right]$ and $\left[\begin{array}{c} 0 \\ 1\end{array}\right]$ maps to same vector in this particular linear transformation.

27

Left Null Space

By definition, it is clear that the left null space of $A$ is the null space of $A^{T}$. But why is it called the left null space. See figure 1.

28

29

Image not loaded

Fig. 1 Naming of the left null space

As we take A as the standard matrix of reference and $x^{T}$ comes to the left of $A$. So, the null space of $A^{T}$ was named as the left null space of the matrix of $A$.

We know that the row space and null space are orthogonal complements. Similarly, column space and left null space are also orthogonal complements(Video 7). Do refer to pause and ponder 3.

30

31

Video 7: Left Null Space(Visually)

Finding the Basis of Left Null Space

For finding the basis of left null space treat left null space of a matrix $A$ as null space of a matrix $A^{T}$. Therefore, we can follow the method of finding the basis of the null space of the matrix(Here, the matrix is the transpose of the matrix $A$, if we want to find the basis of the left null space of the matrix A).

32

Dimension of the Four Fundamental Subspaces

The column and row spaces of $m\times n$ matrix A (where A represents the matrix of linear transformation) both have dimension $r$. Here, $r$ is the rank of a matrix (this would be elaborated in the lecture notes of Rank-Nullity Theorem). The null space has dimension $n-r$ and the left null space has dimension $m-r$.

34

Applications

These four fundamental subspaces have several linear algebra applications. It includes finding and analyzing the rank of a matrix. An important theorem, called Rank-Nullity Dimension Theorem is one of the consequences of the Four Fundamental Subspaces. Also, they are applied in the fundamental theorem of linear algebra. You may refer to Further Reading Section point 3. Professor Gilbert Strang tells about Graphs, networks, incidence matrices and how the four fundamental subspaces
can be used here.

35

Pause And Ponder

  1. Try to prove that all four fundamental subspaces are vector spaces.
  2. Refer to the method of finding the basis of row space. Think of why does this method work.
  3. If we know that row space and null space are orthogonal complements then try to come up with a proof that column space and left null space are orthogonal complements

36

History

The earliest examples of vector space were found in the work of H. Grassmann, from the 1840s onwards. In 1844 he wrote a treatise namely, Ausdehnungslehra, which means the extension theory. It contained the first seeds of the idea of a vector space. In 1862 Grassmann produced a reworked edition. The book had discussed linear independence, vector subspaces and dimension. But it was in the twentieth century that the importance of vector spaces and vector subspaces were recognized. As these four fundamental subspaces are subspaces, the work on them must have started after this point.

37

References

  • Gilbert Strang. 18.06 Linear Algebra. Spring 2010. Massachusetts Institute of Technology: MIT OpenCourseWare, https://ocw.mit.edu. License: Creative Commons BY-NC-SA.
  • Barile, Margherita. "Column Space." From MathWorld--A Wolfram Web Resource, created by Eric W. Weisstein. https://mathworld.wolfram.com/ColumnSpace.html
  • Weisstein, Eric W. "Null Space." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/NullSpace.html
  • https://ocw.mit.edu/courses/mathematics/18-06sc-linear-algebra-fall-2011/ax-b-and-the-four-subspaces/column-space-and-nullspace/MIT18_06SCF11_Ses1.6sum.pdf
  • https://web.mit.edu/18.06/www/Essays/newpaper_ver3.pdf
  • Fundamental Subspaces. Brilliant.org. Retrieved 04:59, May 28, 2020, from https://brilliant.org/wiki/fundamental-subspaces/
  • MATHEMATICS EMERGING A SOURCE BOOK 1540–1900 - By Jacqueline Stedall
  • https://www.math.tamu.edu/~yvorobet/MATH304-504/Lect2-12web.pdf

38

Further Reading

  1. https://www.khanacademy.org/math/linear-algebra/vectors-and-spaces/null-column-space/v/showing-relation-between-basis-cols-and-pivot-cols
  2. https://www.khanacademy.org/math/linear-algebra/vectors-and-spaces/null-column-space/v/showing-that-the-candidate-basis-does-span-c-a
  3. https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/video-lectures/lecture-12-graphs-networks-incidence-matrices/

39


Contributor:
Mentor & Editor:
Verified by:
Approved On:

The following notes and their corrosponding animations were created by the above-mentioned contributor and are freely avilable under CC (by SA) licence. The source code for the said animations is avilable on GitHub and is licenced under the MIT licence.




The work under this website is licenced under a Creative Commons Attribution-Share Alike 4.0 International License CC BY-SA