SymPy Demo 2: General Vector Spaces#

Demo by Jakob Lemvig, Christian Mikkelstrup, Hans Henrik Hermansen, Karl Johan Måstrup Kristiansen, and Magnus Troen. Revised 24-10-24 by shsp.

from sympy import *
init_printing()

Some Vector Operations#

Consider the following vectors:

xs = symbols('x:7')
v1 = Matrix(xs[:3])
v2 = Matrix(xs[4:])
v1,v2
\[\begin{split}\displaystyle \left( \left[\begin{matrix}x_{0}\\x_{1}\\x_{2}\end{matrix}\right], \ \left[\begin{matrix}x_{4}\\x_{5}\\x_{6}\end{matrix}\right]\right)\end{split}\]

The common dot product and cross product are defined in SymPy as .dot() and .cross().

v1.dot(v2), v1.cross(v2)
\[\begin{split}\displaystyle \left( x_{0} x_{4} + x_{1} x_{5} + x_{2} x_{6}, \ \left[\begin{matrix}x_{1} x_{6} - x_{2} x_{5}\\- x_{0} x_{6} + x_{2} x_{4}\\x_{0} x_{5} - x_{1} x_{4}\end{matrix}\right]\right)\end{split}\]

Norms for vectors are defined by .norm(ord = None). By default SymPy uses the Euclidean 2-norm. For other types of norms and how to use them, please refer to the SymPy Matrix documentation.

v1.norm()
../_images/124c735ef8c641cf54c2befaf3b24545d39e76006465bc71222dcd913dc6f17c.png

Be aware that it makes a big difference for the norm whether the symbols are defined as real or complex.

Thinning out a Span#

Consider the following vectors in \(\mathbb{C}^4\):

\[\begin{split} \mathbf v_1 = \left[\begin{matrix}1 + i\\3\\0\\7 i\end{matrix}\right], \ \mathbf v_2 = \left[\begin{matrix}2\\4 - i\\2 i\\8 - i\end{matrix}\right], \ \mathbf v_3 = \left[\begin{matrix}3 + i\\7 - i\\2 i\\8 + 6 i\end{matrix}\right], \ \mathbf v_4 = \left[\begin{matrix}3\\-1 - i\\7 i\\0\end{matrix}\right]. \end{split}\]
v1 = Matrix([1+I, 3, 0 , 7*I])
v2 = Matrix([2, 4-I, 2*I, 8-I])
v3 = Matrix([3+I, 7-I, 2*I, 8+6*I])
v4 = Matrix([3,-1-I, 7*I, 0])

These vectors span a vector space, \(\operatorname{span}(\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4)\). But can this span be expressed using fewer vectors? What we are asking for is the largest possible linearly independent subset of them. If the four vectors are merged as columns in a matrix:

V = Matrix.hstack(v1,v2,v3,v4)
V
\[\begin{split}\displaystyle \left[\begin{matrix}1 + i & 2 & 3 + i & 3\\3 & 4 - i & 7 - i & -1 - i\\0 & 2 i & 2 i & 7 i\\7 i & 8 - i & 8 + 6 i & 0\end{matrix}\right]\end{split}\]

then it is the column space of this matrix that we are asking for. SymPy can find that for us with the command .columnspace():

V.columnspace()
\[\begin{split}\displaystyle \left[ \left[\begin{matrix}1 + i\\3\\0\\7 i\end{matrix}\right], \ \left[\begin{matrix}2\\4 - i\\2 i\\8 - i\end{matrix}\right], \ \left[\begin{matrix}3\\-1 - i\\7 i\\0\end{matrix}\right]\right]\end{split}\]

From the above we see that

\[ \operatorname{span}(\mathbf v_1,\mathbf v_2,\mathbf v_3,\mathbf v_4) = \{c_1\mathbf v_1 + c_2\mathbf v_2 + c_4\mathbf v_4 \,|\, c_1,c_2,c_3 \in \mathbb{C} \} = \operatorname{span}(\mathbf v_1,\mathbf v_2,\mathbf v_4). \]

Basis#

Let the following matrices \(\mathbf A\), \(\mathbf B\), \(\mathbf C\), and \(\mathbf D\) span a subspace in \(\mathbb R^{2\times 2}\):

\[\begin{split} \begin{gather*} \mathbf{A} = \begin{bmatrix} 1 & -6 \\ 2 & 0 \end{bmatrix},\quad \mathbf{B} = \begin{bmatrix} 1 & 2 \\ -2 & 0 \end{bmatrix} \\ \mathbf{C} = \begin{bmatrix} -2 & 2 \\ 1 & 0 \end{bmatrix},\quad \mathbf{D} = \begin{bmatrix} 3 & -4 \\ -1 & 0 \end{bmatrix}. \end{gather*} \end{split}\]
A = Matrix([[1,-6],[2,0]])
B = Matrix([[1,2],[-2,0]])
C = Matrix([[-2,2],[1,0]])
D = Matrix([[3,-4],[-1,0]])
A, B, C, D
\[\begin{split}\displaystyle \left( \left[\begin{matrix}1 & -6\\2 & 0\end{matrix}\right], \ \left[\begin{matrix}1 & 2\\-2 & 0\end{matrix}\right], \ \left[\begin{matrix}-2 & 2\\1 & 0\end{matrix}\right], \ \left[\begin{matrix}3 & -4\\-1 & 0\end{matrix}\right]\right)\end{split}\]

We want to find the dimension of and a basis for this subspace \(\mathrm{span}(\mathbf{A},\mathbf{B},\mathbf{C},\mathbf{D})\). The standard basis for \(\mathbb{R}^{2 \times 2}\) is

\[\begin{split} E_{\mathbb {R_{2\times 2}}} = \left( \left[\begin{matrix}1 & 0\\0 & 0\end{matrix}\right], \ \left[\begin{matrix}0 & 0\\1 & 0\end{matrix}\right], \ \left[\begin{matrix}0 & 1\\0 & 0\end{matrix}\right], \ \left[\begin{matrix}0 & 0\\0 & 1\end{matrix}\right]\right), \end{split}\]

and SymPy can create the vector representations of each matrix in this basis using the command .vec():

A.vec(),B.vec(),C.vec(),D.vec()
\[\begin{split}\displaystyle \left( \left[\begin{matrix}1\\2\\-6\\0\end{matrix}\right], \ \left[\begin{matrix}1\\-2\\2\\0\end{matrix}\right], \ \left[\begin{matrix}-2\\1\\2\\0\end{matrix}\right], \ \left[\begin{matrix}3\\-1\\-4\\0\end{matrix}\right]\right)\end{split}\]

So, the above output shows the coordinate vectors of the matrices with respect to the standard basis. These are merged as column into a coordinate matrix:

V = Matrix.hstack(A.vec(),B.vec(),C.vec(),D.vec())
V
\[\begin{split}\displaystyle \left[\begin{matrix}1 & 1 & -2 & 3\\2 & -2 & 1 & -1\\-6 & 2 & 2 & -4\\0 & 0 & 0 & 0\end{matrix}\right]\end{split}\]

We now have a couple of ways for finding the dimension of and a basis for the subspace.

Method 1 - Investigate the Column Space of \(\mathbf V\)#

Use the .columnspace() command:

V.columnspace()
\[\begin{split}\displaystyle \left[ \left[\begin{matrix}1\\2\\-6\\0\end{matrix}\right], \ \left[\begin{matrix}1\\-2\\2\\0\end{matrix}\right]\right]\end{split}\]

The output contains two vectors, which we recognize as the coordinate vectors for the matrices \(\mathbf A\) and \(\mathbf B\). Hence, the subspace \(\mathrm{span}(\mathbf{A},\mathbf{B},\mathbf{C},\mathbf{D})\) has a dimension of \(2\) and is spanned by \(\mathbf A\) and \(\mathbf B\), meaning these two matrices constitute a basis for the subspace.

Method 2 - Investigate the Reduced Row-Echelon Form#

Use the .rref() command:

V.rref()
\[\begin{split}\displaystyle \left( \left[\begin{matrix}1 & 0 & - \frac{3}{4} & \frac{5}{4}\\0 & 1 & - \frac{5}{4} & \frac{7}{4}\\0 & 0 & 0 & 0\\0 & 0 & 0 & 0\end{matrix}\right], \ \left( 0, \ 1\right)\right)\end{split}\]

Since the reduced row-echelon form reveals a rank of \(2\), the dimension of \(\mathrm{span}(\mathbf{A},\mathbf{B},\mathbf{C},\mathbf{D})\) is also \(2\). From the rref we can read that \(\mathbf A\) and \(\mathbf B\) are linearly independent, since they are the only ones associated with pivots in the rref, and therefore they constitute a basis for \(\mathrm{span}(\mathbf{A},\mathbf{B},\mathbf{C},\mathbf{D})\) - see the course textbook for the relevant definition. Let us denote the basis that consists of the basis vectors \(\mathbf A\) and \(\mathbf B\) by \(\beta\). New coordinates for \(\mathbf C\) and \(\mathbf D\) with respect to basis \(\beta\) are found by solving the following systems of equations:

\[\begin{split} x_1 \cdot \mathbf{A} + x_2 \cdot \mathbf{B} = \mathbf{C}\\ x_3 \cdot \mathbf{A} + x_4 \cdot \mathbf{B} = \mathbf{D}. \end{split}\]

These are solved by setting up augmented matrices whose coefficient matrix must consist of the coordinate vectors for \(\mathbf A\) and \(\mathbf B\) as columns, and whose right-hand sides must be the coordinate vector for \(\mathbf C\), respectively \(\mathbf D\). Note that the coordinate vector for \(\mathbf C\) is the third column in V, respectively the fourth column for \(\mathbf D\).

# Coefficient matrix as left-hand side
V_12 = V[:,[0,1]] # This extracts the coordinate vectors for A and B

# Solving for C
linsolve((V_12, V.col(2)))
../_images/dc0bad8428e61bf60e6344b4f476a53d4ca6a0b58e972192f039d027dcec42c9.png
# Solving for D
linsolve((V_12, V.col(3)))
../_images/50af93d448d4644d936351facab833f1e270f0d9420e65482141afd489026833.png

The coefficients are now known, and we have the linear combinations:

\[ \mathbf{C} = -\frac{3}{4} \cdot \mathbf{A} -\frac{5}{4} \cdot \mathbf{B}\quad\text{and}\quad \mathbf{D} = \frac{5}{4} \cdot \mathbf{A} + \frac{7}{4} \cdot \mathbf{B}, \]

and the coordinate vectors for matrices \(\mathbf C\) and \(\mathbf D\) in basis \(\beta\) are thus:

\[\begin{split} [\mathbf{C}]_\beta = \begin{bmatrix} -\frac 3 4 \\[3pt] -\frac 5 4 \end{bmatrix} \quad\text{and}\quad [\mathbf{D}]_\beta = \begin{bmatrix} \frac 5 4 \\[3pt] \frac 7 4 \end{bmatrix}. \end{split}\]

A check to be certain:

Eq(C,-A * Rational(3/4) - Rational(5/4) * B), \
Eq(D, Rational(5/4) * A + Rational(7/4) * B)
../_images/41b5fef7dd7168fe53773e0bdbbc70ab178324f56fa43ca1cb4a63eeeee47037.png

Difference between Real and Complex Vector Spaces#

Consider these three vectors in \(\mathbb{C}^3\):

\[\begin{split} \mathbf v_1 = \left[\begin{matrix}1 + i\\3\\0\\7 i\end{matrix}\right], \ \mathbf v_2 = \left[\begin{matrix}2\\4 - i\\2 i\\8 - i\end{matrix}\right], \ \mathbf v_3 = \left[\begin{matrix}2 + 2 i\\7 + 2 i\\2 i\\1 + 6 i\end{matrix}\right]. \end{split}\]
v1 = Matrix([1+I, 3, 0])
v2 = Matrix([2, 4-I, 2*I])
v3 = Matrix([2 + 2*I, 7 + 2*I, 2*I])
A = Matrix.hstack(v1,v2,v3)
A
\[\begin{split}\displaystyle \left[\begin{matrix}1 + i & 2 & 2 + 2 i\\3 & 4 - i & 7 + 2 i\\0 & 2 i & 2 i\end{matrix}\right]\end{split}\]

We want to determine whether the vectors \(\mathbf v_1,\mathbf v_2,\mathbf v_3\) constitute a basis for, respectively, the real vector space \(\mathbb C^3\) (so, over \(\mathbb R\)) and the complex vector space \(\mathbb C^3\) (so, over \(\mathbb C\)). \(\mathbb C^3\) is 3-dimensional, and the three vectors constitute a basis for a vector space of dimension \(3\) precisely if they are linearly independent (see the relevant theorem in the course textbook).

As \(\mathbf A = [\mathbf v_1,\mathbf v_2,\mathbf v_3] \in \mathbb{C}^{3 \times 3}\) is a square matrix, we could use the determinant to investigate linear independence of \(\mathbf v_1,\mathbf v_2,\mathbf v_3\) (see the relevant theorem in the course textbook). This is enough when investigating whether the vectors are linearly independent in - thus investigating whether they constitute a basis for - the vector space \(\mathbb C^3\) over \(\mathbb C\). However, this will not determine linear independence over \(\mathbb R\). For that we instead consider their linear combination (see the relevant definition in the course textbook).

Let \(c_1,c_2,c_3 \in \mathbb F\). The vectors are linearly independent if the equation

\[ c_1 \mathbf v_1 + c_2 \mathbf v_2 + c_3 \mathbf v_3 = \mathbf 0 \]

only holds true for \(c_1= c_2= c_3 = 0\). Written as a matrix-vector product,

\[\begin{split} \left[\begin{matrix}1 + i & 2 & 2 + 2 i\\3 & 4 - i & 7 + 2 i\\0 & 2 i & 2 i\end{matrix}\right]\left[\begin{matrix}c_{1}\\c_{2}\\c_{3}\end{matrix}\right] = \left[\begin{matrix}0\\0\\0\end{matrix}\right], \end{split}\]

we solve as follows:

cs = symbols('c1:4', real=True)
sol = linsolve((A, zeros(3,1)))
Eq(Matrix(cs), Matrix(list(sol)[0]), evaulate=False)
\[\begin{split}\displaystyle \left[\begin{matrix}c_{1}\\c_{2}\\c_{3}\end{matrix}\right] = \left[\begin{matrix}\tau_{0} \left(-1 - i\right)\\- \tau_{0}\\\tau_{0}\end{matrix}\right]\end{split}\]

We see that there are other solutions than the zero-solution, so at first look we might think that this means linear dependence - but we should not be too quick!

  • Indeed, \(\mathbf v_1,\mathbf v_2,\mathbf v_3\) are linearly dependent in the complex vector space \(\mathbb C^3\) (so, over \(\mathbb C\)) since there exist non-zero \(c_1,c_2,c_3 \in \mathbb C\) that fulfill the linear combination.

  • But another look at the solution tells that it contains imaginary, non-real values. In the vector space \(\mathbb C^3\) over \(\mathbb R\) we must have \(c_1,c_2,c_3 \in \mathbb R\). This is only fulfilled when \(\tau_0=0\), meaning that \(c_1=c_2=c_3 = 0\) is the only solution in this case. Hence \(\mathbf v_1,\mathbf v_2,\mathbf v_3\) are linearly independent in the vector space \(\mathbb C^3\) over \(\mathbb R\), and they constitute a basis for \(\mathbb C^3\) over \(\mathbb R\).