====== Lecture 02 ======
{{youtube>6VbbYXpBIqA?medium}}
===== Content =====
Chapter 1: Mathematical Background: Linear Algebra
* Vector Spaces
* Linear Transformations and Matrices
* Properties of Matrices
Chapter 2:
Keywords
===== Annotations / Comments / Remarks =====
==== Special Euklidian Group SE(3) ====
==== Eigenvalue Problem ====
Linear transformation scales vector $\mathbf{v}$, scaling factor is Eigen value.
Set of Eigen values is spectrum:
$\sigma(A) = \{ \lambda_1 \ldots \lambda_n \} $
$A\mathbf{v} = \lambda \mathbf{v}$
$(A-\lambda \one) A = 0$
$\left(A-\lambda \unity \right) A = 0$
**Question:**
If $P$ is invertible and $B = P^{-1}AP \Rightarrow \sigma(A) = \sigma(B)$. Why?
==== Symmetric Matrices ====
Real matrix with real Eigenvalues is related to symmetric matrix.
$S^T = S$
Positive semidefinite: $x^TSx \ge 0$
Positive definite: $x^TSx > 0$
==== Vector Spaces, Keywords ====
Vector space $V$ of field $\mathbb{R}$, commutative group (+),
Subspace is closed subset, e.g. a plane or line running through $\vec{0}$ is a subspace.
Span, linear dependency (or dependence), linear independence
Basis, $\mathbf{B} = \{\mathbf{b}_1, \ldots , \mathbf{b}_n\}$
$\vec{v} = \sum_{i = 1}^k \alpha_i \vec{b}_i$
$\mathbf{v} = \sum_{i = 1}^k \alpha_i \mathbf{b}_i$
**Inner product, dot product**
$\left<.,.\right>: V \times V \rightarrow V$
Cononical and Induced Inner Product:
Kronecker Product:
A=[10 20 30; 40 50 60]
As = reshape(A,prod(size(A)),1)
u = [1 2]'
v = [2 3 4]'
K = kron(v,u)
disp("res1 = u'*A*v ")
res1 = u'*A*v
disp("res2 = kron(v,u)'*As ")
res2 = kron(v,u)'*As
** Linear Transformation **
disp("Kanonical Basis")
e1 = [1 0 0]'
e2 = [0 1 0]'
e3 = [0 0 1]'
E = [e1 e2 e3]
A = [1 2 3; 4 5 6]
disp("This defines a lin. transf. L(v) = A*v")
disp("The columns of A are the images of the basis verctors")
disp("Image of basis vector e1: b1 = L(e1)=A*e1")
b1 = A*e1
disp("Image of basis vector e2: b2 = L(e2)=A*e2")
b2 = A*e2
disp("Image of basis vector e3: b3 = L(e3)=A*e3")
b3 = A*e3
disp("v = 2*e1 + 3*e2 + 4*e3")
v = 2*e1 + 3*e2 + 4*e3
disp("w=L(v)=A*v = L(2*e1+3*e2+4*e3)")
w = A*v
disp("w=2*L(e1)+3*L(e2)+4*L(e3)")
ww = 2*b1 + 3*b2 + 4*b3
A Hilbert space is a metric vector space when the metric is based on an inner product (dot product).
**Ring**
$\mathcal{M(m,n)}$ is the set of all $m \times n$-matrices. $\mathcal{M(n)}$ is the set of all square matrices:
$M(n) \in V$
$\cdot: V \times V \rightarrow V$
$+: V \times V \rightarrow V$
==== Group ====
Group requirements:
* $g_1\circ g_2 \in G$
* $e\circ g = g \circ e = g$
* $\exists g^{-1} \in G: g \circ g^{-1} = g \circ g^{-1} = e$
* $g_1 \circ g_2 \circ g_3 = (g_1 \circ g_2) \circ g_3 = g_1 \circ (g_2 \circ g_3)$
Example: Rotations form a group!
Citation (DC):
A group G has a matrix representation or can be realized as a matrix group if there exists an **injective** transformation:
$R: G \rightarrow GL(n)$
Why not surjective, i.e. bijective in total? $R(\phi) \rightarrow M$ and $R(n\cdot2\pi+\phi) \rightarrow M$ are mapped to the same element of $M \in SO(n) \subset GL(n)$.
Set of invertible square matrices form the General Linear Group GL(n). They are closed with respect to multiplication.
$\det(M) = 1$
$R: G \rightarrow GL(n)$
$R(e) = I, R(g\circ h) = R(g)R(h), \all g,h \in G$
"It preserves the group structure!" (DC)
$R$ is a group **homomorphism**.
=== Affine Group $A(n)$ ===
$L(x) = A(x) + b$
$L: \mathbb{R}^{n+1}\rightarrow \mathbb{R}^{n+1}$
$\left( \begin{array}{cc}
A & \mathbf{x} \\\\
0 & 1 \\\\
\end{array}\right) $
===== Appendix: My MathJax Template for a Matrix =====
$\left( \begin{array}{rrrr}
1 & 0 & \cdots & 0 \\\\
0 & \ddots & 0 & \vdots \\\\
\vdots & 0 & \ddots & 0 \\\\
0 & \cdots & 0 & 1
\end{array}\right) $