aboutsummaryrefslogtreecommitdiff
path: root/li/03_inverse
diff options
context:
space:
mode:
Diffstat (limited to 'li/03_inverse')
-rw-r--r--li/03_inverse94
1 files changed, 94 insertions, 0 deletions
diff --git a/li/03_inverse b/li/03_inverse
new file mode 100644
index 0000000..8ba7887
--- /dev/null
+++ b/li/03_inverse
@@ -0,0 +1,94 @@
+f : X -> Y
+g : Y -> X
+if inverse exists, g = f^{-1}
+g(f(x)) = x for all x \in X
+f(g(x)) = y for all y \in Y
+
+For simplicity's sake, we will require bijectivity to define the
+inverse, although degenerate cases (i.e. non-injective) can be defined.
+
+ Matrix Inverse
+A := mxn matrix.
+
+Ax where a is nx1 matrix. A can be considered as a function from R^n to R^m.
+
+Definition:
+nxn matrix A is invertible iff there exists B nxn such that AB = BA =
+I_n. A^{-1} := B.
+
+Thm: If A, B are inverses s.t. AB = I_n, BA = I_n.
+
+A = [ a1 | a2 | ... an ]
+B = [ b1 | b2 | ... bn ]
+
+AB = [Ab1 | Ab2 | ... Abn ]
+
+Let e_i = [ 0 0 ... 1 ... 0 ] where 1 is in the ith position.
+This gives systems Ab1 = e1, Ab2 = e2 ...
+Each can be solved like a standard augmented matrix.
+However, we can solve like
+
+[A | e1 | e2 | e3 ... ] (*)
+Two possibilities:
+- n pivots (every column has pivot)
+ Reduced echelon form is I_n
+ Right matrix = B = A^{-1}
+- <n pivots (implies at least one row of zeroes at the bottom)
+ The right matrix is always invertible [how?], so at least one of the
+ systems in (*) has no solution, and A is not invertible.
+
+If we only use A_j + cA_i -> A_j where j > i to solve
+[ A | I_n ],
+we get [ U | L^{-1} ]
+
+U is invertible <=> all diagonal elements of U are non-zero
+<=> every column of U has a pivot column
+L is always invertible, so iff U is invertible, A = LU is invertible.
+
+ Transpose
+
+A := mxn matrix.
+A^T = B
+B := nxm where b_ji = a_ij
+
+A : R^n -> R^m
+B : R^m -> R^n (Not inverse properties)
+
+If A is invertible, then A^T is invertible, and
+(A^{-1})^T = (A^T)^{-1}
+But why?
+(1)
+If A, B are invertible, AB is invertible, and:
+ (AB)^{-1} = B^{-1}A^{-1} [why??] [this should verify the previous
+ identity]
+(2)
+(AB)^T = B^T A^T [could be proved by brute calculation]
+
+Definition: nxn matrix A is symmetric if A = A^T
+
+If A is symmetric and invertible, A = LU = LDL^T (Thm!)
+Then, D would be invertible. If A not invertible, U not invertible, and
+D doesn't need to be invertible.
+This is Cholesky decomposition. "Keeps the symmetry" (?)
+D is a diagonal (and therefore symmetric) matrix.
+
+
+Chapter 2
+---------
+
+Vector space is a collection V of objects called vectors, a binary
+addition operator, and an operator to multiply a vector and a scalar
+(defined in R or C)
+
+(u + v) + w = u + (v + w)
+a(u + v) = au + av
++ Some more rules (probably commutative?)
+ (a+b)u = au + bu. Gives existence of 0 vector.
+
++, * must be closed under V.
+
+Ex: Let V = polynomials degree <= 2.
+Ex: Upper-diagonal 2x2 matrices
+Ex: R^2
+Ex: Subspace of R^2
+Not ex: Line in R^2 not containing origin.