aboutsummaryrefslogtreecommitdiff
path: root/li/03_inverse
blob: 8ba788716bd7d9fb699cb5b1c91d6b4864b0a4cb (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
f : X -> Y
g : Y -> X
if inverse exists, g = f^{-1}
g(f(x)) = x for all x \in X
f(g(x)) = y for all y \in Y

For simplicity's sake, we will require bijectivity to define the
inverse, although degenerate cases (i.e. non-injective) can be defined.

    Matrix Inverse
A := mxn matrix.

Ax where a is nx1 matrix. A can be considered as a function from R^n to R^m.

Definition:
nxn matrix A is invertible iff there exists B nxn such that AB = BA =
I_n. A^{-1} := B.

Thm: If A, B are inverses s.t. AB = I_n, BA = I_n.

A = [ a1 | a2 | ... an ]
B = [ b1 | b2 | ... bn ]

AB = [Ab1 | Ab2 | ... Abn ]

Let e_i = [ 0 0 ... 1 ... 0 ] where 1 is in the ith position.
This gives systems Ab1 = e1, Ab2 = e2 ...
Each can be solved like a standard augmented matrix.
However, we can solve like

[A | e1 | e2 | e3 ... ] (*)
Two possibilities:
- n pivots (every column has pivot)
    Reduced echelon form is I_n
    Right matrix = B = A^{-1}
- <n pivots (implies at least one row of zeroes at the bottom)
    The right matrix is always invertible [how?], so at least one of the
    systems in (*) has no solution, and A is not invertible.

If we only use A_j + cA_i -> A_j where j > i to solve
[ A | I_n ],
we get [ U | L^{-1} ]

U is invertible <=> all diagonal elements of U are non-zero
<=> every column of U has a pivot column
L is always invertible, so iff U is invertible, A = LU is invertible.

    Transpose

A := mxn matrix.
A^T = B
B := nxm where b_ji = a_ij

A : R^n -> R^m
B : R^m -> R^n (Not inverse properties)

If A is invertible, then A^T is invertible, and
(A^{-1})^T = (A^T)^{-1}
But why?
(1)
If A, B are invertible, AB is invertible, and:
    (AB)^{-1} = B^{-1}A^{-1} [why??] [this should verify the previous
                                    identity]
(2)
(AB)^T = B^T A^T [could be proved by brute calculation]

Definition: nxn matrix A is symmetric if A = A^T

If A is symmetric and invertible, A = LU = LDL^T (Thm!)
Then, D would be invertible. If A not invertible, U not invertible, and
D doesn't need to be invertible.
This is Cholesky decomposition. "Keeps the symmetry" (?)
D is a diagonal (and therefore symmetric) matrix.


Chapter 2
---------

Vector space is a collection V of objects called vectors, a binary
addition operator, and an operator to multiply a vector and a scalar
(defined in R or C)

(u + v) + w = u + (v + w)
a(u + v) = au + av
+ Some more rules (probably commutative?)
    (a+b)u = au + bu. Gives existence of 0 vector.

+, * must be closed under V.

Ex: Let V = polynomials degree <= 2.
Ex: Upper-diagonal 2x2 matrices
Ex: R^2
Ex: Subspace of R^2
Not ex: Line in R^2 not containing origin.