aboutsummaryrefslogtreecommitdiff
path: root/li/hw3.tex
diff options
context:
space:
mode:
Diffstat (limited to 'li/hw3.tex')
-rw-r--r--li/hw3.tex378
1 files changed, 378 insertions, 0 deletions
diff --git a/li/hw3.tex b/li/hw3.tex
new file mode 100644
index 0000000..870daf0
--- /dev/null
+++ b/li/hw3.tex
@@ -0,0 +1,378 @@
+\def\bmatrix#1{\left[\matrix{#1}\right]}
+
+ {\noindent\bf Section 2.2}
+
+{\noindent\bf 12.}
+
+{\it (a)}
+
+This is correct. It's equal to the number of linearly independent
+rows/dimension of the row space/rank because if any of the non-zero rows
+were linearly dependent, they would have been eliminated to a zero row
+when forming $R.$
+
+{\it (b)}
+
+This is false. A zero matrix has rank zero but can have a nonzero number
+in this property if it has more columns than rows.
+
+{\it (c)}
+
+This is true. All columns are either pivot columns or free columns, and
+the rank is the number of pivot columns.
+
+{\it (d)}
+
+No. The following matrix has four ones but rank one:
+$$\bmatrix{1&1&1&1}$$
+
+{\noindent\bf 26.}
+
+The maximum rank of a matrix is the smaller of its number of rows and
+its number of columns because the pivot columns and rows are strictly
+less than the total number of each. Therefore, $C$ and $A$ have at most
+rank 2, and $CA$ also has at most rank 2 (column space of $A$ is a
+superset of the column space of $CA,$ which becomes obvious if they're
+treated like functions). $CA$ is a $3\times 3$ matrix, and $I_3$ has
+rank 3, so $CA \neq I.$
+
+$AC = I$ if
+$$A = \bmatrix{1 & 0 & 0\cr
+ 0 & 1 & 0}$$
+ and
+$$C = \bmatrix{1 & 0\cr
+ 0 & 1\cr
+ 0 & 0}$$
+
+{\noindent\bf 42.}
+
+If $Ax = b$ has infinitely many solutions, then there exists infinitely
+many solutions to $Ay = 0$ if $y = x - x_0$ where $x_0$ is a particular
+solution to $Ax_0 = b.$ If there exists one particular solution $x_1$ to
+$Ax_1 = B,$ then there must be an infinite number $A(x_1+y) = B$ where
+$y$ is in the null space of $A$ as noted earlier.
+
+However, $Ax = B$ could have zero solutions. The matrix
+$$A = \bmatrix{1&0\cr 0&0}$$
+does not include $b_0 = \bmatrix{0\cr 1}$ in its column space, so $Ax =
+b_0$ would have zero solutions even though $Ax = \bmatrix{1\cr 0}$ has
+an infinite number of solutions.
+
+\iffalse % practice problems
+
+{\noindent\bf 7.}
+
+$$R_3 = R_2 + R_1 \to c = 5 + 2.$$
+
+{\noindent\bf 9.}
+
+{\it (a)}
+
+$$\bmatrix{1&2&3&4\cr 0&0&1&2\cr 0&0&0&0}\bmatrix{x_1\cr x_2\cr x_3\cr
+x_4} = \bmatrix{0\cr 0\cr 0} \to x = \bmatrix{-4\cr 0\cr -2\cr 1}x_4 +
+\bmatrix{-2\cr 1\cr 0\cr 0}x_2$$
+$$R = \bmatrix{1&2&0&-2\cr 0&0&0&1&2\cr 0&0&0&0}.$$
+$$Rx = 0 \to x = \bmatrix{2&0&-2&1}x_4 + \bmatrix{-2&1&0&0}x_2.$$
+
+{\it (b)}
+
+If the right-hand side is $(a, b, 0),$ the solution set will be the null
+space plus a particular solution. In the case of $U,$ a particular
+solution would be $(a, 0, b, 0).$
+
+{\noindent\bf 10.}
+
+$$\bmatrix{0&1&-1\cr 1&0&-1}x = \bmatrix{1\cr -2\cr 0}.$$
+$$\bmatrix{0&1&-1\cr 1&0&-1\cr 1&1&-2}x = \bmatrix{1\cr -2\cr 0}.$$
+
+{\noindent\bf 14.}
+
+$$R_A = \bmatrix{1&2&0\cr 0&0&1\cr 0&0&0}.$$
+$$R_B = \bmatrix{1&2&0&1&2&0\cr 0&0&1&0&0&1\cr 0&0&0&0&0&0}.$$
+$$R_C = \bmatrix{1&2&0&0&0&0\cr 0&0&1&0&0&0\cr 0&0&0&0&0&0\cr
+0&0&0&1&2&0\cr 0&0&0&0&0&1\cr 0&0&0&0&0&0}.$$
+
+{\noindent\bf 21.}
+
+The rank $r$ is the number of pivot rows and the number of pivot
+columns, so the subset of these rows and columns would be an $r\times r$
+matrix. They are by definition linearly independent, so each spans/forms
+a basis for $R^r,$ giving them invertibility.
+
+{\noindent\bf 24.}
+
+The rank of $A$ is the same as the rank of $A^T,$ so
+$${\rm rank}(AB) \leq {\rm rank}(A) \to {\rm rank}((AB)^T) \leq
+{\rm rank}(A^T) \to {\rm rank}(B^TA^T) \leq {\rm rank}(A^T) \to
+{\rm rank}(AB) \leq {\rm rank}(B).$$
+
+{\noindent\bf 25.}
+
+
+
+{\noindent\bf 36.}
+
+{\it (a)}
+
+All vectors in $R^3$ are in the column space, so only the trivial
+combination of the rows of $A$ gives zero.
+
+{\it (b)}
+
+Only vectors where $b_3 = 2b_2$ are within the column space. This means
+that $2x_2 = -x_3$ gives a zero combination. % double check this.
+
+{\noindent\bf 40.}
+
+$x_5$ is a free variable, the zero vector isn't the only solution to
+$Ax=0,$ and if $Ax=b$ has a solution, then it has infinite solutions.
+
+{\noindent\bf 43.}
+
+{\it (a)}
+
+$q=6$ gives a rank of 1 for $B,$ and $q=3$ gives a rank of 1 for the
+frist matrix.
+
+{\it (b)}
+
+$q = 7$ gives a rank of 2 for both matrices.
+
+{\it (c)}
+
+A rank of 3 is impossible for both matrices.
+
+{\noindent\bf 45.}
+% idk come back to this.
+
+{\it (a)}
+
+$r < n.$
+
+{\it (b)}
+
+$r > m.$ $r\geq n.$ % ???
+
+{\it (c)}
+
+$r < n.$
+
+{\it (d)}
+
+{\noindent\bf 53.}
+
+{\it (a)}
+
+False. The zero matrix has $n$ free variables.
+
+{\it (b)}
+
+True. If the linear function corresponding to the matrix can be
+inverted, it must not have a non-zero null-space (i.e. a non-injective
+relation).
+
+{\noindent\bf 60.}
+
+$$\bmatrix{1&0&-2&-3\cr0&1&-2&-1}$$ has this nullspace.
+
+{\noindent\bf 61.}
+
+% simple enough to construct
+
+{\noindent\bf 62.}
+
+
+
+{\noindent\bf 63.}
+{\noindent\bf 64.}
+
+{\noindent\bf 65.}
+
+$$\bmatrix{0&0\cr 1&0}$$
+
+{\noindent\bf 66.}
+
+Dimension of null space is $n-r = 3-r,$ and dimension of column space is
+$r,$ so they cannot have the same dimension and therefore cannot be
+equal.
+
+\fi
+
+ {\noindent\bf Section 2.3}
+
+{\noindent\bf 22.}
+
+{\it (a)}
+
+They might not span ${\bf R}^4$ if, for example, they are all the zero
+vector, but they could span it, like if the first four were elementary
+vectors $e_1$ to $e_4.$
+
+{\it (b)}
+
+They are not linearly independent because 4 is the maximal independent
+set.
+
+{\it (c)}
+
+Any four might be a basis for ${\bf R}^4,$ because they could be
+linearly independent and four vectors in ${\bf R}^4$ could span it.
+
+{\it (d)}
+
+$Ax = b$ might not have a solution. It could have a solution depending
+on the $b,$ but $0x = e_1,$ where $0$ refers to the zero vector for $A$
+has zero solutions.
+
+{\noindent\bf 27.}
+
+The column space of $A$ has basis in $\{(1, 0, 1)^T, (3, 1, 3)^T\}$ and
+the column space of $U$ has basis in $\{(1, 0, 0)^T, (3, 1, 0)^T\}.$
+The two matrices have the same row space, based in $\{(1, 3, 2), (0, 1,
+1)\}.$
+They also have the same null space, based in $\{(-1, 1, -1)\}.$
+
+{\noindent\bf 32.}
+
+{\it (a)}
+
+The dimension is 3 because this is the set of vectors on ${\bf R}^4$
+under one linear constraint: $v_4 = -(v_3 + v_2 + v_1).$
+
+{\it (b)}
+
+The dimension is 0 because the identity matrix, by definition only
+returns 0 if given 0.
+
+{\it (c)}
+
+The dimension is 16 because there are 16 unconstrained components.
+
+{\noindent\bf 36.}
+
+6 independent vectors satisfy $Ax=0$ by the rank theorem. $A^T$ has the
+same rank, so 53 independent vectors satisfy $A^Ty = 0.$
+
+{\noindent\bf 42.}
+
+$\{x^3, x^2, x, 1\}$ form a basis of the polynomials of degree up to 3,
+and this set restricted to those where $p(1) = 0$ has basis $\{x^3-1,
+x^2-1, x-1\}.$
+
+\iffalse % practice problems
+
+{\noindent\bf 7.}
+
+$$v_1 - v_2 + v_3 = w_2 - w_3 - w_1 + w_3 + w_1 - w_2 = 0,$$
+proving dependence of these vectors.
+
+{\noindent\bf 8.} % not an actual problem
+
+$$c_1v_1 + c_2v_2 + c_3v_3 = c_1(w_2 + w_3) + c_2(w_1 + w_3) + c_3(w_1 +
+w_2) = (c_2+c_3)w_1 + (c_1+c_3)w_2 + (c_1+c_2)w_3 = 0.$$
+Because the set of $w$ vectors are independent, this sum is only equal
+to zero if $c_2 + c_3 = 0 \to c_3 = -c_2,$ $c_1 + c_3 = 0 \to c_1 = -c_3
+= +c_2,$ and $c_1+c_2 = 0 \to c_2 = c_1 = 0 \to c_3 = 0.$
+
+{\noindent\bf 9.}
+
+{\it (a)}
+
+If $v_1$ to $v_3$ are linearly independent, the dimension of their
+spanning set must be 3 (and the set equal to $R^3$), so $v_4 \in R^3$
+can be written as a combination of the other three.
+
+{\it (b)}
+
+$v_2 = kv_1$ where $k\in\bf R$
+
+{\it (c)}
+
+$0v_1 + k(0,0,0) = 0,$ giving a non-trivial combination with the value
+0.
+
+{\noindent\bf 12.}
+
+The vector $b$ is in the subspace spanned by the columns of $A$ when
+there is a solution to $Ax = b.$ The vector $c$ is in the row space of
+$A$ when there is a solution to $A^Tx = c$ or $x^TA = c.$
+
+The zero vector is in every space, so the rows may still be independent.
+(False)
+
+{\noindent\bf 13.}
+
+The dimensions of the column spaces and of the row spaces of $A$ and $U$
+are the same (2), and the row spaces are the same between the two (and
+conversely, the null space)
+
+{\noindent\bf 21.}
+
+% easy
+
+{\noindent\bf 23.}
+
+If they are linearly independent, the rank of $A$ is $n.$ If they span
+$R^m,$ the rank is $m.$ If they are a basis for $R^m,$ then both are
+true and $n = m.$
+
+{\noindent\bf 25.}
+
+{\it (a)}
+
+The columns are linearly independent, so there is no nontrivial linear
+combination equal to 0.
+
+{\it (b)}
+
+The columns of $A$ span $R^5,$ so there must be a linear combination
+(value of $x$) equal to $b.$
+
+{\noindent\bf 26.}
+
+{\it (a)}
+
+True. Thm in the book.
+
+{\it (b)}
+
+False. See 31.
+
+{\noindent\bf 31.}
+
+If we let $v_k = e_k,$ the subspace with basis $(0, 0, 1, 1)$ does not
+have a basis in the elementary vectors.
+
+{\noindent\bf 34.}
+
+% seems simple enough, don't know if I can do it.
+
+{\noindent\bf 35.}
+
+{\it (a)}
+
+False. The unit vector $e_1$'s single column is linearly independent,
+but except in $R,$ it doesn't span $R^k,$ and $e_1x = e_2$ has no
+solution.
+
+{\it (b)}
+
+True. The rank is at most $5,$ meaning there must be two free variables.
+
+{\noindent\bf 41.}
+
+{\it (a)}
+
+For dimension 1, $y_k = kx.$
+
+{\it (b)}
+
+For dimension 2, $y_1 = x^2,$ $y_2 = 2x,$ and $y_3 = 3x.$
+
+{\it (c)}
+
+For dimension 3, $y_k = x^k.$
+
+\fi
+
+\bye