Assignments
Due date: Fri, 12 Jan 2018
In this exercise, we explore further the relation between norms and inner-products. Let $V$ be a vector space over $\mathbb{R}$.
We shall see that:
- We can associate to each inner product $\langle\cdot, \cdot \rangle$ on $V$ a norm $\Vert\cdot\Vert$ on $V$.
- Such a norm satisfies some useful properties which we take as the definition of a norm on $V$.
- A norm derived from an inner-product satisfies an additional property called parallelogram law.
- Given the norm on $V$ associated to an inner-product, we can recover the inner-product using polarization.
- However, not all norms are associated to inner-products.
Exercises
- Let $\langle\cdot, \cdot \rangle$ be an inner-product on $V$ and define $\Vert x \Vert = \sqrt{\langle x, x\rangle}$ for $x \in V$. Show that
- for all $x\in V$, $\Vert x\Vert \geq 0$ with equality if and only if $x = 0$,
- for all $x \in V$ and $c \in \mathbb{R}$, $\Vert c \cdot x\Vert = \vert c \vert \cdot \Vert x\Vert$,
- (triangle inequality) for all $x, y \in V$, $\Vert x + y\Vert \leq \Vert x\Vert + \Vert y \Vert$.
$~$
We take these properties to be the definition of a norm, i.e., a norm is a function $\Vert\cdot\Vert: V \to \mathbb{R}$ that satisfies the properties (1) - (3).
-
Let $\langle\cdot, \cdot \rangle$ be an inner-product on $V$ and define $\Vert x \Vert = \sqrt{\langle x, x\rangle}$ for $x \in V$. Show that for $x, y \in V$,
This is called the parallelogram law.
-
Let $V = \mathbb{R^2}$ and define $\Vert (x, y)\Vert = \vert x \vert + \vert y \vert$. Show that $\Vert\cdot\Vert$ satisfies the properties (1) - (3) of problem 1 (i.e., it is a norm), but not the parallelogram law (and is thus not obtained from an inner product).
-
Suppose a norm $\Vert\cdot\Vert$ on a vector space $V$ is of the form $\Vert x \Vert = \sqrt{\langle x, x\rangle}$ for an inner product $\langle\cdot, \cdot \rangle$, then show that the inner product is given by
This is called the polarization identity.
Remark: In fact, if a norm satisfies the parallelogram law, then the polarization identity gives an inner product (this is harder to prove than the above statements).
Due date: Mon, 22 Jan 2018
Let $V$ be the vector space
with inner product given by
- Show that polynomials $x^n$ and $x^m$ (as elements of $V$) are orthogonal if and only if $n-m$ is odd.
- Show that $f(x) = 1$ and $g(x) = \sqrt{3}x$ form an orthonormal basis for the subspace $L \subset V$ consisting of linear functions (you may use the fact that the constant function $1$ and the function $x$ form a basis for the space $L$).
- Find the linear function $h(x) = ax + b$ that minimizes
- Find an orthonormal basis for the subspace $Q \subset V$ consisting of polynomials of degree at most $2$.
- Let $S$ be the subspace of $V$ consisting of symmetric continuous functions, i.e., functions $f$ such that $f(-x) = f(x)$ for all $x \in [-1, 1]$. Let $g\in V$ be an anti-symmetric function, i.e., $g(-x) = -g(x)$ for all $x \in [-1, 1]$. Show that $\forall f\in S$, .
- For a function $f \in V$, let $\hat{f}$ be given by $\hat{f}(x) = \frac{f(x) + f(-x)}{2}$. Show that $f - \hat{f}$ is anti-symmetric, and deduce that $\hat{f}$ is the orthogonal projection onto $S$ of $f$.
Due date: Mon, 29 Jan 2018
- If $det \begin{pmatrix}x & y & z \\ 3 & 0 & 2 \\ 1 & 1 & 1 \end{pmatrix} = 1$, compute the determinants of each of the following matrices:
- (a) $\begin{pmatrix} 2x & 2y & 2z \\ \frac{3}{2} & 0 & 1 \\ 1 & 1 & 1 \end{pmatrix}$.
- (b) $\begin{pmatrix} x & y & z \\ 3x+3 & 3y & 3z+2 \\ x + 1 & y + 1 & z + 1 \end{pmatrix}$.
- (c) $\begin{pmatrix} x - 1 & y - 1 & z - 1 \\ 4 & 1 & 3 \\ 1 & 1 & 1 \end{pmatrix}$.
-
Show that $det \begin{pmatrix} 1 & 1 & 1 \\ a & b & c \\ a^2 & b^2 & c^2 \end{pmatrix} = (b - a)(c - a)(c - b)$.
- Let $U$ and $V$ be upper-triangular matrices.
- (a) Show that $U+ V$ is upper-triangular.
- (b) Prove or disprove: $det(UV) = det(U)det(V)$.
- (c) Prove or disprove: $det(U+V) = det(U) + det(V)$.
Due date: Mon, 05 Feb 2018
Fix $n > 0$ and let $V$ be the vector space of polynomials $p(x)$ over $\mathbb{R}$ of degree at most $n$. Let $L: V \to V$ be the linear tranformation $L(p) = \frac{dp}{dx}$, i.e., $L$ maps a polynomial $p$ to its derivative $\frac{dp}{dx}$.
- Prove or disprove: $L$ is injective.
- Prove or disprove: $L$ is surjective.
- Either describe a left inverse for $L$ (and prove that it is a left inverse) or prove that $L$ has no left inverse.
- Either describe a right inverse for $L$ (and prove it is a right inverse) or prove that $L$ has no right inverse.
Due date: Mon, 12 Feb 2018
- Suppose $V$ is a vector space over any field with $dim(V) = 2$ and $L$ is a linear transformation such that $L^2 = 0$ but $L \neq 0$. Let $v \in V$ be a vector such that $L(v) \neq 0$.
- Show that $v$ and $L(v)$ are independent. (Hint: Consider a linear combination which is zero and apply $L$).
- It follows from the above that $v$, $L(v)$ is a basis of $V$. Find the matrix of $L$ with respect to this basis.
- Let $V = \mathbb{R}^3$ and consider the linear transformation $L : V \to V$ given by $L(x, y, z) = (0, 0, x)$.
- Show that $L \neq 0$ and $L^2 = 0$.
- Find the matrix of $L$ with respect to the standard basis of $\mathbb{R}^3$.
- Let $V = \mathbb{R}^5$ and consider the linear transformation $L : V \to V$ given by $L(x_1, x_2, x_3, x_4, x_5) = (0, x_1, x_2, 0, x_4)$.
- Show that $L^2 \neq 0$ and $L^3 = 0$.
- Find the matrix of $L$ with respect to the standard basis of $\mathbb{R}^5$.
Due date: Mon, 12 Mar 2018
- Consider the function $f : \mathbb{R}^2 \to \mathbb{R}$ given by
- $f(0, 0) = 0$,
- $f(x, y) = 1$ if $y = x^2$ and $x \neq 0$,
- $f(x, y) = 0$ if $y \neq x^2$.
Show that
- (a) $f$ is not continuous at $(0, 0)$.
- (b) for every vector $v \in \mathbb{R}^2$, the directional derivative of $f$ at $(0, 0)$ in the direction $v$ exists.
- Consider the function $f : \mathbb{R}^2 \to \mathbb{R}$ given by
- $f(0, 0) = 0$,
- $f(x, y) = x$ if $y = x^2$ and $x \neq 0$,
- $f(x, y) = 0$ if $y \neq x^2$.
Show that
- (a) $f$ is continuous at $(0, 0)$.
- (b) for every vector $v \in \mathbb{R}^2$, the directional derivative of $f$ at $(0, 0)$ in the direction $v$ exists.
- (c) The total derivative $Df((0, 0))$ of $f$ at $(0, 0)$ does not exist.
- Consider the function $f : \mathbb{R}^2 \to \mathbb{R}$ given by
- $f(0, 0) = 0$,
- $f(x, y) = \frac{xy}{x^2 + y^2}$ if $(x, y) \neq (0, 0)$,
Show that
- (a) $f$ is not continuous at $(0, 0)$.
- (b) For every $(x, y)\in \mathbb{R^2}$, the partial derivatives $\frac{\partial f}{\partial x}$ and $\frac{\partial f}{\partial y}$ exist.
- (c) The partial derivatives $\frac{\partial f}{\partial x}$ and $\frac{\partial f}{\partial y}$ are not continuous functions.
- Consider the function $f : \mathbb{R}^2 \to \mathbb{R}$ given by
- $f(0, 0) = 0$,
- $f(x, y) = \frac{xy}{x^{4/3} + y^{4/3}}$ if $(x, y) \neq (0, 0)$,
Show that
- (a) $f$ is continuous at all $(x, y)\in \mathbb{R}^2$.
- (b) For every $(x, y)\in \mathbb{R^2}$, the partial derivatives $\frac{\partial f}{\partial x}$ and $\frac{\partial f}{\partial y}$ exist.
- (c) The partial derivatives $\frac{\partial f}{\partial x}$ and $\frac{\partial f}{\partial y}$ are not continuous functions.
- (d) The total derivative $Df((0, 0))$ of $f$ at $(0, 0)$ does not exist.
Due date: Mon, 19 Mar 2018
Consider two rectangles with length $L$ and breadth $B$ placed one on top of the other to form a $L \times 2B$ rectangle. Consider a connected path joining the bottom left corner of the lower rectangle to the top right corner of the upper rectangle consisting of two line segments $l_1$ and $l_2$, one in each rectangle and meeting at the common edge.
- Let $\theta_1$ and $\theta_2$ be the angles made by $l_1$ and $l_2$ to the line perpendicular to the common edge. Then show that $B\cdot(\tan(\theta_1) + \tan(\theta_2)) = L$.
- If light travels along $l_1$ and $l_2$ at speeds $v_1$ and $v_2$, respectively, find the total time travelled $T(\theta_1, \theta_2)$ as a function of $\theta_1$ and $\theta_2$.
- Show that the path (of the above form) along which the total time travelled is minimum satisfies Snell’s law $\sin(\theta_1)/\sin(\theta_2) = v_1/v_2$.
- Consider all cuboids with length $L$ breadth $B$ and height $H$ with surface area $6$. Show that the cuboid with largest volume among these is the cube with unit side.
Due date: Mon, 26 Mar 2018
- Let $S$ be the unit disc in the plane, i.e., $S = \{(x, y)\in \mathbb{R}^2 : x^2 + y^2 \leq 1\}$. Let $f(x, y) = e^{x^2 + y^2}$.
- (a) Express $\int_S f(x, y)dxdy$ in polar co-ordinates.
- (b) Compute this integral by expressing as an iterated integral.
- Let $C$ be the unit circle traversed in the anti-clockwise direction. Compute the following line integrals along $C$.
- (a) $\int_C ydx + x dy$.
- (b) $\int_C ydx - x dy$.
- (c) $\int_C xdx + y dy$.
Due date: Mon, 02 Apr 2018
We identify $\mathbb{C} = \mathbb{R}^2$ as usual. Let $U \subset \mathbb{C}$ be an open set and let $f: U \to \mathbb{C}$, $f(z) = u(z) + iv(z)$ be a smooth function with $u(z)$ and $v(z)$ the real and imaginary parts of $f(z)$. Let $dz = dx + idy$ and let $f(z)dz$ be given by complex multiplication as usual.
We say that $f(z)$ is holomorphic if $f(z)dz$ is a closed $1$-form, i.e., the real and imaginary parts of $d(f(z)dz)$ are both $0$.
- Let $\frac{\partial f}{\partial \bar{z}} =
\frac{1}{2}\left(\frac{\partial f}{\partial x} + i\frac{\partial f}{\partial y}\right)$. Show that $f(z)$ is holomorphic if and only if $\frac{\partial f}{\partial \bar{z}} = 0$.
- Show that if $f(z)$ and $g(z)$ are holomorphic functions on $U$, then $f(z) + g(z)$ is holomorphic.
- Show that if $f(z)$ and $g(z)$ are holomorphic functions on $U$, then $f(z)g(z)$ is holomorphic.
- Show that every polynomial $p(z)$ over complex numbers is holomorphic on $\mathbb{C}$.
- Show that if $f(z)$ is holomorphic on $\mathbb{C}$ and $S^1$ is the unit circle, $\int_{S^1} f(z)dz = 0$.
- Show that $f(z) = \frac{1}{z}$ is a holomorphic function on $\mathbb{C} \setminus \{0\}$, but $\int_{S^1} f(z)dz \neq 0$.
Due date: Mon, 16 Apr 2018
Please do not submit this assignment. This is to only help with understanding the material on ODEs. It is not a model for final examination questions.
Let $A : \mathbb{R}^2 \to \mathbb{R}^2$ be a linear transformation. We can also regard this as a linear transformation $A : \mathbb{C}^2 \to \mathbb{C}^2$.
- For what values of $tr(A)$ and $det(A)$ does $A$ have eigenvalues $\alpha \pm i\beta$ for $\alpha, \beta \in\mathbb{R}$ with $\beta\neq 0$?
Assume henceforth that $A$ has complex eigenvalues $\alpha \pm i\beta$ for $\alpha, \beta \in\mathbb{R}$ with $\beta\neq 0$.
- Let $\lambda = \alpha + i\beta$ and let $v\in\mathbb{C}^2$ be an eigenvector of $A$ with $Av = \lambda v$. Show that its (component-wise) complex conjugate $\bar{v}$ is also an eignevector.
- Let $w_1 = (v + \bar{v})/2$ and $w_2 = (v - \bar{v})/2i$. Show that $w_1, w_2 \in\mathbb{R}^2\subset{\mathbb{C}}^2$ and $w_1$, $w_2$ form a basis of $\mathbb{R}^2$ (as a real vector space).
- Find the matrix of $A$ with respect to the basis $w_1$, $w_2$.
- Recall that $\mathbb{C}$ is a real vector space with basis $1, i$. Let $P: \mathbb{R}^2 \to \mathbb{C}$ be the (invertible) linear transformation such that $P(w_1) = 1$ and $P(w_2) = i$. Let $B = P\circ A \circ P^{-1}$.
Show that $B(z) = \lambda z$ for all $z \in \mathbb{C}$.