Definition
An orthogonal complement of some vector space V is that set of all vectors x such that x dot v (in V) = 0.
MathJax TeX Test Page
An orthogonal complement of some vector space V is
$$V^{\perp} = \{x | x \cdot v = 0, v \in V\}$$
MathJax TeX Test Page
Consider a vector space formed by the span of $$\begin{bmatrix}1 \\ 2 \\ 3\end{bmatrix}, \begin{bmatrix}3\\0\\-1\end{bmatrix}$$
The orthogonal complement is the set of all x such that
$$\begin{bmatrix}1 \\ 2 \\ 3\end{bmatrix} \cdot x = 0 \text{ and} \begin{bmatrix}3\\0\\-1\end{bmatrix} \cdot x = 0$$
We can rewrite this as
$$\begin{bmatrix} 3 & 0 & -1 \\1 & 2 & 3 \end{bmatrix} x = \vec{0}$$
The row space of this matrix equals the span of those two vectors. The null space of the matrix is the orthogonal complement of the span. We thus get our first equation
$$\boxed{R(A)^{\perp} = N(A)}$$
It's also worth noting that in a previous post, we showed that
$$\boxed{C(A) = R(A^T)}$$
This is pretty intuitive. When you transpose a matrix, the rows become columns. From these boxed equations alone, we can solve any problem that asks to find an orthogonal complement.
MathJax TeX Test Page
Find the orthogonal complement of the vector space given by the following equations:
$$\begin{cases}x_1 + x_2 - 2x_4 = 0\\x_1 - x_2 - x_3 + 6x_4 = 0\\x_2 + x_3 - 4x_4 = 0\end{cases}$$
We can create a matrix out of this:
$$A = \begin{bmatrix}1 & 1 & 0 & -2 \\ 1 & -1 & -1 & 6 \\ 0 & 1 & 1 & -4 \end{bmatrix}$$
So, $V = N(A)$. By the equation we showed above, $R(A)$ is its orthogonal complement, meaning the orthogonal complement is the span of the following vectors:
$$\begin{bmatrix}1\\1\\0\\-2\end{bmatrix}, \begin{bmatrix}1\\-1\\-1\\6\end{bmatrix}, \begin{bmatrix}0\\1\\1\\-4\end{bmatrix}$$
MathJax TeX Test Page
$$\text{Let } W = \text{Span}\left(\begin{bmatrix}1 \\ 2 \\ 3 \\ 4 \end{bmatrix}, \begin{bmatrix}1\\1\\1\\1\end{bmatrix}, \begin{bmatrix}1 \\ 4\\ 7 \\ 10\end{bmatrix}\right). \text{ Find } W^{\perp}.$$
In other words $$W = C\left(\begin{bmatrix}1 & 1 & 1\\2 & 1 & 4\\3 & 1 & 7\\4 & 1 & 10\end{bmatrix}\right) = R\left(\begin{bmatrix}1 & 2 & 3 & 4 \\1 & 1 & 1 & 1\\1 & 4 & 7 & 10\end{bmatrix}\right)$$
Therefore, $W^{\perp} = N\left(\begin{bmatrix}1 & 2 & 3 & 4 \\1 & 1 & 1 & 1\\1 & 4 & 7 & 10\end{bmatrix}\right)$
$$\begin{bmatrix}1 & 2 & 3 & 4 \\1 & 1 & 1 & 1\\1 & 4 & 7 & 10\end{bmatrix} \to \begin{bmatrix}1 & 2 & 3 & 4 \\0 & -1 & -2 & -3\\0 & 2 & 4 & 6\end{bmatrix}$$
$$ \to \begin{bmatrix}1 & 2 & 3 & 4 \\0 & -1 & -2 & -3\\0 & 0 & 0 & 0\end{bmatrix} \to \begin{bmatrix}1 & 0 & -1 & -2 \\0 & -1 & -2 & -3\\0 & 0 & 0 & 0\end{bmatrix}$$
We now have the following equations: $x_1 = x_3 + 2x_4$, $x_2 = -2x_3 - 3x_4$
The solution is $$\begin{bmatrix}x_3 + 2x_4\\-2x_3 - 3x_4 \\ x_3 \\ x_4\end{bmatrix} = x_3\begin{bmatrix}1\\-2\\1\\0\end{bmatrix} + x_4\begin{bmatrix}2\\-3\\0\\1\end{bmatrix}$$
Therefore, the orthogonal complement is the vector space made of the span of $\begin{bmatrix}1\\-2\\1\\0\end{bmatrix} \text{ and } \begin{bmatrix}2\\-3\\0\\1\end{bmatrix}$
MathJax TeX Test Page
Find the orthogonal complement of the column space of $\begin{bmatrix}1 & 1 & -1 & 0 \\2 & 2 & 0 & 1\\ -1 & -1 & -1 & -1\end{bmatrix}$
This is equivalent to finding the orthogonal complement of the row space of $\begin{bmatrix}1 & 2 & -1\\1 & 2 & -1\\-1 & 0 & -1\\0 & 1 & -1\end{bmatrix}$
This equals the null space of that matrix.
$$\begin{bmatrix}1 & 2 & -1\\1 & 2 & -1\\-1 & 0 & -1\\0 & 1 & -1\end{bmatrix} \to \begin{bmatrix}1 & 2 & -1 \\ 0 & 0 & 0 \\0 & 2 & -2\\0 & 1 & -1\end{bmatrix} \to \begin{bmatrix}1 & 0 & 1 \\ 0 & 1 & -1 \\0 & 0 & 0\\0 & 0 & 0\end{bmatrix} $$
$x_1 = -x_3, x_2 = x_3$
$$x_3\begin{bmatrix}-1\\1\\1\end{bmatrix}$$
Therefore our solution is the span of $\begin{bmatrix}-1\\1\\1\end{bmatrix}$
Orthogonal Complement of the Orthogonal Complement
MathJax TeX Test Page
$$\left(V^{\perp}\right)^{\perp} = V$$
This follows many other functions: inverse, transpose, etc.
Now, let's prove it. The way we prove two sets are equal is we prove that they contain each other.
Proof Part 1: $V \subseteq \left(V^{\perp}\right)^{\perp}$
MathJax TeX Test Page
In order to do this, we show that every point in V is also in $\left(V^{\perp}\right)^{\perp}$. We start with a few definitions.
$$\text{Definition: } V^{\perp} = \{w \in \mathbb{R}^n|\text{ }w \cdot y = 0, \forall y \in V\}$$
$$\text{Definition: } \left(V^{\perp}\right)^{\perp} = \{w \in \mathbb{R}^n|\text{ }w \cdot y = 0, \forall y \in V^{\perp}\}$$
$$\text{Let } x \in V$$
We now want to show that x satisfies the definition of $\left(V^{\perp}\right)^{\perp}$, so we choose an arbitrary element in $V^{\perp}$.
$$\text{Let } y \in V^{\perp}$$
$$\text{By definition of V perp, } y \cdot x = 0$$
$$x \cdot y = 0$$
Now, we generalize. Because this worked for an arbitrary element in $V^{\perp}$, it will work for every value.
$$\forall y \in V^{\perp}, x \cdot y = 0$$
As we can see, this is the definition of $\left(V^{\perp}\right)^{\perp}$. Therefore, $$x \in \left(V^{\perp}\right)^{\perp}$$
This works for every $x \in V$, so
$$V \subseteq \left(V^{\perp}\right)^{\perp}$$
Proof Part 2: $\left(V^{\perp}\right)^{\perp} \subseteq V$
MathJax TeX Test Page
In order to do this, we show that every point in $\left(V^{\perp}\right)^{\perp}$ is also in V.
$$\text{Let } x \in \left(V^{\perp}\right)^{\perp}$$
The orthgonoal complement of a set is just that: a complement. Therefore, any vector in $\mathbb{R}^n$ can be written as the sum of a vector in $V$ and a vector in $V^{\perp}$. Really quick justification: the dimension of V is k, and the dimension of $V^{\perp}$ = n - k, so combined they equal n.
$$x = v + w, v \in V \text{ and } w \in V^{\perp}$$
$$x \cdot w = 0 \text{ By definition of } \left(V^{\perp}\right)^{\perp}$$
$$x \cdot w = 0 , v \cdot w + ||w||^2 = 0 + ||w||^2 = 0$$
$$w = \vec{0}$$
$$\text{Therefore, } x = v \in V$$
$$x \in V$$
$$\left(V^{\perp}\right)^{\perp} \subseteq V$$