\documentclass{article}
\usepackage{graphicx}
%\input epsf.sty
%\usepackage[dvips]{graphicx}
\usepackage{latexsym}
\usepackage{amsfonts}
\usepackage{amssymb}
\usepackage{amsmath}
\usepackage{verbatim}
% THEOREMS -------------------------------------------------------
%From thesis...
%\documentclass{book}
%\usepackage{graphicx}
%\input epsf.sty
%\usepackage[dvips]{graphicx}
%\usepackage{latexsym}
%\usepackage{amsfonts}
%\usepackage{amssymb}
%\usepackage{verbatim}
%\usepackage{afterpage}
%\usepackage{amsmath}
%\usepackage{cite}
%\usepackage{titling}
%\usepackage{appendix}
\usepackage[standard, thmmarks]{ntheorem}
%\usepackage{smartref}
%\addtoreflist{figure}
%\usepackage[margin=3cm]{geometry}
%\usepackage{fancyhdr}
%End from thesis...
\newtheorem{prob}{Problem}
\newtheorem{ax}{Axiom}[section]
\newtheorem{thm}{Theorem}[section]
\newtheorem{cor}[thm]{Corollary}
\newtheorem{lem}[thm]{Lemma}
\newtheorem{prop}[thm]{Proposition}
%\theoremstyle{definition}
\newtheorem{defn}[thm]{Definition}
%\theoremstyle{remark}
\newtheorem{rem}[thm]{Remark}
%\numberwithin{equation}{section}
% MATH -----------------------------------------------------------
\newcommand{\norm}[1]{\left\Vert#1\right\Vert}
\newcommand{\abs}[1]{\left\vert#1\right\vert}
\newcommand{\set}[1]{\left\{#1\right\}}
\newcommand{\R}{\mathbb R}
\newcommand{\RP}{\mathbb R P}
\newcommand{\CP}{\mathbb C P}
\newcommand{\Z}{\mathbb Z}
\newcommand{\C}{\mathbb C}
\newcommand{\Q}{\mathbb Q}
\newcommand{\F}{\mathcal F}
\newcommand{\hyp}{\mathbb H}
\newcommand{\M}{\mathcal M}
\newcommand{\Lob}{\mathcal L}
\newcommand{\Vect}{\text{Vect\,}}
\newcommand{\im}{\text{Im\,}}
\newcommand{\St}{\text{St\,}}
\newcommand{\Cl}{\text{Cl\,}}
\newcommand{\Li}{\text{Li\,}}
\newcommand{\sgn}{\text{sgn\,}}
\newcommand{\vol}{\text{vol\,}}
\newcommand{\Conv}{\text{Conv\,}}
\newcommand{\Distr}{\text{Distr}}
\newcommand{\Cont}{\text{Cont}}
\newcommand{\Fol}{\text{Fol\,}}
\newcommand{\eps}{\varepsilon}
\newcommand{\To}{\longrightarrow}
\newcommand{\BX}{\mathbf{B}(X)}
\newcommand{\A}{\mathcal{A}}
\newcommand{\N}{\mathbb N}
\newcommand{\diagram}{\vspace*{5cm}}
\renewcommand{\theenumi}{(\roman{enumi})}
\renewcommand{\labelenumi}{\theenumi}
%\newenvironment{proof}{\noindent \emph{Proof. }}{ \begin{flushright}$\square$\end{flushright} }
% ----------------------------------------------------------------
\begin{document}
\title{Complex vector spaces, duals, and duels: \\ Fun with a number, or two, or four}
\author{Daniel Mathews}%
\maketitle
% ----------------------------------------------------------------
\tableofcontents
\section{Complex complexity}
Physicists have an unpleasant (to mathematicians, but perhaps useful to themselves) of writing notation to mean whatever they want. For instance, a letter might refer to a matrix, or an operator, or a physical quantity --- or simultaneously all of them, so that an equation can have multiple meanings depending on how you read it. The postmodernists, it seems, lagged far behind the scientists in the realm of deliberate ambiguity!
So, let us take a real number, $z$.
\emph{For my first trick}, I don my physicist (or postmodernist?) hat and hereby declare that $z$ is no longer real, but complex! So, let us write $z=x+yi$! So now, $z$ looks like a complex number, with $x,y$ being respectively the real and imaginary parts of $z$.
Being given $z$, then, is the same as being given the ordered pair $z = (x,y)$. We can express this by saying $\C \cong \R^2$: $\C$ and $\R^2$ are isomorphic --- as real vector spaces. Not as complex vector spaces, because it's not clear how to multiply a pair of real numbers by $i$!
But, if we give the isomorphism $\C \cong \R^2$, $z = x+yi \mapsto (x,y)$, then we \emph{can} multiply a pair of real numbers by $i$: we say that
\[
i(x,y) = iz = i(x+yi) = -y+xi = (-y,x).
\]
In particular, multiplication by $i$ is given by
\[
(x,y) \mapsto (-y,x) = \begin{bmatrix} 0 & -1 \\1 & 0 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix}.
\]
In a non-rigorous and physicist-sort of way, then,
\[
i = \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}.
\]
To be rigorous, we will say that we have equipped $\R^2$ with a \emph{complex structure}. That is, we have defined an operator on $\R^2$ which is like multiplication by $i$. We will call this operator $J: \R^2 \To \R^2$, and $J$ is given by the matrix above. Note that
\[
J^2 = \begin{bmatrix} -1 & 0 \\ 0 & -1 \end{bmatrix} = -I
\]
so that $J$ applied twice multiplies everything by $-1$. This is just like multiplication by $i$, as we should expect.
Thus, we can perhaps say (in our incorrigibly unrigorous way) that while $\R^2 \cong \C$ is true, it doesn't capture all the structure of $\C$, and in fact that $(\R^2, J) \cong (\C, i)$ is better, whatever it means. In fact, it doesn't mean much. All we really have is $\C$, a (very trivial!) vector space over the complex numbers; and $\R^2$, an (almost as trivial) vector space over the real numbers, with an extra operator $J$ to give it a complex structure. If we didn't do this, $\R^2$ might be feeling rather shallow without any complex structure!
This investigation into the difference between $\R^2$ and $\C$, and how to put a complex structure on $\R^2$, is all rather edifying but unfortunately not very interesting. So let us move straight to ---
\emph{For my second trick}, I hereby declare that $x,y$ are complex numbers! They looked very real of course, but no longer.
This is a bit silly. If in my first trick I declared that that $z = x+yi$ and I now declare that $x,y$ are actually complex, this seems to defeat the purpose of introducing $x,y$ in the first place --- as real and imaginary parts of $z$. And, sure enough, it doesn't make much sense. Here $z$ was already complex; no need to further complexify it.
However! If we forget about our original $z$, and only think of our $x,y$ as real numbers, $(x,y) \in \R^2$, then there is no such absurdity in now calling $(x,y)$ complex. We had an ordered pair --- now we still have an ordered pair. The only difference is that previously the ordered pair was of real numbers; now it is a pair of complex numbers. Really, we now have $(x,y) \in \C^2$; but our $\C^2 \cong \R^4$, again, as real vector spaces.
Well, now we certainly have something with a complex structure --- if only because we declared everything to be complex!
However, there is a slight hitch: there are two different ways we might define the complex structure.
\emph{The first way}, and perhaps the more obvious, would simply be to note that $x,y$ are now complex numbers, and so we can multiply them by $i$! That is, we can define
\[
i(x,y) = (ix, iy).
\]
But that's not the only way!
\emph{The second way} would be simply to continue where we left off. We had a complex structure $J$ on our real vector space $\R^2$, defined by $J(x,y) = (-y,x)$. We just declared that $x,y$ are now complex, but that doesn't make this definition of $J$ invalid; $J$ still makes sense. Moreover, we still have $J^2 = -I$. So this is just as good a complex structure:
\[
J(x,y) = (-y,x).
\]
These are quite different! For instance, if we take the pair $(1,0)$, we have $i(1,0) = (i,0)$, but $J(1,0) = (0,1)$. How do these two complex structures relate?
We can easily write them as matrices. If we write $x = a + b i$, $y = c + d i$, then we can regard $(x,y) = ( a + b i, c + d i)$ as $(a,b,c,d)$. That is, $\C^2 \cong \R^4$. Then we have $i(a + b i, c + d i) = (-b + a i, -d + c i)$ and so
\[
i = \begin{bmatrix} & -1 && \\ 1 &&& \\ &&& -1 \\ && 1 & \end{bmatrix}.
\]
On the other hand, we have $J(a+bi, c+di) = (-c-di, a+bi)$ and so
\[
J = \begin{bmatrix} && - 1 & \\ &&& -1 \\ 1 &&& \\ & 1 && \end{bmatrix}.
\]
On $\R^4$, we have two complex structures. On $\C^2$, however, it is $i$ that is more natural, being, after all, an actual complex number. The operator $J$, considered in this way, is a real linear operator on a real vector space that happens to satisfy $J^2 = -1$, so it is \emph{like} complex multiplication by $i$; this is what a complex structure is. It can also be considered as a \emph{complex} linear operator on a \emph{complex} vector space --- the operator being given by, of course,
\[
\begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}
\]
as it was before, but now acting on $\C^2$.
Poor old $J$ must be feeling a bit inferior, compared to $i$! However, the operator $J$ gives rise to a lot of structure! We just need to look a little closer. It turns out that $J$ is sometimes like multiplication by $i$ --- how often? Does $J$ ever coincide with multiplication by $i$? It's easy to see:
\[
J(x,y) = i(x,y) \; \Rightarrow \; (-y,x) = (ix,iy) \; \Rightarrow \; x = iy.
\]
So, whenever $x=iy$, $J$ coincides with $i$. That is, $J(iy,y) = (-y,iy) = i(iy,y)$. Another way of saying this is that $i$ is an \emph{eigenvalue} for $J$ with eigenvector $(i,1)$. This make sense, since $J$ is not only a real linear operator on $\R^4$, but also a complex linear operator on $\C^2$ --- and the eigenspace of $i$ describes precisely where $J$ coincides with the more natural complex structure on $\C^2$. Nonetheless, as it turns out, we can \emph{diagonalize} $J$ over the complex numbers. Since $J^2 = -1$, the only possible eigenvalues are $\pm i$, and it is easy to find the $(-i)$-eigenspace:
\[
J(x,y) = -i(x,y) \; \Rightarrow \; (-y,x) = (-ix, -iy) \; \Rightarrow \; y = ix.
\]
Thus $J(x,ix) = (-ix,x) = -i(x,ix)$ and the $(-i)$-eigenspace is spanned by $(1,i)$. Since $(i,1)$ and $(1,i)$ span $\C^2$ (over $\C$), $J$ is indeed diagonalizable over the complex numbers.
What does this mean? There is a $1$-complex-dimensional subspace ($2$-real-dimensional) of $\C^2$ on which $J$ --- the complex linear operator on a complex vector space --- is the same as the more natural multiplication by $i$, that is, $J(x,y) = i(x,y)$. Then there is another $1$-complex-dimensional subspace on which $J$ is \emph{complex anti-linear} with respect to the natural multiplication by $i$, that is, $J(x,y) = -i(x,y)$. And these two subspaces span $\C^2$.
So, given any pair of complex numbers $(\alpha,\beta)$, we can write it as a sum of two pairs, on which $J$ respectively acts as multiplication by $i$ and $-i$. In fact, explicitly:
\[
(\alpha, \beta) = \left( \frac{\alpha + \beta i}{2}, \frac{\beta - \alpha i}{2} \right) + \left( \frac{\alpha - \beta i}{2}, \frac{\beta + \alpha i}{2} \right) = \frac{\beta - \alpha i}{2} \left( i,1 \right) + \frac{\alpha - \beta i}{2} \left( 1,i \right).
\]
Thus, $J$ is much more interesting than $i$ --- it is sometimes $i$, and sometimes $-i$, and usually a bit of both.
What does this mean? In one sense, all we have done is diagonalize the matrix $\begin{bmatrix} 0 & -1 \\1 & 0 \end{bmatrix}$ over $\C$, and found it has eigenvalues $\pm i$. But there is more to it than that, because this matrix was originally a complex multiplication itself. We started with $\R$ and turned it into $\C \cong \R^2$. Thus $\R^2$ was furnished with a complex structure $J$. But then we turned $\R^2$ into $\C^2 \cong \R^4$. This $\R^4$ still inherits a complex structure $J$ from $\R^2 \cong \C$; but now also obtains the obvious complex structure $i$.
Of course, this trick of declaring real numbers suddenly to be complex really corresponds to nothing other than \emph{tensoring a real vector space with $\C$} (over $\R$). Our magic is really tensor magic! First we found that $\R \otimes_\R \C \cong \C \cong (\R^2, J)$. Then we found that $\C \otimes_\R \C \cong \C^2 \cong \R^4$. Keeping track of the complex structures, $J$ extends (real-)linearly over the complex vector space and we find that
\[
\C \otimes_\R \C \cong (\R^2, J) \otimes_\R \C = \left( (i,1) \C, i \right) \oplus \left( (1,i) \C, -i \right).
\]
Thus, if we tensor a complex vector space with the complex numbers --- and extend the original complex structure $J$ linearly over the new tensor product --- then the result does \emph{not} coincide with multiplication by $i$ on that tensor product. The tensor product decomposes into a \emph{complex linear} and a \emph{complex anti-linear} part. A succinct way of saying this is
\[
\C \otimes_\R \C \cong \C \oplus \bar{\C}
\]
where $\bar{\C}$ corresponds to $\C$ with its orientation reversed: multiplication by $i$ becomes multiplication by $-i$.
Using the same argument, one can prove that for any complex vector space $V$,
\[
V \otimes_\R \C \cong V \oplus \bar{V}
\]
where, again, $\bar{V}$ means $V$ with orientation reversed, or equivalently, conjugated.
Complex numbers are even more complex than we thought!
\section{Dual duel!}
Now, let us continue in a slightly different vein our considerations of complex vector spaces. Let us consider the \emph{dual} $V^*$ of a complex vector space $V$.
Of course, $V^*$ should also be a complex vector space. But there are two competing ways we can define the dual $V^*$, leading to a dual duel!
Firstly, a complex vector space $V$ can be thought of as a real vector space with a complex structure $J: V \To V$ on it, with $J^2 = -1$. The dual is then described as for real vector spaces:
\[
V^* = \text{Hom}_\R \left( V, \R \right),
\]
i.e. the \emph{real}-linear maps from $V$ to $\R$. Note that, if we take a random (real) inner product on $V$, then for given $v \in V$, the map $x \mapsto \langle v, x \rangle$ is a real-linear map $V$ to $\R$; it turns out that \emph{all} linear maps can be written in this way. But let us not take any old random inner prouct on $V$; let us take one which is \emph{compatible} with $J$. We should intuitively think of $J$ as ``a rotation by $90$ degrees", and hence we should think of our inner product as ``a metric in which $J$ \emph{actually is} a rotation by $90$ degrees". Formally, this means that $\langle v,w \rangle = \langle Jv, Jw \rangle$.
In any case, with an inner product we have set up a 1-1 correspondence between $V$ and $V^*$, given by:
\[
\begin{array}{ccc}
V & \cong & V^* \\
v & \sim & \left(x \mapsto \langle v, x \rangle \right)
\end{array}
\]
(Another way of writing $x \mapsto \langle v, x \rangle$ is simply to write $\langle v, \cdot \rangle$: this is a function waiting for an argument!) Now, this defines $V^*$ as a real vector space, but not a complex one. Our aim now is to give it a complex structure $J^*$.
Well, if there is a complex structure $J$ on $V$, one way to put a complex structure on $V^*$ would be to do so through the isomorphism $V \cong V^*$! So, if $f \in V^*$ corresponds to $v \in V$ (i.e. $f = \langle v, \cdot \rangle$) then we should have $J^* f$ corresponding to $Jv$. Namely,
\[
J^* f = \langle Jv, \cdot \rangle, \quad \text{or} \quad J^* f(x) = \langle Jv, x \rangle.
\]
We can rewrite this, since our inner product is compatible with $J$:
\[
J^*f(x) = \langle Jv, x \rangle = \langle -v, Jx \rangle = - \langle v, Jx \rangle = - f(Jx).
\]
This equation $J^*f(x) = -f(Jx)$ is excellent, because it \emph{does not depend} on the choice of inner product! It should be clear (or an easy exercise!) that $(J^*)^2 = -1$; it actually is a complex structure. So we have defined a canonical complex structure on $V^*$, making it into a complex vector space; we can write this as $(V^*, J^*)$.
But, recall, there is a second method, duelling with the first method in the dual duel!
In our second method, we regard $V^*$ as a complex vector space from the beginning. After all, $V$ was! Then
\[
V^* = \text{Hom}_\C \left( V, \C \right),
\]
so it makes perfect sense to multiply elements of $V^*$ by $i$. An element of $V^*$ is a complex-linear map $f: V \To \C$, and hence the map $if: V \To \C$ can be defined by $(if)(x) = i f(x)$. Such a simple definition that it is almost written into our notation! (The physicists would be proud!)
Note that now $(if)(x) = if(x) = f(ix)$, since $f \in V^*$ was defined to be complex linear. Note that here we are writing $i$ instead of $J^*$ because it is so natural to do so; if we wrote $J$ and $J^*$ instead, then the previous equation becomes $J^* f(x) = f(Jx)$.
We will write the complex structure on the dual defined by this second method as $V_\C^*$ (and we will write $V$ as $V_\C$ to emphasise this when necessary).
Note that both methods, duelling for the right to dual $V$, both give isomorphisms. For the first method we described how $V^* \cong V$; for the second method it is true by a similar argument. But are these isomorphisms complex linear?
In the first case, complex linearity requires that if $v \in V$ corresponds to $f \in V^*$, then $Jv$ corresponds to $J^* f$. But this is exactly how we defined $J^*$! So the first method, although perhaps more convoluted, gives us a $(V^*, J^*)$ which is isomorphic to $V$ via a \emph{complex-linear} isomorphism.
In the second case, this is not so clear, not least because we waved our hands about ``a similar argument"! So let us see how this works. To define the isomorphism, we again take an inner product on $V$ --- but this time, it must be a \emph{complex} inner product, with the properties $\langle \lambda v, w \rangle = \langle v, \bar{\lambda} w \rangle = \lambda \langle v, w \rangle$ for $\lambda \in \C$. (Note the shiftiness in the second coordinate; this is standard and necessary for positive-definiteness, however.) Then we again note that, for $v \in V$, the map $V \To \C$, $x \mapsto \langle x, v \rangle$ is complex linear --- this is true since $x$ is in the first coordinate of the inner product --- and that any complex linear map to $\C$ can be written in this way. So we have an isomorphism
\[
\begin{array}{ccc}
V & \cong & V_\C^* \\
v & \sim & \langle \cdot, v \rangle.
\end{array}
\]
We can then see how the complex structure on $V_\C^*$ works in relation to $v$: if $f \in V_\C^*$ corresponds to $v \in V$, then what does $if$ correspond to? Well, then $f(x) = \langle x, v \rangle$, so $(if)(x) = i \langle x, v \rangle = \langle x, -iv \rangle$. (Note we have used the shiftiness --- more accurately, complex anti-linearity --- of the second coordinate.) So $if$ corresponds to $-iv$.
That is, the second method, which seems more natural, leads to an isomorphism between $V_\C^*$ and $V$ which is not complex linear --- in fact, complex \emph{anti-linear}.
That is, $(V^*, J^*) \cong (V, J)$ but $(V_\C^*, i) \cong (V, -i) \cong \bar{V}$, where $\bar{V}$ denotes $V$ with the orientation reversed, $i$ becoming $-i$.
The result of the dual duel is therefore: the most natural way of defining the dual of a complex vector space leads to the dual being isomorphic to the conjugate of the original vector space! \emph{The dual is the conjugate.} If we are prepared to do something more convoluted, going back to a real vector space with complex structure operator $J$, and transplant this to the dual over the reals, then we can obtain a dual which is complex-isomorphic to the original vector space.
\section{1, 2, 4, dual!}
Putting these two sets of considerations together, we note the conclusion of the first section,
\[
V \otimes_\R \C \cong V \oplus \bar{V},
\]
can now be extended! Upon tensoring a complex vector space by the complex numbers over the reals, we obtain the direct sum of a copy of the vector space, and its conjugate. But (provided we take the more natural version) this is just the dual of $V$.
\[
V \otimes_\R \C \cong V \oplus V^*
\]
\section{Why is this important?}
These may seem like rather obscure, indeed, arcane musings. But this is not the case at all. These considerations are at the heart of much algebra, geometry and topology. For instance:
\begin{enumerate}
\item
Our considerations of duals and conjugates explains why complex dual bundles correspond to conjugate bundles, in turn why they correspond to negative characteristic classes.
\item
The splitting of $V \otimes \C \cong V \oplus \bar{V}$ is the starting point of Dolbeault cohomology, which is central to complex geometry and topology. Just as (real-valued) differential forms are central to (real!) differential geometry and smooth topology, not least through de Rham cohomology --- so too complex-valued differential forms are central to complex geometry and topology, not least through Dolbeault cohomology.
\item
The process of tensoring with $\C$ producing a complex vector space, and its conjugate, is the key to the definition of the Pontryagin characteristic classes. And the Pontryagin classes are very important objects in topology, not least because of the Hirzebruch signature theorem.
\end{enumerate}
\section{Postscript}
These are whimsical notes based on a discussion in a book on $4$-Manifolds, \cite{Scorpan}, pp. 134--137. The Hirzebruch signature theorem, Pontrjagin classes, and complex geometry are of course all very important in dimension $4$, as of course are vector spaces of dimension $4$! For the applications of this idea to characteristic classes, see \cite{MS} or \cite{BT} or any other book on the topic. For the application to complex geometry, see e.g. \cite{H}. For the algebraic side of the picture with dual bundles, see e.g. \cite{GH}. A more specialized but beautiful realm where all this applies is complex line bundles on Riemann surfaces (i.e. complex curves), see e.g. \cite{J}.
\begin{thebibliography}{99}
\bibitem{GS}
Gompf and Stipsicz, 4-Manifolds and Kirby Calculus.
\bibitem{MS}
Milnor and Stasheff, Characteristic Classes
\bibitem{BT}
Bott and Tu, Differential Forms in Algebraic Topology
\bibitem{H}
Huybrechts, Complex Geometry: An Introduction
\bibitem{GH}
Griffiths and Harris, Principles of Algebraic Geometry
\bibitem{J}
Jost, Compact Riemann Surfaces: An Introduction to Contemporary Mathematics
\bibitem{Scorpan}
Scorpan, The Wild World of $4$-Manifolds
\end{thebibliography}
\end{document}