March 5, 2020

eigenvalue decomposition of Pauli matrices

At the heart of linear algebra is a task called eigenvalue decomposition. This task, in the simplest words possible, allows you to analyze a given matrix to determine how it’s constructed from a combination of basis vectors, called eigenvectors, and scalars, called eigenvalues. Many statistical models, machine learning algorithms, and scientific theories use eigenvector decomposition to go from a muddle of data to a understandable theory.

Here, I’ll talk about using eigenvalue decomposition to inspect the fundamental logic gates used in quantum computing.

What are the Pauli matrices?

A Pauli matrix describes a rotation of a complex vector $v \in \mathbb{C}^{2}$. Visually, imagine a unit sphere; the vector represents a arrow from the center to the surface of the sphere. The rotation performed by a Pauli matrix occurs along the X, Y, or Z axis, repectively, of our visualization. We use these matrices to represent fundamental gates in quantum computing.

The Pauli $X$ matrix is $$\begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}$$ and performs a flip along the x-axis. The Pauli $Y$ matrix is $$\begin{bmatrix} 0 & -i \\ i & 0 \end{bmatrix}$$ and performs a flip along the y-axis. The Pauli $Z$ matrix is $$\begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix}$$ and performs a flip along the z-axis.

What are eigenvalues and eigenvectors?

An eigenvalue and an eigenvector are pairs of datum that describe a matrix. Specifically, the description has the form $Av = \lambda v$. We’ll assume as above that $v \in \mathbb{C}^{2}$ and therefore $A \in \mathbb{C}^{2,2}$. $\lambda$ is a scalar in $\mathbb{C}$. Since there may be many such data, we narrow the unique possibilities with a characteristic function that states $c(\lambda) = det(A - \lambda I) = 0$, where $I$ is the identity matrix.

The process of eigenvalue decomposition takes a given matrix $A$ and producess its eigenvalues $\lambda$ and eigenvectors $v_{i}$. The process goes as follows (described in MML Example 4.5):

  1. Write the matrix representation into the charateristic function.
  2. Solve for the eigenvalues.
  3. Replace the eigenvalues in the characteristic function and solve for the eigenvectors.

I’m going to demonstrate using the Pauli matrices.

Pauli X - eigenvalue decomposition

Step 1

Write the matrix representation into the characteristic function.

$$ \begin{align} det( \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} ) = 0 \end{align} $$

Step 2

Solve for the eigenvalues.

$$ det( \begin{bmatrix} - \lambda & 1 \\ 1 & - \lambda \end{bmatrix} ) = 0 $$

$$ \lambda^2 - 1 = 0 $$

$$ \lambda = \pm 1 $$

Step 3

$$ \begin{bmatrix} -1 & 1 \\ 1 & -1 \end{bmatrix} v = 0 $$

$$ v = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \\ 1 \end{bmatrix} $$

Similarly, for $\lambda = -1$ we get $v = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \\ -1\end{bmatrix}$. The scalar $\frac{1}{\sqrt{2}}$ on each solution comes from the requirement that the vector $v$ be a unit vector, i.e., it’s distance from origin equal to one.

Pauli Y - eigenvalue decomposition

Step 1

$$ det( \begin{bmatrix} 0 & -i \\ i & 0 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} ) = 0 $$

Step 2

$$ det( \begin{bmatrix} - \lambda & -i \\ i & - \lambda \end{bmatrix} ) = 0 $$

$$ \lambda^2 + 1 = 0 $$

$$ \lambda = \pm i $$

Step 3

$$ \begin{bmatrix} -i & -i \\ i & -i \end{bmatrix} v = 0 $$

$$ v = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \\ -1 \end{bmatrix} $$

Similarly, for $\lambda = -i$ we get $v = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \\ 1\end{bmatrix}$. These are already unit vectors along the imaginary y-axis and so do not need normalization.

Pauli Z - eigenvalue decomposition

Step 1

$$ det( \begin{bmatrix} 1 & 0 \\ 0 & -1 \end{bmatrix} - \lambda \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} ) = 0 $$

Step 2

$$ det( \begin{bmatrix} 1 - \lambda & 0 \\ 0 & -1 - \lambda \end{bmatrix} ) = 0 $$

$$ \lambda^2 - 1 = 0 $$

$$ \lambda = \pm 1 $$

Step 3

$$ \begin{bmatrix} 0 & 0 \\ 0 & -2 \end{bmatrix} v = 0 $$

$$ v = \frac{1}{2} \begin{bmatrix} 0 \\ 1 \end{bmatrix} $$

Similarly, for $\lambda = -i$ we get $v = \frac{1}{2} \begin{bmatrix} 1 \\ 0\end{bmatrix}$. Like with the $X$ matrix, we normalize by multiplying by a scalar.

Content by © Jared Davis 2019-2020
Theme by © Emir Ribic 2017-2020

Powered by Hugo & Kiss.