# Basics of Quantum Mechanics 2

We continue our study of basics of quantum mechanics. Part 1 of this blog series can be found here.

## 4. Matrix formulation of quantum mechanics

Matrix formulation of quantum mechanics was first proposed by Werner Heisenberg, Max Born, and Pascual Jordan. Later, it was unified with wave mechanics formulation introduced by Erwin Schrödinger.

Matrix formulation is particularly useful when we work with finite discrete bases. This is because by using matrix formulation, quantum mechanical quantities can be expressed mostly in matrix multiplication, which is something we understand quite well from linear algebra, and also has been highly optimized both in terms of software (e.g. leveraging structures like sparsity) and hardware (e.g. using GPUs).

Let’s start with the basics. Recall that a ket $| \psi \rangle = \sum_i c_i |u_i \rangle$ where $c_i = \langle u_i | \psi \rangle$, which can be thought of as the representation of the ket $|\psi\rangle$ in the ${|u_i\rangle}$ basis. To write the ket $|\psi\rangle$ in matrix form, we simply stack the representation as a column vector as follows:

$$
\begin{bmatrix}
\langle u_1 | \psi \rangle\\

\langle u_2 | \psi \rangle\\

\vdots\\

\langle u_i | \psi \rangle\\

\vdots
\end{bmatrix}
= \begin{bmatrix}
c_1\\

c_2\\

\vdots\\

c_i\\

\vdots
\end{bmatrix}.
$$

Similarly, to express a bra $\langle \psi | = \sum_i c_i^* \langle u_i |$ where $c_i^* = \langle \psi | u_i \rangle = \langle u_i | \psi \rangle^*$, we write them as a row vector:

$$[\langle \psi | u_1 \rangle~\langle \psi | u_2 \rangle \cdots \langle \psi | u_i \rangle \cdots ] = [c_1^* c_2^* \cdots c_i^* \cdots]. $$

Now, we look at an operator $\hat{A} = \sum_{ij} A_{ij} |u_i \rangle \langle u_j|$ where $A_{ij} = \langle u_i | \hat{A} | u_j \rangle$, which can be expressed as a matrix:

$$
\begin{bmatrix}
A_{11} & A_{12} & \cdots & A_{1j} & \cdots \\

A_{21} & A_{22} & \cdots & A_{2j} & \cdots \\

\vdots & \vdots & & \vdots & \\

A_{i1} & A_{i2} & \cdots & A_{ij} & \cdots \\

\vdots & \vdots & & \vdots &
\end{bmatrix}
$$

As an example, let’s write the expression $|\psi'\rangle = \hat{A} |\psi\rangle$ in terms of matrix formulation. First, recall the following:

- $|\psi'\rangle = \sum_i c_i' |u_i\rangle$
- $|\psi\rangle = \sum_i c_i |u_i\rangle$

Then, we can express the coefficient $c_i'$ as follows:

$$
\begin{aligned}
c_i' &= \langle u_i | \psi' \rangle = \langle u_i | \hat{A}| \psi \rangle = \langle u_i | \hat{A} \mathbb{I} | \psi \rangle \\

&= \langle u_i | \hat{A} \left( \sum_j |u_j \rangle \langle u_j | \right) |\psi \rangle = \sum_j \langle u_i | \hat{A} | u_j \rangle \langle u_j | \psi \rangle \\

&= \sum_j A_{ij} c_j.
\end{aligned}
$$

The last expression from the above can be expressed in matrix formulation as follows:

$$
\begin{aligned}
&\begin{bmatrix}
A_{11} & A_{12} & \cdots & A_{1j} & \cdots \\

A_{21} & A_{22} & \cdots & A_{2j} & \cdots \\

\vdots & \vdots & & \vdots & \\

A_{i1} & A_{i2} & \cdots & A_{ij} & \cdots \\

\vdots & \vdots & & \vdots &
\end{bmatrix}
\cdot
\begin{bmatrix}
c_1\\

c_2\\

\vdots\\

c_i\\

\vdots
\end{bmatrix} =
\begin{bmatrix}
A_{11}c_1 + \dots + A_{1j}c_j \dots \\

A_{21}c_1 + \dots + A_{2j}c_j \dots \\

\vdots\\

A_{i1}c_1 + \dots + A_{ij}c_j \dots \\

\vdots
\end{bmatrix} =
\begin{bmatrix}
c_1'\\

c_2'\\

\vdots\\

c_i'\\

\vdots
\end{bmatrix}
\end{aligned}.
$$

Therefore, we arrive at the matrix formulation for $c_i'$ we began with. Similarly, we can express other quantum mechanical quantities we have seen in terms of matrix formulation.

## 5. Change of basis in quantum mechanics

Recall that we can represent state space with an orthonormal basis. For example, we have been working with the basis ${ | u_i \rangle }$ where $\langle u_i | u_j \rangle = \delta_{ij}$. Then, we can express a ket $|\psi\rangle$ in the ${ |u_i\rangle }$ basis, i.e.

$$ |\psi \rangle = \sum_i c_i | u_i \rangle \quad\text{where}\quad c_i = \langle u_i | \psi \rangle. $$

We will look at how to express the ket $|\psi\rangle$ in different basis, ${ |v_j \rangle }$, i.e.

$$ |\psi \rangle = \sum_j d_j | v_j \rangle \quad\text{where}\quad d_j = \langle v_j | \psi \rangle. $$

The step goes as follows: first, express the coefficient $d_j$ as above. Then, insert an identity, and using the resolution of the identity matrix *in* ${ |u_i \rangle }$ basis, we find an expression that relates the coefficient $d_j$ and $c_i$. Mathematically,

$$
\begin{aligned}
d_j &= \langle v_j | \psi \rangle = \langle v_j | \mathbb{I} | \psi \rangle = \langle v_j | \left( \sum_i |u_i \rangle \langle u_i | \right) | \psi \rangle \\

&= \sum_i \langle v_j | u_i \rangle \langle u_i | \psi \rangle = \sum_i S_{ji} c_i.
\end{aligned}
$$

Thus, to compute $d_j$ given $c_i$, we simply need to compute the quantity $S_{ji} = \langle v_j | u_i \rangle$, which is called an “overlap.” Using the matrix formulation we studied in the previous chapter, we can also express this relationship as follows:

$$
\begin{bmatrix}
d_1\\

d_2\\

\vdots
\end{bmatrix} =
\begin{bmatrix}
S_{11} & S_{12} & \cdots \\

S_{21} & S_{22} & \cdots \\

\vdots & \vdots & \ddots \\

\end{bmatrix}
\cdot
\begin{bmatrix}
c_1\\

c_2\\

\vdots
\end{bmatrix}.
$$

We can also compute $c_i$ given $d_j$ in a similar fashion. If you follow the steps similar to the mathematical derivation above, it turns out that:

$$ c_i = \sum_j \langle u_i | v_j \rangle d_j \quad\text{where}\quad \langle u_i | v_j \rangle = \langle v_j | u_i \rangle^* = S_{ji}^*. $$

To close this chapter, let’s look at how we perform a change of basis with an operator $\hat{A}$. We write the elements of operator $\hat{A}$ in ${ |u_i\rangle }$ basis and ${ |v_j\rangle }$ basis as $A_{ik}^u = \langle u_i | \hat{A} | u_k \rangle$ and $A_{j\ell}^v = \langle v_j | \hat{A} | v_\ell \rangle$, respectively. Then, we can compute $A_{j\ell}^v$ in terms of $A_{ik}^u$ as follows:

$$
\begin{aligned}
A_{j\ell}^v &= \langle v_j | \hat{A} | v_\ell \rangle = \langle v_j | \mathbb{I}\hat{A}\mathbb{I} | v_\ell \rangle \\

&= \langle v_j | \left( \sum_i |u_i\rangle\langle u_i| \right) \hat{A} \left( \sum_k |u_k\rangle \langle u_k| \right) | v_\ell \rangle \\

&=\sum_{i,~k} \langle v_j | u_i \rangle \langle u_i | \hat{A} | u_k \rangle \langle u_k | v_\ell \rangle \\

&= \sum_{i,~k} S_{ji} A_{ik}^u S_{\ell k}^*.
\end{aligned}
$$

Again, we can also go in the opposite direction. It turns out that to express $A_{ik}^u$ in terms of $A_{j\ell}^v$ is:

$$ A_{ik}^u = \sum_{j,~\ell}S_{ji}^*A_{j\ell}^v S_{\ell k}. $$

## 6. Eigenvalues and eigenstates in quantum mechanics

As we saw in Chapter 2 of the previous post, operators are mathematical objects that allow us to describe physical properties. Eigenvalues and eigenstates are particularly important in quantum mechanics, because when we measure the physical properties, the only possible outcome is one of the eigenvalues of the associated operator; also, the state of system, after the measurement, is in the corresponding eigenstate. We start with the third postulate of quantum mechanics that summarizes the above:

**Postulate III:** The result of a measurement of a physical quantity is one of the eigenvalues of the associated observable.

Mathematically, we can write down the eigenvalue equation as follows:

$$ \hat{A} |\psi\rangle = \lambda |\psi \rangle, $$

where $\hat{A}$ is an operator, $|\psi\rangle$ is a “special” ket called eigenstate (i.e. eigenvector) of $\hat{A}$, and $\lambda$ is the eigenvalue of $\hat{A}$. As can be seen, the operator $\hat{A}$ takes the ket $|\psi\rangle$ as an input, and outputs the same ket $|\psi\rangle$, only scaled with $\lambda$.