Basics of Quantum Mechanics 1

In this series of blog posts, I want to introduce some basics of quantum mechanics, which can be helpful to start learning about quantum computing. I myself do not have any physics background, and found this series of videos extremely helpful. This post is mainly based on this series of lectures, and I hope it is helpful to anyone who wants to start learning quantum computing.

1. Dirac notation in state space

State space is the name we give to the vector space in which the quantum system lives. Just like Euclidean space, which is used to describe the “world” in classical mechanics, state space is another vector space that shares many similarities with Euclidean space. For instance, just like Euclidean space, state space also is equipped with an inner product, thus making it a Hilbert space.

Even though we use a lot of linear algebra in quantum mechanics, we use quite different notation, called “Dirac notation,” invented by one of the founders quantum mechanics, Paul Dirac. We start with the first postulate of quantum mechanics:

Postulate I: The state of a physical system is characterized by a state vector that belongs to a complex vector space $\mathcal{V}$, called the state space of the system.

It turns out that the vector space properties that Euclidean space has mostly translates to state space, with a few tweaks. Let’s recap the properties of (3-dimensional) Euclidean space first.

Euclidean space (in 3-dimension for illustration) has the following properties:

  • An element $r$ is called a (3-dimensional) “vector”
  • Vector addition: $r_1 + r_2 = r_3 \in \mathbb{R}^3$
  • Commutativity of vector addition: $r_1 + r_2 = r_2 + r_1$
  • Associativity of vector addition: $(r_1 + r_2) + r_3 = r_1 + (r_2 + r_3)$
  • Identity for vector addition: $0 + r = r$
  • Inverse of vector addition: $r + (-r) = 0$
  • Scalar multiplication: $a \cdot r \in \mathbb{R}^3$
  • Associativity of scalar multiplication: $a(br) = (ab)r$
  • Distributivity of scalar multiplication: $(a + b )r = ar + br$, and $a(r_1 + r_2) = ar_1 + ar_2$
  • Identity for vector multiplication: $1\cdot r = r$

The above properties more or less translate directly to state space.

State space $\mathcal{V}$ has the following properties:

  • An element $| \psi \rangle$ is called a “ket”
  • Vector addition: $|\psi_1 \rangle + |\psi_2\rangle = |\psi_3 \rangle \in \mathcal{V}$
  • Commutativity of vector addition: $|\psi_1 \rangle + |\psi_2\rangle = |\psi_2 \rangle + |\psi_1 \rangle$
  • Associativity of vector addition: $(|\psi_1 \rangle + |\psi_2\rangle) + |\psi_3\rangle = |\psi_1 \rangle + (|\psi_2\rangle +|\psi_3\rangle)$
  • Identity for vector addition: $0 + |\psi\rangle = |\psi \rangle$
  • Inverse of vector addition:$|\psi \rangle + (- |\psi \rangle) = 0$
  • Sacalar multiplication: $a | \psi \rangle \in \mathcal{V}$
  • Associativity of scalar multiplication: $a( b | \psi \rangle ) = (ab) | \psi \rangle$
  • Distributivity of scalar multiplication: $(a+b) | \psi \rangle = a| \psi \rangle + b | \psi \rangle$, and $a ( | \psi_1 \rangle + | \psi_2 \rangle ) = a| \psi_1 \rangle + b| \psi_2 \rangle$
  • Identity for vector multiplication: $1 | \psi \rangle = | \psi \rangle$

Now, we move to some differences. Both Euclidean space and state space are subsets of Hilbert space, and thus are equipped with scalar (inner) product.

Scalar product in Euclidean space:

$SP(r_1, r_2) = r_1 \bullet r_2 = c, \quad c \in \mathbb{R}$

  • Conjugation: $r_1 \bullet r_2 = r_2 \bullet r_1$
  • Linearity: $r_1 \bullet a (r_2) = a(r_1) \bullet r_2 = a(r_1 \bullet r_2)$ and $r_1 \bullet (r_2 + r_3) = r_1 \bullet r_2 + r_1 + r_3$
  • Positivity: $r_1 \bullet r_1 \geq 0$, and $r_1 \bullet r_2 = 0$ if and only if $r_1 = 0$

Scalar product in state space:

$SP(|\psi_1,~|\psi_2\rangle) = c, \quad c \in \mathbb{C}$

Notice that the scalar $c$ is now a complex number, which is one of the crucial differences of state space and Euclidean space. Now we look at the properties of scalar product in state space:

  • Conjugation: $SP(|\psi \rangle,~|\phi \rangle) = [SP(|\phi \rangle,~|\psi \rangle)]^*$
  • Linearity in second argument: $SP(|\psi \rangle,~a |\phi \rangle) = a SP(|\psi \rangle,~|\phi \rangle)$ and $SP(|\psi \rangle,~|\phi \rangle + |\chi\rangle) = SP(|\psi \rangle,~|\phi \rangle) + SP(|\psi \rangle,~|\chi \rangle)$
  • Anti-linearity in first argument: $SP(a|\psi \rangle,~ |\phi \rangle) = [SP(|\psi \rangle,~a|\psi \rangle)]^* = a^*[SP(|\phi \rangle, |\psi\rangle) ]^* = a^* SP(|\psi \rangle, |\phi \rangle)$ where $a^*$ is a complex conjugate.
  • Positivity: $SP( |\psi \rangle, |\psi \rangle ) \geq 0$, and $SP( |\psi \rangle, |\psi \rangle ) = 0$ iff $|\psi \rangle = 0$

So the main difference comes from the fact that state space lives in complex vector space, rather than real vector space. Through the scalar product, we can define some basic notations for quantum mechanics:

  • $SP(|\psi\rangle, |\phi\rangle) \implies \langle \psi |\phi \rangle$ is called “braket.” As one can guess, $\langle \psi |$ is called “bra,” and it corresponds to a row vector.
  • As one can guess, $\langle \psi |$ is called “bra,” and it corresponds to a row vector. (And of course $|\phi \rangle$ corresponds to a column vector.)
  • “Ket” $|\psi \rangle \in \mathcal{V}$ maps to “bra” $\langle \psi | \in \mathcal{V}^*$ in dual space
  • Scalar product is anti-linear in the first argument: $a|\psi\rangle \longleftrightarrow a^* \langle \psi |$

With this basic notation equipped, let’s study a concept called “operators” in the next chapter.

2. Operators in quantum mechanics

Operators are mathematical objects that allow us to describe physical properties, such as position, momentum, and energy. Let’s start by the second postulate of quantum mechanics:

Postulate II: A physical quantity $\mathcal{A}$ is described by an operator $\hat{A}$ acting on the state space space $\mathcal{V}$, and this operator is an observable.

In other words, an operator acts on elements of state space $\mathcal{V}$, which are kets, and these kets are modified by the operator $\hat{A}$ in some manner. This can be written is $\hat{A} |\psi \rangle = |\psi' \rangle$, meaning that the operator $\hat{A}$ acts on the ket $| \psi \rangle$, which is modified to another ket $|\psi' \rangle$. It’s important to remember that the operators in quantum mechanics can act on the superposition of different states:

$$\hat{A} (a_1 |\psi_1 \rangle + a_2 |\psi_2 \rangle) = a_1 \hat{A} |\psi_1 \rangle + a_2 \hat{A} | \psi_2 \rangle.$$

As can be guessed from the above expression, these operators are called “linear operators.” Fortunately, in quantum mechanics, we can always work with linear operators, which makes the study of quantum mechanics a lot easier. Similarly to the properties of addition and multiplication of kets we studied earlier, (linear) operators have basic properties:

  • Associativity of addition: $\hat{A} + (\hat{B} + \hat{C}) = (\hat{A} + \hat{B}) + \hat{C}$
  • Commutativity of addition: $\hat{A} + \hat{B} = \hat{B} + \hat{A}$

Before stating the properties of multiplication of operators, let’s start with its definition:

  • Multiplication of operators: $(\hat{A}\hat{B}) |\psi\rangle = \hat{A} ( \hat{B} |\psi \rangle )$

Above can also be thought of as $\hat{A}$ acting on the new state, $\hat{B} |\psi \rangle = |\psi' \rangle$, i.e. $\hat{A} |\psi' \rangle$. Now we state the properties of multiplication of operators:

  • Associativity of multiplication: $\hat{A} (\hat{B}\hat{C}) = (\hat{A} \hat{B}) \hat{C}$
  • Non-commutativity of multiplication: $\hat{A} \hat{B} \neq \hat{B} \hat{A}$

The non-commutativity of multiplication is one of the most important properties of operators in quantum mechanics. Due to this aspect, we also define commutators:

  • Commutator: $[\hat{A}, \hat{B}] = \hat{A}\hat{B} - \hat{B} \hat{A}$

The non-commutativity of multiplication of operators, and consequently the notion of commutators play a fundamental role in quantum mechanics. Specifically, two operators that do not commute are associated with properties that cannot be measured simultaneously in a quantum system, such as position and momentum operators.

Now, we introduce the notion of the adjoint operator. Adjoint operator can be thought of as the dual of the operator introduced above, just like a ket corresponds to a bra in the dual space. We can write this as follows:

$$|\psi'\rangle =\hat{A}|\psi \rangle \longleftrightarrow \langle \psi' | = \langle \psi|\hat{A}^\dagger.$$

Adjoint operator is also linear, just like the operator introduced previously. With this notion, we can introduce some particularly important operators in quantum mechanics:

  • Hermitian operator: $\hat{A} = \hat{A}^\dagger$
  • Unitary operator: $\hat{A}^{-1} = \hat{A}^\dagger$

Another important way to write an operator is through “outer product.” It turns out that outer product is also an operator, which can be seen below:

$$(|\phi \rangle \langle \psi | ) |\chi \rangle = | \phi \rangle ( \langle \psi | \chi \rangle ) = a|\phi \rangle,$$

where we denoted the scalar (inner) product $\langle \psi | \chi \rangle$ as $a \in \mathbb{C}$.

We close this chapter with some basic mathematical properties of operators:

  • $\langle \psi | \hat{A}^\dagger | \rho \rangle = \langle \rho |\hat{A}|\psi\rangle^*$
  • $(\hat{A}^\dagger)^\dagger = \hat{A}$
  • $(a \hat{A})^\dagger = a^* \hat{A}^\dagger$
  • $(\hat{A} + \hat{B})^\dagger = \hat{A}^\dagger + \hat{B}^\dagger$
  • $(\hat{A}\hat{B})^\dagger = \hat{B}^\dagger \hat{A}^\dagger$

3. Representations in quantum mechanics

Just like choosing convenient basis in Euclidean space leads to simpler notations, in quantum mechanics, if we choose convenient basis, it can simplify mathematical representations as well. We represent state space with an orthonormal basis. Orthonormal basis are mathematically defined as a set $\{ |u_i\rangle \}$ such that $\langle u_i | u_j \rangle = \delta_{ij}$, where $\delta_{ij} = 1$ if $i = j$, and $0$ otherwise (this is known as the “kronecker delta” function). Also, every ket in the state space, $|\psi \rangle \in \mathcal{V}$, can be respresented as a unique linear combination of the set $\{ |u_i\rangle \}$, i.e. $|\psi \rangle = \sum_i c_i |u_i \rangle$.

Since we’re working with orthonormal basis, it is quite easy to find a specific coefficient as follows:

$$\langle u_j | \psi \rangle = \langle u_j | \left( \sum_i c_i |u_i \rangle \right) = \sum_i c_i \langle u_j | u_i \rangle = \sum_i c_i \delta_{ij} = c_j.$$

In words, we can say that $\{ c_i \}$ are a representation of a ket $|\psi \rangle$ in the $\{ |u_i\rangle \}$ basis. An important concept of representation is “closure relation”:

$$\begin{aligned} |\psi \rangle &= \sum_i \langle u_i | \psi \rangle |u_i \rangle = \left( \sum_i |u_i \rangle \langle u_i \right) | \psi \rangle \\
&= \sum_i c_i | u_i\rangle \quad \text{where}\quad c_i = \langle u_i | \psi \rangle. \end{aligned}$$

Recall from Chapter 2 that an operator can be written as an outer product. In that sense, from the above, we can see that $\sum_i |u_i \rangle \langle u_i |$ is an operator (notice that I emphasized this with parentheses) that, given a state $|\psi\rangle$, returns the same state $|\psi \rangle$. Thus, we can also write $\sum_i |u_i \rangle \langle u_i | = \mathbb{I}$, where $\mathbb{I}$ is an identity matrix. This result is sometimes also called the “resultion of the identity (in the $\{|u_i\rangle \}$ basis).” Similar result holds for a bra $\langle \psi | \in \mathcal{V}^*$ :

$$\begin{aligned} \langle \psi | = \langle \psi | \mathbb{I} = \langle \psi | \left( \sum_i |u_i \rangle \langle u_i | \right) &= \sum_i \langle \psi | u_i \rangle \langle u_i | \\
&= \sum_i c_i^* \langle u_i | \quad \text{where} \quad c_i^* = \langle u_i | \psi \rangle^* = \langle \psi | u_i \rangle. \end{aligned}$$

Now, let’s look at how we represent operators. Recall the definition of an operator $\hat{A} |\psi\rangle = |\psi' \rangle$. Using the same $\{ |u_i \rangle \}$ basis as before, we can write down both kets as below:

$$\begin{aligned} |\psi\rangle &= \sum_i c_i | u_i \rangle \quad\text{for}\quad c_i = \langle u_i | \psi \rangle \\
|\psi'\rangle &= \sum_i c_i' | u_i \rangle \quad\text{for}\quad c_i' = \langle u_i | \psi' \rangle. \end{aligned}$$

Notice that we can write $c_i'$ in an alternative way as follows:

$$\begin{aligned} c_i' &= \langle u_i | \psi' \rangle = \langle u_i | \hat{A} | \psi \rangle = \langle u_i | \hat{A}\mathbb{I} | \psi \rangle \quad\text{(recall closure relation)} \\ &= \langle u_i | \hat{A} \left( \sum_j |u_j \rangle \langle u_j | \right) |\psi\rangle = \sum_j \langle u_i |\hat{A} |u_j \rangle \langle u_j | \psi \rangle. \end{aligned}$$

Using this alternative form of $c_i'$, we can write down the ket $|\psi'\rangle$ in a different form:

$$\begin{aligned} |\psi'\rangle &= \sum_i \left( \sum_j \langle u_i|\hat{A}|u_j \rangle \langle u_j|\psi\rangle \right) |u_i\rangle \\
&= \left( \sum_{ij} |u_i \rangle \langle u_i | \hat{A}| u_j \rangle \langle u_j| \right) |\psi\rangle \\
&= \hat{A} |\psi\rangle. \end{aligned}$$

Therefore, we see that we can write the operator $\hat{A}$ as

$$\hat{A} = \left( \sum_{ij} |u_i \rangle \langle u_i | \hat{A}| u_j \rangle \langle u_j| \right) = \sum_{ij} A_{ij} |u_i\rangle \langle u_j|,$$

where $A_{ij} = \langle u_i | \hat{A} | u_j \rangle \in \mathbb{C}$. Therefore, we arrive at a scalar value $A_{ij}$ that represents the operator $\hat{A}$ in the $\{ |u_i\rangle \}$ basis, just like how $c_i = \langle u_i | \psi \rangle$ represents the ket $|\psi \rangle$ in the same basis.

With the above, we close this chapter, and thus part 1 of this blog post about basics of quantum mechanics. In the next part, we will study matrix formulation of quantum mechanics, how to change basis, and a particularly import basis called eigenbasis.