Formulas for commutators and anticommutators

When an addition and a multiplication are both defined for all elements of a set \(\set{A, B, \dots}\), we can check if multiplication is commutative by calculation the commutator: \[\begin{equation} \require{physics} \comm{A}{B} = AB - BA \thinspace . \end{equation}\] \(A\) and \(B\) are said to commute if their commutator is zero. We can analogously define the anticommutator between \(A\) and \(B\) as \[\begin{equation} \comm{A}{B}_+ = AB + BA \thinspace . \end{equation}\]

From these definitions, we can easily see that \[\begin{align} & \comm{A}{B} = - \comm{B}{A} \\ & \comm{A}{B}_+ = \comm{B}{A}_+ \thinspace . \end{align}\]

Letting \(\dagger\) stand for the Hermitian adjoint, we can write for operators or \(A\) and \(B\): \[\begin{align} & \comm{A}{B}^\dagger = \comm{B^\dagger}{A^\dagger} = - \comm{A^\dagger}{B^\dagger} \\ & \comm{A}{B}^\dagger_+ = \comm{A^\dagger}{B^\dagger}_+ \end{align}\]

If \(U\) is a unitary operator or matrix, we can see that \[\begin{equation} \comm{U^\dagger A U}{U^\dagger B U } = U^\dagger \comm{A}{B} U \thinspace . \end{equation}\]

Using the definitions, we can derive some useful formulas for converting commutators of products to sums of commutators: \[\begin{align} & \comm{A}{BC} = B \comm{A}{C} + \comm{A}{B} C \\ & \comm{AB}{C} = A \comm{B}{C} + \comm{A}{C}B \\ & \comm{AB}{CD} = A \comm{B}{C} D + AC \comm{B}{D} + \comm{A}{C} DB + C \comm{A}{D} B \\ & \comm{ABC}{D} = AB \comm{C}{D} + A \comm{B}{D} C + \comm{A}{D} BC \\ & \comm{A}{BCD} = BC \comm{A}{D} + B \comm{A}{C} D + \comm{A}{B} CD \end{align}\]

In general, we can summarize these formulas as \[\begin{equation} \comm{A}{B_1 B_2 \cdots B_n} = \comm{A}{\prod_{k=1}^n B_k} = \sum_{k=1}^n B_1 \cdots B_{k-1} \comm{A}{B_k} B_{k+1} \cdots B_n \thinspace . \end{equation}\]

Concerning sufficiently well-behaved functions \(f\) of \(B\), we can prove that \[\begin{equation} \comm{\comm{A}{B}}{B} = 0 \qquad\Rightarrow\qquad \comm{A}{f(B)} = f'(B) \comm{A}{B} \thinspace . \end{equation}\]

In electronic structure theory, we often want to end up with anticommutators: \[\begin{align} & \comm{A}{BC} = \comm{A}{B}_+ C - B \comm{A}{C}_+ \\ & \comm{AB}{C} = A \comm{B}{C}_+ - \comm{A}{C}_+ B \end{align}\]

In electronic structure theory, we often end up with anticommutators. In case there are still products inside, we can use the following formulas: \[\begin{align} & \comm{A}{BC}_+ = \comm{A}{B} C + B \comm{A}{C}_+ \\ & \comm{A}{BC}_+ = \comm{A}{B}_+ C - B \comm{A}{C} \\ & \comm{AB}{C}_+ = A \comm{B}{C}_+ - \comm{A}{C} B \\ & \comm{AB}{C}_+ = \comm{A}{C}_+ B + A \comm{B}{C} \end{align}\]

The elementary BCH (Baker-Campbell-Hausdorff) formula reads \[\begin{equation} \exp(A) \exp(B) = \exp(A + B + \frac{1}{2} \comm{A}{B} + \cdots) \thinspace , \end{equation}\] where higher order nested commutators have been left out. From this, two special consequences can be formulated: \[\begin{align} \exp(-A) \thinspace B \thinspace \exp(A) &= B + \comm{B}{A} + \frac{1}{2!} \comm{\comm{B}{A}}{A} + \cdots \\ &= \sum_{n=0}^{+ \infty} \frac{1}{n!} \thinspace {}_n\comm{B}{A} \thinspace , \end{align}\] in which \({}_n\comm{B}{A}\) is the \(n\)-fold nested commutator in which the increased nesting is in the left argument, and \[\begin{align} \exp(A) \thinspace B \thinspace \exp(-A) &= B + \comm{A}{B} + \frac{1}{2!} \comm{A}{\comm{A}{B}} + \cdots \\ &= \sum_{n=0}^{+ \infty} \frac{1}{n!} \comm{A}{B}_n \thinspace , \end{align}\] in which \(\comm{A}{B}_n\) is the \(n\)-fold nested commutator in which the increased nesting is in the right argument.

Let \(A\) be an anti-Hermitian operator, and \(H\) be a Hermitian operator. We can then show that \(\comm{A}{H}\) is Hermitian: \[\begin{equation} \comm{A}{H}^\dagger = \comm{A}{H} \thinspace . \end{equation}\]